A lot of computations are really higher complexity order functions with stair steps at certain intervals based on hardware trying to pretend they are constant time. All operations cost the same amount until n doubles again and then it’s slower. If you zoom out toward infinity, the stair steps smooth out into a logarithmic curve. It becomes logarithmic in the former case and square root in the latter. Even dividing two numbers or doing a memory address lookup stops being C, which is part of why prime factoring worked for RSA for so long.
If anyone had made clockless logic work you would see that adding 1 + 1 is in fact faster than adding 2^63 + 1.
If you put enough data into a hash table the key length has to increase logarithmically to the table size in order to have distinct keys per record. Even Knuth points out that hash tables are really nlogn - something I’m pretty sure my CS professors left out. In multiple classes. Man, did I get tired of hash tables, but I see now why they harped on them. Case on point, this article.
If anyone had made clockless logic work you would see that adding 1 + 1 is in fact faster than adding 2^63 + 1.
If you put enough data into a hash table the key length has to increase logarithmically to the table size in order to have distinct keys per record. Even Knuth points out that hash tables are really nlogn - something I’m pretty sure my CS professors left out. In multiple classes. Man, did I get tired of hash tables, but I see now why they harped on them. Case on point, this article.