The limits on memory access are physical, illustrated by grass hoppers famous video about nanoseconds and the speed of light.
Should computer algorithms always assume they need to model caches since they are never going away? When determining the computational complexity, time and memory are treated with equivalence, but real memory doesn't and never will behave that way.
Should computer algorithms always assume they need to model caches since they are never going away? When determining the computational complexity, time and memory are treated with equivalence, but real memory doesn't and never will behave that way.