This post goes into a lot of details about their point of view - that the programmer developing software has to run it a lot of times, so if it's inefficient (in terms of CPU cycles) it'll slow them down. I was hoping they'd quantify that, but they don't.
Overall their argument makes sense, but if the slowdown due to "too many CPU cycles" only has a practical effect in taking a few extra milliseconds overall then IMO the programmer isn't slowed down in any practical sense.
Overall their argument makes sense, but if the slowdown due to "too many CPU cycles" only has a practical effect in taking a few extra milliseconds overall then IMO the programmer isn't slowed down in any practical sense.
reply