Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> That humans have invented timezones and DST won't change the physics of a CPU's internal clock ticking x billion times per second.

Increasingly we are programming in distributed systems. One milli or nano on one node is not a milli or nano on another node, and that is physics that is more inviolable.



In which case, does being off a few milli actually matter that much in any significant number of those distributed instances? No precision is exact, so near enough, should generally be near enough for most things.

It may depend in some cases, but as soon as you add network latency there will be variance regardless of the tool you use to correct for variance.


Important for some consistency algorithms, for example Google spanner. (Not necessarily advocating for those algorithms)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: