Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is the bit I found curious: "adding a small amount of random latency to requests to storage servers counter-intuitively reduced the average latency and the outliers due to the smoothing effect it has on the network".

Can anyone explain why?



Synchronized network traffic can cause incast or other buffer overflows.


Yeah jitter is generally used to mitigate “thundering herd” type problems because it reduces the peak load by spreading it out over time.


Thanks to both of you - makes sense




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: