Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I used to work for a data-scraping firm and very often we would accidentally knock many web sites offline when we pointed our crawlers at them.

I'd love to agree with you, but the crawler problem is 100x worse today than it was a decade ago



This would be much better solved with IP-based rate limits. And if IP-based doesn't work, then you're dealing with a DDOS, and it doesn't sound like this case was DDOS protection.


IP-based rate limiting is easily foiled via proxies, VPN services, tor, or botnets


And user agent string based protections are even more easily foiled, that's why I don't believe this can be plausibly counted under DDoS protection.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: