Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I would have thought that the number of blogs/total amount of content any site can serve is theoretically infinite.

It’s the rate of requests it can handle that matters, which may explain why I’m getting an “Application error” page at the moment



There are, I imagine, some tricks for lots of virtual hosts. Like not running out of file descriptors if they each have their own log. Or watching for bloat in config files. Apache, at least, used to bog down on startup with thousands of VirtualHost directives. Or perhaps an acme cert renewal script that runs serially and takes a looong time to finish.


mod_vhost has existed for something like 20 years. https://httpd.apache.org/docs/2.4/mod/mod_vhost_alias.html ; back in the day ISPs like Demon were hosting thousands of customer websites off single large machines (on the hardware of 20 years ago!) using this technique.


Sure, though that page you linked contains some tips on how to avoid performance issues with the directory substitution. My point was that some thought is required to do thousands of vhosts.


I would pick nginx in the first place.


Nginx may need tuning for high numbers of server blocks, so it's not completely immune: http://nginx.org/en/docs/http/server_names.html#optimization




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: