Bad analogies are bad analogies. ollama is a server system, it should expect to connect with more than one client and they know very well by now that this also means networked clients. If you create a server client protocol, implementing security is your job.
Any decent router is going to block connections from internet to your local network by default. For ollama to be accessible from the outside, they had to allow it explicitly. There's no way to blame ollama for this.
I cannot express how deeply wrong you are about this; a "server system" is not some mandate that it should be production ready for a ton of people on the internet.
This is a program that very different people want or need to try out that just so happens to involve a client-server architecture.
As cynical as I am, I honestly don't think there is much to wonder about here. The initial product's adoption relied on low friction and minimal setup. That they wanted to keep it going as long as possible is just an extension of this.
Ollama doesn't run a web server that is "broadcasting across the internet". It runs a server that is accessible locally. You have to deliberately deploy it onto a public server in order for it to be accessible from the internet.
In all cases, having zero auth at all [0] even when others want to use it as a service to broadcast across the internet is ridiculous. Leading to problems like this: [1] and now all exposed without any protection.
Even allowing others to change the $OLLAMA_HOST env is a security footgun.