I think this is a great entrypoint into the static/dynamic argument and I'd love to argue with some people about it. I believe dynamic used to make sense but no longer does in the vast majority of cases. Static binaries have their costs, but are so much easier to reason about.
This is a really silly entrypoint into the static/dynamic argument. Static linking does not protect anything here, only makes it harder for developers.
They each have tradeoffs even only considering security.
Consider a situation in which there is a new vulnerability in openssl. you can treat this as a hypothetical question or just.. remember any of your past experiences of any of the many openssl vulns.
How many binaries on your server use the vulnerable version? If all binaries are dynamically linked you can answer this fairly trivially with a shell script to enumerate binaries, pass them to ldd, and a little grepping.
If all of your binaries are statically linked what do you do? Ideally pull the build info from your build server that shows you every version of everything that went into the binary.. which is data that just doesn't exist for most people
Maybe you scan the binaries to do some kind of signature analysis... but I would not be confident in the results not having false positives and false negatives.
Now let's patch it. How quickly can you recompile every static binary on your server? Can you even easily cut new builds of these existing versions but with a small patch increment or will your dev teams just rush a new release of any changes they're working on?
or with dynamically libraries, you update the library on your server and be done with it
... or so you thought. you didn't check what processed were running with the old library still open in memory and restart them so you're still vulnerable :)
> I think this is a great entrypoint into the static/dynamic argument and I'd love to argue with some people about it.
I don't think it is. You start from an irrational and unsubstantiated belief that ignores any of the basic usecases of shared libraries.
> I believe dynamic used to make sense but no longer does in the vast majority of cases.
It's your personal belief, and one that's unsubstantiated and os based on ignorance.
> Static binaries have their costs, but are so much easier to reason about.
That assertion is completely irrelevant, as it fails to address any of the usecases for shared libraries. Being able to run code, and other dubious claims of simplicity, don't even qualify as questioning the purpose of shared libraries.
There’s also other security considerations. As an operator or builder, do you want to patch a library (say OpenSSL) to keep your system up to date or patch every binary. If changing a dependency requires rebuilding all consumers recursively, the there’s not a huge benefit.
I think in the specific case of security issues, more bugs have been fixed by upgrading dynamic dependencies than introduced. That's just my gut feeling though, and I'd like to see data.
> I think in the specific case of security issues, more bugs have been fixed by upgrading dynamic dependencies than introduced.
That's just your personal assertion, which is entirely baseless and unsubstantiated. It's ok to have beliefs, but instead of pushing them as truths you should at least start by doing some cursory research to see if they are even plausible. And yours isn't.
It's not entirely unsubstantiated, as my experience is that the former is very common. The latter is much harder to observe though, so it's just an impression.
I'm very interested in your assertion that my impression is implausible though. What evidence do you have?
It seems to me that deploying a static binary is for situations where one doesn't have control over the underlying system, or where shipping dependencies hasn't been solved, i.e, you just want to ship one binary.
Only cheap if you're running on huge servers. End user machines and edge compute are more constrained, so one needs to be more polite with resource use there.