> After getting burned due to performance limitations by Python applications at scale I switched to Go and never looked back
Same here. Not just performance, but also tooling.
We used to use pipenv for reproducible dependency management, but literally any change to the lockfile would take 30+ minutes to rebuild--this isn't including the time it takes to pull the dependencies in the first place, which was only a few minutes anyway. This meant that our CI jobs would take ~45 minutes and we would spend a ton of time trying different caching tricks and adding various complexity to our CI process to bring these times down, but they were still unsavory. With Go, we don't even need caching and everything builds and tests run in just a few minutes (end to end).
Not only that, but Go's static typing was a boon. Years ago we would try keeping our types up-to-date with Sphinx (and code review checks), but inevitably they would fall out of date and the only thing worse than no type documentation is incorrect type documentation. Further, this didn't stop people from writing stupid code nearly as well as a type checker does (think "code whose return type varies based on the value of some input parameter" and various other such things). We eventually tried mypy but it was immature (couldn't express recursive types), cumbersome (e.g., expressing a callback that takes kwargs was tedious), confusing (I don't think I ever figured out how to publish a pypi package with type annotations that would be picked up automatically by mypy), slow, etc. I'm sure these things will improve with time, and there are also various other Python type checkers (although with everything in Python, each tool seems to have its own hidden pitfalls). On the plus side, Python's Unions are better than Go's various patterns for emulating algebraic data types aka enums aka sum types.
Similarly, with Python we would try to deploy AWS Lambdas, but it didn't take long and these would bust the (at the time) 250MB limit (compressed). I'm guessing Lambda has since raised this limit, but even still. Go binaries with as many (direct and transitive) dependencies would yield an artifact that was two orders of magnitude smaller and it would include the standard library and runtime. This also meant that our Docker images could be a lot smaller which has a whole bunch of other benefits as discussed here: https://news.ycombinator.com/item?id=30209023.
With respect to performance, beyond the obvious problems, it also made tests far slower, and slow tests (and slow test suites) get run less frequently and later in the software development lifecycle which has a lot of knock-on effects (especially in a language that relies so heavily on tests rather than static analysis to catch bugs). We would also have to go through and purge tests or move them out of band of CI because we wanted to be able to do Continuous Deployment (and even if you don't want to be able to continuously deploy to production, being able to rapidly iterate in lower/ephemeral environments is really nice--getting high-fidelity local dev environments is either futile or very close thereto).
It's really surprising how many advantages Go confers (at least relative to Python) beyond those which are obvious from the marketing.
Your experience with Python perfectly reflects mine when I used it for building large scale applications with it.
Except for few niche areas such as ML, DS, algorithm prototyping (or) archaic fintech systems which expects python code as an input I think Go is a perfect replacement for Python.
When ever this topic comes up, People bring up the beginner friendly web-frameworks in python; Which I admit has some merit but Go as a language is beginner friendly 'enough' and not having to rely upon large monolith 3rd party frameworks is one of its key advantages.
Same here. Not just performance, but also tooling.
We used to use pipenv for reproducible dependency management, but literally any change to the lockfile would take 30+ minutes to rebuild--this isn't including the time it takes to pull the dependencies in the first place, which was only a few minutes anyway. This meant that our CI jobs would take ~45 minutes and we would spend a ton of time trying different caching tricks and adding various complexity to our CI process to bring these times down, but they were still unsavory. With Go, we don't even need caching and everything builds and tests run in just a few minutes (end to end).
Not only that, but Go's static typing was a boon. Years ago we would try keeping our types up-to-date with Sphinx (and code review checks), but inevitably they would fall out of date and the only thing worse than no type documentation is incorrect type documentation. Further, this didn't stop people from writing stupid code nearly as well as a type checker does (think "code whose return type varies based on the value of some input parameter" and various other such things). We eventually tried mypy but it was immature (couldn't express recursive types), cumbersome (e.g., expressing a callback that takes kwargs was tedious), confusing (I don't think I ever figured out how to publish a pypi package with type annotations that would be picked up automatically by mypy), slow, etc. I'm sure these things will improve with time, and there are also various other Python type checkers (although with everything in Python, each tool seems to have its own hidden pitfalls). On the plus side, Python's Unions are better than Go's various patterns for emulating algebraic data types aka enums aka sum types.
Similarly, with Python we would try to deploy AWS Lambdas, but it didn't take long and these would bust the (at the time) 250MB limit (compressed). I'm guessing Lambda has since raised this limit, but even still. Go binaries with as many (direct and transitive) dependencies would yield an artifact that was two orders of magnitude smaller and it would include the standard library and runtime. This also meant that our Docker images could be a lot smaller which has a whole bunch of other benefits as discussed here: https://news.ycombinator.com/item?id=30209023.
With respect to performance, beyond the obvious problems, it also made tests far slower, and slow tests (and slow test suites) get run less frequently and later in the software development lifecycle which has a lot of knock-on effects (especially in a language that relies so heavily on tests rather than static analysis to catch bugs). We would also have to go through and purge tests or move them out of band of CI because we wanted to be able to do Continuous Deployment (and even if you don't want to be able to continuously deploy to production, being able to rapidly iterate in lower/ephemeral environments is really nice--getting high-fidelity local dev environments is either futile or very close thereto).
It's really surprising how many advantages Go confers (at least relative to Python) beyond those which are obvious from the marketing.