At my work we use docker for everything now. Makes no sense. Literally we have a dedicated server for each application. Instead of copying python files to the server in 5 seconds we take 30 minutes to build a docker container, copy to repo, scan it, deploy.
But you’ve gained the ability to do static analyses and quality control, enforce authorisation on who can deploy to the prod server, gained accountability and a history of changes by virtue of having it in the repo, enabled everyone to run the image locally regardless of their local environment…
To be fair, this can all be achieved with other tools, while still rsyncing the Python to prod.
One advantage is being able to run the Docker anywhere with the same build. Two runs of the same build are not always the same, but the Docker image will be.
Not saying you're wrong here, but if that 30 minutes doesn't include running tests it sounds like your Dockerfile is setup poorly. For a ci/cd setup it should take like...a minute at most, if you don't have heavy ML dependencies, etc. Maybe less, depending on how well you cache things.
Thanks, yes probably. Our company is great on adding new functionality but has a huge amount of tech debt including infrastructure and SDLC processes that are really slow.