One gotcha I caught myself in with this technique is using it in a script that would remediate a situation where my home has lost internet and needed the router to be power cycled. When the internet is out, `uv` cannot download the dependencies specified in the script, and the script would fail. Thankfully I noticed this problem after writing it but before needing it to actually work, and refactored my setup to pre-install the needed dependencies. But don't make the same mistake I almost made! Don't use this for code that may need to run airgapped! Even with uv caching you may still get a cache miss.
If you indefinitely grow the collection of packages that you have in use in permanent virtual environments, then yes your disk usage will and should grow indefinitely anyway.
There's not much point recycling contents from a package cache unless you currently don't have a venv using that package and also don't reasonably expect to have one in the near future.
And I'm saying this as the guy complaining all the time about things like the size of Numpy wheels.
Here's a fun one (says he, in a panic): Will we get to a point where (through fault or, gasp, design) a given AI will generatively code a missing dependency, on the fly - perhaps as a "last ditch" effort?
(I can imagine languages having official LLMs which would more or less "compress/know" enough of the language to be an ...
... import of last resort, of sorts, by virtue of which an approximation to the missing code would be provided.-
I'm pretty sure we've already reached a point where people generatively code possibly-malicious packages to publish under names of non-existent packages commonly hallucinated by LLMs prompted by other devs.