Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I have to warn again users that think they have found the silver bullet that pyenv comes with a big caveat: it compiles python on your machine.

The number of possible modes of failure in this situation is huge.

See also: "Why not tell people to "simply" use pyenv, poetry or anaconda"

https://www.bitecode.dev/p/why-not-tell-people-to-simply-use

I'm not saying pyenv is not a useful tool, but it is not a tool for beginners fighting with python packaging problems. It's a specialist tool to normalize your setup.

Very often, I see people tell me they don't have problems with pyenv, but later on have other unrelated problems with their dependencies. Analysis then prove it was because of pyenv, they just didn't know it. The cost is not obvious.



Actually, it only builds it locally if it can't find a pre-packaged version for your system/arch. Admittedly that's most of the recent ones on a Mac, but there is a difference (I've been using pyenv for nearly ten years[1] now).

The big advantage for me is that I can match whatever runtime and standard library a target has (and yes, that's needed more times than not, even in this new age of Docker).

Additionally, you can build an _optimized_ Python. I have this set for my builds:

    env PYTHON_CONFIGURE_OPTS='--enable-optimizations --with-lto' PYTHON_CFLAGS='-march=native -mtune=native' pyenv install 3.12.2
[1]: https://taoofmac.com/space/blog/2015/10/03/1245


That happens way more often than people want to admit, and I think it's not genuine to present the tool that way.

E.G, I'm on Ubuntu 20.04 on an Dell XPS, a fairly standard machine, I'll get:

    pyenv install 3.9 -v
    /tmp/python-build.20240325124651.73089 ~
    Downloading Python-3.9.19.tar.xz...
    -> https://www.python.org/ftp/python/3.9.19/Python-3.9.19.tar.xz
    ...
    LD_LIBRARY_PATH=/tmp/python-build.20240325124651.73089/Python-3.9.19 CC='gcc -pthread' LDSHARED='gcc -pthread -shared -L/home/user/.pyenv/versions/3.9.19/lib -Wl,-rpath,/home/user/.pyenv/versions/3.9.19/lib -L/home/user/.pyenv/versions/3.9.19/lib -Wl,-rpath,/home/user/.pyenv/versions/3.9.19/lib  ' OPT='-DNDEBUG -g -fwrapv -O3 -Wall'  _TCLTK_INCLUDES='' _TCLTK_LIBS=''  ./python -E ./setup.py  build
    running build
    running build_ext
    ... etc
There are obviously binaries for this Python, since I can apt install it and I didn't specify the minor version.

But by default it downloads the source, and compiles it.

On top of that it will use a shim which comes with its own world of possible pain.

Again, I don't want to bash on pyenv.

But I do want to lower people's expectations.

It's a tool for experts, not beginners.


I’ve quite literally never seen it download a pre-compiled Python - I didn’t actually know it had that functionality, and have been using it for probably a decade.


Ditto. I had no idea that it ever looked for a pre-compiled version.


Actually, I'm pretty sure that binaries for Python 3.9 aren't available on Ubuntu 20.04 without adding an apt repository to get them from, so your statement that you "can apt install it" isn't entirely correct in this case - by default, binaries aren't available, and therefore pyenv has to compile Python 3.9.


Just tried, I get the same result with 3.7, 3.8, 3.10 and 3.11.


Currently going through this myself on a new mac I bought lol. Still haven't gotten anywhere. Oh well

$ PYTHON_CONFIGURE_OPTS='--disable-ipv6 --enable-optimizations --with-lto' PYTHON_CFLAGS='-march=native -mtune=native' pyenv install 3.12.2 python-build: use openssl@3 from homebrew python-build: use readline from homebrew Downloading Python-3.12.2.tar.xz... -> https://www.python.org/ftp/python/3.12.2/Python-3.12.2.tar.x... Installing Python-3.12.2... python-build: use readline from homebrew python-build: use ncurses from homebrew python-build: use zlib from xcode sdk

BUILD FAILED (OS X 14.4 using python-build 20180424)

Inspect or clean up the working tree at /var/folders/w9/xvxzj68j6kx7m480rnwq6hvh0000gn/T/python-build.20240325114837.43577 Results logged to /var/folders/w9/xvxzj68j6kx7m480rnwq6hvh0000gn/T/python-build.20240325114837.43577.log

Last 10 log lines: ./Include/internal/pycore_interp.h:193:24: error: field has incomplete type 'struct _dtoa_state' struct _dtoa_state dtoa; ^ ./Include/internal/pycore_interp.h:193:12: note: forward declaration of 'struct _dtoa_state' struct _dtoa_state dtoa; ^ 1 error generated. make[2]: ** [Objects/boolobject.o] Error 1 make[1]: ** [profile-gen-stamp] Error 2 make: ** [profile-run-stamp] Error 2


Actually, I installed rye and got everything working. There goes any usage of pyenv going forward lol.


  PYTHON_CONFIGURE_OPTS='--disable-ipv6
LOL, the strategy of disabling ipv6 even finds its way into Python builds


> Actually, it only builds it locally if it can't find a pre-packaged version for your system/arch. Admittedly that's most of the recent ones on a Mac, but there is a difference (I've been using pyenv for nearly ten years[1] now).

Really? I regularly use this on Linux exactly because it compiles from source rather than using the system provided Python which often have patches in them that breaks pip's test suite.

I've never encountered it trying to use the system provided Python, but maybe there's a weird quirk in the way I am using it.


If you don’t care which specific version of python you are using, do not use pyenv.

You will know when you care. These days, doing common tasks, the constraint solver in poetry will often tell you:

“Hey! I can’t find a version of sentencepiece with metadata that lets me combine this with python 3.12.2. Sorry! I give up!”

Now, if you aren’t concerned with using your bare metal for CUDA or fancy new MPS or AMD stuff. Just ignore this and use containers. I’d use podman compose.

However, I use pyenv on every machine. Because it compiles specific versions I actually need, even to create various virtual environments. If compiling python automatically sounds tough, you probably don’t need to anyway.

To describe the problem you’d see. I try to use poetry by default, though I think it became popular before it was PEP-compliant or useful in devops. It is impossible to control the behavior of other package managers, and poetry is/was strict about that. Which means you can’t force deploy in many cases. (Better lately.)

For the problem pyenv helps to solve, I back-up my pyproject.toml setuptools backend with pip and requirements.txt. These days, requirements-cuda.txt and requirements-mps.txt.

The landscape is still a disaster for binary compatibility, but it can be done lol. (I’ve been doing python packaging professionally since prom, which was python 2.6 give or take.)


This kind of error surprises me:

> I can’t find a version of sentencepiece with metadata that lets me combine this with python 3.12.2

I'm a reasonably advanced python user. I've shipped web apps, Desktop GUIs, cli-tools, and even written cpython extension modules for custom hardware control. I typically target the system `python3` of whatever linux distribution I'm shipping to and I use the system 'python3-virtualenv' for venvs.

But I have never encountered a dependency resolution issue or been forced to use poetry. What am I doing wrong?


Sometimes, an app or library developer wants to target multiple Linux distributions.

For example, I want my own Python apps to work equally well on Debian oldstable (which currently provides Python 3.9) and Arch Linux (currently on 3.11). That means that I’m going to choose the lowest common version (3.9) as the language level for my app or library.

And if I program against a 3.9 language level, I absolutely *refuse* to use any other interpreter than a 3.9 one at development time. (If I used a newer one, then my linter and type checker would helpfully give me false alarms all the time, unaware of the actual language level.)

Hence, I use pyenv and poetry to get exactly the interpreter for the language level I want, and to allow pylint and mypy to be perfectly aware of the language level I’m using at development time, allowing them to give me findings that actually make sense.


> But I have never encountered a dependency resolution issue or been forced to use poetry. What am I doing wrong?

Not using python on a system without the necessary build tools while trying to use dependencies with native code that don't have binaries built for your combination of python version and platform?

This used to be a huge problem with Python on Windows, as things that weren't pure Python would very often not have binary packages or have them for a narrow set of python versions, often not the same set as other dependencies. (Not just a windows problem, but it was definitely big on windows.)

Tooling and practices have advanced so that more packages are automatically built against a wider set of targets, so the problem is a lot smaller than it used to be.


(Edit2 tl:dr: it’s that other package maintainers don’t always keep up with new semver constants in their metadata, particularly on anything cutting-edge)

It’s always the inclusion of a specific dependency we added for a feature, and based on that dev’s knowledge and experience. It’s often me, but not always.

This doesn’t happen in ecosystems with a base package versioning. This is arguably why anaconda became popular, and why we target base docker images of ubuntu.

Doesn’t work in complex deployments based on money rather than ideals, every time. At least in my career.

Edit: first time I dealt with this, we ended up forking a dependency chain instead of using pip. I lost that war and they ended up reviving a legacy PHP app instead of funding python dev.


So, what is "the tool for beginners fighting with python packaging problems"?

That is, the pattern seems to be that someone mentions a solution, then a zillion responses as to why it sucks.

Is there any tool or pair that sucks least for most cases and beginners? I get that every case is different, but perhaps there are some useful starting points?


While there are no silver bullet, for beginners, I found that "Relieving your Python packaging pain" (https://www.bitecode.dev/p/relieving-your-python-packaging-p...) is the Pareto solution. That is, the solution that has the best ratio of effort, reward, but also the lower risk of failure. It's not no risk, but I've been helping beginners for 15 years with Python, and have tried everything you can think of.

It does imply, on linux, to limit yourself to the choices of Python you can install. This constraint is, for most people, preferable than the alternative, even if it gets frustrating to our geeky soul.


There isn't one. Python distribution and packaging is just fundamentally horribly broken.

I think his point was that you shouldn't pretend to users that just switching to pyenv is the solution.


> Python distribution and packaging is just fundamentally horribly broken

It's clearly not because most people successfully use it fine.

The problem of distribution and packaging is often a matter of user expectations vs. the actual problem.

The user expectations is that Python is a high level language and will run the same across different machines regardless of OS and Hardware.

The actual problem is Python is a glue language often depending on lots of libraries that are sensitive to how they were compiled and what hardware they are targeting. So people can end up moving the the glue of the project first and then expect everything else to just work.

Things are getting better (e.g. https://lukeplant.me.uk/blog/posts/python-packaging-must-be-...), a lot of people have put a lot of work into standards and improving the ecosystem, but it is a hard problem that most other popular languages don't interface with nearly as much.


> It's clearly not because most people successfully use it fine.

Well, no because it's perfectly possible to successfully use a horribly broken system. I use Python "successfully", it just meant I have spend probably literal weeks of my life fighting pip and virtualenv and relative imports and finding I need flags like `--config-settings editable_mode=compat`.

> The problem of distribution and packaging is often a matter of user expectations vs. the actual problem.

Ha yes, I expect it to work reliably and simply and it doesn't!

> The actual problem is Python is a glue language often depending on lots of libraries that are sensitive to how they were compiled and what hardware they are targeting.

That's completely irrelevant to the kind of problems I was talking about. I run into issues with Python failing to compile C libraries relatively rarely! Even compiling Python itself seems to work quite well (maybe not surprising since that's one of the only ways to get a new version on Linux).

It's all the packaging infrastructure that's a mess. Pip, virtualenv, setuptools, and also the module import system is a total disaster. You don't see questions like this for Go:

https://stackoverflow.com/questions/14132789/relative-import...


> It's all the packaging infrastructure that's a mess. Pip, virtualenv, setuptools, and also the module import system is a total disaster. You don't see questions like this for Go:

> https://stackoverflow.com/questions/14132789/relative-import...

I think you'll find the further you delve into it the less the problems are distinct, a lot of the issues with the module import system, pip, virtualenv, setuptools, etc. is because they are designed with having to support a vast range of things, from being depended on to interact with system libraries to downloading sdists and compiling arbitrary languages, etc.

Though the specific example you linked was largely solved with Python 3, there was a lot of confusion during the 2 to 3 transition because people had to support both behaviors, but most people don't have to think about Python 2 any more.


> Though the specific example you linked was largely solved with Python 3

I can assure you it absolutely was not.

> I think you'll find the further you delve into it the less the problems are distinct, a lot of the issues with the module import system, pip, virtualenv, setuptools, etc. is because they are designed with having to support a vast range of things, from being depended on to interact with system libraries to downloading sdists and compiling arbitrary languages, etc.

Not really. There are plenty of systems that have to "support a vast range of things" that aren't this bad.

In my opinion it's because the core Python devs didn't particularly care about the issue, never really tried to solve it, and as a result we have 10 incompatible half-baked third party solutions.

It's similar to the situation with C/C++ - worse in some ways, better in others (at least there is a de facto package registry in Python). In some ways it's because both languages are very old and predate the idea that packaging should be easy and reliable. That's fine, but please don't pretend that it is easy and reliable now.


> I can assure you it absolutely was not.

The question, as posted, was a confusion about how Python 2 relative importing worked, which was indeed bad. I don't know what you think you are pointing out, you haven't said, and the question *is* about Python 2.

> Not really. There are plenty of systems that have to "support a vast range of things" that aren't this bad. > > In my opinion it's because the core Python devs didn't particularly care about the issue, never really tried to solve it, and as a result we have 10 incompatible half-baked third party solutions.

I agree that a lot of the solutions were created when there was not an understanding, or thought out design, on what would be a good packaging solution.

But these ill thought out solutions were exactly because of trying to support this wide range of situations, from working on weird OSes, to integrating with strange build systems.

However, what you seemed to have missed is there is now a first party standard on:

* How package installers (pip, poetry, PDM, etc.) should interact with package builders (setuptools, hatchling, etc.)

* How and where build configuration and project metadata should be stored (pyproject.toml)

Almost every popular Python package tool now supports these standards, meaning they all interact with each other pretty well.

Dropping all legacy configuration, and updating the standards to support edge cases is still a long road, but it is a road being travelled and things are getting better.


I don't disagree with what you're saying but that article seems odd. It's just a story of someone installing something once that didn't break immediately. Not only is it anecdotal, it doesn't even seem to confirm whether the pip story is getting better or if they just got lucky.


I agree, but there is no large scale study on this.

As someone who has managed Python distributions in a large company and who triages issues on the pip github issue page that my anecdotal experience is things are getting better.

The only hard statistic I can point to is the number of top packages on PyPI that offer wheels has substantially gone up, and is close to 100% in the top 500.


They use it fine by using Docker. So many major Python repos on GitHub come with a Dockerfile compared to, say, NodeJS. It's unfortunate, but having dealt with Python packages before, I don't blame them.


Lots of analysts, data scientists, traders, engineers, etc., use Python third party packages successfully, and have never touched, or maybe even head of, Docker.

And yeah, in general there are significantly less NodeJS third party packages interfacing with packages that directly depend on OSes and hardware. Python has many third party packages that are older than NodeJS that depend on foreign function interfaces, win32 com apis, directly talking to graphics shaders, etc.


Python packages are painful even if nothing native is involved. There are NodeJS packages that rely on native code too, difference is the packaging system is a lot simpler to use, though it's starting to get worse with the `import` vs `require` nonsense and Typescript.


The silver bullet is Docker. It's not the best solution in every situation, but at least it'll work.

For your own projects built from scratch (rather than big multi-dep projects off GitHub), there's a smaller learning curve if you go to the vanilla https://www.python.org/downloads/ , install the latest, and use the included pip to install packages. That'll probably get you very far. It's not like the older days when you needed both Py3 and Py2.

For experts working frequently in Python, tools like Pyenv can make more sense.


You could just give up and resort to using Docker.


Using docker is not free, you are effectively trading one type of complexity for another.

And of course you'll have to make sure people equally have good practices, since incorrectly using sudo pip install in docker and not a venv are very common.

So again, one possible solution for a certain context, but I wouldn't sell that to most people. Certainly not to beginners.

People writing their pythonanywhere website won't pop up a container, won't they ?


For a beginner trying to run someone else's Python project, it is free. You run one command and it works. There's tons more complexity under the hood, but you don't have to deal with it. This is why so many Python projects do resort to Docker.

And btw, JS doesn't have this problem, they use NPM.


> For a beginner trying to run someone else's Python project, it is free.

Its absolutely, by far, the highest friction alternative on Windows for anyone that doesn't use Docker for other purposes.


- Not if you have to edit the code. Try to explain your beginner with notepad++ how to do that.

- Not if you have to deploy the code.

- Not if you have to import the code in your own project.

Basically, not if you want to do anything by running the code. For which I would suggest an installer instead.


I don't see what editing has to do with this. Whether you're running the main.py or the Dockerfile, if you want to edit the code then you gotta use Notepad.


Is it not just install and run? The alternative is installing Python, installing pip, etc.


Installing python with the python.org installer installs the entirety of Python and its standard library and tools, including pip, venv, etc., and the py launcher which supports selecting from multiple python versions. Linux distros may split this up into different packages, but on Windows (or Mac, afaik, though the py launcher is a Windows-only feature) using the official installers is one-stop shopping.

And, no, docker isn't just install and run on Windows. Before Docker made a heavy push for paid Docker Desktop for even personal use, it was close to that.

But now the way to get a free usable docker command line is to install WSL and a Linux environment, install docker there, and then invoke docker via wsl. (Which, of course, you will not find via Docker’s own information, which will try to sell you a paid subscription.)


I tried it just now on my spare Win10 desktop, and it went like this:

1. Download and install Docker desktop for Windows

2. Restart Windows (guess you don't have to do this with just Python)

3. Run Docker Desktop

4. Say "no" to signing into a Docker account

5. Wait for engine to start, which took a few minutes the first time, a bit annoying

6. Pull and run an image (I tried nginx)

It was weird being asked to log in, but the "no" button was pretty clear. I didn't feel like I was forced to use WSL to avoid paying; maybe they've backtracked from something. Unity was far more convincing that I had to pay.


Even at that, there are edge cases like when a python package like Tensorflow uses a instruction set that is not available on a Arm machine.


Yeah, personally I wouldn't develop on a different ISA than what I deployed to in production.


Honestly this. I've moved three companies now to docker based dev for teams on python. It's better for teams who want to customize how they in particular work without inadvertently also blocking themselves from quickly contributing to another team because that team uses a totally different python management scheme. So a team can be conda-in-docker and another team can be virtualenv-in-docker and another team can be raw-python-in-docker and that one group of weirdos can be jython or whatever other sad shit they cook up.

The common interface is they all use `docker-compose up` and have their editors hooked into the containers.


this is the way.


If you use Homebrew: you can use ‘brew pyenv-sync’ to use Homebrew’s pythons with pyenv. Similar commands are available for rbenv/nodenv (which always feel like it is missing an ‘e’ to me)


> it compiles python on your machine

What is the alternative? Every solution that I know of on Linux requires you to build Python on the machine: asdf, official Python downloads, etc.


For now most people would do well to stick to python.org installers for mac and windows.

For linux, official repos are ideal. If you really, really can't (which is different than wanting to), ubuntu deadsnake and red hat epl are the best second plan, while already more finicky.

If you use something more exotic, you chose hardship, and you will have to be up to the task.

Anything else will come with bigger caveats than the people promoting them will really admit to.

The story is of course a bit richer because linux packaging is fun, so I'll complete with:

https://www.bitecode.dev/p/installing-python-the-bare-minimu...

Thete are currently no silver bullet to bootstrap python on linux. I tried all of them with hundred of colleagues and trainees.

I do have hope for astral to come up with one but don't run on rye thinking your life is saved.


Even on Macs, you can use MacPorts. They provide a lot of versions pre-compiled (and everything is BSD-solid).


I know you mean well, but such advice is why so many beginners have painful Python experience.

Macports and homebrew Pythons are dependencies of other packages. They can be used by you, but they are not meant for you.

This means at some point, they will contain a surprise, and not a good one.


> but such advice is why so many beginners have painful Python experience

Sample of one, but I never encountered anything even broken in MacPorts. They seem to embrace the BSD ethos of doing everything right (even if at a slower pace, as some packages lag a few releases behind Homebrew).


Personally I only use MacPorts or Homebrew to install random smaller tools. It's easier to install Python, NodeJS, Postgres, etc binaries from the main website.


These builds are an alternative: https://github.com/indygreg/python-build-standalone

Those are what Rye and hatch use.

Drawbacks: late availability of patch versions, various quirks from how they are built (missing readline, missing some build info that self-compiled C python modules might need.)


I think PEP711 (https://peps.python.org/pep-0711/) is (eventually) a better alternative, because it builds on top of the proven manylinux approach to binary compatibility.


Those are proof of concept builds now and will eventually hopefully replace indygreg's builds.

Not only does the format need to exist but the service of building and publishing them is needed too.


Very cool, this is the first time I'm hearing of PEP 711! I hope this (or some other PEP like it) will get accepted eventually!


I hope this will be merged with indygreg builds at some point, but the community move slowly on those issues so it may take years.


> Every solution that I know of on Linux requires you to build Python on the machine

Unless you need a Python that's not supported by your Linux distribution, you can just use what's available.

On macOS, MacPorts provides compiled versions for 3.2 all the way to 3.13, as well as 2.6 and 2.7. Right now, I have 3.8, 3.9, 3.10, 3.11, 3.12, and a 3.13 development build. The fact it's not Linux (or x86) might cause some frustration.


If you’re lucky enough to be on a Linux system that uses apt some thankless soul maintains a repo called deadsnakes with all these binaries. Fabulous if you’re using any somewhat old version of Python in CI for instance. Yum based systems are SOL as far as I can tell. Build and host your own binary for that. Apk doesn’t have this either IIRC


If you are deploying to such an ancient OS, it's perhaps easier to have the whole OS packaged as a container or a VM and use that. It's not only different Python versions that might bite you.


What makes you think that GP’s comment is talking about older OSes?

My understanding of their comment is that they’re talking about getting older Python interpreters to run on more modern OSes, modern enough that they don’t carry the older Python as a system package anymore. Hence, deadsnakes.


This is not about system packages or OSes, this is about your application needing Python 3.9.7 specifically, and locking to that, and 3.9.7 not being available in repositories anymore (3.9.7 is just an example). So normally you would either need to self host 3.9.7 somewhere or compile it from source on every new machine (which is terrible for CI but fine for local dev, a one off in local dev to build a Python version is nothing but paying 4 minutes every time on ci to compile your version of Python from source is a very angry amount of time for a lot of people).


> needing Python 3.9.7 specifically

That's indeed extremely specific. I'd imagine this pain could be self-inflicted with C-based extensions that were compiled (and can't be recompiled) with structures that don't exist in other versions.

I don't want to imagine what other eldritch horrors await developers working on this application.


Usually it is not the patch version being important eg being specifically 3.9.7, but rather picking one supported version and pinning it. Yes we should all be on the latest Python version that is available in the mainline repos and distros as a binary but a lot of orgs I’ve seen don’t invest the time and effort to properly be upgrading to be on the latest version of deps. So you are usually having one specific supported Python version that everyone builds against which may or may not be new enough to be in the official repo.

I also imagine the scenario you mentioned causing a lock to a patch version. But I see this as just a normal steady state a lot of orgs drift to if not staying on top of version updates


The thankless soul is the awesome Anthony Sottile, who is also the author of pre-commit.ci: https://twitter.com/codewithanthony

Drop by and send thanks cause he really needs them.


deadsnakes is useful, but also a bit of a misnomer. One can use it to get pretty much anything python version, even ones that are too new to be available in your default apt repos. That is, it's not just for 'dead'/EOL versions.


That's exactly why I don't really on that for Ubuntu though. It's a single point of failure, when I can just use `asdf` to install Python and easily switch between it, automatically if setup correctly on projects.


I use asdf on my local machine, stuff like deadsnakes is for when you need your CI to use VaguelyOldButStillSupported Python version and don’t want to either compile Python from source or host your own binary


My use case is that I'm almost always using the latest Python version, which many Linux distros don't have in their packaging yet, and I'm upgrading code using old versions to the latest. asdf is the best tool for that.


Rye downloads pre-built binaries.

I'll leave further discussion of the reliability and provenance of said binaries to someone else.


Binaries are quite reliable and from a serious project, but rye is young and uses shim.

Still, it's part of those new generation of tooling that bringing hope for the next few years.


> What is the alternative?

Using per-distro official installation channels.

E.g. using deadsnakes PPA on Ubuntu, AUR on Arch Linux, etc.


try pyoxidizer


I think that you are painting a bit of an unreasonably bleak view of pyenv. I think that one can easily get value out of it without being a Python expert. I’m not sure how I can refute your “yes, but you’ll eventually run into trouble. You just haven’t clocked enough hours yet”. I can’t prove a negative. But I’ll say that I’ve been writing Python in my day job for a decade.

But, to add to the list of problems, these Python versions IIRC do not compile with optimisation turned on, so they’re by default quite a bit slower than they need to be.


I have not suffered any issues due to pyenv compiling. In our prod boxes we compile Python ourselves anyway. It's a really trivial build process tbh.


I generally like bite codes newsletter but this one was a miss for me.

I’m sure he is trying to promote some sort of work flow in that article but I don’t understand which.


I call it the curse of Python.

When the God Of Programming made Python, all other languages were jealous of its elegance, simplicity and intuitive beauty. When the other programming languages went to complain he said..."Wait until you see what package systems I will give them...They will never get an environment properly setup..." :-)


And then the Devil made CUDA…

”The CUDA version on your system does not match the CUDA version the realm of mortal men was compiled with”


Oh man, you just confirm that I’m not crazy. For every single time I use pyenv it just downloads source and build. I got tangled into so many issues such as TLS headers in my AL2 boxes. I thought the whole thing is about getting the official binary and install and not compile from source even for a 2 year old release.


Data point: pyenv works just fine for me. It helps me with installing/managing many python versions (multiple years).

I guess, I'm in the "expert" category. I'm saying it so that people won't be afraid to try it.


Python is quite self contained, it's not a chore to build it.


Can I ask - why the dislike for virtualenvwrapper? If you are saying that it can have occasional problems, I agree with you. But it makes the process so much simpler. The occasional problems I’ve had (had an issue once or twice deleting a venv) pales in comparison to the advantage I get in remembering a couple fast commands. Is there something better or am I relegated to “source what/directory/was/it/again/bin/activate?


> See also: "Why not tell people to "simply" use pyenv, poetry or anaconda"

> https://www.bitecode.dev/p/why-not-tell-people-to-simply-use

Just curious: what are the downsides of poetry installed with pipx? The article mentions having to install poetry in another venv, but that's hardly an issue with pipx (you just add an 'x' after 'pip'), and installing pipx is as straight-forward as it can be.


I eventually landed on pipx as fighting with Pyenv and Anaconda - via Miniconda - was an exercise in frustration. There's some mucking about in `$HOME/.local`, but this is mostly self-contained and not a huge chore to keep running.

Coming from the Homebrew/Ruby ecosystem - Hey @mikemcquaid - installing a entirely separate package manager just to deal with a few projects felt like the wrong thing to do.

Occasionally, I have still needed to compile Python myself in order to get things to work, which isn't guaranteed not to blow up w/ `brew`, but this has become far less common of late.


Agreed pipx solves a lot of packaging issues with no downside to speak of. Not just with poetry but also for tools like virtualenv, ruff and black and non-dev command line tools.


> People that love pyenv are so adamant at telling everybody they should use it. How it solved all their problems, and got rich and lost 5 pounds.

> It doesn’t support Windows. That’s game over right there, for half of the community.

What do you mean by this? I use pyenv on windows all the time.


Why does it compile Python on your machine?


What's the problem? The repo clearly states that you need to install such and such build prerequisites.

Also, the issues the article you link are things that damn near every programmer will eventually need to know. Things like PATH are IMO the basics -- if you don't understand this or how to `$packageManager install` something, you're gonna have a rough time programming in general, regardless of language.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: