Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> which package manager/distribution I need to use, and what system libraries those dependencies need to function properly

I don't understand why things are so complicated in Python+ML world.

Normally, when I have a Python project, I just pick the latest Python version - unless documentation specifically tells me otherwise (like if it's still Python 2 or if 3.11 is not yet supported). If the project maintainer had some sense, it will have a requirements list with exact locked versions, so I run `pip install -r requirements.txt` (if there is a requirements.txt), `pipenv sync` (if there is a Pipfile), or `poetry install` (if there's pyproject.toml). That's three commands to remember, and that's not one just because pip (the one de-facto package manager) has its limitations but community hadn't really decided on the successor. Kinda like `make` vs automake vs `cmake` (vs `bazel` and other less common stuff; same with Python).

External libraries are typically not needed - because they'll be either provided in binary form with wheels (prebuilt for all most common system types), or automatically built during the installation process, assuming that `gcc`, `pkgconfig` and essential headers are available.

Although, I guess, maybe binary wheels aren't covering all those Nvidia driver/CUDA variations? I'm not a ML guy, so I'm sure how this is handled - I've heard there are binary wheels for CUDA libraries, but never used that.

Ideally, there's Nix (and poetry2nix) that could take care of everything, but only a few folks write Flakes for their projects.

> Usually the most complicated part is running `make` or `cmake` with `pkgconfig` somewhere in my path

Getting the correct version of all the dependencies is the trickiest part as there is no universal package managers - so it's all highly OS/distro specific. Some projects vendor their dependencies just to avoid this (and risk getting stuck with awfully out-of-date stuff).

> Maybe install some missing system libraries if necessary.

And hope their ABIs (if they're just dynamically loaded)/headers (if linked with) are still compatible with what the project expects. At least that is my primary frustration when I try to build something and it says it doesn't work anymore with whatever OS provides (mostly, Debian stable fault lol). It is not exactly fun to backport a Debian package (twice so if doing this properly and not handwaving it with checkinstall).



> Ideally, there's Nix (and poetry2nix) that could take care of everything, but only a few folks write Flakes for their projects.

Relevant to "AI, Python, setting up is hard ... nix", there's stuff like:

https://github.com/nixified-ai/flake


The right combo for Nvidia/CUDA/RandomPythonML library is a nightmare at times. This is especially true if you want to use older hardware like a Tesla M40 (dirt cheap, still capable). And your maker hopefully be with you if you if you tried to use your distro's native drivers first.

It's fair to say part of the blame is on Nvidia, but wow is it frustrating when you have to find eclectic mixes.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: