Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I used python to write a small utility that i wanted to share with a coworker, ended up having to rewrite it in Go because i spent 3 hours trying to compile the python version to an exe so that it was easier to run on their machine.

I wonder why isn't there an easy way to just package the interpreter inside an EXE for people that don't care about binary size just so that it would make it seamless to package up utilities written in python.



It's kind of frustrating to deal with python stuff as an user. I wish it didn't have to be like this, because there's a lot of interesting stuff written in python that's nearly impossible to get to run.

There are too many environemnt systems (conda, venv, I think I'm forgetting one more), not all of them build or ship on all systems, then you need to manage them, and sometimes to have the exact correct python version installed because god help you if you're on 3.12 and not 3.11; and set as PATH for pip to find the correct dependencies, but you need to do that before you set up your venv and run pip, otherwise you need to tear that down and star tover. Sometimes the dependencies build, and break, because some package is missing from your system.

It's kind of a miserable experience.


With luck, `uv tool install` [1] will solve most of the frustrations with Python programs on Linux/Mac/Windows, since uv combines the functionality of pipx and pyenv and doesn't itself depend on Python. It is a single binary that manages Python interpreters and project environments. uv is very actively developed and seems to be rapidly gaining adoption. It makes me optimistic about its future (and, correspondingly, pessimistic about the future of Python tooling that competes with it).

I already have a success story of helping a friend install ArchiveBox [2] with uv. ArchiveBox required an older version of Python than their system had. The quickest and easiest solution I could think of was:

  pipx install uv
  uv tool install --python 3.8 archivebox
You could also install uv by other means.

[1] https://docs.astral.sh/uv/concepts/tools/

[2] https://archivebox.io/


> imagine here the xkcd comic about standards

These tools come and go in the Python world... It's basically like JavaScript frameworks. It seems that people recommend a different one every month


Yes, just like this. Since 2015 when I started work professionally with python, after each year, a new project shows some kind of "messiah complex" around the python distributing software problem... They usually say in some part of the readme: "That time we will be saved! We have the right tool here that does this and that".

No, it's not solved! And probably will not be solved neither.

Obviously, things get better honestly, 2015 was far worst than now, but currently it's very far from perfect.

For instance, go static build are much superior. Or even Rust, with all the language issues, at least the tooling is good and the software distribution works great.


Yeah uv works great - that --python in your example can even be awkward pythons like [email protected] (provided pypy works on your OS/arch combo)


I've come to realize that putting everything into a container is the only viable way to share a python program.

I'll certainly checkout PEX, I think that distribution of a binary is likely the largest one for Python right now. Java solved it with a JAR file, static-compiled binaries for most compiled languages.

At Google PAR files have been quite a useful way to handle this for at least 18 years now, hope this can get reasonably solved everywhere.


Docker exists because running a Python program is so brutally difficult that packaging up half an OS is the only way to reliably run them.


I would say that it's not just a Python problem, Node.js or Ruby seems to have almost the same problems. I'd say Docker exists because packaging C/C++ dependencies is an impossible problem that hasn't been solved in 5 decades. The build tools absolutely suck.


It isn't just the build tools, Unix as the platform for Worse Is Better, has decided to never solve this problem. Docker solves the problem of a having an under powered broken dynamic linker.

Docker should have never been necessary.


> under powered broken dynamic linker.

In your mind what would a good dynamic linker look like?


> because packaging C/C++ dependencies is an impossible problem that hasn't been solved in 5 decades

C/C++ build systems suck. But deploying a C++ program is much much easier than Python. And Linux is waaaay worse than Windows for both building and deploying imho.


> Java solved it with a JAR file

JARs still require a JRE to run, and the runtime needs to be invoked. The equivalent would probably be a Python zipapp (which are just Python files zipped into a .pyz the interpreter can run).

Static binaries are one advantage that languages like Go and Rust have though, yeah.


You can bundle a JRE with a JAR to get a self contained executable.

Also don’t forget that many compiled languages depend on a particular libc so you’re not totally in the clear there.


A JRE + binary deployment weights in over 100 MB.

An ordinary compiled binary would be a fraction of that size, saving you time in provisioning and deployment.


Small price to pay if you have a JVM team already.

Most programmers (HN is a bubble) hate learning new languages.


I also end up avoiding python not because it's a bad language, but because it's so much more convenient to work with compiled artifacts that worry about shipping around a bunch of source or object files and a full compiler. Not to mention the speed of using Go and the ability to write concurrent code without a lot of effort.

The only time I really use python is when I need a quick and dirty script.


PyApp is really good for this: https://github.com/ofek/pyapp

It is just a way to wrap your python code into a self bootstrapping executable: https://ofek.dev/pyapp/latest/runtime/

You can get it setup pretty easily with hatch: https://hatch.pypa.io/1.9/plugins/builder/app/

But yea I agree the problem with python is there isn’t a tool like deno that just does everything that has to do with packaging, locking, venvs, and building a cross platform binary.


As of a few months ago, pex supports including an interpreter, either directly inline or lazily downloaded (and cached): https://docs.pex-tool.org/scie.html


It's probably on the list that you've tried, but I've had most luck with Nuitka (in standalone mode, not onefile). Unlike pyinstaller, it really is the executable it claims to be, not just an elaborate zip file containing python.exe and your source files.


Nuitka(and a couple other python distribution techniques) tripped my employer’s antivirus.

The packaging/deployment of python keeps me from using it for anything other than one-off, disposable programs.


That often happens when a program is just a self-extractor that puts the real executable in a temp directory and runs that. That's how most Python packagers work (including PyInstaller), but Nuitka only does that in "onefile" mode (it self-extracts the files that make up "standalone" mode).

In "standalone" mode, the executable really is just a normal program. It's actually the various fragments of the Python bytecode interpreter unrolled according to the Python bytecode of your program and the packages it uses. All the C extension modules are shared libraries in the same directory (this means you can use LGPL C extension modules like PyQt BTW) and any package data files are alongside in that directory too. This makes it seem a bit messy but in reality it's cleaner than what onefile does under the hood.

(This is what I was alluding to in my parent comment but I didn't explain myself.)


While it does proper transpiling nuitka is still also a zip containing python and python files because there are things it can't deal with


No it's not, in standalone mode. Even in onefile mode, which is what you're referring to, the contents on the inside are the same as standalone mode, so it's still more than just Python and your source files. It does include the Python interpreter as a library so it can handle calls to `exec()`, but almost all code is unrolled to C and compiled directly.


I've always been curious why Nim never took off as the solution for use cases like this. It combines the simplicity and clarity of Python-like syntax with the performance of a natively compiled language, which looks like the best of both worlds.


Python strength is not the syntax but the standard and 3rd party libraries.


3rd party libraries don't spring from nowhere. And no language starts having one in abundance. People have to be motivated enough to write all those libraries in the first place and a lot of them are written just to use python syntax over C Code.

I'm not saying it's the only reason to choose python now but it's definitely among the biggest reasons.


Not necessarily. Python had the best C-API, that was the main reason. If Nim or Lisp copied that C-API people might move.

It is safe to say in 2024 that people do not want FFIs.


Yes but we're still getting to the same point.

Why not just call the C code you've already written in C ? Because they would rather use python (or python like) syntax.

I don't think we actually disagree here. Even your point about the better C-API doesn't indicate that syntax wasn't a deciding factor, just that one of several options had better compatibility.


My friend is building a tool to do something like this using the actually portable Python from cosmopolitan python: https://github.com/metaist/cosmofy

You run one command that it generates a single executable that can run simultaneously on Mac Linux and windows. Pretty nice for just deploying simple Python scripts.


There is: PyInstaller


Seems like i have it in my browsing history, i remember not being able to run the executable it produced, not being able to find a library it was supposed to have. This was just using the default "pyinstaller your_program.py", and i was frustrated enough to not go deeper into why that was. Will definitely give it a try again in the future


PyInstaller-made executables also used to have a habit of getting flagged by security software as malicious (maybe that's why you couldn't run it?) -- apparently, so many malware writers used it that it ruined the party for everyone.

Fortunately, that was only the 32-bit version of Python 2.7. Using 64-bit versions or Python 3 was enough to not get flagged as malicious. I figured that out when I decided I didn't want to teach myself Go just then to deploy something that had worked the day before.


Pyinstaller can be a bit fiddly to get right initially if you have a lot of package dependencies, but other than that it works really well.


Using the spec files for persistent readable configuration also goes a long way, if you treat pyinstaller as a python module you can automate it whole with just python, including the spec files as it executes them as python scripts

I've had automated builds running mostly untouched for years here https://github.com/Numerlor/Auto_Neutron/blob/master/pyinsta..., though it doesn't have any particularly troublesome packages other than PySide6


Yep I used it at my last job and it worked great! Startup times were horrible, but that didn't matter so much to us and it solved tons of problems we had with people messing up their python environments. Takes some tweaking to get certain modules (like scikit-rf) to work, but never found an issue that couldn't be solved.

Would recommend!


same but i made the mistake of rewriting in rust. in the end had to waste 4+ hours getting a windows vm and building there to have a working program.

is go better at building trully multiplatform from linux, in practice?


Can you provide any more specifics? I've done the same and had good luck with rust across most Linux, but found Go easier musl, armv7, and for freebsd.


You can always use cx freeze https://pypi.org/project/cx-Freeze/


I love to hear these kind of success stories! Glad you found proper tooling to stay productive with.


This is more a failure story of python than a success story.


Fwiw that's what we do at work but I don't think it's open source


Why not just use Docker?

It's been around for years now, is super battle tested, and user tooling continues to get easier. As a bonus, it works for not only python but other tooling as well.


Depends on which part the coworker needed. If it was passing around a reproducible environment, sure, Docker works. But if they needed "here's a thing I can double-click and it'll just work" then Docker has no real advantages.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: