When I drop into a Node.js project, usually some things have changed, but I always know that if I need to, I can find all of my dependencies in my node_modules folder, and I can package up that folder and move it wherever I need to without breaking anything, needing to reset my PATH or needing to call `source` inside a Dockerfile (oh lord). Many people complain about Node and npm, but as someone who works on a million things, Node/npm is never something I need to think about.
Python/pip though… Every time I need to containerize or setup a Python project for some arbitrary task, there’s always an issue with “Your Linux distro doesn’t support that version of Python anymore”, forcing me to use a newer version than the project wants and triggering an avalanche of new “you really shouldn’t install packages globally” messages, demanding new —yes-destroy-my-computer-dangerously-and-step-on-my-face-daddy flags and crashing my automated scripts from last year.
And then there’s Conda, which has all of these problems and is also closed source (I think?) and has a EULA, which makes it an even bigger pain to automate cleanly (And yes I know about mamba, and miniconda, but the default tool everyone uses should be the one that’s easy to work with).
And yes, I know that if I was a full-time Python dev there’s a “better way” that I’d know about. But I think a desirable quality for languages/ecosystems is the ability for an outsider to drop in with general Linux/Docker knowledge and be able to package things up in a sometimes unusual way. And until uv, Python absolutely failed in this regard.
I think a lot of the decades old farce of Python package management would have been solved by this.
https://peps.python.org/pep-0582/
https://discuss.python.org/t/pep-582-python-local-packages-d...
Having dependency cache and build tool that knows where to look for it is much superior solution.
If you have local dependency repo and dependency manifest, during the build, you can either:
1. Check if local repo is in sync - correct build, takes more time
2. Skip the check - risky build, but fast
If the dependencies are only in the cache directory, you can have both - correct and fast builds.
What's the "this" that is supposedly always your issue? Your comment is phrased as if you're agreeing with the parent comment but I think you actually have totally different requirements.
The parent comment wants a way to have Python packages on their computer that persist across projects, or don't even have a notion of projects. venv is ideal for that. You can make some "main" venv in your user directory, or a few different venvs (e.g. one for deep learning, one for GUIs, etc.), or however you like to organise it. Before making or running a script, you can activate whichever one you prefer and do exactly like parent commenter requested - make use of already-installed packages, or install new ones (just pip install) and they'll persist for other work. You can even switch back and forth between your venvs for the same script. Totally slapdash, because there's no formal record of which scripts need which packages but also no ceremony to making new code.
Whereas your requirements seem to be very project-based - that sounds to me like exactly the opposite point of view. Maybe I misunderstood you?
> Python/pip though… Every time I need to containerize or setup a Python project for some arbitrary task, there’s always an issue with “Your Linux distro doesn’t support that version of Python anymore” [...]
How are you containerizing Python projects? What confuses me about your statement are the following things:(1) How old must the Python version of those projects be, to not be supported any longer with any decent GNU/Linux distribution?
(2) Are you not using official Python docker images?
(3) What's pip gotta do with a Python version being supported?
(4) How does that "Your Linux distro doesn’t support that version of Python anymore" show itself? Is that a literal error message you are seeing?
> [...] demanding new —yes-destroy-my-computer-dangerously-and-step-on-my-face-daddy flags and crashing my automated scripts from last year
It seems you are talking about installing things in system Python, which you shouldn't do. More questions:(1) Why are you not using virtual environments?
(2) You are claiming Node.js projects to be better in this regard, but actually they are just creating a `node_modules` folder. Why then is it a problem for you to create a virtual environment folder? Is it merely, that one is automatic, and the other isn't?
> This was always my issue with pip and venv: I don’t want a thing that hijacks my terminal and PATH, flips my world upside down and makes writing automated headless scripts and systemd services a huge pain.
It is very easy to activate a venv just for one command. Use a subshell, where you `. venv/bin/activate && python ...(your program invocation here)...`. Aside from that, projects can be set up so that you don't even see that they are using a venv. For example I usually create a Makefile, that does the venv activating and running and all that for me. Rarely, if ever, I have to activate it manually. Since each line in a Makefile target is running in its own shell, nothing ever pollutes my actual top level shell.Debian-13 defaults to Python-3.13. Between Python-3.12 and Python-3.13 the support for `pkg_config` got dropped, so pip projects like
https://pypi.org/project/remt/
break. What I was not aware of: `venv`s need to be created with the version of python they are supposed to be run. So you need to have a downgraded Python executable first.
This is one of uv’s selling points. It will download the correct python version automatically, and create the venv using it, and ensure that venv has your dependencies installed, and ensure that venv is active whenever you run your code. I’ve also been bit by the issue you’re describing many times before, and previously had to use a mix of tools (eg pyenv + pipenv). Now uv does it all, and much better than any previous solution.
Would you help me make it work?
docker run -it --rm -v$(pwd):/venv --entrypoint python python:3.12-alpine -m venv /venv/remt-docker-venv
How do I source it? cd remt-docker-venv/
source bin/activate
python --version
bash: python: command not foundYou could also pass the `--copies` parameter when creating the initial venv, so it's a copy and not symlinks, but that is not going to work if your on MacOS or Windows (because the binary platform is different to the Linux that's running the container), or if your development Python is built with different library versions than the container you're starting.
The problem is you are mounting a virtual environment you have built in your development environment into a Docker container. Inside your virtual environment there's a `python` binary that in reality is a symlink to the python binary in your OS:
cd .venv
ls -l bin/python
lrwxr-xr-x@ 1 myuser staff 85 Oct 29 13:13 bin/python -> /Users/myuser/.local/share/uv/python/cpython-3.13.5-macos-aarch64-none/bin/python3.13
So, when you mount that virtual environment in a container, it won't find the path to the python binary.The most basic fix would be recreating the virtual environment inside the container, so from your project (approximately, I don't know the structure):
docker run -it --rm -v$(pwd):/app --entrypoint ash ghcr.io/astral-sh/uv:python3.12-alpine
/ # cd /app
/app # uv pip install --system -r requirements.txt
Using Python 3.12.12 environment at: /usr/local
Resolved 23 packages in 97ms
Prepared 23 packages in 975ms
Installed 23 packages in 7ms
[...]
/app # python
Python 3.12.12 (main, Oct 9 2025, 22:34:22) [GCC 14.2.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
But, if you're developing and don't wanna build the virtual environment each time you start the container, you could create a cache volume for uv, and after the first time installation, everything is going to be way faster: # First run
docker run -ti --rm --volume .:/app --volume uvcache:/uvcache -e UV_CACHE_DIR="/uvcache" -e UV_LINK_MODE="copy" --entrypoint ash ghcr.io/astral-sh/uv:python3.12-alpine
/ # cd /app
/app # uv pip install -r requirements.txt --system
Using Python 3.12.12 environment at: /usr/local
Resolved 23 packages in 103ms
Prepared 23 packages in 968ms
Installed 23 packages in 16ms
[...]
# Second run
docker run -ti --rm --volume .:/app --volume uvcache:/uvcache -e UV_CACHE_DIR="/uvcache" -e UV_LINK_MODE="copy" --entrypoint ash ghcr.io/astral-sh/uv:python3.12-alpine
/ # cd /app
/app # uv pip install -r requirements.txt --system
Using Python 3.12.12 environment at: /usr/local
Resolved 23 packages in 10ms
Installed 23 packages in 21ms
You can also see some other examples, including a Docker Compose one that automatically updates your packages, here:https://docs.astral.sh/uv/guides/integration/docker/#develop...
---
Edit notes:
- UV_LINK_MODE="copy" is to avoid a warning when using the cache volume
- Creating the venv with `--copies` and mounting it into the container would fail
if your host OS is not exactly the same as the containers, and also defeats in a
way the use of a versioned Python containerLiterally, my case. I recently had to compile an abandoned six-year-old scientific package written in C with Python bindings. I wasn’t aware that modern versions of pip handle builds differently than they did six years ago — specifically, that it now compiles wheels within an isolated environment. I was surprised to see a message indicating that %package_name% was not installed, yet I was still able to import it. By the second day, I eventually discovered the --no-build-isolation option of pip.
This works because of the relative path to the pyenv.cfg file.
Python sticks out for having the arrogance to think that it’s special, that “if you’re using Python you don’t need Docker, we already solved that problem with venv and conda”. And like, that’s cute and all, but I frequently need to package Python code and code in another language into one environment, and the fact that their choice for “containerizing” things (venv/conda) plays rudely with every other language’s choice (Docker) is really annoying.
If that's not good enough for you, you could do some devops stuff and build a docker container in which you compile Python.
I don't see where it is different from some npm project. You just need to use the available resources correctly.
pip and venv are not such things. The activation script is completely unnecessary, and provided as a convenience for those to whom that workflow makes more sense.
> Every time I need to containerize or setup a Python project for some arbitrary task, there’s always an issue with “Your Linux distro doesn’t support that version of Python anymore“
I can't fathom why. First off, surely your container image can just pin an older version of the distro? Second, right now I have Python versions 3.3 through 3.14 inclusive built from source on a very not-special consumer Linux distro, and 2.7 as well.
> and triggering an avalanche of new “you really shouldn’t install packages globally” messages, demanding new —yes-destroy-my-computer-dangerously-and-step-on-my-face-daddy flags and crashing my automated scripts from last year.
Literally all you need to do is make one virtual environment and install everything there, which again can use direct paths to pip and python without sourcing anything or worrying about environment variables. Oh, and fix your automated scripts so that they'll do the right thing next time.
> I know that if I was a full-time Python dev there’s a “better way” that I’d know about.
Or, when you get the "you really shouldn't install packages globally" message, you could read it — as it gives you detailed instructions about what to do, including pointing you at the documentation (https://peps.python.org/pep-0668/) for the policy change. Or do a minimum of research. You found out that venvs were a thing; search queries like "python venv best practices" or "python why do I need a venv" or "python pep 668 motivation" or "python why activate virtual environment" give lots of useful information.
The shame is ... it never had to be that way. A venv is just a directory with a pyvenv.cfg, symlinks to an interpreter in bin, and a site-packages directory in lib. Running anything with venv/bin/python _is_ running in the virtual environment. Pip operations in the venv are just venv/bin/python -m pip ... . All the source/deactivate/shell nonsense obfuscating that reality did a disservice to a generation of python programmers.
It isn't that way. Nothing is preventing you from running the venv's python executable directly.
But the original designer of the concept appears to have thought that activation was a useful abstraction. Setting environment variables certainly does a lot to create the feeling of being "in" the virtual environment.
Seriously, this is why we have trademarks. If Anaconda and Conda (a made-up word that only makes sense as a nickname for Anaconda and thus sounds like it’s the same thing) are two projects by different entities, then whoever came second needs to change their name, and whoever came first should sue them to force them. Footguns like this should not be allowed to exist.
Anaconda suddendly increased the licensing fees like Broadcom did with VMWare, many companies stopped using it because of the sudden increase in costs.
https://blog.fulcrumgenomics.com/p/anaconda-licensing-change... https://www.theregister.com/2024/08/08/anaconda_puts_the_squ...
This is not anything like a fact. For three years now (since the 3.11 release) Python distributions on Linux have in fact taken special measures to prevent the user from using tools other than the system package manager to install into the global environment. And for thirteen years (since the 3.3 release) Python has offered standard library functionality to create isolated environments specifically to avoid that problem. (And that functionality is based on a third party library with public releases going back eighteen years.)
Pip is designed around giving you the freedom to choose where those environments are (by separately creating them) and your strategy for maintaining them (from a separate environment per-project as a dev, to a single-environment-still-isolated-from-the-system for everything as a data scientist, and everything in between).
Treating python as a project level dependency rather than a system level dependency is just an excellent design choice.
Nobody is treating Python as a project level dependency. Your Linux distro treats it as a system level dependency, which is exactly why you encountered the problem you did.
When you create a virtual environment, that does not install a Python version. It just makes symlinks to a base Python.
Building Python from source, and setting it up in a way that doesn't interfere with the package manager's space and will cause no problems, is easy on major distros. I have access to over a dozen builds right now, on Mint which is not exactly a "power user" distro (I didn't want to think about it too much when I initially switched from Windows).
Only if that "program that uses Python" is itself provided by a system package for global installation.
> so you have python packages bundled as system packages which can conflict with that same package installed with pip.
Right; you can choose whether to use system packages entirely within the system-package ecosystem (and treat "it's written in Python" as an irrelevant implementation detail); or you can create an isolated environment so as to use Python's native ecosystem, native packaging, and native tools.
I don't know why anyone would expect to mingle the two without problems. Do you do that for other languages? When I tried out Ruby, its "Bundler" and "gem" system was similarly isolated from the system environment.
https://uploads.dailydot.com/2024/04/damn-bitch-you-live-lik...
You can create venvs wherever you please and then just install stuff into them. Nobody forces the project onto you, at work we don't even use the .toml yet because it's relatively new, we still use a python_requirements.txt and install into a venv that is global to the system.
At work for us we use uv pip freeze to generate a more strict requirements file.
> [...] at work we don't even use the .toml yet because it's relatively new, we still use a python_requirements.txt and install into a venv that is global to the system.
Unless your `python_requirements.txt` also carries checksums, like uv's lock files or poetry's lock files, that is. Though of course things get spicy and non-reproducible again, if you have then multiple projects/services, each with their own `python_requirements.txt`, all installing into that same global venv ...
I think you're basically suggesting that you'd have a VM or something that has system-high packages already preinstalled and then use UV on top of it?
I don't see anything resembling "environments" in the list of features or in the table of contents. In some sections there is stuff like "When working on a project with uv, uv will create a virtual environment as needed", but it's all about environments as tied to particular projects (and maybe tools).
You can use the `uv venv` and the `uv pip` stuff to create an environment and install stuff into it, but this isn't really different from normal venvs. And in particular it doesn't give me much benefit over conda/mamba.
I get that the project-based workflow is what a lot of people want, and I might even want it sometimes, but I don't want to be forced into foregrounding the project.
The advantage of being forced to do this is other people (including yourself on a new laptop) can clone your project, run uv install and get working. It's the death of "works on my machine" and "well it'll take them a couple of weeks to properly get up and running".
I know this might be a strange idea on HN, but tons of people writing code in Python, who need access to PyPI packages to do what they're doing, have no intention whatsoever of providing a clonable project to others, or sharing it in any other way, or indeed expecting anyone else on the planet to run the code.
It takes a couple of seconds to setup, and then you just use uv add instead of (uv) pip install to add things to the environment, and the project file is kept in sync with the state of the environment. I'm not really understanding what it is for the workflow you describe that you expect a tool to do that uv isn’t providing?
How about the advantage of not taking an entire lunch break to resolve the environment every time you go to install a new library?
That was the biggest sticking point with conda/mamba for me. It's been a few years since I last used them but in particular with geospatial packages I would often run into issues.
uv venv fooenv
Activate virtual environment on Windows (yes I'm sorry that's what I'm currently typing on!): .\fooenv\Scripts\activate
Run some environment commands: uv pip install numpy
uv pip freeze
uv pip uninstall numpy
If you run python now, it will be in this virtual environment. You can deactivate it with: deactivate
Admittedly, there doesn't seem to be much benefit compared to traditional venv/pip for this use case.This is covered in the section of the docs titled "The pip interface": https://docs.astral.sh/uv/pip/
Performance?
It is still benefitial to not install stuff system wide, since this makes it easy to forget which stuff you already have installed and which is a missing dependency.
Keeping track of dependencies is kind part of a programers work, so as long as you're writing these things mostly for yourself do whatever you like. And I say that as someone who treats everything like a project that I will forget about in 3 days and need to deploy on some server a year later.
Note that I'm mostly in the research/hobby environments - I think this approach (and Python in general, re: some other discussions here about the language) works really well, especially for the latter, but the closer you get to "serious" work, the more sense the project environment approach makes of course
Example: https://treyhunner.com/2024/12/lazy-self-installing-python-s...
If not, where do you see a meaningful difference?
tbh this has been a sticking point for me too with uv (though I use it for everything now). I just want to start of a repl with a bunch of stuff installed so I can try out a bunch of stuff. My solution now it to have a ~/tmp dir where I can mess around with all kinds of stuff (not just python) and there I have a uv virtualenv installed with all kinds of packages pre-installed.
Right, it's this. I get the feeling a lot of people here don't work that way though. I mean I can understand why in a sense, because if you're doing something for your job where your boss says "the project is X" then it's natural to start with a project structure for X. But when I'm going "I wonder if this will work..." then I want to start with the code itself and only "productionize" it later if it turns out to work.
I hope the people behind UV or someone else adress this. A repl/notebook thing that is running on a .venv preinstalled with stuff defined in some config file.
So, create a project as a playground, put what you want it to include (including something like Jupyter if you want notebooks) in the pyproject.toml and... use it for that?
What do you want a tool to do for that style of exploration that uv doesn't already do? If you want to extract stuff from that into a new, regular project, that maybe could use some tooling, sure, that would take some new tooling.
Do you need a prepackaged set of things to define the right “bunch of stuff” for the starting point? Because that will vary a lot by what your area of exploration is.
uv run --with=numpy,pandas pythonuv.lock is a bless
I use the 'bare' option for this
uv has a script mode, a temp env mode, and a way to superimpose a temp env on top of an existing env.
See: https://www.bitecode.dev/p/uv-tricks
That's one of the selling point of the tool: you don't need a project, you don't need activate anything, you don't even need to keep code around.
Yesterday I wanted to mess around with logoru in ipython. I just ran `uvx --with loguru ipython` and I was ready to go.
Not even a code file to open. Nothing to explicitly install nor to clean up.
For a tool that is that fantastic and create such enthusiasm, I'm always surprise of how little of its feature people know about. It can do crazy stuff.
or `uvx --with my-package ipython`
That gets problematic if environments go out of sync, or you need different versions of python or dependencies.
So you are right, you probably won't benefit a lot if you just have one big environment and that works for you, but once you pull things in a project, uv is the best tool out there atm.
You could also just create a starter project that has all the things you want, and then later on pull it out, that would be the same thing.
Could it be that you’re just used to separate environments causing so much pain that you avoid it unless you’re serious about what you’re doing?
That is exactly 100% what I demand. Projects should be - must be - completely isolated from one another.
Quite frankly anything else seems utterly insane to me.