You got three other responses before me all pointing at uv. They are all wrong, because uv did not introduce this functionality to the Python ecosystem. It is a standard defined by https://peps.python.org/pep-0723/, implemented by multiple other tools, notably pipx.
You're making the common mistake of conflating how things currently work with how things could work if the responsible group agrees to change how things work. Something being the way it is right now is not the same as something else being "not possible".
Yes, you absolutely can create a language that has syntax otherwise identical to Python (or at least damn close) which implements a feature like this. No, you cannot just replace Python with it. If the Python ecosystem just accepted that clearly better things were clearly better, and started using them promptly, we wouldn't have https://pypi.org/project/six/ making it onto https://pypistats.org/top (see also https://sethmlarson.dev/winning-a-bet-about-six-the-python-2...).
Nobody is claiming this is a trivial problem to solve but its also not an impossible problem. Other languages have managed to figure out how to achieve this and still maintain backwards compatibility.
Note that you will be expected to have familiarized yourself generally with previous failed proposals of this sort, and proactively considered all the reasonably obvious corner cases.
Modules being singletons is not a problem in itself I think? This could work like having two versions of the same library in two modules named like library_1_23 and library_1_24. In my program I could hypothetically have imports like `import library_1_23 as library` in one file, and `import library_1_24 as library` in another file. Both versions would be singletons. Then writing `import library==1.23` could be working like syntax sugar for `import library_1_23 as library`.
Of course, having two different versions of a library running in the same program could be a nightmare, so all of that may not be a good idea at all, but maybe not because of module singletons.
import numpy==2.1
And let's say numpy didn't expose a version number in a standard (which could be agreed upon in a PEP) field, then it would just throw an import exception. It wouldn't break any old code. And only packages with that explicit field would support the pinned version import.
And it wouldn't involve trying to extract and parse versions from older packages with some super spotty heuristics.
But it would make new code impossible to use with older versions of python, and older packages, but that's already the case.
Maybe the issue is with module name spacing?
Yes, this part actually is as simple as you imagine. But that means in practical terms that you can't use the feature until at best the next release of Numpy. If you want to say for example that you need at least version 2 (breaking changes, after all), well, there are already 18 published packages that meet that requirement but are unable to communicate that in the new syntax. This can to my understanding be fixed with post releases, but it's a serious burden for maintainers and most projects are just not going to do that sort of thing (it bloats PyPI, too).
And more importantly, that's only one of many problems that need to be solved. And by far the simplest of them.
However it isn't trivial. First problem coming to my mind:
module a importing first somelib>=1.2.0 and then b and b then requiring somelib>1.2.1 and both being available, will it be the same or will I have a mess from combining?
In fact there's already many packages already defining __version__ at a package level.
https://packaging.python.org/en/latest/discussions/versionin...
Edit: What they are solving with UV is at the moment of standing up an environment, but you're more concerned about code-level protection, where are they're more concerned about environment setup protection for versioning.
This only helps for those that do, and it hasn't been any kind of standard the entire time. But more importantly, that helps only the tiniest possible bit with resolving the "import a specific version" syntax. All it solves is letting the file-based import system know whether it found the right folder for the requested (or worse: "a compatible") version of the importable package. It doesn't solve finding the right one if this one is wrong; it doesn't determine how the different versions of the same package are positioned relative to each other in the environment (so that "finding the right one" can work properly); it doesn't solve provisioning the right version. And most importantly, it doesn't solve what happens when there are multiple requests for different versions of the same module at runtime, which incidentally could happen arbitrarily far apart in time, and also the semantics of the code may depend on the same object being used to represent the module in both places.
That sounds like it is absolutely fixable to me, but more of a matter of not having the will to fix it based on some kind of traditionalism. I've used python, a lot. But it is stuff like this that is just maddeningly broken for no good reason at all that has turned me away from it. So as long as I have any alternative I will avoid python because I've seen way too many accidents on account of stuff like this and many lost nights of debugging only to find out that an easily avoidable issue became - once again - the source of much headscratching.
Do you know what happens when Python does summon the will to fix obviously broken things? The Python 2->3 migration happens. (Perl 6 didn't manage any better, either.) Now "Python 3 is the brand" and the idea of version 4 can only ever be entertained as a joke.
Not possible? Come on.
Almost everyone already uses one of a small handfull of conventional ways to specify it, eg `__version__` attribute. It's long overdue that this be standardized so library versions can reliably be introspected at runtime.
Allowing multiple versions to be installed side-by-side and imported explicitly would be a massive improvement.
Some situations could be improved by allowing multiple library versions, but this would introduce new headaches elsewhere. I certainly do not want my program to have N copies of numpy, PyTorch, etc because some intermediate library claims to have just-so dependency tree.
The charitable interpretation of this proposed feature is that it would handle this case exactly as well as the current situation, if the situation isn't improved by the feature.
This feature says nothing about the automatic installation of libraries.
This feature is absolutely not about supporting multiple simultaneous versions of a library at runtime.
In the situation you describe, there would have to be a dependency resolution, just like there is when installing the deps for a program today. It would be good enough for me if "first import wins".
When an installer resolves dependency conflicts, the project code isn't running. The installer is free to discover new constraints on the fly, and to backtrack. It is in effect all being done "statically", in the sense of being ahead of the time that any other system cares about it being complete and correct.
Python `import` statements on the other hand execute during the program's runtime, at arbitrary separation, with other code intervening.
> This feature says nothing about the automatic installation of libraries.
It doesn't have to. The runtime problems still occur.
I guess I'll have to reproduce the basic problem description from memory again. If you have modules A and B in your project that require conflicting versions of C, you need a way to load both at runtime. But the standard import mechanism already hard-codes the assumptions that i) imports are cached in a key-value store; ii) the values are singleton and client code absolutely may rely on this for correctness; iii) "C" is enough information for lookup. And the ecosystem is further built around the assumption that iv) this system is documented and stable and can be interacted with in many clever ways for metaprogramming. Changing any of this would be incredibly disruptive.
> This feature is absolutely not about supporting multiple simultaneous versions of a library at runtime.
You say that, but you aren't the one who proposed it. And https://www.hackerneue.com/item?id=45467350 says explicitly:
> and support having multiple simultaneous versions of any Python library installed.
Which would really be the only reason for the feature. For the cases where a single version of the third-party code satisfies the entire codebase, the existing packaging mechanisms all work fine. (Plus they properly distinguish between import names and distribution names.)
# /// script
# dependencies = [
# "requests<3",
# "rich",
# ]
# ///
import requests
from rich.pretty import pprint
resp = requests.get("https://peps.python.org/api/peps.json")
data = resp.json()
pprint([(k, v["title"]) for k, v in data.items()][:10])uv venv —seed —python=3.12 && source .venv/bin/activate && pip3 install requests && …
I should be able to do "python foo.py" and everything should just work. foo.py should define what it wants and python should fetch it and provide it to foo. I should be able to do "pyc foo.py; ./foo" and everything should just work, dependencies balled up and statically included like Rust or Go. Even NodeJS can turn an entire project into one file to execute. That's what a modern language should look and work like.
The moment I see "--this --that" just to run the default version of something you've lost me. This is 2025.
#!/usr/bin/env -S uv run --script # # /// script # requires-python = ">=3.12" # dependencies = ["httpx"] # ///
import httpx
print(httpx.get("https://example.com"))
https://docs.astral.sh/uv/guides/scripts/#improving-reproduc...
There are also projects py2exe pyinstaller iirc and others that try to get the whole static binary thing going.
You’re trying imo to make Python into Golang and if you’re wanting to do that just use Golang. That seems like a far better use of your time.
I don't want to lose multiple hours debugging why something did go wrong because I am using three versions of numpy and seven of torch at the same time and there was a mixup