Preferences

Oof. I wish they could support version imports

    import torch==2.6.0+cu124
    import numpy>=1.2.6
and support having multiple simultaneous versions of any Python library installed. End this conda/virtualenv/docker/bazel/[pick your poison] mess

It's been explained many times before why this is not possible: the library doesn't actually have a version number. The distribution of source code on PyPI has a version number, but the name of this is not connected to the name of any module or package you import in the source code. The distribution can validly define zero or more modules (packages are a subset of modules, represented using the same type in the Python type system).

You got three other responses before me all pointing at uv. They are all wrong, because uv did not introduce this functionality to the Python ecosystem. It is a standard defined by https://peps.python.org/pep-0723/, implemented by multiple other tools, notably pipx.

> It's been explained many times before why this is not possible: the library doesn't actually have a version number. The distribution of source code on PyPI has a version number, but the name of this is not connected to the name of any module or package you import in the source code.

You're making the common mistake of conflating how things currently work with how things could work if the responsible group agrees to change how things work. Something being the way it is right now is not the same as something else being "not possible".

No, changing this breaks the world. A huge fraction of PyPI becomes completely invalid overnight, and the rest fails the expected version checks. Not to mention that the language is fundamentally designed around the expectation that modules are singleton. I've written about this at length before but I can't easily find it right now (I have way too many bookmarks and not nearly enough idea how to organize them).

Yes, you absolutely can create a language that has syntax otherwise identical to Python (or at least damn close) which implements a feature like this. No, you cannot just replace Python with it. If the Python ecosystem just accepted that clearly better things were clearly better, and started using them promptly, we wouldn't have https://pypi.org/project/six/ making it onto https://pypistats.org/top (see also https://sethmlarson.dev/winning-a-bet-about-six-the-python-2...).

The hard part is making the change. Adding an escape hatch so older code still works is easy in comparison.

Nobody is claiming this is a trivial problem to solve but its also not an impossible problem. Other languages have managed to figure out how to achieve this and still maintain backwards compatibility.

You're welcome to bring a concrete proposal to, e.g., https://discuss.python.org/c/ideas/6 , or ask around the core devs to find a PEP sponsor.

Note that you will be expected to have familiarized yourself generally with previous failed proposals of this sort, and proactively considered all the reasonably obvious corner cases.

> Not to mention that the language is fundamentally designed around the expectation that modules are singleton.

Modules being singletons is not a problem in itself I think? This could work like having two versions of the same library in two modules named like library_1_23 and library_1_24. In my program I could hypothetically have imports like `import library_1_23 as library` in one file, and `import library_1_24 as library` in another file. Both versions would be singletons. Then writing `import library==1.23` could be working like syntax sugar for `import library_1_23 as library`.

Of course, having two different versions of a library running in the same program could be a nightmare, so all of that may not be a good idea at all, but maybe not because of module singletons.

I know I'm missing something but wouldn't it be possible to just throw an import error when that happens? Would it even break anything? If I try:

import numpy==2.1

And let's say numpy didn't expose a version number in a standard (which could be agreed upon in a PEP) field, then it would just throw an import exception. It wouldn't break any old code. And only packages with that explicit field would support the pinned version import.

And it wouldn't involve trying to extract and parse versions from older packages with some super spotty heuristics.

But it would make new code impossible to use with older versions of python, and older packages, but that's already the case.

Maybe the issue is with module name spacing?

> And let's say numpy didn't expose a version number in a standard (which could be agreed upon in a PEP) field, then it would just throw an import exception. It wouldn't break any old code. And only packages with that explicit field would support the pinned version import.

Yes, this part actually is as simple as you imagine. But that means in practical terms that you can't use the feature until at best the next release of Numpy. If you want to say for example that you need at least version 2 (breaking changes, after all), well, there are already 18 published packages that meet that requirement but are unable to communicate that in the new syntax. This can to my understanding be fixed with post releases, but it's a serious burden for maintainers and most projects are just not going to do that sort of thing (it bloats PyPI, too).

And more importantly, that's only one of many problems that need to be solved. And by far the simplest of them.

If vereioned imports were added to the language versioned library support obviously would have to become part of the language as well.

However it isn't trivial. First problem coming to my mind:

module a importing first somelib>=1.2.0 and then b and b then requiring somelib>1.2.1 and both being available, will it be the same or will I have a mess from combining?

You could absolutely have this be part of the language in any regard. The question then becomes how does one implement it in a reasonable way. I think every package should have a __version__ property you should be able to call, then you could have versioned imports.

In fact there's already many packages already defining __version__ at a package level.

https://packaging.python.org/en/latest/discussions/versionin...

Edit: What they are solving with UV is at the moment of standing up an environment, but you're more concerned about code-level protection, where are they're more concerned about environment setup protection for versioning.

> In fact there's already many packages already defining __version__ at a package level.

This only helps for those that do, and it hasn't been any kind of standard the entire time. But more importantly, that helps only the tiniest possible bit with resolving the "import a specific version" syntax. All it solves is letting the file-based import system know whether it found the right folder for the requested (or worse: "a compatible") version of the importable package. It doesn't solve finding the right one if this one is wrong; it doesn't determine how the different versions of the same package are positioned relative to each other in the environment (so that "finding the right one" can work properly); it doesn't solve provisioning the right version. And most importantly, it doesn't solve what happens when there are multiple requests for different versions of the same module at runtime, which incidentally could happen arbitrarily far apart in time, and also the semantics of the code may depend on the same object being used to represent the module in both places.

> It's been explained many times before why this is not possible: the library doesn't actually have a version number.

That sounds like it is absolutely fixable to me, but more of a matter of not having the will to fix it based on some kind of traditionalism. I've used python, a lot. But it is stuff like this that is just maddeningly broken for no good reason at all that has turned me away from it. So as long as I have any alternative I will avoid python because I've seen way too many accidents on account of stuff like this and many lost nights of debugging only to find out that an easily avoidable issue became - once again - the source of much headscratching.

> a matter of not having the will to fix it based on some kind of traditionalism

Do you know what happens when Python does summon the will to fix obviously broken things? The Python 2->3 migration happens. (Perl 6 didn't manage any better, either.) Now "Python 3 is the brand" and the idea of version 4 can only ever be entertained as a joke.

Yes, good point. Compared to how the Erlang community has handled decades of change Python does not exactly deserve the beauty prize. The lack of forethought - not to be confused with a lot of hot air - on some of these decisions is impressive. I think that the ability to track developments in near realtime is in conflict with that though. If you want your language to be everything to everybody then there will be some broken bones along the way.
> It's been explained many times before why this is not possible: the library doesn't actually have a version number.

Not possible? Come on.

Almost everyone already uses one of a small handfull of conventional ways to specify it, eg `__version__` attribute. It's long overdue that this be standardized so library versions can reliably be introspected at runtime.

Allowing multiple versions to be installed side-by-side and imported explicitly would be a massive improvement.

I believe the charitable interpretation is that it is not possible without breaking an enormous amount of legacy code. Which does feel close enough to “not possible”.

Some situations could be improved by allowing multiple library versions, but this would introduce new headaches elsewhere. I certainly do not want my program to have N copies of numpy, PyTorch, etc because some intermediate library claims to have just-so dependency tree.

What do you do today to resolve a dependency conflict when an intermediate library has a just-so dependency tree?

The charitable interpretation of this proposed feature is that it would handle this case exactly as well as the current situation, if the situation isn't improved by the feature.

This feature says nothing about the automatic installation of libraries.

This feature is absolutely not about supporting multiple simultaneous versions of a library at runtime.

In the situation you describe, there would have to be a dependency resolution, just like there is when installing the deps for a program today. It would be good enough for me if "first import wins".

> What do you do today to resolve a dependency conflict when an intermediate library has a just-so dependency tree?

When an installer resolves dependency conflicts, the project code isn't running. The installer is free to discover new constraints on the fly, and to backtrack. It is in effect all being done "statically", in the sense of being ahead of the time that any other system cares about it being complete and correct.

Python `import` statements on the other hand execute during the program's runtime, at arbitrary separation, with other code intervening.

> This feature says nothing about the automatic installation of libraries.

It doesn't have to. The runtime problems still occur.

I guess I'll have to reproduce the basic problem description from memory again. If you have modules A and B in your project that require conflicting versions of C, you need a way to load both at runtime. But the standard import mechanism already hard-codes the assumptions that i) imports are cached in a key-value store; ii) the values are singleton and client code absolutely may rely on this for correctness; iii) "C" is enough information for lookup. And the ecosystem is further built around the assumption that iv) this system is documented and stable and can be interacted with in many clever ways for metaprogramming. Changing any of this would be incredibly disruptive.

> This feature is absolutely not about supporting multiple simultaneous versions of a library at runtime.

You say that, but you aren't the one who proposed it. And https://www.hackerneue.com/item?id=45467350 says explicitly:

> and support having multiple simultaneous versions of any Python library installed.

Which would really be the only reason for the feature. For the cases where a single version of the third-party code satisfies the entire codebase, the existing packaging mechanisms all work fine. (Plus they properly distinguish between import names and distribution names.)

You could do that with uv.

  # /// script
  # dependencies = [
  #   "requests<3",
  #   "rich",
  # ]
  # ///
  
  import requests
  from rich.pretty import pprint
  
  resp = requests.get("https://peps.python.org/api/peps.json")
  data = resp.json()
  pprint([(k, v["title"]) for k, v in data.items()][:10])
How would version imports be handled across the codebase? Also, what do you gain with those over PEP 723 – Inline script metadata? https://packaging.python.org/en/latest/specifications/inline...
Oof. This feature request has nothing to do with lazy imports. It’s also solved far more cleanly with inline script metadata.
Really what is the headache with virtual environments? They’ve been solved. Use UV or python’s built in venv creator and you’re good to go.

uv venv —seed —python=3.12 && source .venv/bin/activate && pip3 install requests && …

That's messy.

I should be able to do "python foo.py" and everything should just work. foo.py should define what it wants and python should fetch it and provide it to foo. I should be able to do "pyc foo.py; ./foo" and everything should just work, dependencies balled up and statically included like Rust or Go. Even NodeJS can turn an entire project into one file to execute. That's what a modern language should look and work like.

The moment I see "--this --that" just to run the default version of something you've lost me. This is 2025.

You mean this?

#!/usr/bin/env -S uv run --script # # /// script # requires-python = ">=3.12" # dependencies = ["httpx"] # ///

import httpx

print(httpx.get("https://example.com"))

https://docs.astral.sh/uv/guides/scripts/#improving-reproduc...

There are also projects py2exe pyinstaller iirc and others that try to get the whole static binary thing going.

You’re trying imo to make Python into Golang and if you’re wanting to do that just use Golang. That seems like a far better use of your time.

The mess has ended thanks to uv.
NO! I don't want my source code filled with this crap.

I don't want to lose multiple hours debugging why something did go wrong because I am using three versions of numpy and seven of torch at the same time and there was a mixup

uv is good.

This item has no comments currently.

Keyboard Shortcuts

Story Lists

j
Next story
k
Previous story
Shift+j
Last story
Shift+k
First story
o Enter
Go to story URL
c
Go to comments
u
Go to author

Navigation

Shift+t
Go to top stories
Shift+n
Go to new stories
Shift+b
Go to best stories
Shift+a
Go to Ask HN
Shift+s
Go to Show HN

Miscellaneous

?
Show this modal