https://pep-previews--4622.org.readthedocs.build/pep-0810/#f...
Q: Why not use importlib.util.LazyLoader instead?
A: LazyLoader has significant limitations:
Requires verbose setup code for each lazy import.
Has ongoing performance overhead on every attribute access.
Doesn’t work well with from ... import statements.
Less clear and standard than dedicated syntax.
> Has ongoing performance overhead on every attribute access.
I would have expected so, but in my testing it seems like the lazy load does some kind of magic to replace the proxy with the real thing. I haven't properly dug into it, though. It appears this point is removed in the live version (https://peps.python.org/pep-0810).
> Doesn’t work well with from ... import statements.
Hmm. The PEP doesn't seem to explain how reification works in this case. Per the above it's a solved problem for modules; I guess for the from-imports it could be made to work essentially the same way. Presumably this involves the proxy holding a reference to the namespace where the import occurred. That probably has a lot to do with restricting the syntax to top level. (Which is the opposite of how we've seen soft keywords used before!)
> Requires verbose setup code for each lazy import.
> Less clear and standard than dedicated syntax.
If you want to use it in a fine-grained way, then sure.
Note that this is global to the entire process, so for example if you make an import of Numpy lazy this way, then so are the imports of all the sub-modules. Meaning that large parts of Numpy might not be imported at all if they aren't needed, but pauses for importing individual modules might be distributed unpredictably across the runtime.
Edit: from further experimentation, it appears that if the source does something like `import foo.bar.baz` then `foo` and `foo.bar` will still be eagerly loaded, and only `foo.bar.baz` itself is deferred. This might be part of what the PEP meant by "mostly". But it might also be possible to improve my implementation to fix that.