That depends very much on what you're doing. For production I'd like to use the OS package manager to install my Python dependencies, and move the responsibility of patching to the OS.
For workstations, absolutely, go with virtual environments, it's the only way to go. One concern I've seen from some in the machine learning space is that rebuilding a virtual environment, for example if macOS upgrades the Python version, takes hours. That can be solved by using pyenv, then you can have multiple versions of Python and be free of Anaconda. I primarily use pyenv to be sure that I have the same Python version that ships with the server OS on my laptop.
I have been out of the loop with Python for a couple years at this point, but how could this get so bad?
If you are looking into a system you are unfamiliar with, where do you look first?... pip? pipx? homebrew?... or is it in anaconda? pyenv?... must be in the os package manager... apt? pacman?
Honestly, Maven and NPM look great compared to this mess.
NPM to me is the great example of a package manager that worse than anything the Python community has come up with, but that's subjective I think.
Python isn't as bad as people make it out to be. There are some issue that you will run into if your project/code base becomes really large, but that's not an issue for most people. The vast majority can get just use python -mvenv .venv to set up a virtual environment and install in that using pip.
Next step up is you need specific versions of Python, so you switch to pyenv, which functions mostly like the built in virtualenv.
Then you have the special cases where you need to lock dependencies much hard than pip can do and you use poetry, pip is to slow so you use pipx. Those are edge cases to be honest. That's not to say that they aren't solving very real problem, but mostly you don't need it.
It would be great if there was one tools that could do it all, lock the Python version, do quick dependency resolution and lock them down.
So far Python has opted to split the problem: Tools for locking the Python version, tools for creating separate environments and tools for managing packages. There's a lot of crossover in the first two, but you can pretty much mix and match virtual environment tools and package managers anyway you like. I think that's pretty unique.
It's definitely unique, but the "out of the box" experience is suffering for it in my opinion.
I actually recently had to fix something small in a Python project, pip refused to work because of homebrew, homebrew didn't have the dependencies and directed me to pipx, pipx finally worked - It was a strange experience.
And for the record NPM mostly has a bad reputation because of its past... nowadays it's perfectly usable and can lock dependencies and node version "out of the box".
> It's definitely unique, but the "out of the box" experience is suffering for it
The tooling that ships with Python could be much better, I'd agree with that. You can go pretty far with venv and pip, but just the python3 -mvenv .venv isn't exactly a great example of user friendliness at work.
What's your problem with NPM? It just works, it's easy to package for, it handles multiple version dependencies elegantly, with pnpm it doesn't trash your disk and is really fast.
(P)NPM and CJS is the best package/dependency management there currently is.
For workstations, absolutely, go with virtual environments, it's the only way to go. One concern I've seen from some in the machine learning space is that rebuilding a virtual environment, for example if macOS upgrades the Python version, takes hours. That can be solved by using pyenv, then you can have multiple versions of Python and be free of Anaconda. I primarily use pyenv to be sure that I have the same Python version that ships with the server OS on my laptop.