Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Ask HN: Why Poetry did not become a mainstream package manager for Python?
80 points by rcshubhadeep on June 19, 2021 | hide | past | favorite | 101 comments
I have recently been introduced to Poetry. I am a pretty experienced Python dev with both back-end and ML in my project basket. However, I have been using pip/requirements.txt/setup.py so far for every single project I had done. But once I have tried Poetry, I do not think I will ever go back.

I was wondering why is it not more popular? What is holding it back? What do you think?



Because venv/pip/requirements is good enough for most cases. Most developers are introduced early to these tools and learn them good enough to work fine for them.

If there's no real problem with the existing tools, then people will not change their tested and trusted way of working for some fancy new tool, no matter what that new tool promises!

There's an old engineering proverb about that: if it works don't mess with it.


I've had more trouble with pip than any other package manager. Heck, one of my projects on GitHub currently has a broken CI pipeline because some newer version of pip broke deployments to pypi and I haven't investigated why yet.

And plus, it's just super confusing. It's never been clear to me what the difference between pip and pip3 is. I thought pip3 was for python3 packages, but it seems like pip can also be used for python3 as well. But then sometimes it seems like if I use pip I only get the python2 version.

With npm or go get, if I install a package I know exactly where it got installed a how to mnaually remove if needed. If I install something with pip, the newly installed package goes into some mysterious directory whose location is never quite certain.

So yes, there are major problems with pip, hence why competitors are showing up like poetry.


> It's never been clear to me what the difference between pip and pip3 is. I thought pip3 was for python3 packages, but it seems like pip can also be used for python3 as well. But then sometimes it seems like if I use pip I only get the python2 version.

It's pretty simple; pip3 will always be for python3, pip2 will always be for python2. pip is context dependent, for example if you are in a python2 venv, it will use py2, and vice versa.

> With npm or go get, if I install a package I know exactly where it got installed a how to mnaually remove if needed. If I install something with pip, the newly installed package goes into some mysterious directory whose location is never quite certain.

If you pip install a library, it will almost definitely be installed in pythondir/lib/site-packages, unless you have messed around with your installation


Always use pipenv (although somebody else suggested pip-tools as better than pipenv) instead of pip for package installs. You can install pipenv using pip. It works /so much better/ than regular pip. Pipenv installs the right versions for the dependencies needed for the package you are installing, very well. It is astonishing how much better it works compared to pip.

Personally, I use pipenv (manages package dependencies and sub-dependencies), pyenv (python version management), and pyenv-virtualenv (creates an isolated version of Python such as 3.8) together, in tandem, over poetry and others. Personally, I think this combination gives the best amount of versatility. This even works well with difficult installs of python packages that need to be compiled into executable software. This includes stuff that has not been recently updated.

A good guide on how to do this is here (see table comparison against poetry at the bottom of the article): https://towardsdatascience.com/python-environment-101-1d68bd...


> It's never been clear to me what the difference between pip and pip3 is.

pip3 is the name of the system installation of the pip executable corresponding to the python 3.x installation on linux systems (maybe Mac, too) when installed from system package managers on systems where python3 is the name of the python 3.x executable when installed from the package manager.

This is not really a pip issue, but a “how python is packaged to support side-by-side system installations” issue.


Think you might need to downgrade pip (using pip) to the version that is less strict when checking dependencies (i forget the exact version, think maybe 22).

pip install "pip<22"

To check which pip is for which python:

pip --version

Pyenv is pretty great for managing multiple python versions.


I think this is it. I still don’t know what benefits poetry adds over the typical venv + pip + requirements flow, because that flow has worked pretty well for me.

Is there something crucial I’m missing? I feel like package management has largely been solved in python, but I admittedly haven’t done a lot of research into what other workflows are out there.


If you think ‘pip works’ then the programs you are building are too simple. I worked at a place where we did machine learning builds and neither pip or conda worked reliably. What was insidious was that it almost worked and people thought they could live around it’s limitations which turns every 2 hour job into a 15 hour job. (But they are data scientists…. A reliable build is like Bigfoot, cold fusion or Linear A to them.)

Pip’s dependency solving strategy is not correct, especially if you install pip packages one at a time.

A real dependency solver would download the metadata for all matching versions (can be done with two seeks for a wheel but gotta run setup.py for an egg…) do an smt solve and then install the wheels.

Pip just starts installing packages, hopes for the best, sometimes backtracks, often gets stuck or does the wrong thing.


That's a fair point, most of the applications I've worked on have been relatively simple (or I've just gotten lucky). It makes sense that different issues crop up at different scales, given the complexity of different workflows and package dependencies. I also haven't worked in ML before, so that may be a class of complexity all on its own.


Two issues turn up in in ML.

One is that the dependencies are a beast. The risk that it won't find a solution between a number of libraries that are only compatible with certain versions is high.

The other one is that ML projects themselves contain data, often large amounts of data.

For instance a "word2vec" style model might be 1 gigabyte and it might be a part of another model. (Say you turn the words to vectors then put the vectors through a CNN or RNN.) The Python packaging system might be a logically correct place to store this data (a necessary part of the the model) but it's a big file that will cause hassles if you do everything right, big hassles if you do things wrong (like compress anaconda packages with slow bzip2, use whatever algorithm that Docker uses to superamplify I/O, ...)

It is nice to pack all your "empty" models (ready to train) as python packages and maybe even your "trained models" (some data files to supplement the empty files) but it will take some iteration to make it all go smoothly.


How do you pin all of your dependencies to a specific version with just venv, pip and a requirements.txt? How do you upgrade them later?


Versions are specified within the requirements.txt, if that's what you mean. Upgrading can be done by installing a later version of the package and rewriting the requirements file.

I suspect my use cases haven't been as complicated as some of the others listed in this thread, which may be why I've never felt the need to look into poetry and others.


I know what you mean, but as soon as you have transitive dependencies it doesn't work at all. At that point you can't reproduce the same state of your venv at a later date because some minor version of a dependent package could change and maybe break your build.


Agreed. Similar to why Plan9 did not replace Unix, because Unix is good enough. It does not mean we can't do better, but the leap forward was (is) not big enough.

At least that's my understanding.


My theory on tech acceptance is that it's not good enough to be an improvement.

To gain traction and acceptance, the offering must be several times better.

Marketing also counts.

Hence the lingering demise of python2.7: python3 didn't offer sufficient sizzle until ~3.6 or so.


Npm is also good enough for most cases, but it hadn't stopped yarn from getting traction. May be Python developers are just more conservative?


They are simply uninterested. Poetry and pipenv both give you the opportunity to separate dev dependencies, keep your venv folder separate from the project, add platform specific packages and easily update all package versions. Pipenv will even load the .env file. After reading all this, your python developer will go "well I still don't know what it does that I can't do with venv/pip".


Yarn got most of its popularity during a time when it was clearly ahead of npm in every way, but saw its share dropped when npm caught up. IMO Poetry came a bit too late; it would have been a tremendous hit in like 2015, but as the timeline be it is the leap forward is simply not enough. (Incidentally I think the same is happening for Yarn v2 as well; very good ideas, but too minor for most users to care.)


It is complicated I think.

For one, I think competition, there is Kenneth Reitz' pipenv and then of course conda's environment.yml. I think pipenv lost the popularity contest but somehow lead to a situation where people wondered on which horse to bet.

Conda also takes a lot of market share esp. in the machine-learning deployments while also being not a 'perfect' tool.

Pretty sure we will see more poetry in the coming years.

Last but not least Poetry sometimes also feels like a weird tool. For example, when its dumping colourful stack traces on you in case of (invalid) inputs, etc.

If pip-compile would support dev and production requirements it would still be my way to go I think.


I found pipenv to be far too opinionated (e.g: explicitly not supporting sections other than `dev`, because you're doing it wrong if there are any other sections).

I also found it's insistence on doing certain things reduced my flexibility. We thought it was kind of dead for a while where there were no releases for about a year.

That said, I've generally found a rise in opinionated tooling (see: black) and that frustrated me for a while. Then i realised i can just ignore it all and still be productive with my old stack if i knew what i was doing (which i do) while people still blurt their opinions elsewhere.


> I also found it's insistence on doing certain things reduced my flexibility. We thought it was kind of dead for a while where there were no releases for about a year.

It certainly was dead in that time


> If pip-compile would support dev and production requirements it would still be my way to go I think.

Can you elaborate on this? I use pip-compile development.in and pip-compile production.in and both use -r base.in for shared dependencies


what I did was

pip-compile production.in

then use the pinned production requirements and a dev.in to produce the dev environment.

Key point being that the dev environment needs to have the very same pinned versions for all packages that are part of the prod environment.


> If pip-compile would support dev and production requirements it would still be my way to go I think.

It supports as many separate requirements files as you want, if that's what you mean.


It can support what I want but then I am writing shell scripts...


Poetry does need help. I created a thread here: https://github.com/python-poetry/poetry/issues/4160

my primary concern poetry's development stagnating due to natural causes while many projects rely on it. switching costs are high. moving from pipenv -> poetry took me days across all my projects.

they're looking for developers, but also help with managing issues on the tracker.

i also raised the issue of them being funded. I think one possible outcome is seeing if PSF's Fiscal Sponsoree program (https://www.python.org/psf/fiscal-sponsorees/) would be a fit. Waiting to see what their maintainers say.


Poetry is a mainstream package manager. Every company I know uses it.

And more importantly, the concepts it brings will eventually become Python standards (like pyproject.toml or the lock file): https://snarky.ca/what-the-heck-is-pyproject-toml/


I think it was the other way around. PEP 518 was first, then came Poetry.


It’s more complicated than that. Poetry’s predecessor Poet started at about the same time as PEP 518 (independently), but it jumped head-first to pyproject.toml when the community started using that file. A lot of Python packaging is like that, the community is very organic and it’s rarely what caused what, but similar ideas popping up around the same time and eventually converge.


The author of Poetry has an open Github sponsors account [0]. If you're using Poetry, or rely on it in the commercial context, I'd suggest investing into reliability and sustainability of your build tools :)

In my startup, I started to make sure that we allocate a small monthly budget to distribute between our open source dependencies, Poetry being one of them. It's always shocking to find how little support the projects are getting - the author of Poetry currently has 11 sponsors (with majority of those being individuals).

[0] https://github.com/sponsors/sdispater


I do think we're at a cusp where the poetry adoption rate may increase.

I avoided poetry for years because it doesn't have an option to skip the lockfile. We build images, so our dependencies are frozen anyway. (Now that we use dependabot, I'm willing to have a lockfile that's automated.)

I track Python packaging changes carefully, updating from procedural setup.py, declarative setup.cfg, versioning through setuptools-scm and now PEP517 and pyproject.toml.

But even now, Python ships with setuptools, pip and venv, and a system like poetry is not really a necessity.

But if you want convenient virtualenv management, editable installs _and_ PEP517, then you probably need poetry. I haven't found another system that can do all three.


FWIW, I placed my early bet on pipenv, but it had some significant bugs and development had stagnated (seems like it might be picking up again).

When we decided to move away, I looked at Poetry and pip-tools. Poetry seemed too different to me, don't remember specifics, but felt like too much would need to change. All I wanted was the ability to lock package versions.

Enter pip-tools. It was really simple, felt familiar, and just worked. So, that's what we chose and it's what we use to this day.


Not sure what you saw but Poetry works as simply as you described. Just `poetry init`, `poetry add foo`, `poetry shell` or `poetry run ...`, etc. I use it every day.


I'm surprised that pip-tools isn't more popular. For the stuff I do with Python (small CLI tools, casual webdev) it's completely adequate. I can pin down my dependencies, upgrade to a new pinned state and sync my venvs with the packages specified in the generated requirements.txt.

Maybe there are more elegant or modern ways of doing this, but for my personal projects I haven't found a reason to switch to anything else.

Btw, there is also micropipenv[0] now which can consume both poetry and pipenv lockfiles and convert them to pinned down requirements.txts as generated by pip-tools.

[0] https://pypi.org/project/micropipenv/


I think poetry is rally gaining traction. Poetry solves some problems in an easy way that are cumbersome to replicate with pip (like guarding against supply chain attacks with lock files) and teams are starting to realize that basic use of pip and requirements.txt no longer is enough to manage dependencies. I am starting to more an more projects adopting poetry and the uptake seems to look like the uptake I saw in FastAPI. Side note: I tried using pipenv, but is was just too slow for resolving dependencies. Curious what other people’s experience is on this with pipenv and poetry.


Nothing is holding it back. It is making progress but Python is a big community and poetry is still new. A new way of managing dependencies does not take off overnight (and nor should it! go slow, and break nothing!).


I can speak for myself— venv is good enough for most things and conda works for everything else (pandas/numpy/tensorflow on Mac m1).

For production code I’m more worried about minimizing surprises then elegance which means it’s really hard to replace something that works. Poetry only hit 1.0 in 12/2019, so maybe I’ll try it if the api remains stable for another few years. Contrast that to JS where we’re now on webpack 5 I think and there’s really no long term support available.


Because it’s unclear (to me at least) what problems these tools solve.

Pip apparently isn’t deterministic in its resolutions, but you can achieve determinism the same way Poetry does by creating a frozen requirements.txt if you absolutely must.

I’ve only seen dependency hell in badly designed projects, particularly ones where people use libraries to share code in an organisation (which frequently turns out poorly because maintaining libraries is harder than people think).

Poetry is also really slow. I’ve noticed on larger projects doing ‘poetry add’ takes nearly 10 minutes. This often happens when people use wildcard dependencies; for whatever reason the resolver looks through all possible version combinations, which can cause resolution to take exponentially longer than it should.


Lots of people have a process that works well-enough. There was confusion and bad experience with there being multiple options, e.g. pipenv (which came with its fair share of drama), making people wary.


Because `python3 -m venv venv && ./venv/bin/pip install * && ./venv/bin/pip freeze > requirements.txt` is good enough.


...until you need to use a Conda library, or until a transitive dependency suddenly breaks everything

https://github.com/pypa/pip/issues/9204


Poetry won’t help you either with a transitive dependency conflict. The problem is that Python just doesn’t have the concept of nested dependencies. And it’s too much of an arse to change the module loading code to work with a Node style pattern.

The solution to that problem is probably to do ‘pip —-ignore-installed’ and hope for the best, because without a fundamental change to how Python loads modules no fancy package manager will help you.


It does very explicitly help you with a transient dependency conflict by outlining exactly what the conflict is and, if possible, resolving it by downgrading a dependency.


PIP will show you what conflicts as well.

It's also not that hard to work out anyway.


Only recently, in part because of Poetry itself. And “it’s not hard” is too much - it’s a problem that shouldn’t exist. And doesn’t, with Poetry.

If you’re using pip with some ad-hoc pinning you’re just using a terrible homemade version of poetry and you’re leaving a lot on the table for no real reason.

It’s great. Use it. You won’t want to go back.


I guess Conda has its audience, but I would scoff at using anything absolutely requiring it.


Several years ago when I had to build a custom package manager/build tool to solve this use case, I wish I could've just scoffed :)


None of the packages I use under Linux require Conda.

Is this Jupyter-related, one wonders?


It wasn't Jupyter related, but I can't remember which specific packages it was. It was quite a few years ago


Venv state management adds complexity I personally don't care for.


for a start, python’s package manager is pip. Poetry is the leading “pip but also more” tool, and has been for a while.


Says who? For a while it was easy_install, and it's not like pip was always around. It wasn't even distributed with Python until recently.


Recently? pip has been distributed with python for years (2.7.9 / 3.4)


I was using Python before then, I experienced the change. Which is why I have no problem imagining another change to another tool like Pipenv or Poetry if they get enough traction.


I never got pip installed with python on ubuntu


that’s because ubuntu and debian are (seemingly willfully) wrong and broken in how python is packaged on their distro package managers. If you install python from python.org, you get pip.

https://gist.github.com/tiran/2dec9e03c6f901814f6d1e8dad0952...


Can't speak for others, but in my own & work projects I'll use containers (not just for Python but other dependencies). That gives me the isolation I need (e.g. if I need to use a different Python version and/or set of libraries) and pip/requirements is adequate (maybe using pip-tools to keep the requirements.txt up to date). While you can of course set up Poetry in containers, it adds complexity and build time and doesn't give me much on top of what pip+containers provide.


poetries lock files are a great help to make sure you can reproducibly build your container


Can you elaborate? What does docker+poetry get me that docker+pip+requirements.txt doesn’t?


Here's an example on why a lock file is still very important when using Docker:

Let's say it's June 19th and your project exists on GitHub and you have a requirements.txt file with only `Flask==2.0.1` in it.

Now let's fast forward to October 19th and you clone the project and run a docker-compose build. You'll get Flask 2.0.1 but you might get a drastically newer version of Werkzeug, Jinja2, itsdangerous and Click because all of those dependencies are defined by Flask with the >= operator.

Suddenly you have a Docker image that's much different than what it was in June and the only thing that changed is time. This becomes a problem because now it could potentially mean running different versions in dev, CI and production.

This has bitten me in the past a few times with Werkzeug and also the Vine library with Celery where it pulled in newer versions that had backwards incompatible changes that broke my app. Things worked in the past but didn't work months in the future even when none of my top level dependencies changed.

A lock file fixes this issue and it's still necessary with Docker.

I've solved it in a slightly different way using pip directly by keeping top level deps in requirements.txt, freezing out a requirements-lock.txt file and referencing it with pip3 install using the -c flag. There's example repos at https://github.com/nickjj/docker-flask-example and https://github.com/nickjj/docker-django-example that demonstrate how it's done. It's not a 100% fool proof solution like Poetry but it works well enough where I haven't had a single issue since I started using this pattern and it mostly solves the problem.


Why is pip freeze not 100%?


> Why is pip freeze not 100%?

You mean not 100% fool proof?

It is, but with the strategy above a new lock file will get generated if the requirements.txt file is newer than the lock file, so if you change 1 dependency you might get newer unexpected dependencies. This is just the limitation of how pip works without building a whole new layer of dependency tracking in (which I guess is why Poetry and similar tools exists). Fortunately in practice I'm happy with the pip solution because it's a few line shell script and hasn't been a problem yet. The important part of having the lock file is there for reproduceable builds today and in the future.


the problem poetry.lock solves that requirements.txt doesn't is that if one of the dependencies you use has an unrestrictive version in their requirements.txt you can end up with unexpected upgrades in the dependencies of your dependencies. if you are only ever developing against a prebuilt docker container and that exact container is what gets shipped to prod then you won't have this issue, but if you have a CI system anywhere along the line that rebuilds the container you can still be bit by this.


Doesn’t pip freeze > requirements.txt capture exact versions of dependencies of dependencies?


> dependencies you use has an unrestrictive version in their requirements.txt

It's the other guy causing the ruckus.


I pip install foo, and foo depends on bar. I pip freeze > lock.txt. My lock file has foo v1 and bar v1, right? Later bar upgrades to v2. I then try to rebuild the container image from lock.txt. My pip freeze lock file will still keep me on foo v1 and bar v1, even though foo has unpinned dependency on bar and bar has new version.

Is pip freeze not solving this scenario? Or is poetry solving a different scenario?

Not trying to flame war, just not sure I’m grokking.


That's an oddly worded question. Poetry is a "mainstream" packaging, dependency and virtualenv management tool. What makes you think it's not popular?


Tangent: Poetry can have very slow dependency resolution. It doesn't cut to the core of the problem: Pypi doesn't enforce clearly specifying dependencies. Many packages do, like in Wheels metadata, but Poetry ignores this, and installs each package.

There are some ways around it. For example, build an independent cache of metadata to use instead of Pypi directly.

For discussion on the topic, see this GH issue: https://github.com/python-poetry/poetry/issues/2094#issuecom...

This is now fixed, but for most of Poetry's life, it took some tricks to work on Python 3, at least on Windows; the workaround wasn't documented. So, it'd always assume Python 2. I can't remember the details, but tried and quit Poetry several times during its early life for this reason.


Poetry strikes me as about 90% of what is needed.

I use it all the time because otherwise I’d have to write my own build system.

I am not sure if poetry’s approach to solving dependencies is really correct (Personally I like downloading the metadata for the projects, SMT solving, choosing a set of wheels and installing them together. It only works for wheels since you can’t extract the dependencies for eggs w/o running setup.py)

Pip is almost right if you install all deps at once but it can’t update wrong versions of old packages incompatible with new ones you add.

Practically poetry handles cases that pip doesn’t.

My beef(s) with poetry are:

   A. It is bad hygiene to copy files by default into your package; they should quit pulling wings off angels and just make a /src dir.
   B. Poetry itself gets corrupted and fails to update.  Delete and reinstall always gets me back on the road but I was anxious about it at first.


I use Pipenv now for applications that I’m not publishing to PyPI. I like the workflow better, and find it to be much less buggy. Pipenv is not really involved in package building, though.

For libraries and OpenSource that I’m building actual packages for, Poetry seems nicer to me than setup.py. I’m trying it out now with one, partly to adopt the newer pyproject.toml thing.

Packaging has always been kind of an ever changing mess in Python vs newer ecosystems, and I don’t see an end to that.

I’d just do what works best for you.

Also, the general opinionated aggressiveness of the maintainer/project is a little off putting to me. Its a little weird in the community.

Normally projects don’t dog other projects on their page. Often they cite alternatives with maybe a statement on how the philosophies vary.

But that’s just my opinion and experience. I’m sure others differ.


I for one am happy with pip/requirements.txt and venvs, I guess if I wanted to publish packages I would look at a more comprehensive system, but for internal apps and scripts I don't see the benefit. I'd love to know why others prefer Poetry, though.


For me virtualenv+requirements is good enough. The only feature of Poetry that I miss is the ability to create wheels without a setup.py. But I don't create wheels when writing backend code. So I only use Poetry for working with packages.


But you still need a pyproject.toml, which is just a setup.py in a different form.

One gripe with pyproject.toml is that I can’t do editable installs (i.e ‘pip install -e path/to/package/‘) with it. Highly annoying when trying to patch packages.


You actually can but it's not documented I guess. I remember finding it when I dug the source code.

[tool.poetry.dependencies] my-package = { "path" = "path/to/package", develop = true }

This way you can patch my-package and poetry will always install the latest state of it.


But I can’t do it from pip to another poetry package. I have to be using poetry for it.


Ah got it, I see what you mean now. In cases like this I had to build the wheel and install it which is not great indeed.


True. I guess I just prefer pyproject.toml, it is definitely more minimal for my use cases.


Poetry does create wheels without a setup.py. In fact, on my last job we got rid of all setup.py we had in favour o poetry.

It cannot replace all setup.py use cases, but for pure python packages it can replace it completely.


I mean I miss that feature of Poetry when working with pip+venv.


Give it time. It's easily the best ome of the bunch, but python has tons of legacy.


I used it and disliked it. I can't exactly say why anymore, it was a while ago, but it just felt odd, and confusing.

I did not continue with it. Maybe the documentation was not written "to my tastes". I don't know how "mainstream" it is, but I suppose if it's initial taste has been better, I'd have stuck with it. I suspect there are others in my position, and hence, maybe it didn't take off as well as it could have.??


Is it just me: I used pipenv before knowing poetry. Pipenv mostly fits the bill. Then, everyone was talking about poetry so I gave it a try but the experience wasn’t that great. It’s either doing what pipenv could do or running into issues I found hard to understand. So now, I just use pipenv. And I think pipenv development just has been started again.

I’m a casual Python user btw.


I like to keep it simple. When I'm not working on a dockerized project I use virtualenvwrapper + pip and they do the job just fine.


Poetry has been usable for how long? Maybe two years? Python packaging has been a dumpster fire for that many decades. The developers who haven't left for languages with more sensible package management have been burned by the latest Python packaging fad many times. It is understandable that they would be willing to let others work out the bugs first.


I like to use it for applications, because of the lock files and easy virtualenvs / all in one experience.

However, I also maintain a number of packages with c extensions or other compiled components, and poetry has no documented way of building packages with binary components, it can only create non-any wheels.


People think package manager as a package downloader only and for that yes, PIP is more than enough.

The main problem for me is pipenv and pip doesn't solve dependencies version issues when you want to upgrade one or several packages. Poetry in the other hand never had an issue.


I looked at it a bit, but not long enough to really understand how it's better than modern setuptools w/ a fully declarative setup.cfg. I'm not sure I have whatever problems it's solving.


Its pretty good but you can get almost all the functionality with venv, make, and pip.

pip install -r requirements.dev, reference requirements.txt in it. It will install dev packages and the main ones.


A tangent...

Glyph† pointed out that, in Python, as an OOP language, we should objectify all the things, e.g. like what Pathlib does for file/path manipulation. (Pathlib was inspired or extracted from Twisted, IIRC.)

A Pythonic dependency/package manager library would be soooo cooool...

https://en.wikipedia.org/wiki/Glyph_Lefkowitz https://glyph.twistedmatrix.com/


It's around 1k opened issues on github. If you want more wide adoption you can help with fixing them.


The Python world is small, and a lot depends on being in the right cliques and working for the right corporations, regardless of the quality (or even quantity!) of output.

Conferences are regulated with always the same speakers who have no incentives to change their slide decks. Advising people about the broken packaging system is also a good source of income.

As someone else has said here, it's an authoritarian club.


There's a bigger problem. Comes down to production issues and the efficiency of using something just because it's cool. I'm a grizzled setuptools guy, for sure. While I think poetry has many great ideas, it currently can't be used in a container-based microservices ecosystem with even one problematic dependency (without monkeypatching or forking crazy things like `requests` in some cases). Proprietary dependencies make it even more painful, but that can be fixed.

My current organization had adopted poetry before I arrived, and I eventually had to decide that we're reversing from poetry, back to pip and explicit usage of setuptools. It was too problematic, and was costing us in production DLL-hell bugs. The solver does not allow you to pin a constraint if it conflicts with the packaging metadata of your dependencies.

The stance of the poetry team is that any discrepancies should be fixed in upstream dependency. Hey, I love purity and platonic solids too, but... so that's not a completely crazy design decision, but we're in an NP problem here. That stance requires the cooperation of every single upstream package maintainer, which obviously, no one can actually operationalize. Here is a very old issue about this:

https://github.com/python-poetry/poetry/issues/697#issuecomm...

There is some movement, and acknowledgment that a tool is useless if you can't take it full-stack for your application. As of writing, the state of the art is to fork and patch things like fastapi and flask. That's nuts. I can make our stack work with the tested tools much more consistently, and there isn't much downside.

https://github.com/python-poetry/poetry/issues/697#issuecomm...

To me, that removes the value of using poetry. `poetry install <x>` is something you do very rarely. Propping up new python environments needs to happen very frequently. The convenience doesn't outweigh the operational costs.

tl;dr: I think poetry is really cool, and it looks like npm to a developer. It doesn't play nice with others, and that's by design. For production use, pip/setuptools solves more problems than people think it does. Sure, constraints.txt is a blunderbuss for a dependency resolver. However, it works. Especially if you are making more than one package for a real-world application. If you are maintaining a standalone library without complex dependencies, sure, go for it :)


Mirrors my experience. Walled garden “we handle everything for you” tools are good for simple use cases. One you need to go outside the garden, even a little bit, it becomes more trouble than it’s worth. For non-trivial projects, I prefer docker + pip.


Poetry just was too little too late, containerization is how people are fixing their dependency issues now. I obviously want have a dockerfile that really represents all my dependency then a good package manager that blames my OS when it can't install something. It's the best horse at the car show.


Imho the python steering committee is too weak in taking action in the direction of improving the whole ecosystem.

They did mess up the python 2/3 and imho they just don't want to deal with such things anymore.

Deprecating stuff is necessary as well as it is necessary to provide an upgrade path (both documentation and tooling) and new stuff.


3.10 is due out this year.

I'd like to see a harmonization of packaging for 3.11.

Of course, the vibe is not strong enough for me to exit the peanut gallery and contribute, so maybe we should be thankful for what we have from the thousands of human-years already invested.


I really don't understand this viewpoint. When RSpec came out in the Ruby world, it did not need a steering committee to deprecate what people were using, nor provide an upgrade path, people flocked to it because it was better. This authoritarian cat-herding so popular in the Python world is just going to drive people away. The forced march to Tox has made me decide to start no more Python projects, for example.


Having trouble wording this in a way that doesn't sound belligerent, so please understand that it isn't meant that way, just a genuine question.

In what way were you being forced to use tox?


I have dozens of projects using pytest, generally run under pytest-runner via

    python3 setup.py -q pytest
All now report

    WARNING: Testing via this command is deprecated and will be removed in a future 
    version. Users looking for a generic test entry point independent of test runner 
    are encouraged to use tox.
I asked a question on SO asking for non-tox alternatives, one of the tox devs replied

    python setup.py provisions dependencies, however, the way
    it does it is considered deprecated and unsupported going
    ahead. Instead, users are encouraged to use tox or ensure
    their own dependencies. There's no way around that unless
    you're willing to fork setuptools, fix the provision issues 
    and use that.


The alternative is to just run

     python - m pytest
in an environment with all dependencies installed, or

    pytest
In an environment with the package itself installed.

Tox automatizes the process of creating such environments.

Just out of curiosity, what exactly is your problem with tox?


Yeah, that's pretty frustrating. The churn in python dependency management has been immense.

The incurable optimist in me thinks that there is a set of best practices that are emerging across multiple languages, that there is a "right way to do it" that we're converging on. Poetry seems far closer to that correct approach than anything else I've tried in Python and I hope that means it will have some longevity.

The realist in me expects that some time in the next few years there will be another new hotness I'll have to decide whether to migrate all my stuff to.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: