Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Package curation exists in other languages, too. C++ has Boost, and Haskell has its Haskell Platform. It helps avoid the pitfalls of languages with large standard libraries (where stability guarantees make "batteries included" turn into obsolete and bitrotting "dead batteries").


This is an idea that every ecosystem eventually realises it needs. Once you've got enough versions of enough libraries that A and B both need C but at different versions, you start to need curation, although that need might not become pressing enough to do anything about for a while. But once you've got curation, cryptographically trusting that curation becomes viable in a way that cryptographic trust of the original packages often isn't.

Putting a layer of "distributions" over language ecosystems, in the same way that "distributions" solve the problem of putting enough mutually-compatible library versions together to get Linux to work, is, I think, inevitable.


> Once you've got enough versions of enough libraries that A and B both need C but at different versions, you start to need curation

Node specifically doesn’t have this constraint, though, as A’s C and B’s C can be loaded independently into the same VM, each one hermetic and only visible to its parent. (This is probably half of why Node’s ecosystem became the way it did, now that I think about it; every other ecosystem hits increasing numbers of constraint-resolution conflict problems as dependency hierarchy depth increases, and so limits itself in hierarchy depth to avoid this.)


> Node specifically doesn’t have this constraint, though, as A’s C and B’s C can be loaded independently into the same VM, each one hermetic and only visible to its parent.

Rust can do this as well, though most version variation is handled via semver rules wrt. compatibility. I think this exact requirement led to some controversy in the Go community at some point? Though they should now have a module system that allows for this?


It's less of a problem, but it bites as soon as you need to pass a data structure from one C to the other. That's less frequent, but does happen.


Another aspect I think is, how much open source dependency do you actually need for your project. You can probably get by with fewer packages than you expect, which also makes it more feasible to run a curated package system.

For example, you can't submit any old Haskell library you put together to Hackage, or you couldn't at the last time I checked. It had to meet certain minimum standards to be considered.

I honestly don't know where you would start in terms of curating NPM, precisely because of the dependency on dependencies. You'd end up curating half of the ecosystem.


> For example, you can't submit any old Haskell library you put together to Hackage

You can, but you need to figure out what your library is doing and translate it to category theory.


Then you have languages like Go, where it feels like the norm is relying on the standard lib, and any dependency really must be providing value to justify it.

My node projects typically have thousands of dependencies, my java and C++ projects have hundreds. My go projects typically have 8-12, and have never once exceeded 40.

Not to stan the Go language itself, I just wish this philosophy was the prevalent one in languages.


Cool, I'm glad there's precedent for it. I'm not opposed to adding guardrails as the post suggests, but it feels like a band-aid on much deeper problem.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: