Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Why are small packages bad? Independent functions should be versioned and distributed independently. Otherwise we get several utility packages which are nothing but collections of independent functions.

Widely used packages should form the basis of a standard library that is distributed with the language itself.



Small packages means more packages, and more packages compounds the dependency graph which:

- makes dependency resolution harder

- means more points of failure

- makes auditing difficult

- makes install time longer

- leads to harder maintenance and upgrade

This also causes heterogeneity in the mass of your dependencies:

- it splits the resources for documentation, testing, tutorial, etc

- it ensures very weak integration between various building blocks, forcing everyone to rebuild glue code again and again

- it makes discovering the proper solution to your problem harder as you must chose a combination of dependencies instead of one

- it makes contributing to the project harder, especially to junior profiles, and increases the price of on-boarding

- eventually, it leads to a culture that pushes the "libs versus frameworks" so far you never get a decent framework for anything. This is why there is no JS equivalent to RoR or Django.

There is, of course, a balance to reach. You don't want a lib that does everything.


The package manager also has evolved to handle lots of dependencies.

In the article 19000 packages are installed in 40 seconds... My Django project with 100 times fewer dependencies takes longer in CI

Npm comes with an audit tool

I have never seen dependency resolution fail (since packages can have private copies), unlike pypi or rubygems.

So some of the downsides to a large dependency tree are mitigated. I'll add one more downside:

- more chances for shenanigans like left-pad to cause issues.

Don't get me wrong, I think 19000 dependencies is fucking nuts. NINETEEN THOUSAND.


> 19000 dependencies

Doesn't this happen because of package manager duplication? I think npm lets packages have their own copies of their dependencies. Since we have a lots of small, widely used packages, they get duplicated at numerous points in the dependency graph. The number of files and installation size explodes.


A big part is the development toolchain, which would be installed at a system level for most languages.

I started a vue project last night, npm install --production installs 4 packages. With dev dependencies, I get 2300 packages. Eslint, babel, webpack, etc bring in lots of luggage

BTW, I think 19000 is wrong, on a fresh node_modules I get:

    $ npm install gatsby
    ...
    + gatsby@2.20.18
    added 1773 packages from 733 contributors 
     and audited 23706 packages in 52.317s
    $ du -sh node_modules
    245M node_modules
Not sure where the "audited" number comes from, but its not the number of install packages. I get 2737 directories containing a package.json, 1477 of which are uniq.

`debug` appears 32 times!


Every separate package adds overhead and another maintainer that you've got to put your trust in. I'd rather have few packages of well trusted maintainers instead of a thousand packages from god knows who.

Personally, I've made it a habit to not add anything that requires trivial one liner garbage packages, which amounts to not installing any dependency that uses anything from Jon Schlinkert (so no Webpack). Unfortunately I'm stuck with gulp right now but with the next major rewrite, I'll also get rid of gulp and its 300 dependencies.


> I'd rather have few packages of well trusted maintainers instead of a thousand packages from god knows who.

Then the problem is the fact anybody can submit a package, not small packages.

Linux distributions solve this problem by having dedicated maintainers. Users of a distribution trust its maintainers when they use it. Software developers want the complete opposite: language-specific packages, instant and unrestricted package publication, containers, etc. Nobody wants to have to talk to a maintainer in order to get their software included in a distribution. Of course the result ends up being a mess.

What if Node's maintainers decided to distribute a curated set of packages alongside Node itself? The size of each individual package wouldn't really matter.


> What if Node's maintainers decided to distribute a curated set of packages alongside Node itself?

First, front-end has nothing to do with Node. Secondly, let’s say there’s some sort of front-end foundation who decides to provide a curated set of packages. A natural candidate would be create-react-app, which pulls over a thousand dependencies. That’s just one package you want to include. How the hell do you even start to curate such a beast?


> First, front-end has nothing to do with Node.

It was just an example.

> A natural candidate would be create-react-app, which pulls over a thousand dependencies. That’s just one package you want to include. How the hell do you even start to curate such a beast?

I don't know. How did Linux distributions do it? Arch Linux has over 10 thousand packages. Debian has over 50 thousand packages. Maybe people should ask the maintainers.


Point to me one package in Debian that depends on thousands more.

Oh and I was a maintainer for a major package manager back in the day for crying out loud (was responsible for overseeing the general health of the project as well as directly responsible for maybe a couple dozen individual packages). Never seen this kind of madness.


I don't know about Debian but Arch Linux has huge package groups and metapackages.

https://www.archlinux.org/groups/x86_64/kde-applications/

https://www.archlinux.org/groups/x86_64/pro-audio/

The Arch Wiki recommends the use of these huge packages. I don't see any reason why it wouldn't scale to a thousand packages or more.


I'm talking about one actual package that largely runs in a single process (or at least a single process group) with thousands of deps, not some sort of loosely related by category or by using the same framework (hell, we're talking about the framework itself here) package group where failure of one no name package doesn't affect anything else.


Groups are very different from dependencies. And a big meta-package is there for user convenience to get a bunch of stuff. It's almost never going to be depended on; if something needs a package or two from the group it will depend on that directly.


Have you considered using Rollup? I've personally found it delightful.


Already using rollup as a replacement for Webpack, but it seems to lag behind from time to time. Couldn't use certain new javascript features right away since it would throw an error.


> Why are small packages bad?

Small, separately maintained packages are bad in a an ecosystem where a final system can incorporate only one version of any given package because it increases the number of opportunities for version conflicts.

Now, it's true that there are some other concerns which weigh in favor of packages being at the minimum useful size, but those necessarily also weigh in favor of a package management system which allows each package to isolate its upstream dependencies without constraining it's downstream users. And not just allows, but facilitates it so that it is the norm, so that dependency conflicts aren't a thing.


I was under the impression that in the npm ecosystem, packages are reused if the version dependencies are compatible, and split if they are not.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: