Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Correct; NPM is not an "open source project" in the sense of a volunteer-first development model. Neither is Linux - over 80% of commits are corporate, and have been for a decade. Neither is Blender anymore - the Blender Development Fund raking in $3M a year calls the shots. Every successful "large" open source project has outgrown the volunteer community.

> Actually in large products these are incredible finds.

In large products, incredible finds may be true; but breaking compatibility with just 0.1% of your customers is also an incredible disaster.



> breaking compatibility with just 0.1%

Yes. But in this story nothing like that happened.


But NPM has no proof their dashboard won't light up full of corporate customers panicking the moment it goes to production; because their hardcoded integration to have AWS download packages and decompress them with a Lambda and send them to an S3 bucket can no longer decompress fast enough while completing other build steps to avoid mandatory timeouts; just as one stupid example of something that could go wrong. IT is also demanding now that NPM fix it rather than modify the build pipeline which would take weeks to validate, so corporate's begging NPM to fix it by Tuesday's marketing blitz.

Just because it's safe in a lab provides no guarantee it's safe in production.


That’s an argument against making any change to the packaging system ever. “It might break something somewhere” isn’t an argument, it’s a paralysis against change. Improving the edge locality of delivery of npm packages could speed up npm installs. But speeding up npm installs might cause the CI system which is reliant on it for concurrency issues to have a race condition. Does that mean that npm can’t ever make it faster either?


It is an argument. An age old argument:

"If it ain't broke, don't fix it."


This attitude is how in an age with gigabit fiber, 4GB/s hard drive write speed, 8x4 GHz cores with simd instructions it takes 30+ seconds to bundle a handful of files of JavaScript.


disable PRs if this is your policy.


Ok, but why is the burden on him to show that? Are they not interested in improving bandwidth and speed for their users?

The conclusion of this line of reasoning is to never make any change.

If contributions are not welcome, don’t pretend they are and waste my time.

> can no longer decompress fast enough

Already discussed this in another thread. It’s not an issue.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: