Correct; NPM is not an "open source project" in the sense of a volunteer-first development model. Neither is Linux - over 80% of commits are corporate, and have been for a decade. Neither is Blender anymore - the Blender Development Fund raking in $3M a year calls the shots. Every successful "large" open source project has outgrown the volunteer community.
> Actually in large products these are incredible finds.
In large products, incredible finds may be true; but breaking compatibility with just 0.1% of your customers is also an incredible disaster.
But NPM has no proof their dashboard won't light up full of corporate customers panicking the moment it goes to production; because their hardcoded integration to have AWS download packages and decompress them with a Lambda and send them to an S3 bucket can no longer decompress fast enough while completing other build steps to avoid mandatory timeouts; just as one stupid example of something that could go wrong. IT is also demanding now that NPM fix it rather than modify the build pipeline which would take weeks to validate, so corporate's begging NPM to fix it by Tuesday's marketing blitz.
Just because it's safe in a lab provides no guarantee it's safe in production.
That’s an argument against making any change to the packaging system ever. “It might break something somewhere” isn’t an argument, it’s a paralysis against change. Improving the edge locality of delivery of npm packages could speed up npm installs. But speeding up npm installs might cause the CI system which is reliant on it for concurrency issues to have a race condition. Does that mean that npm can’t ever make it faster either?
This attitude is how in an age with gigabit fiber, 4GB/s hard drive write speed, 8x4 GHz cores with simd instructions it takes 30+ seconds to bundle a handful of files of JavaScript.
> Actually in large products these are incredible finds.
In large products, incredible finds may be true; but breaking compatibility with just 0.1% of your customers is also an incredible disaster.