I wonder how long it will take for browsers to go beyond the cache concept and implement an integrated package repository so I can upload my manifest + my 3kb app.js and tell the browser to download (and store) all the dependencies I need.
It will not only help with performance, but will also stop the absurd tooling madness that front-end has become.
How does that differ from cache manifest (see link [1])? It's now being replaced with service workers, but largely storing the dependencies on first refresh is what it does.
This still works on a single website level. A common package manager will help every website that need the same deps (at least in the same semver range) with the benefit of a download once/available to everyone, true immutable, cache.
Edit: The most common example. Let's say you need jQuery. The browser download the repo once and then it will be ready and available for maybe millions of website. Just think about the benefit of the saved bandwidth alone.
I cannot stop to think how stupid is to download the same assets again and again and again for every website you visit.
Yep, this is kind of npm but for browsers. But already the sheer size of npm shows how this is hardly possible: http://www.modulecounts.com/ -- I expect the npm repository to be at the size of two to three letters in gigabytes. This is quite large compared to the total hard disk cache of your browser (which also includes images, CSS, HTML, etc).
Of course you don't need to download the whole repository as with npm. But just the links of the optimized distributable assets. In short, your /dist folder.
Since NPM is now the predominant way of distributing packages, they don't usually have a /dist folder so individual packages would again need to think about this. This was one of the reasons that bower is not really used anymore.
It will not only help with performance, but will also stop the absurd tooling madness that front-end has become.