Perhaps there’s an alternative to reduce the bundle size further: use code splitting or something like the shared module bundling that Webpack does for you when you have multiple entrypoints relying on the same shared code (if over a threshold minimum size) to reduce how much duplicated code is shipped multiple times for different entrypoints (e.g. uses or platforms or pages).
That said, the miracle of compression (and Yarn PnP not needing to uncompress) means you could have duplicate code and it won’t cost you much at all.
Another option would be to maintain your own dependencies by code splitting at the npm module level, but that could be considered an API-breaking change, I suppose, if the goal is to reduce how much is distributed by splitting out platforms.
You could also publish one layer of dependencies, probably, and still have exact versions pinned, but that would require maintaining CI build tools for your dependencies to ensure they are entirely built with no further dependencies or relying on republishing prebuilt no-dependency binaries in your own namespace.
I am reminded of how before tree-shaking became commonplace, it was routine to export libraries as lots of tiny npm packages, one per function sometimes, and import just the functions you needed as their own packages. That was taking it too far, and ECMAScript Modules (ESM) standards have somewhat replaced it, though for best tree-shaking you have to ensure your JS is actually modular and has no global state or side-effects when importing, which means you end up breaking some JS code when turning on Closure Compiler’s advanced mode, for example.
But we’re deep in the build optimization weeds now… the big advantage to pre-compiling libraries though is that you don’t have to tell others what build toolchain to use and instead provide standard ES5, ES6 or whatnot, already pre-compiled and ready for use (or… maybe, further tree-shaking…)
That said, the miracle of compression (and Yarn PnP not needing to uncompress) means you could have duplicate code and it won’t cost you much at all.
Another option would be to maintain your own dependencies by code splitting at the npm module level, but that could be considered an API-breaking change, I suppose, if the goal is to reduce how much is distributed by splitting out platforms.
You could also publish one layer of dependencies, probably, and still have exact versions pinned, but that would require maintaining CI build tools for your dependencies to ensure they are entirely built with no further dependencies or relying on republishing prebuilt no-dependency binaries in your own namespace.
I am reminded of how before tree-shaking became commonplace, it was routine to export libraries as lots of tiny npm packages, one per function sometimes, and import just the functions you needed as their own packages. That was taking it too far, and ECMAScript Modules (ESM) standards have somewhat replaced it, though for best tree-shaking you have to ensure your JS is actually modular and has no global state or side-effects when importing, which means you end up breaking some JS code when turning on Closure Compiler’s advanced mode, for example.
But we’re deep in the build optimization weeds now… the big advantage to pre-compiling libraries though is that you don’t have to tell others what build toolchain to use and instead provide standard ES5, ES6 or whatnot, already pre-compiled and ready for use (or… maybe, further tree-shaking…)