Hacker Newsnew | past | comments | ask | show | jobs | submit | bakkoting's commentslogin

This hasn't been true since version 5.4.2, released in 2017.

`npm install` will always use the versions listed in package-lock.json unless your package.json has been edited to list newer versions than are present in package-lock.json.

The only difference with `npm ci` is that `npm ci` fails if the two are out of sync (and it deletes `node_modules` first).


Very few packages published on npm include polyfills, especially packages you'd use when doing local scripting.

I'm sorry, but this is just incorrect. Have you ever heard of ljharb[0]? The NPM ecosystem is rife with polyfills[1]. I don't know how you can make a distinction on which libraries would be used for "local scripting" as I don't think many library authors make that distinction.

[0] - TC39 member who is self-described as "obsessed with backwards compatibility": https://github.com/ljharb

[1] - Here's one of many articles describing the situation: https://marvinh.dev/blog/speeding-up-javascript-ecosystem-pa...


Yes. I'm on TC39 as well, and I've talked to Jordan about this topic.

It's true that there are a few people who publish packages on npm including polyfills, Jordan among them. But these are a very small fraction of all packages on npm, and none of the compromised packages were polyfills. Also, he cares about backwards compatibility _with old versions of node_; the fact that JavaScript was originally a web language, as the grandparent comment says, is completely irrelevant to the inclusion of those specific polyfills.

Polyfills are just completely irrelevant to this discussion.


Fair enough. Thank you for the clarification, and I apologize for not recognizing your status as a TC39 member.

If you look at the list of compromised packages, very few of them could reasonably be included in a standard library. It's mostly project-specific stuff like `@asyncapi/specs` or `@zapier/zapier-sdk`. The most popular generic one I see is `get-them-args`, which is a CLI argument parser - which is something Node has in the form of `util.parseArgs` since v16.17.0.

Well they clearly lacked marketing? Pretty sure a red text in npm every time that package was installed that says "hey we have a better way to do this with node alone" would have made a dent in the library usage, but they didn't do anything of the sort.

I don't think there's literally any conforming implementations of modern ECMAScript by that definition.


If you mean "latest" ECMAScript then that is true. Even latest gcc or clang does not support all features from C++23: https://en.cppreference.com/w/cpp/compiler_support.html.


Google mostly does obey web standards that are set by an industry consortium (WHATWG, W3C, or in the case of JavaScript EMCA).

Chrome has the best compliance with standards of any of the big three (see wpt.fyi) - which is not surprising, because they also have the most engineering time dedicated to their browser, and the most people working on standards.

These bodies require buy in from multiple vendors, but generally not unanimity. That said, browsers can and do ship things which haven't been standardized (e.g. WebUSB, which is still only a draft because only Chrome wants to ship it). In a lot of cases this pretty much has to happen pre-standardization, because it is difficult to come up with a good standard from the ivory tower with no contact with actual use. Chrome is unusually good about working in public to develop specifications for such features even when other browsers aren't currently interested in shipping them.

I don't know what problem you think this proposal would solve.


> Chrome is unusually good about working in public to develop specifications for such features even when other browsers aren't currently interested in shipping them.

That is, if there's a promotion, or a company bet, or a need to establish/secure market dominance for one property or another, Chrome dumps a scribble on a napkin, barely engages in any conversation, and ships to production within a few weeks after dumping said scribbles.

Once it's out there, it couldn't care less what other browsers vendors will say. Dominant market share and an army of developers who never bothered to learn about standards processes will make sure that this is now a standard.


It doesn't require 2FA in general, but it does for people with publish rights for popular packages, which covers most or all of the recent security incidents.

https://github.blog/changelog/2022-11-01-high-impact-package...


> And how many dependencies does Hono have?

Zero.

I'm guessing you're looking at the `devDependencies` in its package.json, but those are only used by the people building the project, not by people merely consuming it.


That doesn't prevent supply chain attacks. Dev dependencies are still software dependencies and add a certain level of risk.


This is needlessly pedantic unless you are writing from an OS, browser, etc. that you wrote entirely by yourself, without using an editor or linter or compiler not written by you, in which case I tip my cap to you.


Only in the sense that any other software on the developers' machines adds a certain level of risk.


This is the approach taken by node's built-in argument parser util.parseArgs.


It also exposes a function which does type stripping (as `import { stripTypeScriptTypes } from 'node:module'`).

This lets you build simple web apps (i.e., those with no frontend dependencies) as pure TypeScript, including the frontend, by stripping the types out from your frontend scripts as you serve them: https://github.com/bakkot/buildless-ts-webapp


The 7.2% number is already adjusted for inflation. Historically the stock market has gotten about 10% nominal return, 6.5-7% real.


Huh! I genuinely didn't know that and this makes me very happy.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: