Could we also consider just not connecting critical systems to the internet at large? No reason, for example, for the Jaguar assembly line to depend on an internet connection.
Yep air-gapped security is a thing. But servicing, patching and communicating with air-gapped devices still needs to happen, which involves connecting to them somehow. Likely a manufacturing plant has started to connect everything together to streamline the process and created an attack surface that way.
You can see the appeal for not needing to go through all the issues, complexity and costs that entails.
This is like saying a Java library readme should start with what the JVM is. It's fine to not know these things, but the majority of this comes with the territory. Right now it sounds like you're simply not the target audience. The Github readme includes a link to all of the major bits, so I don't really see the problem.
If you were trying to convince me to build a web app in Java, you wouldn’t need to explain what the JVM is, but you’d need to make a strong argument for why Java is better than the alternatives available in 2025.
So the question is whether the target audience here is “people who want to build full-stack web apps” or “people who are already using the BHVR stack”.
Why not? There are a lot of people who use the 2010s de facto standard JS server stack — Node, Express, Webpack etc. — but don't necessarily have the time or inclination to keep up to date with every new project in this space. It's a lot to follow.
The exclusive gatekeeping messaging doesn't seem very useful. There's probably a much bigger audience for "Hey, instead of starting yet another Node project, why not try this?" rather than preaching to the already converted early adopters.
Vite is a project with 25M weekly NPM downloads and used by some 9M github repos. It's not an obscure project by any stretch of the imagination. Heck it's almost as popular as React.
This is amazing. I hope it succeeds. If I had any use for a truck I'd be lining up to buy one. They make one in a compact sedan or hatchback form factor and I am in. Heck, even better a subcompact.
I compared the dimensions of the Slate with my '06 Pontiac Vibe hatchback, and it's only a few inches longer. I suspect the Slate + Fastback kit will be pretty close to a hatchback in size and function.
I feel like once a language is standardized (or reaches 1.0), that's it. You're done. No more changes. You wanna make improvements? Try out some new ideas? Fine, do that in a new language.
I can deal with the footguns if they aren't cheekily mutating over the years. I feel like in C++ especially we barely have the time to come to terms with the unintended consequences of the previous language revision before the next one drops a whole new load of them on us.
> If the size of the new type is larger than the size of the last-written type, the contents of the excess bytes are unspecified (and may be a trap representation). Before C99 TC3 (DR 283) this behavior was undefined, but commonly implemented this way.
> When initializing a union, the initializer list must have only one member, which initializes the first member of the union unless a designated initializer is used(since C99).
→ = {0} initializes the first union variant, and bytes outside of that first variant are unspecified. Seems like GCC 15.1 follows the 26 year old standard correctly. (not sure how much has changed from C89 here)
Most likely C++ would not happened, while at the same time C and UNIX adoption would never gotten big enough to be relevant outside Bell Labs.
Which then again, isn't that much of a deal, industry would have steered into other programming languages and operating systems.
Overall that would be a much preferable alternative timeline, assuming security would be taken more seriously, as it has taken 45 years since C.A.R Hoare Turing award speech and Morris worm, and only after companies and government started to feel the monetary pain of their decisions.
I think there are very good reasons why C and UNIX were successful and are still around as foundational technologies. Nor do I think C or UNIX legacy are the real problem we have with security. Instead, complexity is the problem.
Starting by being available for free with source code tapes, and a commented source code book.
History would certainly have taken a different path when AT&T was allowed to profit from Bell Labs work, as their attempts to later regain control from UNIX prove.
Unfortunately that seems the majority opinion on WG14, only changed thanks to government and industry pressure.
Being free was important and history could have taken many paths, but this does not explain why it is still important today and has not been replaced despite many alternatives. WG14 consists mostly of industry representatives.
It is important today just like COBOL and Fortran are with ongoing ISO updates, sunken cost, no one is getting more money out of rewriting their systems just because, unless there are external factors, like government regulations.
Then we have the free beer UNIX clones as well.
Those industry members of WG14 don't seem to have done much security wise language improvement during the last 50 years.
> I feel like once a language is standardized (or reaches 1.0), that's it. You're done. No more changes. You wanna make improvements? Try out some new ideas? Fine, do that in a new language.
Thank goodness this is not how the software world works overall. I'm not sure you understand the implications of what you ask for.
> if they aren't cheekily mutating over the years
You're complaining about languages mutating, then mention C++ which has added stuff but maintained backwards compatibility over the course of many standards (aside from a few hiccups like auto_ptr, which was also short lived), with a high aversion to modifying existing stuff.
That does not make sense for anything that exists over decades.
Do you want to be still using Windows NT, or C++ pred 2004 standard or python 2.0
We learn more and need to add to things., Some things we designed 30 years ago were a mistake should we stick with them.
You can't design everything before release for much software. Games you can or bespoke software for a business as you can define what it does, but then the business changes.
There's a special place in hell for orgs that do this. Google has been doing the same thing with Android.
IIRC Apple at least has always been fairly clear and consistent with what bits of its software are open and what bits aren't. To my knowledge they haven't been breaking off chunks of Darwin and closing them. (Although if I'm wrong do correct me.)
I wonder, does it just let you draw pictures based on the as-advertised color and resolution limitations, or does it take into account clever programming tricks that can increase the color count (with some limitations)?