Hacker Newsnew | past | comments | ask | show | jobs | submit | VyseofArcadia's commentslogin

Could we also consider just not connecting critical systems to the internet at large? No reason, for example, for the Jaguar assembly line to depend on an internet connection.


Yep air-gapped security is a thing. But servicing, patching and communicating with air-gapped devices still needs to happen, which involves connecting to them somehow. Likely a manufacturing plant has started to connect everything together to streamline the process and created an attack surface that way.

You can see the appeal for not needing to go through all the issues, complexity and costs that entails.


I suppose we could all do what Asahi have been forced to do and go back to using pens, paper and fax machines: https://www.bbc.co.uk/news/articles/cly64g5y744o


Malware can hop through airgaps on USB keys, so that's not enough: https://en.wikipedia.org/wiki/Stuxnet


How else do you expect to move the information around between sites and use it?


I mean, I also don't know Bun and Vite. I've at least seen React. You should probably just explain the whole stack.


This is like saying a Java library readme should start with what the JVM is. It's fine to not know these things, but the majority of this comes with the territory. Right now it sounds like you're simply not the target audience. The Github readme includes a link to all of the major bits, so I don't really see the problem.


If you were trying to convince me to build a web app in Java, you wouldn’t need to explain what the JVM is, but you’d need to make a strong argument for why Java is better than the alternatives available in 2025.

So the question is whether the target audience here is “people who want to build full-stack web apps” or “people who are already using the BHVR stack”.


The benefits are very clear to me.

If you build MERN apps, this is a template that replaces Express with Hono, Node with bun, and Webpack with Vite.

All of which are significantly faster than their counterparts. Hono can be deployed anywhere and has a much smaller bundle size than Express.


These two paragraphs would already be a much more helpful project description than “typesafe fullstack monorepo”.


If you've never heard of Bun or Vite you're clearly not the audience for this.


Why not? There are a lot of people who use the 2010s de facto standard JS server stack — Node, Express, Webpack etc. — but don't necessarily have the time or inclination to keep up to date with every new project in this space. It's a lot to follow.

The exclusive gatekeeping messaging doesn't seem very useful. There's probably a much bigger audience for "Hey, instead of starting yet another Node project, why not try this?" rather than preaching to the already converted early adopters.


Vite is a project with 25M weekly NPM downloads and used by some 9M github repos. It's not an obscure project by any stretch of the imagination. Heck it's almost as popular as React.

https://npmtrends.com/react-vs-vite-vs-webpack

Imagine someone posting a project that uses React and then someone demanding they explain what React is...


You are selecting the frontend crew and making claims based on that. Many of us are not frontend devs just want to wrap an API. At least I do.


> Many of us are not frontend devs

That's fine but OP's template is clearly for devs with frontend skills. No?


Heard not an expert in, parent is right


Answer's in the article. There is a screen behind the wheel for the speedometer, odometer, etc. The backup cam displays there.


Right, I edited my comment. The OP just posted a pretty misleading title saying there was no screen at all.


This is amazing. I hope it succeeds. If I had any use for a truck I'd be lining up to buy one. They make one in a compact sedan or hatchback form factor and I am in. Heck, even better a subcompact.


I compared the dimensions of the Slate with my '06 Pontiac Vibe hatchback, and it's only a few inches longer. I suspect the Slate + Fastback kit will be pretty close to a hatchback in size and function.


A summary:

> Embrace

Yay, MS loves open source!

> Extend

Wow, VS Code is so useful!

> Extinguish

shocked Pikachu meme


That's optimistic. Sci-fi has taught us that way worse forms of AI are possible.


Worse in the sense of capability, not alignment.


I feel like once a language is standardized (or reaches 1.0), that's it. You're done. No more changes. You wanna make improvements? Try out some new ideas? Fine, do that in a new language.

I can deal with the footguns if they aren't cheekily mutating over the years. I feel like in C++ especially we barely have the time to come to terms with the unintended consequences of the previous language revision before the next one drops a whole new load of them on us.


> If the size of the new type is larger than the size of the last-written type, the contents of the excess bytes are unspecified (and may be a trap representation). Before C99 TC3 (DR 283) this behavior was undefined, but commonly implemented this way.

https://en.cppreference.com/w/c/language/union

> When initializing a union, the initializer list must have only one member, which initializes the first member of the union unless a designated initializer is used(since C99).

https://en.cppreference.com/w/c/language/struct_initializati...

→ = {0} initializes the first union variant, and bytes outside of that first variant are unspecified. Seems like GCC 15.1 follows the 26 year old standard correctly. (not sure how much has changed from C89 here)


Programming languages are products, that is like saying you want to keep using vi 1.0.

Maybe C should have stop at K&R C from UNIX V6, at least that would have spared the world in having it being adopted outside UNIX.


I liked the idea I heard: internet audiences demand progress, but internet audiences hate change.


If C++ had never been invented, that might have been the case.


C++ was invented exactly because Bjarne Stroustoup vouched never again to repeat the downgrade of his development experience from Simula to BCPL.

When faced with writing a distributed systems application at Bell Labs, and having to deal with C, the very first step was to create C with Classes.

Also had C++ not been invented, or C gone into an history footnote, so what, there would be other programming languages to chose from.

Lets not put programming languages into some kind of worshiping sanctuary.


I don't think C would have become a footnote if not for C++ given UNIX.


Most likely C++ would not happened, while at the same time C and UNIX adoption would never gotten big enough to be relevant outside Bell Labs.

Which then again, isn't that much of a deal, industry would have steered into other programming languages and operating systems.

Overall that would be a much preferable alternative timeline, assuming security would be taken more seriously, as it has taken 45 years since C.A.R Hoare Turing award speech and Morris worm, and only after companies and government started to feel the monetary pain of their decisions.


I think there are very good reasons why C and UNIX were successful and are still around as foundational technologies. Nor do I think C or UNIX legacy are the real problem we have with security. Instead, complexity is the problem.


Starting by being available for free with source code tapes, and a commented source code book.

History would certainly have taken a different path when AT&T was allowed to profit from Bell Labs work, as their attempts to later regain control from UNIX prove.

Unfortunately that seems the majority opinion on WG14, only changed thanks to government and industry pressure.


Being free was important and history could have taken many paths, but this does not explain why it is still important today and has not been replaced despite many alternatives. WG14 consists mostly of industry representatives.


It is important today just like COBOL and Fortran are with ongoing ISO updates, sunken cost, no one is getting more money out of rewriting their systems just because, unless there are external factors, like government regulations.

Then we have the free beer UNIX clones as well.

Those industry members of WG14 don't seem to have done much security wise language improvement during the last 50 years.


I think this is far from the truth.


I suspect this change was motivated by standards conformance.


The wording of GCC maintainer was "the standard doesn't require it." when they informed Linux kernel mailing list.

https://lore.kernel.org/linux-toolchains/Z0hRrrNU3Q+ro2T7@tu...


Reminds me of strict aliasing. Same attitude...

https://www.yodaiken.com/2018/06/07/torvalds-on-aliasing/


> I feel like once a language is standardized (or reaches 1.0), that's it. You're done. No more changes. You wanna make improvements? Try out some new ideas? Fine, do that in a new language.

Thank goodness this is not how the software world works overall. I'm not sure you understand the implications of what you ask for.

> if they aren't cheekily mutating over the years

You're complaining about languages mutating, then mention C++ which has added stuff but maintained backwards compatibility over the course of many standards (aside from a few hiccups like auto_ptr, which was also short lived), with a high aversion to modifying existing stuff.


Perl 6 and Python 3 joined the chat


It's careless development. Why think something in advance when you can fix it later. It works so well for Microsoft, Google and lately Apple. /s

The release cycle of a software speaks a lot about its quality. Move fast, break things has become the new development process.


That does not make sense for anything that exists over decades.

Do you want to be still using Windows NT, or C++ pred 2004 standard or python 2.0

We learn more and need to add to things., Some things we designed 30 years ago were a mistake should we stick with them.

You can't design everything before release for much software. Games you can or bespoke software for a business as you can define what it does, but then the business changes.


There's a special place in hell for orgs that do this. Google has been doing the same thing with Android.

IIRC Apple at least has always been fairly clear and consistent with what bits of its software are open and what bits aren't. To my knowledge they haven't been breaking off chunks of Darwin and closing them. (Although if I'm wrong do correct me.)


Another roughly contemporary point of comparison, Haiku OS: https://www.haiku-os.org/slideshows/haiku-1/


I wonder, does it just let you draw pictures based on the as-advertised color and resolution limitations, or does it take into account clever programming tricks that can increase the color count (with some limitations)?


Also, does it take other limitations of the platforms (such as two colors per 8x8 grid on Spectrums and similar limitations on C64 etc) into account?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: