Hacker Newsnew | past | comments | ask | show | jobs | submit | zoeysmithe's commentslogin

Its incredible to how compltely unwatchable modern youtube norms are, to me at least. I feel like youtubers now aim almost exclusively for the 12-18 demographic. I mean, this person is doing some kind of character or affectation instead of using a normal voice. Everything is some kind of grift or character or PR or persona now it seems. I understand they do this to get viewers, but its just depressing how much more content I'd enjoy if the PR gimmicks and lowest-common-denominator tricks were stopped.

I just saw techtips Linus interview Linus Torvalds and the constant manboying and bad jokes was just embarrassing and badly hurt the interview. I really wish people like this would turn it way, way down. I think we all love some levity and whimsy, but now those gimmicks are bigger and louder than the actual content.


Torvalds didn't hold back either though, so not sure what the complaint is... If you watch some WAN you'll see you're not getting some weird persona in that video, just the same guy with a bit of extra energy - which is just what you want to do for presentations / shows / whatever. It was a genuine experience.

To me this sounds like a computer-generated voice for obvious pro-privacy reasons for this kind of project. If it bothers you, then maybe work on better voice synthesis tech! I assume it sounds not-leading-generation because it was locally rendered but I could be wrong.

> I just saw techtips Linus interview Linus Torvalds and the constant manboying and bad jokes was just embarrassing and badly hurt the interview.

If you've been watching LTT for any amount of time, it wouldn't be surprising that that's just LTT Linus' nervous awkward style, he's just a person. The jokes can be cringe as hell, but I thought the video was great, I don't think most nerds would be any different in front of a camera.


Why not? Its not an open standard. This is the rent-seeking behavior you get under for-profit capitalist implementations. This is why we push so hard for open standards.

Uh, the HDMI forum is non-profit

That's meaningless, because they delegated licensing to HDMI® Licensing Administrator, Inc. And even if they are somehow a nonprofit: you are also not making any profit when all the money you retrieve via licensing fees is used to pay the royalties of the various patent holders.

Nobody cares if the mailing list where they discuss the upcoming specs is managed by a non-profit, the broader HDMI ecosystem is still a massive money grab.


Then why do they have all this?

Profit/non-profit isn't a big difference. Many non-profits are essentially businesses in practice (money spent/managed, the non-profit just a conduit to the for-profit companies that defacto own it), but just don't issue stock. A non-profit can act like this, and DOES. Non-profits exist in a capitalist context and inherit those norms. Again, this is why we aim for open standards.

Also a non-profit is just that, its not a charity. A charity is an entirely other classification and even those are regularly used and abused like this.


There is more than stock required to be non-profit. I suspect technically a non-profit could issue stock, though it is probably not something any would ever try.

Non-profit is a business arrangement where making money isn't the goal. There are many different versions of one though: many local clubs are a non-profit and they exist only for the benefit of their members.


tbf AOMedia doesnt really make this call. The steam deck for example doesn't do AV1 natively. It could, but Valve has so far decided not to implement it. I dont know how many other devices and systems that could do AV1 but don't do it exist, but to get this level of support, we really need to pressure these companies.

Shrug, if I blog about the joys of driving down route 66 in a '57 Chevy, I really dont have any obligation to give equal time to what its like in a '57 Packard. Its a Saturn fan site, so its just going to be Saturn-centric.

Immediately after mentioning the Playstation port, the article explicitly states (in bold, on a line on its own)

> Grandia is Best on Saturn.

If you specifically blog about how driving down route 66 in a '57 Chevy is better than driving down it in a '57 Packard, I think you have some responsibility to try to justify your claims. Otherwise it's just trolling.


You could spend a little time singing the praises of your '57 Chevy and what makes it particularly joyful to you.

...which almost always ends in flamewars with people arguing over something that's entirely down to nostalgia and personal preference.

Remember this guy was chased out of chicago for trying to cover up the murder of Laquan McDonald by the CPD. Then, previously was famous for being Clinton's fixer in the Gennifer Flowers case. The fact that this man has any political career at all is an incredible indictment of our system.

It's not clear to me that he has any political career beyond his home town.

This will go to SCOTUS, which typically gives the administration preferential treatment. The US's current level of corruption is way too high to assume your scenario.

I mean why is that a problem? Win95 engineering reflects the hardware of the time, the same way today's software engineering reflects the hardware of our time. There's no ideal here, there's no "this is correct," etc its all constantly changing.

This is like car guys today bemoaning the simpler carburetor age or the car guys before them bemoaning the model T age of simplicity. Its silly.

There will never be a scenario where you need all this lightweight stuff outside of extreme edge cases, and there's SO MUCH lightweight stuff its not even a worry.

Also its funny you should mention win95 because I suspect that reflects your age, but a lot of people here are from the dos/first mac/win 2.0 age, and for that crowd win95 was the horrible resource pig and complexity nightmare. Tech press and nerd culture back then was incredibly anti-95 for 'dumbing it all down' and 'being slow' but now its seen as the gold standard of 'proper computing.' So its all relative.

The way I see hardware and tech is that we are forced to ride a train. It makes stops but it cannot stop. It will always go to the next stop. Wanting to stay at a certain stop doesn't make sense and as in fact counter-productive. I wont go into this, but linux on the desktop could have been a bigger contender if the linux crowd and companies were willing to break a lot of things and 'start over' to be more competitive with mac or windows, which at he time did break a lot of things and did 'start over' to a certain degree.

The various implementations of linux desktop always came off clunky and tied to unix-culture conventions which dont really fit the desktop model, which wasn't really appealing for a lot of people, and a lot of that was based on nostalgia and this sort of idealizing old interfaces and concepts. I love kde but its definitely not remotely as appealing as win11 or macos gui and ease of use.

In other words, when nostalgia isn't pushed back upon, we get worse products. I see so much unquestionable nostalgia in tech spaces, I think its something that hurts open source projects and even many commercial ones.


I agree with this take. Win95's 4MB minimum/8MB recommended memory requirement and a 20MHz processor is seen as the acceptable place to draw the line but there were graphical desktops on the market before that on systems with 128K of RAM and 8MHz processors. Why aren't we considering Win95's requirements as ridiculously bloated?

Yep, at the time the Amiga crowd was laughing at the bloat. But now its suddenly the gold standard on efficiency? I think a lot of people like to be argumentative because they refuse to understand they are engaging in mere nostalgia and not actually anything factual or logical.

if you can compile the kernel though, there is no reason that W95 should be any smaller than your specifically compiled kernel - in fact it should be much bigger

however this is of course easier said than done


> There will never be a scenario where you need all this lightweight stuff

I think there are many.

Some examples:

* The fastest code is the code you don't run.

Smaller = faster, and we all want faster. Moore's law is over, Dennard scaling isn't affordable any more, smaller feature sizes are getting absurdly difficult and therefore expensive to fab. So if we want our computers to keep getting faster as we've got used to over the last 40-50 years then the only way to keep delivering that will be to start ruthlessly optimising, shrinking, finding more efficient ways to implement what we've got used to.

Smaller systems are better for performance.

* The smaller the code, the less there is to go wrong.

Smaller doesn't just mean faster, it should mean simpler and cleaner too. Less to go wrong. Easier to debug. Wrappers and VMs and bytecodes and runtimes are bad: they make life easier but they are less efficient and make issues harder to troubleshoot. Part of the Unix philosophy is to embed the KISS principle.

So that's performance and troubleshooting. We aren't done.

* The less you run, the smaller the attack surface.

Smaller code and less code means fewer APIs, fewer interfaces, less points of failure. Look at djb's decades-long policy of offering rewards to people who find holes in qmail or djbdns. Look at OpenBSD. We all need better more secure code. Smaller simpler systems built from fewer layers means more security, less attack surface, less to audit.

Higher performance, and easier troubleshooting, and better security. There's 3 reasons.

Practical examples...

The Atom editor spawned an entire class of app: Electron apps, Javascript on Node, bundled with Chromium. Slack, Discord, VSCode: there are multiple apps used by tens to hundreds of millions of people now. Look at how vast they are. Balena Etcher is a, what, nearly 100 MB download to write an image to USB? Native apps like Rufus do it in a few megabytes. Smaller ones like USBimager do it in hundreds of kilobytes. A dd command in under 100 bytes.

Now some of the people behind Atom wrote Zed.

It's 10% of the size and 10x the speed, in part because it's a native Rust app.

The COSMIC desktop looks like GNOME, works like GNOME Shell, but it's smaller and faster and more customisable because it's native Rust code.

GNOME Shell is Javascript running on an embedded copy of Mozilla's Javascript runtime.

Just like dotcoms wanted to dis-intermediate business, remove middlemen and distributors for faster sales, we could use disintermediation in our software. Fewer runtimes, better smarter compiled languages so we can trap more errors and have faster and safer compiled native code.

Smaller, simpler, cleaner, fewer layers, less abstractions: these are all goods things which are desirable.

Dennis Ritchie and Ken Thompson knew this. That's why Research Unix evolved into Plan 9, which puts way more stuff through the filesystem to remove whole types of API. Everything's in a container all the time, the filesystem abstracts the network and the GUI and more. Under 10% of the syscalls of Linux, the kernel is 5MB of source, and yet it has much of Kubernetes in there.

Then they went further, replaced C too, made a simpler safer language, embedded its runtime right into the kernel, and made binaries CPU-independent, and turned the entire network-aware OS into a runtime to compete with the JVM, so it could run as a browser plugin as well as a bare-metal OS. Now we have ubiquitous virtualisation so lean into it: separate domains. If your user-facing OS only runs in a VM then it doesn't need a filesystem or hardware drivers, because it won't see hardware, only virtualised facilities, so rip all that stuff out. Your container host doesn't need to have a console or manage disks.

This is what we should be doing. This is what we need to do. Hack away at the code complexity. Don't add functionality, remove it. Simplify it. Enforce standards by putting them in the kernel and removing dozens of overlapping implementations. Make codebases that are smaller and readable by humans.

Leave the vast bloated stuff to commercial companies and proprietary software where nobody gets to read it except LLM bots anyway.


I wonder if it would be possible to have gone directly to Zed, without going through Atom first (likewise, Plan 9 would never have been the first iteration of a Unix-like OS). "Rewrite it in Rust" makes a lot of sense if you have a working system that you want to rewrite, but maybe there's a reason that "rewrite it in Rust" is a meme and "write it in Rust" isn't. If you just want to move fast, put things up on the screen for people to interact with, and figure out how you want your system to work, dynamic languages with bytecode VMs and GC will get you there faster and will enable more people to contribute. Once the idea has matured, you can replace the inefficient implementation with one that is 10% of the size and 10x the speed. Adding lots of features and then pruning out the ones that turn out to be useless may also be easier than guessing the exact right feature set a priori.

The problem with scrapping the web for teaching AI is that the web is full of 'little bobby tables' jokes.

Yep, this is just a ploy to create a PMC that actually has no skill workers in it. You just shove MBAs, nepos, etc into these roles and just have them gobble up some managerial course which is often nothing but: delegate, CYA, and 'manage expectations.'

I dont think we need to go back to the old ideas of The Manager who is Above It All and Doesn't Get Their Hands Dirty. At least at middle levels.



I genuinely can't tell if this is sarcasm? Or do you live somewhere where this is taught?

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: