Hacker Newsnew | past | comments | ask | show | jobs | submit | chronid's commentslogin

And i want to use hibernation, as I don't mind putting my disk encryption passphrase once a day as the price of not risking having my laptop with a completely drained battery on Monday morning due to 1% battery drain/h of s2idle in my 64GB RAM configuration.

You can use suspend+hibernate to accomplish that and it works well. Unless the gods of kernel lockdown decide you cannot for your own good (and it doesn't matter if your disk is fully encrypted, you're not worthy anyway) of course. It's their kernel running on your laptop after all.


I worked in finance on the other side of the pond - developers wanted to constantly bring in and use new services but also didn't want any of the responsibility or the work needed to make compliance happy (or even in that particular company shoulder the costs). When me and other folks where brought in it to fix the "cloud strategy" it was a complete shitshow and heads actually rolled when we wrote a tool to assign costs to applications. But we had to start almost from scratch and limit usable services as we developed strategies and blueprints for each...

The complete, unapologetic desire of devs and security teams (but also many infra teams) to not have any kind of ownership was horrifying to me.

In the end there's not a single solution or strategy, it really goes back to the organization and where your weaknesses and strength are as an org. If you have a gazillion consultants following the "best practice" of the day and exceptions on top of exceptions you are dead, devops or otherwise. You will still make billions if you are the right company though regardless of your software practices, so...


They should be funded by the companies using them. Do you believe any of the fortune top100 would be greatly impacted by funding libxml2? They probably all rely on it, one way or the other.

The foundation of the internet is something that gets bigger and bigger every year. I understand the sentiment and the reasoning of declaring software a "public good", but it won't scale.


> They should be funded by the companies using them. Do you believe any of the fortune top100 would be greatly impacted by funding libxml2? They probably all rely on it, one way or the other.

I agree in theory but it's impractical to achieve due to the coordination effort involved, hence using taxes as a proxy.

> The foundation of the internet is something that gets bigger and bigger every year. I understand the sentiment and the reasoning of declaring software a "public good", but it won't scale.

For a long time, a lot of foundational development was funded by the government. Of course it can scale - the problem is most people don't believe in capable government any more after 30-40 years of neoliberal tax cuts and utter incompetence (California HSR comes to my mind). We used to be able to do great things funded purely by the government, usually via military funding: laser, radar, microwaves and generally a lot of RF technology, even the Internet itself originated out of the military ARPANET. Or the federal highways. And that was just what the Americans did.


Exactly how openssl was (is?) when heartbleed happened. It's nothing new sadly, there are memes about the "unknown oss passion project" holding up the entire stack all over the internet.


Isn't this intuitively true?

Building a nuclear power plant incurs in a massive set up stage with a lot of unknowns unknowns and requiring impressive material engineering and QC.

Solar is much more "incremental", you can almost start producing electricity and recouping costs immediately.

But a nuclear reactor is an extremely dense power generator compared to a solar panel plant by orders of magnitude. I'm not really sure why are they compared this way.


Suddenly? That's the level of quality that is standard in all software projects I've ever seen since I've started working in IT.

Enshittification is all around us and is unstoppable. Because we have deadlines to hit and goals to shows we reached to the VP. We broke everything and the software is just half working? Come on that's an issue for the support and ops teams. On to the next beautiful feature we can put on marketing slides!


Sadly you are absolutely right.


I have plenty of hard disagreements on the "user experience improvements" in Linux. "Adding a skin" is not easy and making the experience somewhat coherent is extremely hard (GNOME is sort of successful at an extreme cost and plenty of limitations, KDE is still an incoherent mess with plenty of bad defaults starting from the base CDDM skin). It's full of things like the missing icon view in the GNOME/GTK file chooser [1] and while it's true that Windows11 is atrocious, all those little things add up.

I actually recovered a laptop my family was using to launch firefox by installing linux on it (soldered ram went bad, linux is the only OS I could use to tell it to skip the bad blocks through kernel command line) but I hold no illusion about its level of "user experience". Just look at the comments in this recent thread [2]. And as a power user I am baffled by some of the choices at the kernel level (which I mentioned in that thread) and others closer to the user by distros (ubuntu and snaps, name an iconic duo), or things like flatpak not being close to ready and still shoved down user's throats...

I spent years when I was younger submitting bug reports for the papercuts I noticed - some ignored for years, some closed and forgotten forever when some project decided to move on from bugzilla - and I have no more time or energy to continue doing so. The maintainers after all write the code, I'm just a user and get no voice :)

I've been reading about the "year of linux" for years now, it's a meme for a reason. People that are not "prosumer" will keep using the preinstalled OS even if it's garbage - assuming they buy a laptop or desktop at all - and the prosumer will probably keep an OSX or a Windows machine close by anyway. Linux is usable as a browser kiosk sure but there is still plenty of friction on everything else. Enshittification will continue, and possibly infect also linux.

[1] https://www.omglinux.com/gnome-thumbnails-file-picker/

[2] https://news.ycombinator.com/item?id=43945373


  > "Adding a skin" is not easy and making the experience somewhat coherent is extremely hard
I don't mean to imply this is easy. But I also do know that these efforts have been in the works for quite some time. They can get more dedication if that's the direction we need to go.

Quick Google

  - 3 free Linux distros that look and feel like Windows: https://www.pcworld.com/article/2532994/3-free-linux-distros-that-look-and-feel-like-windows.html

  - 5 Linux Distributions That are Inspired by the Look and Feel of macOS: https://itsfoss.com/macos-like-linux-distros/

  > soldered ram went bad, linux is the only OS I could use to tell it to skip the bad blocks through kernel command line
IDK how to tell you this, but for 90% of people this is "throw the machine out, buy a new one." I'm really not sure what the critique is here. Even if running with more problems seems unsurprising given what you described. And you're talking about the kernel.

I don't deny that there are problems with Linux, nor that things need to improve to get better mass appeal. But I do think you should look at your own words. They're highly technical. And we should not forget how this would compare when discussing Windows or OSX. That's the choice! It's that these conversations of "Linux sucks" are not just complaints about Linux, they are also suggestions of using Windows or OSX. The context of our conversation is about choosing between these systems, not the existence of problems.

I want to be very clear

  Linux is a dumpster fire.
  This does not mean Windows isn't!
  This does not mean OSX isn't!
The argument I'm making is that this doesn't matter for the general user. Fuck, it generally doesn't matter for the technical user. But there is a good reason why technical/power users have a strong bias towards using Linux. Because at least it is a dumpster fire they can fix. It is absurd to have the framing that we should not encourage people to use Linux in favor of them using systems that are user hostile and destroying all sense of personal privacy!

These arguments become equivalent to: "You don't want to eat that, the chef sneezed in it. Here, eat this cake instead. The chef only took a shit in it."

Idk about you, but give the choice, I'd rather take the sneeze than the shit. I'd (strongly) prefer neither, but frankly that isn't an option now, is it?

And let's be honest, if you want to get more resources to put out more fires, the only way that's going to happen is if there are more users.


The last rewrite I've seen completed (which was justified to a point as the previous system had some massive issues) took 3 years and burned down practically an entire org (multiple people left, some were managed out including two leads, the director was ejected after 18ish months) which was healthy-ish and productive before the rewrite. It's still causing operational pain and does not fully cover all edge cases.

I'm seeing another now in $current_job and I'm seeing similar symptoms (though the system being rewritten is far less important) and customers of the old system essentially abandoned to themselves and marketing and sales are scrambling to try to retain them.

Anecdotal experience is not so good. Rewriting a tiny component? Ok. Full on rewrite of a big system? I feel it's a bad idea and the wisdom holds true.


Spot on. It seems that OP is considering (1) a rewrite that can entirely fit into the mind of an engineerXYZ, and also (2) will be led by the same engineerXYZ, through executive empowerment.

I guess that in your case probably (1) did not hold. Or maybe (2) did not hold, or both.

OP's experiment doesn't prove at all, that an entire org can rewrite a complex app where 1&2 do not hold. Every indication we have is that org's executive functions perform abysmally for code writing (and rewriting). So exactly the point you are making. It would obviously mean that there is value in code, along the value in the org, once we get above the level of the value that conceptually fits into 1 head.


IMHO, anecdotally, if you attempt a full rewrite under the same organizational conditions that resulted in code bad enough to warrant it...

...you're gonna get bad code again, or, as you say, worse. The impact of the organizational culture dwarfs everything else.


This is what home automation was supposed to be. You shouldn't be looking at it, it should just help you silently and reduce the amount of things you have to care about.

Turning lights off, closing shutters when it goes dark, handling temperature and CO2 concentrations, etc.

I feel people have a need to look at dashboards, have screens, etc (maybe it's some sort of sympathetic reaction about looking at dashboards all day at work?) instead of letting go. Dashboards should be looked at if something is wrong and automation is failing.


As a non-blind user, the title expresses my feelings too. And I feel like it's getting worse over time, not better.

From little things to kernel lockdown breaking hibernate on a fully encrypted system just because you should be happy to get your laptop battery killed by s2idle or disable secure boot. Yay, security.

I can only imagine the pain of all the accessibility issues on top of what I experience.


I agree, all the modern technical additions over the years to operating systems have entirely been 2 steps forward, 3 steps back.

It's always, "Oh, well, you can no longer run two or three monitors any more, but your primary display is higher resolution now!" Except DPI adjustments make it irrelevant and now my (i)GPU has a higher minimum load.

Or, "Oh, well, we only give you 2 ports now, but they're all <port>!" Great, but those larger bandwidth ports don't offset the fact that I can't plug in as much any more, and USB hubs are not a solution, they're a hack, wildly variable in operation, and some devices are not compatible with them.


Under Linux, you can still use the -dpi setting in xrandr (or in your x.conf). It sets the font bigger without blurring all the icons.

I prefer it over the replacement approach that modern desktop environments (and wayland) use. I've been exclusively using high-DPI displays for much longer than Mac OS or Windows have supported them, and the old approach was much better.

There's some argument that you need to blur everything badly (instead of setting a session-wide DPI) if the user is simultaneously using two displays with wildly different DPI's. That user is going to have a bad experience no matter what, so I've never understood that argument.


I noticed the one bright spot in the article is debian, though even that's broken for me thanks to systemd.

I switched to devaun, and things are much better, for now. It's unclear how long new software will keep reliably working under X11 without systemd.

Anyway, as a sighted user, my experience almost exactly matches the article, toned down about 10x.

(Concretely, on the systemd side: I hit the same issues with pulseaudio, and the new session stack regularly perma-blanked by screen until I rebooted. I can't reliably share machines with family members because elogin is so bad.)


GTK were talking about dropping X11 support at some point, so X11 folks using GTK apps would probably need to some sort of Wayland to X11 proxy, or to migrate from Xorg to some sort of multi-protocol display server that supports both X11 and Wayland, like Mir from Canonical/Ubuntu, or Arcan.

https://news.itsfoss.com/gtk-drops-x11/ https://arcan-fe.com/


Yeah, I got much peaceful experience after I started running it on VMs, as modern PCs got good enough virtualization hardware, and I have used enough distributions since that 1995's Summer, starting with Slackware 2.0.


Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: