Hacker Newsnew | past | comments | ask | show | jobs | submit | linguae's commentslogin

“Trash” is too harsh of a word, but I do feel that desktop Linux lacks the fit-and-finish of Windows and macOS. It boils down to development resources. The Mac ecosystem has a much smaller number of hardware configurations that macOS needs to be concerned about. Apple has the resources to test these configurations. Even in the PC ecosystem with its plethora of hardware configurations, Microsoft is also a well-resourced company and is thus capable of testing a wide range of configurations. The Linux ecosystem, on the other hand, is much more loosely coordinated among many different parties that may or may not be acting in accord with each other. There are no Apple- or Microsoft-level entities investing hundreds of millions of dollars into desktop Linux development.

In fact, given the circumstances, it’s quite impressive we have desktop Linux at all. Even with its Sisyphean setbacks, it’s come quite a long way compared to 20 years ago when I first started using Linux.


Even for fields that can be done entirely with pen and paper, grad students and postdocs don't work for free, and often the only way a professor can keep up with a research university's publication expectations is to hire grad students and postdocs to contribute to research. Grants help pay for their stipends. Additionally, there are many universities that require their professors to raise grant money as a condition of tenure, since grant money is a significant source of revenue for research universities.

I came to a similar conclusion as a computer scientist working in industry who ended up transitioning to a community college teaching career. I remember being inspired by the stories of Bell Labs and Xerox PARC as a high school student and undergrad, wanting to follow in the footsteps of researchers like Dennis Ritchie. However, the industrial research landscape has changed in the past 15 years. It's very difficult to find truly curiosity-driven places with long timelines and little pressure, and industrial researchers these days are pressured to work on projects with more immediate productization prospects. I've seen this firsthand at a few companies. The tenure-track at a research university route requires playing the "publish-or-perish" game, which is also a curb on freedom and is also filled with pressure.

Being a tenure-track professor at a California community college is a happy situation for me. I love teaching, and for roughly 8 months of the year I'm dedicated to teaching. Tenure at my community college is entirely based on teaching and service; I'm not required (or even expected) to publish. I also get roughly 4 months of the year off (three months off in the summer, one month off in the winter). I spent much of the past summer in Japan collaborating with a professor on research. The only serious downside is not being able to afford a house within a reasonable commute from work, but I had the same problem in industry; not everyone in industry makes FAANG-level salaries. In fact, my compensation is effectively a raise from my previous job when factoring in going from roughly 3 weeks of PTO per year to 4 months off plus 10 days worth of sick leave; I took a roughly 10% pay cut in exchange for greater freedom and roughly 5-6x the annual time off.

I've learned that being a hobbyist researcher with a stable job that provides summers off is quite a favorable situation, since I don't have to worry about my job security being tied to my publication and fund-raising counts. Most of my computer science research can be done on a mid-range laptop with an Internet connection and access to textbooks and academic databases; I don't need equipment that cost five- or six-figures (though it would be nice to have a GPU cluster....).


Bell Labs was part of AT&T which had in effect a government approved monopoly -- nearly all of the US telephone system, so was awash in earnings for transistors, lasers, information theory, the Fast Fourier Transform, etc. Xerox PARC was part of Xerox that was also "awash in earnings" from photocopying machines and for more whatever else, e.g., more in personal computing.

So, for a high school student, the lesson there was not just to do great science but to join or start a business that is or soon can be "awash in money" and then do whatever you want, e.g., Jim and Marilyn Simons, including "great science".

In more detail, now in practice, one of the main motivations of a company "awash in money" is to pursue research for luster, e.g., AI, quantum computing.

Ah, Lesson 101 in US life and money!


It depends on the progressive, however. Yes, I’m hearing more calls to build from progressives. However, for a long time between the 1960s until the past few years, there were two drivers of NIMBYism that progressives championed: (1) local control of neighborhoods and (2) environmentalism. The first was a reaction to urban development plans of the 1950s and 1960s that fundamentally reshaped neighborhoods, but often in ways that did not consider the residents of those neighborhoods. For example, San Francisco once had a historical Japanese American and African American district named The Fillmore with plenty of Victorian homes, but this was largely demolished in the 1960s and replaced with housing projects and a widened Geary Blvd. While I’m still on San Francisco, there were plans in the 1950s to build a network of freeways criss-crossing the city. This was deeply unpopular.

Unpopular plans to dramatically reshape urban cities led to “freeway revolts” (organized, grassroots opposition to freeway projects, which sometimes succeeded) and increased local input over planning. The second was brought on by environmental crises in the 1960s, such as badly polluted rivers and the famous oil spill near Santa Barbara. California, especially its coastal areas, was quite affected by both drivers of NIMBYism, and this became the dominant way of thinking from the 1970s onward.

Local control over neighborhoods sounds reasonable, but unfortunately it’s led to neighborhoods being museum pieces that do not scale upwards to meet demand, thus incentivizing urban sprawl. Restricting development had also significantly boosted the property values in those areas. However, urban sprawl directly conflicts with environmental goals, since it requires more transportation infrastructure and more energy to move people across longer distances than across shorter distances. Thus, we end up with situations where homes get built in far-flung exurbs whose politicians support growth (until the towns get large enough to where some residents want to halt growth to “preserve our quality of life,” thus pushing development to the next closest area friendly to development), environmentalists blocking road-widening and other infrastructure-improving efforts in an attempt to stop/discourage the sprawl, and NIMBYs blocking the construction of denser housing near job centers that could have provided affordable alternatives to exurban housing.

This has been the story of California since the 1970s, and the obscene housing prices and unsustainable mega-commutes are a result of this. Thankfully more people are seeing the consequences of 50 years of broken housing policy, and we’re finally seeing some efforts, even if they’re baby steps, to address this.


Nostalgia is one aspect of retrocomputing to me, but one of the things I also like about retrocomputing is being able to experience platforms I never got to use during their heydays, either because I wasn’t around when those platforms were available, or because I couldn’t afford them. For example, I’m the owner of a NeXT Cube setup, which I’ve had since 2021. I was born in 1989, not too long after the 1988 announcement of the original NeXT computer. I’ve never heard of NeXT until 2004, when I started learning about Mac OS X and its history.

I also think people can learn a lot from the platforms of the past. While computers have gotten objectively more capable over the decades, I think there’s a lot we can learn from the systems of the past. I feel this is especially true in the area of usability. There was a lot of work done in the 1980s and 1990s on usability research, and Apple and Microsoft published human interface guidelines describing how software written for the classic Mac OS and Windows should behave. However, consistency has been sidelined in favor of branding and other marketing concerns at the expense of usability. Using applications designed for Macintosh System 7 or Windows 95 will give people the experience of using applications back when conforming to UI guidelines was a big deal.

Nostalgia is great and is one reason I retrocompute, but it’s more than that for me.


The sad thing is that Apple used to have people like Bill Atkinson (RIP), Larry Tesler (RIP), Bruce Tognazzini, and Don Norman who cared deeply about usability. What made the Mac special wasn’t its looks, especially in the pre-iMac and pre-Mac OS X days when the computers were mostly beige and the classic Mac OS UI was quite plain-looking (though still good-looking, IMO) compared to Mac OS X and later. Rather, what made the Mac special was its attentiveness to UI.

Somewhere along the line Apple products became luxury goods that were appealing due to their visual design. I am particularly fond of Apple’s early 2000s design, with beautiful hardware such as the iMac G4 and the Power Mac G4 Cube running early Mac OS X. Apple still makes very visually-appealing hardware and software. However, I feel that under Tim Cook Apple has heavily leaned into its visual appeal at the expense of what made Apple great: the emphasis on usability.


Many users of Microsoft Word 5.1 for Macintosh hated Microsoft Word 6; they felt the latter was not Mac-like. I was a kid during this time and the only Mac word processor I used in the classic era was the one included with ClarisWorks, so I don’t have any opinions about Word 5.1 vs 6.

I feel Mac OS X peaked with Snow Leopard, though it wasn’t until Catalina when I started refusing to upgrade my personal Macs, which remained at Mojave until I retired them from daily driving.

I also remember SimCity (the successor to SimCity 4) and SimCity Societies not being well-received. Sadly the SimCity franchise is dead; I enjoyed SimCity 2000 and 4 (I never played 3000).

The fourth generation of Pokémon games, in my opinion, was the high water mark of the franchise’s main series games, peaking at Heart Gold/Soul Silver.


I’ve also come to this realization as a longtime Mac user. Granted, I’m not old enough to have used the classic Mac OS as a professional, though I did use it in my childhood. However, I was in high school and college during the Jobs era of Mac OS X, and that’s when I started using Macs as my daily drivers from 2006 until 2022 when I retired my 2013 Mac Pro and 2013 MacBook Air and switched back to PCs running Windows.

Before Apple’s success with the iPhone, Apple was essentially the Macintosh company. Its fortunes were tied to the Mac, and Apple seemed to be attuned to the needs of Mac users. In return, there was quite a strong Mac fandom. The Mac was more than just a tool or just a platform. The Mac was a philosophy, and what attracted people to the Mac was the philosophy of the Mac and its ecosystem.

Ever since the iPhone became a major hit, and especially since the passing of Jobs, it became apparent to me that my best interests as a computer user and Apple’s interests as a company no longer align. Apple no longer needed to cater to “the Mac faithful” to survive. In fact, Apple is one of the biggest companies in the world thanks to the iPhone. It also seems that macOS is losing its distinctiveness.

The unfortunate thing, and I think this is where some Mac users get emotional and disappointed, is that there’s nothing else out there in the personal computing landscape that is like the glory days of the Mac. Windows is an inconsistent mess filled with annoyances, and the bazaar of the Linux ecosystem is nothing like the polished cathedral of the Mac. Everything is a step down from older Macs, even modern Macs.

However, while thankfully we can enjoy retrocomputing for hobbyist use, many of us still need to use up-to-date platforms to browse the Web and to get our work done, and so clinging on to, say, Snow Leopard is not an option outside of hobbyist activities. Hence, why I use Windows. It’s not Snow Leopard but it gets the job done for now.

The beauty about software, though, is that we don’t have to resign ourselves to accepting whatever Apple, Microsoft, and Google releases. Many of us reading this forum have the ability to write our own software. FOSS projects such as Linux have shown that it’s possible for user communities to write their own software that fit their needs without needing to be concerned about business matters such as market dominance.

So, yes, this is a hard lesson for many longtime Mac users about being loyal to a company; companies change. But this lesson also creates opportunities. I think if enough disaffected longtime Mac users got together and pooled their resources together, we could create a FOSS alternative that is community-driven, one where the evolution of Mac-like personal computing is driven by the community, not by Apple or any other corporation.


> I’ve also come to this realization as a longtime Mac user. Granted, I’m not old enough to have used the classic Mac OS as a professional, though I did use it in my childhood. However, I was in high school and college during the Jobs era of Mac OS X, and that’s when I started using Macs as my daily drivers from 2006 until 2022 when I retired my 2013 Mac Pro and 2013 MacBook Air and switched back to PCs running Windows.

Me too. I liked opinionated software when my opinions aligned with Apple's. But I found i hated it when they didn't. Around mavericks or Sierra or so I started getting more and more annoyed with all the things being changed for the worse (in my opinion). Then during the pandemic I had plenty of free time to dig deep into my computing future and I moved to KDE. Heavily customised just the way I love it. It's a breath of fresh air not having someone else make choices on how I should use my computer. Haven't looked back since.

I had already left iOS because it's so locked down and the hardware too expensive for the specs. That already broke a ton of the "just works" of course.


Prior to the iPhone (but within the years Jobs was in charge), Apple was a company whose target demographic was the professional/semi-professional creative market. Once iPod and iPhone demonstrated a huge sales potential the company abandoned the creatives market and became a consumer-oriented company that provided means of media consumption.


The day XCode runs on the iPadOS, the classical Mac is gone.

They already got rid of servers and the workstation market, apparently losing those customers to Windows/Linux wasn't seen as something to worry about.


Between Swift playground and bitrig, that's not as far off as it once was!


XCode does a little more than those, but yeah the seeds are there.


> bazaar of the Linux ecosystem is nothing like the polished cathedral of the Mac

Just stick with the KDE no matter what distro, and you will have basically no major day to day usability problems on Linux.


Why do you favor Windows over Linux then?


There are proprietary software packages that I rely on that are unavailable for Linux, though I regularly use WSL for development.


As someone whose grandparents endured Jim Crow, I largely agree in the sense that social media did not create America’s divides. Many of the divides in American society are very old and are very deep, with no easy fixes.

Unfortunately algorithmic social media is one of the factors adding fuel to the fire, and I believe it’s fair to say that social media has helped increase polarization by recommending content to its viewers purely based on engagement metrics without any regard for the consequences of pushing such content. It is much easier to whip people into a frenzy this way. Additionally, echo chambers make it harder for people to be exposed to other points of view. Combine this with dismal educational outcomes for many Americans (including a lack of critical thinking skills), our two-party system that aggregates diverse political views into just two options, a first-past-the-post election system that forces people to choose “the lesser of two evils,” and growing economic pain, and these factors create conditions that are ripe for strife.


So there wasn’t enough fuel in the fire when marauding Klansmen were hanging Black people?

It was the current President of the US that led a charge that a Black man running for President wasn’t a “real American” and was a secret Muslim trying to bring Shari law to the US and close to half of the US was willing to believe it.

https://www.youtube.com/watch?v=WErjPmFulQ0

This was before social media in the northern burbs of Atlanta where I had to a house built in 2016. We didn’t have a problem during the seven years we lived there. But do you think they were “polarized” by social media in the 80s?

That’s just like police brutality didn’t start with the rise of social media. Everyone just has cameras and a platform


Unfortunately algorithmic social media is one of the factors adding fuel to the fire

Saying social media fans the flames is like saying ignorance is bliss. Mainstream media (cable news, radio, newspapers, etc) only gives us one, largely conservative, viewpoint. If you're lucky, you'll get one carefully controlled opposing viewpoint (out of many!). As you say, our choices are usually evil and not quite as evil.

Anger is not an unreasonable reaction when you realize this. When you realize that other viewpoints exist, the mainstream media and politicians are not acting in anyone's best interest but their own, there really are other options (politically, for news, etc.). Social media is good at bringing these things to light.

There are no easy fixes to the divides you're talking about, but failing to confront them and just giving in to the status quo, or worse, continuing down our current reactionary transcript, is probably the worst way to approach them.


I remember learning Japanese in the early 2000s and the fun of dealing with multiple encodings for the same language: JIS, Shift-JIS, and EUC. As late as 2011 I had to deal with processing a dataset encoded under EUC in Python 2 for a graduate-level machine learning course where I worked on a project for segmenting Japanese sentences (typically there are no spaces in Japanese sentences).

UTF-8 made processing Japanese text much easier! No more needing to manually change encoding options in my browser! No more mojibake!


On the other hand, you now have to deal with the issues of Han unification: https://en.wikipedia.org/wiki/Han_unification#Examples_of_la...


I live in Japan and I still receive the random email or work document encoded in Shit-JIS. Mojibake is not as common as it once was, but still a problem.


I'm assuming you misspelled Shift-JIS on purpose because you're sick and tired of dealing with it. If that was an accidental misspelling, it was inspired. :-)


Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: