Software quality has seriously declined across the board, from Spotify to Slack to core operating systems like Windows and macOS. I think a major factor is corporate culture, which largely ignores software quality. Another issue is that many engineers lack a solid understanding of CS fundamentals like performance, security, and reliability (perhaps this is why many are not able to solve basic algorithmic questions like linked lists or binary trees during interviews)..
I've seen code written by so-called "senior" engineers that should never have made it past review; had they simply paid attention in their CS 101 courses, it wouldn't exist.
On top of that, as long as poor software quality doesn’t hurt a company's bottom line, why would executives care if their app takes 20 seconds to load?
Consumers have become desensitized to bloat, and regulators remain asleep at the wheel..
There are plenty of us that would love to just sit and fix things all day, but then you get a poor performance review for not shipping new features and find yourself out of a job :)
That's not how Apple works. You'd be given requirements specific to your team and expected to implement them. End of story. You wouldn't be empowered to seek out other teams and fix their stuff (or even necessarily talk to them). It's deliberate and intentional to have very few people with that cross-functional power.
You're right and not right. There were the infrequent occasions when an engineer would be tracking down a problem they were having and end up in another teams framework/code. A Radar would be created, a polite code diff attached and, often, the team would take the patch and roll it into the next build.
I thought this once, it's a disappointing experience. You'll hear all the right things, and 3 years in, realize nothing you do matters to anyone, and that's because all the managers who were so excited about your passion for software quality haven't met with you in a 2 years. And then it clicks, they got promoted by knowing the game: features, resembling the rushed planning deck, delivered yearly. (This is a whole lot easier to whine about after banking the salary for 7 years, of course)
You know, money fucked up Apple. When I started there (1995) no one came to Apple as a "career move". Everyone there was passionate about the machine, the code, the UI.
100%. You nailed it. Very heartening to hear this, always been unsure of my most personal analysis, as it was limited to one corp in one era.
Vastly different circumstances (shitty state school x print design gig => 2008 philosophy dropout waiting tables => build a point of sale app startup, iPhone OS 2 => sold it => 2016 Google).
I had at nigh-religious appreciation for the things I learned from the culture of roughly that era. folklore.org type stuff. Grew up on Gruber. And learning so many of your cohort on Twitter. Took me from a waiter to an engineer.
I ended up being ashamed to mention stuff I learned from it, people rarely were in touch with the culture as I understood it. Many soulless vampires afoot once the money comes in.
I'll never forget asking a (wonderful!) colleague why they wanted to work on Team X, expecting "I'm really passionate about [form factor] because [use case]" or "Well, given $LOCATION, my options were [Google Cloud | this team | Google Play Books]"...instead it was "well, coming out of $IVY_LEAGUE with $STEM_MAJOR, my best options were finance or programming, and finance seemed worse"
I had far too many out-of-left-field interactions like that. And it poisons the place in many ways that, to me, ultimately damn it to mediocrity.
Honestly, I don't think it's a culture thing or a CS fundamentals thing.
I think it's the fact that software is 100x, or maybe even 1,000x, more complex that it was just 25 years ago.
Apps are built on so many libraries and frameworks and packages that an average application will have 100x the amount of code. And they're all necessary. A typical phone app is 200 MB, when Word 4.0 was less than 1 MB.
But it's not just the sheer number of lines of code that can have bugs. It's the conceptual complexity of modern software. An app has to have an interface that works with the mouse and with touch, large screens and small screens, regular resolution and retina, light mode and dark mode, it works offline and online with sync functionality, it works in 20 different languages, it works with accessibility, it works with a physical keyboard and an on-screen keyboard, over mobile data and over WiFi, it works with cloud storage and local storage, it goes on and on.
There are so many combinations of states to reason about. When I was building software for Win32 back in 1995, you worried about screen sizes and color depths. That was about it.
Software's just gotten incredibly complex and there's more to reason about and software quality is just going to suffer. Like, I love Dark Mode at night, but at the same time I can't help but wonder what bugs would have gotten fixed if engineering resources hadn't gone into, and continue to go into, Dark Mode for each app.
> And they're all necessary. A typical phone app is 200 MB, when Word 4.0 was less than 1 MB.
On native platforms, no it’s not.
I know this for a fact because I maintain moderately complex, functional phone apps that have binary sizes that sit below the 30MB mark. I use multiple desktop and mobile apps from other developers that also match this description.
The cause of the bloat there can be attributed to the following things, mostly:
- Apps including gobs of poorly optimized analytics/marketing garbage
- Third party libraries unnecessarily including their own gobs of poorly optimized analytics/marketing garbage
- Apps being wrappers of a web tech stack project built by devs who have zero dependency discipline, resulting in a spiral fractal tree of dependencies that takes up way more space than it needs to
Engineers who care about good engineering are pretty much a thing of the past. Today the game is buffing your resume with as many complex tools as possible, and jumping to the next thing before your pile of complexity crumbles under its own weight.
The reason everything is built on millions of layers is not because it is actually required, but because we have invested a whole lot of time in building frameworks so that mediocre programmers can get fast results.
I would call it a culture issue, where we are not able to seperate out places where this is fine, new interesting apps are great, I want as many as possible.
From places where it's destructive, I would be happy if none of the ways I interact with an os had changed since windows 7, but it had gotten faster, more secure, and resilient.
MacOS had more screen sizes to target in 2011 than the iPad does today; in any case, Apple has always tolerated having iPad apps that are blown-up phone apps. Mouse support for iPad apps has existed as an accessibility feature before it was deemed a core feature. Even that isn't any kind of technological leap, mouse support has been part of Android for at least 15 years now.
None of these really explain the sloppiness and unfocused nature of Apple software, which has been best-in-class until recently.
Except those iPad apps also have to have a Web app now, and if you don't have a custom MacOS app your iPad app has to look good when run in MacOS. You then have to support all iPhone models. But also maybe Windows and probably Android. 25 years ago you could slap "IBM PC Compatible" on software and basically design for like 5 color depths and maybe a few resolutions.
Update cycles were on the order of a year, not a week (which also means all new features need to be ported to all the platforms above in that timeframe). Not even mentioning the backend infrastructure and maintenance to run and sync all of these when 25 years ago everything was on your local hard drive and maybe a floppy disk or CD-R.
I lean toward "culture" as the problem. Although, allowing for your 100x or 1000x complexity, how much of that complexity is from feature pile-on?
I imagine putting AirPlay in the software stack, just as an example, caused code perturbations all over the place. Sidecar feels like another disruptive feature. Never mind Catalyst, juggling Swift and C binaries, Swift UI....
This stuff Apple brought upon themselves. I'm sure there will be plenty of opinions though as to whether some of these were worth the cost of disruption, on-going maintenance.
I agree. The frameworks and tools we use are so complicated, but we’re also so tied to the complexity that it’s pretty much an anti pattern to go outside the framework/toolkit.
I haven’t fully thought this idea out, but I’ve been feelin it recently.
I see these trends as negatively impacting app quality.
"User pain" as a euphemism for "lowest common denominator" apathetic / fear driven development.
Like, playing the Vulcan game of Kal Toh where you remove rods unintelligentlly and still believe your constructed structure (the app) is fully coherent, and instead it dissolves into uselessness.
I’d like to hear from the HN comments. Does anyone here work for a modern and popular software company (something I might have used recently) and think that the software they make is really and truly bulletproof? Like no backlog of hundreds of unfixed bugs and polish items that won’t stop growing?
I don’t think I have met anyone who works at one of those places yet. I’d like to.
Except none of Apple stuff is Electron based, and as far as I am aware of Apple salaries are competitive with top companies - so none of your arguments really hold up.
Apple software is still top tier when you start comparing to Slack, Teams, and all the non-native friends. Apple Music does not take close to 1GB of RAM. There are very few native applications these days because of the cost. And lower cost availability is based on the web stack and lower entry level of skills.
MacOS may have bugs but in general they are well engineered. Starting from secure enclave that none of the competitors have, or just raw performance and battery life that is not just hardware. I haven’t seen a single bug in my Watch for over a year. I guess it depends what you use.
The most bugs that we see these days are originating from the choices behind the tech stacks. Python and pure JavaScript are still too popular. Every post here with Rust name on it gets attention because of its promise of some level stability reduced resource footprint.
Dev at Apple seems immensely political and corporate. From the outside it looks very much as there are points for shipping $new_thing, even if no one out there wants $new_thing.
The whole marketing cycle is based on a endless stream of $new_things that give Tim Cook something to talk about during those slick presentations - presumably so he doesn't spend all his time making prayer gestures and talking slowly.
There doesn't seem to be anyone in charge of overall user experience who can say "Why does Facetime get so confused by different numbers and devices owned by one person? Why does the shared clipboard only work some of the time? Did the Settings app really need a new UI? Why hasn't Finder been updated since forever?"
> Why does the shared clipboard only work some of the time? Did the Settings app really need a new UI? Why hasn't Finder been updated since forever?"
These are like smallest of the small annoyances.
Compare Facetime and Zoom, for example. Issues are on completely new universum.
Zoom has new RCE about almost every month. They just don’t give CVEs for them because they can be mitigated on server update.
Web-based apps definitely lose when you compare RAM use, and probably also when you look at the average app installation size. Spotify.app filling half a CD is absolutely bonkers. But these are also the easiest two metrics to measure, and that makes it tempting to look at a huge trend in software quality and reduce it to "Chromium eats RAM".
It is much harder to quantify how many bugs or delays one encounters in a single day of macOS/iOS compared to earlier versions, or in native apps vs web apps, and so it never happens.
Software quality has seriously declined across the board, from Spotify to Slack to core operating systems like Windows and macOS. I think a major factor is corporate culture, which largely ignores software quality. Another issue is that many engineers lack a solid understanding of CS fundamentals like performance, security, and reliability (perhaps this is why many are not able to solve basic algorithmic questions like linked lists or binary trees during interviews)..
I've seen code written by so-called "senior" engineers that should never have made it past review; had they simply paid attention in their CS 101 courses, it wouldn't exist.
On top of that, as long as poor software quality doesn’t hurt a company's bottom line, why would executives care if their app takes 20 seconds to load?
Consumers have become desensitized to bloat, and regulators remain asleep at the wheel..