This is what I miss from the removal of serviceable components on MacBooks. Was a time I would buy the fastest processor and just okay memory and disk, then the first time I got a twinge of jealousy about the new machines, buy the most Corsair memory that they would guarantee would work, and a bigger faster drive. Boom, another 18 months of useful lifetime.
Yes, but that's the slow-boiled frog syndrome. I use my computers for years as well, and whenever I get a new one I think "wow, why didn't I switch sooner, this is so much snappier".
As a counterpoint, I have a 2015 MacBook, a 2015 iMac, and a recent Apple Silicon MacBook. Of course I do Photoshop, Lightroom, Generative AI, etc. on the Apple Silicon system. But I basically don't care which system I browse the web with and, in fact, the iMac is my usual for video calls and a great deal of my web document creation and the like.
I suspect that people who have somewhat older Macs (obviously there's some limit) who find their web browsing intolerably slow probably have something else going on with either their install or their network.
I do some local image generation now and then (mostly using Photoshop). Are you happy now? My only point was that any CPU/GPU-intensive applications I run (and really most local applications) I do on my newish computer. But most stuff I run is in a browser.
The relatively little LLM use I do is in a browser and it doesn't matter which computer I'm doing it on.
I’ve been a Mac user since 2003 or so and I can confidently say my machines last 6-7 years as daily drivers then sunset over 2-3 years when I get a new computer. I always go tower, laptop, tower, laptop. They have a nice overlap for a few years that serves me well.
Amateur… I am using a 2009 15’ MacBook Pro Unibody, with a swapped SuperDrive to SSD, another main SSD and RAM boosted to 8Gb. OpenCore Legacy to update to a relatively recent version of MacOS. The only thing that is so annoying is the webcam that doesn’t work anymore, and a USB port is dead also.
So sad this kind of shenanigans are not possible anymore.
Pfah, showoff. My 2005 Thinkpad T42p crawls circles around that thing - slowly. Maxed out to 2GB, Intel 120GB SSD with a PATA->SATA adapter (just fits if you remove some useless bits from the lid) and - what keeps this machine around - a glorious keyboard and 1600x1200 display. It even gets several hours on the battery so what more could you want?
I have one of these with a MacBook Pro 6,2 that I did the same upgrades to. However I finally decided to retire it when 2nd replacement battery swelled and Chrome stopped supporting OSX 13.
It didn't look like a good candidate for OpenCore Legacy because of the dual video cards, but it feels so gross recycling a perfectly working computer.
I find that a lot of my work is "remote" at this point. Im doing most things on Servers, VM's, and containers on other boxes. The few apps that I do run locally are suffering (browser being the big offender).
Is most of what you're doing remote? Do you have a decent amount of ram in that air?
no, most of the work i do is local, but it's fairly easy stuff, some statistical software, excel, word, browser. And my browser is not suffering that much, perhaps because i have 8GB of ram, and i visit simple websites. Using an adblocker is fundamental tho.
i have an Air from 2011 or 2012 that is out of storage with just the OS installed. I can't update or install any other software because the most recent update installed on it capped out the storage. Low-end windows laptops (the $150-$300 at walmart type) have this same issue. 32GB of storage and windows takes 80% of the space, and you can no longer fit a windows update on it.
I still have the air with whatever the macos is, but as soon as i have a minute i'm going to try and get linux or BSD on it. I'm still sore at how little use i got out of that machine - and i got it "open box" "scratch and dent", so it was around $500 with tax. I got triple the usage out of a 2009ish eeePC (netbook)
Controversial counterpoint: Having standardised hardware causes optimisation.
What do I mean?
In game development, people often argue that game consoles hold back PC games. This is true to a point, because more time is spent optimising at the cost of features, but also optimising for consoles means PC players are reaping the benefits of a baseline decent performance even on low end hardware.
Right now I am developing a game for PC and my dev team are happy to set system requirements at an 11th generation i7 and a 40-series (4070 or higher) graphics card. Obviously that makes our target demographic very narrow but from their perspective the game runs: so why would I be upset?
For over a decade memory was so cheap that most people ended up maxing out their systems, the result is that every program is electron.
For the last 10 years memory started to be constrained and suddenly a lot of electron became less shitty (its still shitty) and memory requirements were something that you could tell at least some companies started working to reduce (or at least not increase).
Now we get faster CPUs, the constraint is gone, and since the M-series chips came out I am certain that software that used to be useful on intel macs is becoming slower and slower. Especially the electron stuff which seems to especially perform well on M-chips