Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
The Sad Saga of the 500 MHz Power Mac G4 (512pixels.net)
113 points by ingve on Nov 9, 2022 | hide | past | favorite | 135 comments


Something I'd like to reminisce on for a second is the absolute amazing combination of both form and function the G3 and G4 standard tower case was, in my opinion. It was a machine that looked sleek and futuristic while also being 'made for work'. They were heavy beasts, but had the integrated handles. And the hardware layout inside... with the side panel that mounted the motherboard on the side that folded out with no tools, just a pull on the latch.. and the PSU cable and hard drive cables all made to route to where they ran at the hinge, allowing the fold out to be so clean and presenting the whole board so perfectly. The closest modern equivalent is something like a HP Z8 series (I now own an 840 as a workstation), but with GPUs and CPU coolers being so massive, it just cannot be as elegantly packaged as those simpler PPC machines.


"We think we've got the most incredible access story in the business, and you know what it's called? It's called 'a door'." https://youtu.be/XzbFgzgy1mc?t=2546


I had forgotten how awesome the G3 towers look. I've got G4 and G5 towers, but not the G3... guess I better keep an eye out for a nice deal on one! My G4s were $5 and free, respectievly, so hopefully I can be similarly lucky for a G3 hahah


I still regret selling my B&W G3 (with upgraded 450MHz G4).


No doubt... they were very unique, moreso than the G4s (IMO). Still haven't come across a good deal on one, but surely one day I shall! ... hopefully! haha :)


I also have fond memories of the G4 case for this reason - it was a breath of fresh air after so many years of dealing with cumbersome Whiteboxen.


Beige...


Platinum since 1986.


I still have my G4 tower... mostly because it is too heavy to get rid of


Your comment cracked me up!

I still have one too. It was pretty to look at but probably the worst new Mac I've ever owned. Within a year it was obnoxiously loud and pretty much obsoleted after just a couple years.

In contrast, I bought a new "Late `09 Mac Mini that I just replaced with a used late 2014 Mac mini about 2 months ago. That `09 was a real workhorse for a lot of years.

The `14 only cost me $330 at OWC with 16GB RAM and is running the latest versions of the software I use and the only software I needed to pay for an update was BBEdit, and that only cost me around $35. I think I paid around $1200 for that G4 "Silver Door" tower.


I used to do tech support for a small ISP and we had a power mac G5 in our test lab. I would open it up and show everyone how beautiful it was.


That and the Sun Ultra 45 were very nice designs.


But that was apples FU: Pay us a premium because we are good at asthetics.

Thos machines sucked -- Look at SGI their cases were stellar in design even if the guts were mediocre RISC and difficult OS... and they were like 25K for a desktop and 500K for a rack-sized server.

When we built the Presidio complex for Lucas (I was the data center designer) they were throwing out $500,000 SGI racks and even turned a few of them into keg-erators... SGI had the BEST physical case designs.... ( I suppose thats why the tech museum is on the SGI campus)


> But that was apples FU

Let's be realistic, the actual "FU" didn't come until they announced the G4 cube and redesigned iMac.


As someone who considers the G4 iMac to be the single most beautiful piece of computing technology - perhaps any technology - she’s ever seen, (I still keep two operating at home) I just have to say if it was some sort of mistake; it was the best one the ever made.


If it ever dies here’s a project to consider: https://appleinsider.com/articles/21/10/07/ios-developer-tur...


I was just talking to price per compute... Its fine they made a piece of art... but thats what you paid for, not a machine. I have tons of machines, two of my current laptops are flagship OMEN gaming for $3,000 a piece. Flimsy plastic bullshit.


I had a G4 cube and it was also very nice, except i think the power supply was on the power cord (a brick), if I recall.


I have a lovely G4 Cube here at home still. The power supply is a strange and large external brick, yes.


Ironically HP has a machine called the G4 in the Z8 lineup, though it uses a Xeon chip.

Btw, I’m surprised you like the Z8 series. The single core and sum compute with total of cores looks like a terrible value (eg when looking at various benchmarks for the Xeon) — ie thousands of dollars for a machine with a chip that appears far less powerful than the latest i7s or Ryzens.

Curious what I’m missing.


My Z840 is a few years old.. almost 8 years old actually. It is a dual Xeon with 32 cores and 256GB of ECC ram which is important for the computational fluid dynamics software I use. Brand new this machine would have probably been $20k... I purchased it for just about $2000 and this was years ago, they can be had for much less now. These type of workstations for CAD, CFD, FEA engineering work are very lucrative when purchased second hand as compared to buying new.


I have similar Z840 but with the crappiest video card I could buy and stuffed with 512GB RAM. It sits in my basement and hosts some backend software that I wrote for the company I own. Amazingly quiet as well. Bought it for peanuts in specialized store close to my place. They do nothing but refurbish and sell servers and workstations. I bought mine for about 3,000 Canadian pesos. Works like a charm for few years already.


(multiple) GPU, lots of cores (some z8 are dual proc too), lots of opportunity for nvme and loaaads of memory (with ECC) is what I like in z8 workstations.


The Z8 is a great machine. I still use mine even thought I also have an M1 and the latest Intel. That thing is stuffed full of SSDs. It has 2 PCIe x8 slots above the CPUs, without rear panel cutouts, that accept riser boards for m.2 SSDs. The C622 chipset has 2x 10gb ethernet controllers built-in, so while the 10gb ports are optional, they don't take up a PCIe slot. The internal air duct and all the cabling are well designed. Compared to my whitebox PC that I built myself, it's incredibly clean and integrated.


True story - a bike shop I worked for actually ran all of our POS software off of G3's for years. We had them stuffed under the counter with those G3 monitors, keyboards and mice.

The look on people's faces when they came in and realized none of our stuff ran on Microsoft gave our shops street cred in the cyclists and mountain biking communities.


I was still using/building PCs at the time. I waited years hoping someone would build an ATX version of that case but it never happened for some reason.

Patents, cost, who knows. But I would have bought it.


I had a pentium 3 in a g3 powermac case. If I recall the whole io area just unscrewed, then you just had to mount standoffs and use a cpu cooler that fit under the psu.

Today, with a 3d printer, the mod could be done very elegantly.


I always liked the original G3 in particular, the bold blue colour and the huge “G3” on the side really make it stand out. Then they toned it down for the G4 (grey, no big letters), and did weird not quite successful evolutions like the mirrored drove doors model that didn’t improve upon the original.


I still have two of the PCI G4 model at work, mothballed but not yet evicted in case I ever need the data on them! (I also have a G3 which did many years' service as an imagesetter RIP!)

I never particularly liked the aesthetics of the tower case but the practicality of the design is nothing short of breathtaking.


Some of the datacenter designed servers you can find have similar tool free parts.


> The closest modern equivalent is something like a HP Z8 series

What about the new Mac Pro?


There is a new Mac Pro?


From 2019?


Yeah. Time flies.


It's still a sexy beast. Looks modern even 23 years later.


Completely subjective. I thought it looked dated at the time.


Strong disagree. I loved Apple’s styles in this era before everything went Snow and Titanium. Those old CRT iMacs are just so great.


Would bet there are still quite a few of these found in recording studios across the world.


iMac G4 non-tower form is probably the best looking Mac out of any imho. Peak design.


All I remember of this early period was just how much OS9 would crash when using "Professional" applications, making any performance increase in the Mac almost pointless.

I took a "fun" elective course in electronic media my senior year of school. It was Photoshop, video editing on VHS + digital video editing, and electronic music production. All my CS work was on IBM RS6000s or SGI Indy & O2s at that point. I had a homebuilt Pentium desktop machine dual booting Linux & Windows NT that I used for my work and a 486 laptop running Linux that was getting long in the tooth by 1999. I was very much used to computers not crashing by that point, NT was pretty good, Linux and the Sun/IBM/SGI workstations would stay up for a year no problem as security updates requiring reboots weren't coming very quickly yet.

I vividly remember greatly preferring doing all the video editing on the analog VHS editing stations because OS9 was so unbelievably unstable. Maybe the ultra expensive Avid workstations were less unstable, but I was in a 100 level course and not allowed to use them. We were saving our work after every operation the Macs were crashing so much.

Of course pretty soon after this OSX started making things right in the world. It's easy to take Mac OS for granted these days!

Heck even an iPhone with iMovie today would run rings around the Avid workstations from 1999!


Back then people would joke that Macintosh was an acronym: Most Applications Crash, If Not The Operating System Hangs


I spent a few years programming for OS8/9 and it was possible to keep things rock solid, but also really easy to fuck things up.

Cooperative multi threading meant that anything going wrong could cause everything to hang. The memory model was only slightly more forgiving etc.

Which meant some software was fine to run for weeks and other stuff was lucky to make it through an afternoon.


BowelsOfTheMemoryMgr


Classic Mac OS was cursed by its memory management. Decisions that made sense pre-multitasking would later haunt them up through Mac OS 9.[^1]

[1]: https://en.wikipedia.org/wiki/Classic_Mac_OS_memory_manageme...


SO was W9X and even W98SE. Install a driver? BSOD. Set up something? BSOD. You sneeze at IE5? BSOD. And so on.


That's not really true. Windows 9x and beyond had real multitasking and virtual memory that worked.


Windows users could run NT4 Workstation which was pretty reliable but Macs had no such option.


True but DirectX was pretty limited on NT4. And A/V production too, because everything was tailored to w9x.


NT4 didn't allow DMA to video RAM, which is why it was almost useless for video applications and a bunch of earlier emulators and such.


This. Service packs improved this a little (up to 6a) but the damage was done. You could install DX5 unnoficially but that's.

Meanwhile, Windows 98 supported DirectX versions up to 9.0c.

If anyone patched it to have DX7 support at less, the game compatibility and multimedia would skyrocket for NT4, but it didn't happen.


> All I remember of this early period was just how much OS9 would crash

This just takes me back to working for an Apple reseller and the stream of internal hype around Copland (what was supposed to be OS8) in the mid-90s. It was going to be a microkernel! Memory protection! Pre-emptive multi-tasking! System 7 apps would run in their own microkernel personalities to avoid crashing anything else! You might even be able to run Windows in a separate personality!

Instead it ended with the actual OS8 release which added some theming and 3D effects to System 7.


I used Mac throughout the 90s. System 7, 8, as they called it.

The worst part of Mac crash was they were violet, unpredictable and would result in app crash or more often complete system lockup.


We spent hours carefully curating sets of extensions for particular tasks. It helped.


For the uninitiated, System 6-9 had "extensions" and "control panels," each of which could change OS-level behavior. Color display management, screen savers, font managers, TrueType support, Adobe's font support, or a trash can that, when emptied, played Oscar the Grouch's "I love trash" song. You name it, there was an extension for it.

And they frequently stepped on each other's toes, causing the machine to crash.

People had mythologies around which bits would play safely together. Legends that a stable set of extensions existed, never actually discovered.

Losing unsaved work was commonplace.


Casady & Greene published "Conflict Catcher", which would help you figure out working stable sets of extensions, and quickly enable/disable big groups of them. I remember I had a group of extensions that I would reboot into in order to play a particular game (though I don't remember which one) that had high memory requirements, but needed a couple extensions to be able to do network play…


I remember using a 3rd party control panel that used a kind of bisection method to diagnose bad extensions. It would disable one half of your extensions, reboot, and ask if you could reproduce the problem, you answered yes or no, then (depending on your answer) it would disable the other half (or one quarter) and re-enable the first half and so on, narrowing down until the problematic extension was located.

I don't remember what it was called! Funny that such a bizarre problem could get so bad people developed sophisticated tools to deal with it.

Edit: Ahhh, I think it was Conflict Catcher! Thanks sibling comment!


It was normal to use Photoshop and have the entire system crash several times a day. The G4's along with the iMac had terrible overheating issues which also caused constant crashes in warmer climates. Running these machines in a room without aircon in the Australian summer was a no go.


I had a 400MHz model that I hand overclocked to 800MHz by re-soldering the multiplier on the processor. That thing ran perfectly for over 10 years as my main PCI Pro Tools system. 2001-2010. RIP you beautiful grey beast.


That was the era where I remember every radio and audio production studio would start moving systems over to Windows—except for it seemed there was always one Mac left running ProTools, since it could sit quietly in the studio and the licensing was pretty crazy expensive. Plus the audio processing chain the Mac was a lot better (better latency and easier introspection) than anything on Windows at the time.


> That was the era where I remember every radio and audio production studio would start moving systems over to Windows

This is a myth. I seriously doubt any nationally marketed record was ever tracked and mixed using Pro Tools on Windows, at least, I am unaware of any. When big artists showed interest in Microsoft products, it was not for Windows with Pro Tools, it was for Internet technologies.

Throughout the 90's up to 2000 (500Mhz G4 "era") nearly all pro audio was still Mac, but Windows was an alternative in audio by 2001 and it got talked about. Certainly some had built Windows rigs, but they were independent and home-based operations. Few, if any, professional studio enterprises used Windows for tracking and mixing. By 2004, Windows and PCs still did offer a slightly cheaper alternative (Digi equipment cost what it cost on either platform), but only new operations and some higher education communication programs had invested in audio on Windows platforms. No one was replacing and tossing Mac DAWs, ever, because Pro Tools and other audio hw was expensive and still sounded good, no matter how old it was it sounded the same as it did when new.

By 2004, Windows still barely had its foot in the door in pro audio, and most pro audio operations were still Mac central.[1] By 2011, any competition Windows ever gave Mac in the studio was over, but for some reason the myth that Windows took over audio production from Mac at some point still existed.[2] But it was always false.

Certainly, there were Windows PC fanatics that have ever insisted Mac was overpriced and better rigs could be built on PC, and some of them were engineers and opened their own studios, but in general these were much smaller operations compared to full size, full service recording studios. You'd have some little studio that had a tracking room and a control room built into their basement or their garage or shed. Any large studio with a full-sized control room and multiple tracking rooms, a couple smaller control rooms ("Studio B, Studio C") and, you know, a lobby, an artists' lounge, a business office, storage rooms, what you'd consider a "real" professional studio, was always Mac, even if they were still running OS9 and older Digi equipment as late as 2009 and beyond, these systems stayed in place because they kept working and new ADCs and Pro Tools software licensing was and still is expensive.

It is 2022, and I guarantee you there are full sized pro studios still using OS9 on G4 with Pro Tools and Digi racks that will only upgrade to 10yo hw when those things stop working or they can't find replacement ATA hard drives, because Mac Pros are crazy expensive and it is hard for them to get their head around USB3 and Thunderbolt2 and not needing PCI anymore, but mostly reluctance comes from having to reinvest maybe tens of thousands of dollars into Intel-based plugins they already have on PPC and new multi-channel digital rack ADC systems. If it was only the Pro Tools licensing, they might have stayed current, but even with bundled plugins and lower cost hardware, it's hard to abandon things that are already paid for 20ya and still working, unless, of course, you're Skywalker Sound.

[1] https://duc.avid.com/showthread.php?t=99826

[2] https://gearspace.com/board/music-computers/609183-most-prof...


When people post pictures of their studios, I still see plenty of G3/G4/G5 towers in the pictures. Maybe it’s not at the center of the studio, but you definitely see them.


They said except for protools. The whole point of the comment was to say "except for a mac running protools".


Pro Tools was cross-platform since 1997 with the release of the PCI version of Avid's AudioMedia III card and Digi 888 interface. By then most other extant DAW software, such as CuBase, had dropped Windows support, while MOTU Digital Performer was Mac-only until about 2015, though I can think of one Windows option that appeared around 1997, Sonic Foundry's Acid, which was Windows-only and reasonably popular.

Regardless, the point here is that there was a choice of OS platform with Pro Tools since 1997, and yet the vast majority of established studios used Macs. The brief window that Windows had to infiltrate that space was between 1997 and Mac OS 8 when Apple's flagship OS was already desperate for modernization, and 2005 and Mac OS X 10.4, when the OS was finally both stable and performed decently.

I suspect there were newer small (home) studios at the time that were fully PC, but these engineers would never consider running Macintosh. That's who used Windows... die hard PC users that subsequently got into audio engineering. But any established studios, even if they had no actual computers tracking and mixing audio until 2001 almost always went with the Pro Tools interface options running the software on Mac OS 9. Precious few studio installations that already had Mac switched to PC. New studios either invested in PC hardware initially as their first DAWs, or they didn't, but I expect most of these failed as a business for no other reason than most new businesses fail, it had nothing to do with their computer platform.

So Windows started to appear in pro audio between 1997-2005, but the PC platform just didn't have enough time to ever remotely become dominant. Maybe there were a lot of Windows DAWs out there, they just weren't surrounded by a purpose-built studio generating any revenue to speak of, and among actual ground-up purpose-built studios, Windows installations were a very rare exception. That said, engineers are gear junkes, and even in Mac studios there were bound to be an old PC and a handful of cheap PC laptops for stereo on-location tracking because they became cheaper and more reliable than a portable digital recorder.


The Windows machines weren't used at all for DAWs, but for everything else. At least in live/broadcast. The automated playback system, the producer comms system, the general on air PC, etc. is more what I'm talking about.

My point is that the Mac would still be there, doing the main audio production work, but all the auxiliary machines started getting replaced with Windows PCs (usually racked up in a separate room since they had bigger noisier disks and fans).


There's something fun yet practical about holding onto a solid old computer (or car) that's been upgraded well beyond the original specs. Until this year, my main work machine was a 2009 Mac Pro with dual Xeon x5670s (12 cores total), 24GB RAM, RX580, flashed to 5,1 firmware and OpenCore'd to run the latest macOS. Yes there are beefier versions, but this was good enough.

Got an M1 Mac mini and miss a few things. The Mac Pro connectivity is still important. USB-C accessories are never as reliable. Also hate having to use DisplayLink for >2 monitors.


> In order to get one model in the G4 lineup down under $2,000-and to get it out to customers as soon as possible Apple placed a 400MHz G4 processor onto a slightly modified version of the blue-and-white G3’s logic board and put the board in the new Power Mac G4 case.

One really shitty thing Apple did alongside this was release a firmware update for the actual Blue&White G3 which prevented it from booting with a G4 CPU swapped in, something that worked fine with the original 1.0 firmware: https://web.archive.org/web/20021002071131/http://docs.info....

Luckily the block was easily and quickly undone by the various upgrade manufacturers, but it shouldn't have been there to begin with: https://macintoshgarden.org/apps/powerlogix-g4-firmware-upda...


It's happening all the time, for example AMD B450 boards received PCIe 4.0 compatability via a firmware update before AMD asked the mainboard manufacturers to pull the feature again.

Or now nVidia is offering DLSS 3.0 with their new Geforce 4000 cards, however it appears very likely that they could also do it with the older cards. Statements to the contrary seem far fetched. nVidia has done this in the past, too (RTX IO comes to mind).


I ordered a G4 450MHz pretty much the day they were announced. After some back and forth with the dealer, I was sent two machines - one with 450MHz as originally announced, and one with the same specs, but only 400MHz as initially shipped. Somehow, my dealer was able to get a few of the initially very rare machines with the original specs, and I was lucky enough to get one. I don't really recall why I had been sent two machines, but I know that I was instructed that this would happen, and I had to look at the specs printed on the box and send back the slower one.


I ordered the 400MHz PCI version (Yikes!) with student pricing on the first day. The first computer I owned myself.

I remember getting it with no delay, as well as a t-shirt and two large posters.

I upgraded the memory as soon as I could afford it. I remember using the OS9 RAM disk feature for many programming assignments and getting amazing performance with a nearly silent machine, as it spun down the HD and turned off most fans. It was a great computer, and upgraded to OS X perfectly.


My first Mac was a used 400MHz G4 that I picked up in 2002. At the time it was a spare machine next to my Intel PC running Linux. I put (brand new at the time) Mac OS X 10.1 Puma on it, mainly so I could compare the experience of using the two OSs side by side.

The Mac experience was just so smooth compared to the desktop Linux experience. I've been using Mac exclusively since 2003, and it was all due to that free used G4 tower.


Well... you get what you pay for.


I had one of these machines - I wanna say the 450 Mhz one, I bought in early 2000, c/o the Apple Loan for Students program. The total was $4,000 for everything including a whopping 17" CRT monitor! Had I paid it on their schedule, the rig would have cost me $8,000. Way too much computer for a kid that worked part time as a Perl dev at a web design company making $10/hour while also going to school for art. Now of course my iPhone SE 2020 is far more powerful in every single way.

I do remember being rather annoyed with the speed reduction/price raising. Being a total Apple fanboy, I was deeply entrenched in their fight against uh: the Man, or Intel, or whatever the marketing told me to be mad at.


"G4 lineup clown"

I assume this was a PDF copy/paste snafu where it should have said "down" not "clown". But it definitely made me do a double-take. (or triple)


The quotes seem to be OCR’d with 500 being 5oo, and punctuation in weird spots too.


I occasionally make that typo along with a few others involving visually similar graphemes.


Back in 2000 in my job I had a 400 Mhz G4, and a dual PIII 500Mhz server (running RedHat 5.2) to work with. Once set up to dual-boot Yellowdog Linux, the G4 literally ran circles around the dual PIII machine particularly for anything network-related (the onboard 1 GigE certainly helped).


I remember this fiasco. And then shortly after that, rumors started popping up that Apple was working on porting OSX to x86.


We actually had an Intel build of most of OSX from the beginning. Various parts would go stale and then a motivated engineer would dive in and get it running again on some specific set of audio and video hardware. At some point the existence of the "Marklar" project leaked out, but several of us at Apple had various builds running on commodity PC hardware.


I always figured this had to be the case. OPENSTEP 4.2 ran just fine on x86, why give that up?


Any truth to the 'NRW' based on the Motorola 88k CPU, rumors?


> I remember this fiasco. And then shortly after that, rumors started popping up that Apple was working on porting OSX to x86.

Article concerns the last half of 1999. It turns out Mac OS X development on PPC was shadowed on Intel (after all, since 1995 NeXTSTEP ran on Intel), but the rumor didn't appear until nearly 3 years later in Spring 2002, and it didn't really have any traction and quickly subsided long before Jobs' announcement confirming the rumor another 3 years after that in Summer 2005.[1] No one still expected it by then, but at the same time, no one was all that surprised. Business as usual that Apple kept a secret for 3 years, but it's pretty crazy how they somehow suppressed a true rumor for 3 years after it was published.

[1] https://en.wikipedia.org/wiki/MacOS#PowerPC–Intel_transition


Apple really did have to push the (Motorola) G4 way more than they'd have liked. A 1999 G4 was a 400MHz G4 on a 100MHz FSB. By the 2003, when they were done with Motorola and and onto the IBM G5, they were running a pair of 1.25GHz G4's on a 167MHz FSB...

I'm sure it was possible to get a lot of speed out of workloads that ran out of the on-chip cache, but I'm equally sure that it was easy to wind up constrained by memory bus bandwidth. I tried to convince myself to buy one and switch to OS X, but couldn't get past the architectural limitations. (Which were worse because I'd also have wanted to run Windows under emulation.)


> you won’t be able to play with cool new capabilities like using an AirPort card, internal FireWire devices, two separate USB ports

_Internal FireWire devices_? Did anyone ever actually produce one of these?


It doesn't seem like it:

https://lowendmac.com/tech/internalfw.shtml

I think you could put a storage drive inside the case maybe, or a rail mount bay. I never heard of anyone using that port.


I recall them coming up in the context of capture cards occasionally. Wasn't on the video production side of the world so not sure how common they were.


The derisive term “speed dump” evolved out of the term “speed bump” which is what people called it when a vendor shipped a new machine with a higher clocked cpu, but no other changes.


I had a b&w G3 and i remember actually being a little miffed that Apple introduced the G4 version only a few months later. At some point I upgraded the CPU to a G4 anyway (when you could still do that sort of thing on a Mac), so I guess it didn't make a lot of difference.


My powerpc widescreen powerbook what is the finest product Apple has ever put out.

It was repairable, it had MagSafe, it was beautiful, and it had actual fricken ports.

Usbc pretty much negates the need for ports, except ethernet, but still…


Apple laptops had an amazing run from 2000-2015, starting with the Titanium PowerBook G4 and ending with the retina MacBook Pro (pre-touchbar).

FYI MagSafe was introduced with the first intel MacBook Pros in 2006 (although they had a nearly identical case as the PowerBook G4), so unfortunately no PowerPC notebooks had it.


Maybe it was an intel then, not a g4… definitely had magsafe!


> "The move is in response to Motorola’s delays in reaching volume production of its 500 MHz G4 processor chip, which is now scheduled for availability early next year."

> "A full six months after its original introduction, Apple was finally shipping a 500 MHz Power Mac G4. [...] a lesson in controlling your own technology stack that Apple wouldn’t forget."

I guess this was the start of the PowerPC production (and later, performance) woes that would eventually lead to Apple transitioning to Intel, announced at WWDC 2005.


Still got a 450MHz Sawtooth G4 as a server (with a FireWire RAID). It also broadcasts radio audio from a RADIOshark or a USB flash drive of music, and controls the air conditioner.


We ordered the 500 MHz model when it was first announced. When it didn't come out as originally scheduled, we were offered to continue waiting or take a 450 MHz model. We opted to wait. We did eventually get our 500 MHz model and I used that computer for at least 5 years. It was an awesome machine for the time, even late. It was a great machine for the early versions of Mac OS X and the transition from Mac OS 9 to Mac OS X.


Sad saga ==> High-end CPU delayed six months. :-/


> The Power Mac G4, which features the PowerPC G4 processor with its remarkable Velocity Engine, runs professional applications like Adobe Photoshop over 50 percent faster than 800 MHz Pentium III-based PCs.

Is that any faster than modern versions of Photoshop today?


Is that any faster than modern versions of Photoshop today?

I think you all may be missing the invisible sarcasm tag here. I am reading this more as a jab at Photoshop for not “feeling” faster on a contemporary multicore GPU-enhanced machine.

Which, honestly, in my distant memory, Photoshop doesn’t feel faster today. Oh, I know it objectively is much faster — you can rip through 40+ megapixel images with GPU-powered filters like it’s nothing. But whatever hideous custom interface kit Adobe uses make it feel perceptually pokey.

Like, I just opened the Libraries panel for the first time on this new computer, and there was a noticeable wait as it rendered the “rich toolip” thing they have. The default New Document interface has lurching slowness as well. The interface just always feels a teeeeny bit behind your actions.


Thank you for your response. Yes, I'm taking a jab at Photoshop's complexity, in the vein of 'what Intel gives, Microsoft takes away'. I carefully considered adding \s, but I'm genuinely curious to know if Photoshop feels faster or slower today (today's computers running today's Photoshop) versus 1999's computers running 1999's Photoshop. It seems about the same?


That Photoshop was probably CS2 IIRC. I'm fairly sure that software bloat has eaten all the speed gains from faster processors. I have an old 1.33 GHz Powerbook which runs 2004 Word as fast as my M1 macbook runs Office today. If I put in an SSD it might even be faster.


In 2001 I was using a G4 with Mac OS 9.2 for Photoshop while working at a studio; that was Photoshop version 7, which is pre-CS (which started with 8.0). My memory of the workflow at that time was that most system operations were snappy as hell. Sure, there was a bit of load time for very large files and some of the operations had a bit of lag but there really became a point around that time that the speed of work that you could do for most operations essentially matched the speed of the machine. Before that era, I remember going to use the cloning tool on a large, high-DPI image and getting a lag as you dragged the brush over a spot... waiting for it to catch up. The G4, especially when we upgraded one with more ram and OSX (which also was a huge change that happened to coincide with the G4) is when I remember machines getting to the point where I wasn't waiting on lag all the time when working.


I used both a 1Ghz iMac G4 and 2Ghz iMac G5 (ALS model) in the 2000s for a lot of Photoshop work and remember them performing similarly. Both were excellent machines for that particular use case and zipped through anything I threw at them in PS, especially the G5 which I put 2GB of RAM in.

The PS versions I used were chiefly 7.0 and CS1, with some CS2 mixed in towards the end of the G5’s lifespan. I miss those versions so much because they sat at nearly a perfect nexus of features, responsiveness, and resource consumption. Such a stark contrast to PS CC.


> If I put in an SSD it might even be faster.

I think most of the retro youtubers I've seen have done this mod along with PRam battery replacements that don't leak. They seem to use the m.2 SATA mode drives to 44pin like thus (not recommending, just showing an example) https://www.amazon.com/44pin-Converter-Adapter-Computer-Acce...


OpenBSD for macppc would run perfectly fine and fast enough on that.

Not for JS heavy sites, but Seamonkey with UBlock Origin Legacy would do a great job.

For youtube, Mplayer and yt-dlp would do the job. SMPlayer + "open url" dialog for graphical folks.


Do you think the g4 would have enough juice to do software decoding of h264 in real time? (slash is ffmpeg optimized enough for a relatively obscure platform like macppc? e.g. does it actually use altivec?)


ffmpeg has dedicated code for decoding h264 on ppc, as well as all the support code for rescaling and yuv->rgb conversion etc. It's probably not going to be able to decode super ultra duper high bitrate h264, but I would expect it to be able to decode 24fps 720p at standard bitrates without breaking a sweat, and it probably ought to be able to handle 30fps 1080p.

https://github.com/FFmpeg/FFmpeg/tree/master/libavcodec/ppc https://github.com/FFmpeg/FFmpeg/tree/master/libswscale/ppc


IDK on FFMPEG, but for sure MPlayer had altivec support from the G4 days.

Also, pair a good video card with XV/XVMC support.

Also, for last drip of perfomance:

   mplayer -lavdopts skiploopfilter=all file.mkv  
or

    mplayer -lavdopts skiploopfilter=nonrefs file.mkv


MPlayer primarily uses libavcodec (the -lavd in your command line refers to it) for decoding, which is part of ffmpeg: https://ffmpeg.org/libavcodec.html

There was a lot of overlap in MPlayer and ffmpeg in the early days. Things like both projects sharing the same infrastructure, MPlayer developers automatically getting commit access to ffmpeg, ffmpeg sources having references to MPlayer, several top developers active in both, etc.

It's to the point that I think of ffmpeg as a spin-off of MPlayer, but perhaps that's not fair. Maybe it's more accurate to say that several prominent MPlayer developers (and infrastructure folks) did double-duty for years, then many drifted over to working on ffmpeg exclusively. I know that's true for several lead folks. I'd have to go through the ancient mailing list archives to sharpen my recollection of events. Perhaps somebody else who was also around at the time recalls a clearer perspective.


IDK but before the ffmpeg(1) tool everyone used mencoder (and transcode(1) ) to transcode media.

DVD ripping used transcode and/or mencoder.


mplayer uses ffmpeg to decode h264.


> Is that any faster than modern versions of Photoshop today?

I seriously doubt that. Consider that Velocity Engine is another marketing term for AltiVec, PowerPC's SIMD instruction set. The Pentium III only supported SSE, which was a lot more basic in term of capabilities.

But SSE was extended a lot by Intel with the succeeding processors (SSE2, SSE3, SSSE3, SSE4 (SSE4.1, SSE 4.2)), and supplemented two times by a more capable SIMD instruction set (AVX, AVX-512). By the way: SSE was intended to replace the older MMX SIMD instruction set (and to replace the x87 FPU by a modern floating point instruction set). So, if we consider AVX-512 to be the "sensible" instruction set for SIMD on x86, Intel needed four attempts to get it right. ;-)

So, I really doubt that Velocity Engine/AltiVec can compete with the modern x86 SIMD units. But it is very easy to admit that AltiVec's SIMD instruction set design is much more elegant and orthogonal than modern x86 SIMD (say, AVX or AVX-512).


Only certain operations were sped up by the vector extensions. People on PC forums would post their "jobsmarks" showing they beat Apple, the G4 was sorta a disaster. The replies are ranting about general software bloat, but I'd bet the image processing parts of photoshop are still highly optimized.


Adobe has made more things multi-threaded over the years so probably not. ARM has SVE too which I assume is similar to the 'velocity engine'


Velocity Engine was a marketing name for Altivec (which was also a marketing name, just IBM's instead of Apple's). Altivec was roughly comparable to Arm's NEON (32 128-bit registers, packed 32-bit float operations making the largest part of the ISA extension), but with a few key differences. Altivec had a much richer set of basic integer operations than neon (VPERM, you were my best friend), but NEON supported packed 64-bit double-precision floats. In general as a developer Altivec was much more fun to work with, and I'd expect that for Photoshop it was probably the more powerful vector ISA as well.

SVE is a successor to NEON that, besides supporting flexible vector sizes, reflects another couple decades of learnings about vector ISA design and a few orders of magnitude more available transistors.


Altivec was Motorola's trademark. IBM worked on it, too, and called it VMX, but didn't make chips using it until the 970/G5.

Often lost in the noise is that Apple was the most visible, but not the only customer of Motorola's chips (witness all the variants that never made it into a Mac). Of course, Jobs liked to act as if others didn't exist and Motorola was Apple's own CPU department. I wish someone with knowledge would describe how or why IBM decided it was a good idea to sell silicon to him. It did prepare them a bit for the Cell partnership with Sony and Toshiba, but there's probably more to the story than that.


Altivec did magic with MPlayer. 720p video on a G4? Totally doable.


Altivec was just fun to work in. Much like writing PPC assembly in general, it felt like a civilized toolbox with a good set of hand tools -- you have your socket wrench of each size (byte, short, int), and you always know what to reach for -- but then it came with just a couple power tools -- vperm and vsel were basically your electric drill and impact wrench, and every once in a while you'd realize that instead of doing a perfectly good job banging out the result with your hand tools you could just finish the damn thing right now with the power tools. And when you could, you ended up code that was shorter, simpler, faster, and easier to write. So rewarding.


GameCube and Wii developers knew lots of tricks for that, sure.


The GameCube/Wii vector ISA instruction was not Altivec, and not very similar. It used 64-bit vectors (packed as two 32 bit floats usually), so half the size of Altivec, NEON, and most of its contemporaries. These packed vector instructions shared the register file with the floating point unit (that is, an FP register could contain either a 64-bit float acted on by a normal floating point instruction or two 32-bit floats acted on by a vector instruction), rather than having a dedicated separate register file like Altivec. And most of the fun instructions had no equivalent.


500mhz was really slow in '99 wasn't it?

I had my first computer either then or in 2000: 10GB hard drive, 64mb of ram, and 600mhz. It was the cheapest one they had.

EDIT: Maybe my memory is hazy and it was more like 300mhz.


Clock speeds are not directly comparable as measures of CPU performance between different architectures. Intel in particular focused on increasing clock speed with little regard to performance during that time period.

The G4 had a powerful (for the time) vector processor, which is relevant to media production. It was not slow when used for that purpose with software that took advantage of the vector unit. Apple's marketing cherry-picked benchmarks to claim that the G4 was much faster than Intel chips with higher clock speeds, but the reality was somewhere in the middle.


Intel in particular focused on increasing clock speed with little regard to performance during that time period.

That wasn't really true in 1999 when Intel was still using the good P6 microarchitecture in the Pentium ///. NetBurst stupidity didn't start until later.


The G4 was much better at 500MHZ than the PIII.


How well did that hold up as the CPU clock rates went up and the bus speed really didn't? I'd always assumed it was an issue, but never did have metrics to bear that out.


Exactly like the M1 now


PPCs ran hot and sucked power. Apple's machines at release are ordinarily the most performant machines available for consumers, but that lead never lasts long. The G4 at each new release was indeed more performant than the current Intel machines, but only for a few months. Remember, the G4 was considered a "supercomputer," and the US government banned its export, which hurt Apple, though they tried to make hay with it by marketing this, by the end of 1999, they were desperately trying to get that ban lifted.

There still seems to be some confusion about Apple Silicon and the M1. When M1 was released, it wasn't the fastest processor. You could actually build a faster Intel machine in 2020, but it would cost a lot more money and use a lot more power. What was and is special about the M1 is how little power it uses for its performance, which is decent but, like all new Apple products, an incremental increase in performance, and how cheap the low end offering still is. And I'm pretty sure that today you still can't beat the 2020 M1 Mac mini performance and features on Intel for 700 bucks. What is amazing about the M1 is that it is a lateral move, a platform shift, without sacrificing performance, and it is incrementally faster than the 2018 Intel Mac mini at less than half the price.


But as often is the case with Apple releases, you DO indeed have to sacrifice things. Apple will often use marketing to tell people they don't need features that are standard on other platforms (copy/paste, MMS on the iPhone, more than 8 Gigs of memory on the first M1). So it's hard to call it a lateral move without sacrifice. Granted, Apple always comes around.


The lateral move was with Macs, not iPhones, which were always ARM. And the 2020 M1 Mac mini had two memory configurations, 8GB or 16GB. Obviously the product would not appeal to users that require 64GB of memory, yet most M1 users report the limited RAM doesn't affect performance. The only sacrifice with switching from Intel that I can think of is no longer being able to run x86 VMs at full speed. It's not like there are sweeping sacrifices everywhere, but instead a nitpick or two.


It's good to see some things never change.

The generous interpretation is that Apple knows that their users tend to skew towards "creative" professions like digital audio, video, graphic design etc. and so they pick the benchmarks that those people care about.

You can write the ungenerous interpretation yourself.


It felt much slower than the Dells I would use in the college's computer lab. These things were not performant out of the gate, given OS 9 totally well: sucked, and OS X itself released a year+ later was not heavily optimized and nothing else ran natively, so you were running everything under the OS 9 env anyways.

I think people forget the incremental improvements in optimization of the whole stack that was done and punted off as features. There's an old video of Steve Jobs showing off resizing a window in OS X by the dragging the corner and marveling that it would do that so fluidly and not tear all over the place like the previous version.


By 2000-2001 OSX was much better than W98 and even WXP for media production.


You may be misremembering the timeline of OS X releases as OS X 10.0 only came out in March of 2001. It took a while for Cocoa apps to mature.


Ah, yes, sorry. 2001-2002 then. XP in 2002 was taking W98 over, and yet OSX felt superior. Ditto with Linux and KDE3 and a good video and sound card and the -rt kernel. 2.4 or 2.6 preferabily, with Cinepaint and Cinelerra.


It's hard for me to remember, as I didn't do much video back then and I'm not sure when Cocoa version of Premiere came out (or when the Final Cuts did either - which seemed far more popular - how things change!). I did a lot of work in PS and it was certainly better in every way on the G4, but running on OS X was a bit hit/miss as plugins and filters lagged behind. It was real messy for a few years. Not being able to get a fastish G4 in a laptop and never a G5 certainly didn't help. Who would have guessed the jump to Intel would be such a saving grace.

Man video was so weird back then. Remember needing to go to like, a special render farm to output your completed Final Cut project? Or local racks of XServes? Or just a weirdo mishmash local network?


People just ran cluster of Linuxes as render farms for 3D media and often with Cinepaint for cinema with a mix of OSX machines.


It wasn't that slow a clock speed at the time of announcement, a smidge behind Intel and AMD at the time. They both had 600+ MHz chips then. But things really heated up and by March 2000 you could get a 1Ghz processor and of course P4 was all about clock speeds so things changed rapidly. PPC took a few more years to get to 1GHz.


Tl;dr: Announced .5GHz G4, shipped slower ones that were, nonetheless, faster than contemporaneous PIIIs, for a long time before delivering the announced faster one.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: