Well apple was a prime Intel client for years until they released M1, and ARM on the cloud isn't really a thing for now... Ultimately it's all about what makes the most sense for what will make the most money, and on a datacenter that means x86 with Linux/Unix
This is a false statement, you have to be specific which one is not available on macos, there are plenty you can use already. Even freecad runs on macos.
They are implying that even though Apple is a wealthy consumer hardware company that has major spats with nVidia and Samsung, it doesn't always make economic sense to make tools they might need in-house when they can simply buy them from a rival.
So rather than invest engineering resources to re-imagine the fridge, they can simply buy them from established manufacturers that make household appliances like Samsung, Sony etc.
The point of this thread is that even though Apple makes silly charts saying how good their hardware is at ML, they use products that their silly charts say aren't as good.
There is no such hypocrisy if they use Samsung refrigerators.
Training the model requires inference for forward propagation, so even then, for your comment to be relevant, you'd need to find a plot that Apple uses to compare inference on quantized models versus Nvidia, which doesn't exist.
...and doing either of those things with CUDA is impossible on Mac. Why? Because Apple burned their bridge with Nvidia and threw a temper tantrum, that's why. Now Nvidia can't support MacOS, even if they wanted.
That's kinda the point of my original comment. Apple claims to know what's best, but contradict themselves through their own actions. We wouldn't be in awkward situations like this if Apple didn't staunchly box-out competitors and force customers to follow them or abandon the ecosystem. It's almost vindicating for people like me, who left MacOS because of these pointless decisions.
I feel like Apple is only testing the waters with AI right now, but perhaps if they get involved enough they'll spend money on their own compute infrastructure? Nvidia is kind of the king at GPU compute right now, and developing comparable hardware is no small or cheap task, but I think Apple is in a very good position to be able to make it work---if they decide to invest in it. But honestly, as far as corporate feud goes, I feel like companies will happily suck it up if it makes some process cheaper and/or easier
> But honestly, as far as corporate feud goes, I feel like companies will happily suck it up if it makes some process cheaper and/or easier
That’s what I think is going on. Apple hated being on the hook for Nvidia’s terrible drivers and chipset/heat problems that ended up causing a ton of warranty repairs.
In this case they’re not a partner, they’re just a normal customer like everyone else. And if Intel comes out with a better AI training card tomorrow Apple can switch over without any worry.
They’re not at the mercy of Nvidia like they were with graphics chips. They’re just choosing (what I assume to be) the best off the shelf hardware for what they need.
Apple silicon is good but it’s designed for a portable. Even the studio and Mac Pro are just laptop chips stitched together. They gotta use heavy duty gear to do heavy duty shit. I know they have a soured relationship with nvidia tho so I would like to see them bolster the AMD/rocm ecosystem. Chances are they’re working on their own stuff here too, though. They are sitting on billions of dollars of liquid cash so I’d imagine they’re using that for some serious R&D.
Dependent is a strong word. At the end of the day all these DL models run on any hardware, and you can easily swap out one type of hardware for another perhaps with some small performance impact. They're commodities, basically.
Huh, even Apple isn't capable of escaping the CUDA trap. Funny to see them go from moral enemies with Nvidia to partially-dependent on them...