Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I guess they also have Samsung fridges in the offices..


And probably Intel processors and Linux in their datacenters.


Well apple was a prime Intel client for years until they released M1, and ARM on the cloud isn't really a thing for now... Ultimately it's all about what makes the most sense for what will make the most money, and on a datacenter that means x86 with Linux/Unix


And Samsung components in iphones ...


And they use CAD software running on Windows (it simply doesn't exist on MacOS)


This is a false statement, you have to be specific which one is not available on macos, there are plenty you can use already. Even freecad runs on macos.


I don't get it, does Apple also make fridges now?


They are implying that even though Apple is a wealthy consumer hardware company that has major spats with nVidia and Samsung, it doesn't always make economic sense to make tools they might need in-house when they can simply buy them from a rival.

So rather than invest engineering resources to re-imagine the fridge, they can simply buy them from established manufacturers that make household appliances like Samsung, Sony etc.


Not only that, don’t they buy display panels and maybe even storage or RAM chips from Samsung?

Once two giant companies are dealing with each other it can get really complicated to cut everything off.


Apple doesn't make silly charts saying they make better refrigerators.


Because they don't sell refrigerators


The point of this thread is that even though Apple makes silly charts saying how good their hardware is at ML, they use products that their silly charts say aren't as good.

There is no such hypocrisy if they use Samsung refrigerators.


ML inference and training are not the same task.


This plot is about general GPU performance, not pure inference. https://www.apple.com/newsroom/2022/03/apple-unveils-m1-ultr...

Training the model requires inference for forward propagation, so even then, for your comment to be relevant, you'd need to find a plot that Apple uses to compare inference on quantized models versus Nvidia, which doesn't exist.


...and doing either of those things with CUDA is impossible on Mac. Why? Because Apple burned their bridge with Nvidia and threw a temper tantrum, that's why. Now Nvidia can't support MacOS, even if they wanted.

That's kinda the point of my original comment. Apple claims to know what's best, but contradict themselves through their own actions. We wouldn't be in awkward situations like this if Apple didn't staunchly box-out competitors and force customers to follow them or abandon the ecosystem. It's almost vindicating for people like me, who left MacOS because of these pointless decisions.


No, they don't build compute clusters either.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: