Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Meta alone bought 350,000 H100 GPUs, which cost them $10.5 billion: https://www.pcmag.com/news/zuckerbergs-meta-is-spending-bill...

This kind of AI capital investment seems to have helped them improve the feed recommendations, doubling their market cap over the last few years. In other words, they got their money back many times over! Chances are that they're going to invest this capital into B100 GPUs next year.

Apple is about to revamp Siri with generative AI for hundreds of millions of their customers. I don't know how many GPUs that'll require, but I assume... many.

There's a gold rush, and NVIDIA is the only shovel manufacturer in the world right now.



> Meta alone bought 350,000 H100 GPUs, which cost them $10.5 billion

Right, which means you need about a trillion dollars more to get to a trillion dollars. There's not another 100 Metas floating around.

> Apple is about to revamp Siri with generative AI for hundreds of millions of their customers. I don't know how many GPUs that'll require, but I assume... many.

Apple also said they were doing it with their silicon. Apple in particular is all but guaranteed to refuse to buy from Nvidia even.

> There's a gold rush, and NVIDIA is the only shovel manufacturer in the world right now.

lol no they aren't. This is literally a post about AMD's AI product even. But Apple and Google both have in-house chips as well.

Nvidia is the big general party player, for sure, but they aren't the only. And more to the point, exponential growth of the already largest player for 6 years is still fucking absurd.


The GDP of the US alone over the next five years is $135T. Throw in other modern economies that use cloud services like Office 365 and you’re over $200T.

If AI can improve productivity by just 1% then that is $2T more. If it costs $1T in NVIDIA hardware then this is well worth it.


(note to conversation participants - I think jiggawatts might be arguing about $50B/qtr x 24 qtr = $1 trillion and kllrnohj is arguing $20 billion * 2^6 years = $1 trillion - although neither approach seems to be accounting for NPV).

That is assuming Nvidia can capture the value and doesn't get crushed by commodity economics. Which I can see happening and I can also see not happening. Their margins are going to be under tremendous pressure. Plus I doubt Meta are going to be cycling all their GPUs quarterly, there is likely to be a rush then settling of capital expenses.


Another implicit assumption is that LLMs will be SoTA throughout that period, or the successor architecture will have an equally insatiable appetite for lots of compute, memory and memory bandwidth; I'd like to believe that Nvidia is one research paper away from a steep drop in revenue.


Agreed with @roenxi and I’d like to propose a variant of your comment:

All evidence is that “more is better”. Everyone involved professionally is of the mind that scaling up is the key.

However, like you said, just a single invention could cause the AI winds to blow the other way and instantly crash NVIDIA’s stock price.

Something I’ve been thinking about is that the current systems rely on global communications which requires expensive networking and high bandwidth memory. What if someone invents an algorithm that can be trained on a “Beowulf cluster” of nodes with low communication requirements?

For example the human brain uses local connectivity between neurons. There is no global update during “training”. If someone could emulate that in code, NVIDIA would be in trouble.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: