Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The thought expressed in the title came to my mind when I saw Nvidia described as an "AI company" in the press recently...




An object is what it does. NVIDIA is making the most money through AI, so that's what it is now to the market

The hardware is heavily optimized for low precision matrix math, pretty much only used for AI.

Not since the Ozaki scheme has appeared. Good high-precision perf from low-precision tensor units has unlocked some very interesting uses of low-fp64-perf GPUs.

And in graphics rasterization.

Those parts (the tensor cores) aren't used for rasterization.

While raster units are separate from tensor cores, both can be leveraged for image rendering. The simplest example of this is Nvidia's DLSS.

Nvidia is selling hardware. What the buyers are doing with it doesn't change anything about Nvidia.

A company selling knives is not considered a butcher or cook, despite the main uses of knives being just that.


A company that sells knives, and also invests heavily in restaurants, might be considered to be in the restaurant business, however

Nvidia spends a lot of money investing in downstream AI companies, in what feels like a rather incestuous circle


I'm not sure if this is what you mean too, but by the same logic it's not a 'graphics company' nor gaming etc. either. 'Chipmaker' as they say, specialising in highly parallel application-specific compute.

But it clearly does, as NVIDIA rolls out hardware and software optimised for deployment as AI compute.

You have a point. Then it's a "compute" company.

Just like it was a crypto company. Its a computational fad chaser.

Next up: quantum. And that will be the end of them.


They're just selling. Thats it. If 50 Pittsburgh Steelers fans show up to your bar every Sunday, congrats, you're a Steelers bar now.

I don't think they will fall for the quantum hype. Jensen has publicly stated that quantum computing is at least decades away from being useful.

We were saying the same thing about AI less than one decade ago, of course... and then the Vaswani paper came out. What if it turns out that when it comes to quantum computing, "Used Pinball Machine Parts Are All You Need"?

Indeed, why would they not call themselves NvidAI to begin with. This company has twice already been super lucky to have their products used for the wrong thing (given GPUs were created to accelerated graphics, not mining or inference)

3 times, if you count the physics GPGPU boom that Nvidia rode before cryptocurrencies.

And other than maybe the crypto stuff, luck had nothing to do with it. Nvidia was ready to support these other use cases because in a very real way they made them happen. Nvidia hardware is not particularly better for these workloads than competitors. The reason they are the $4.6T company is that all the foundational software was built on them. And the reason for that is that JHH invested heavily in supporting the development of that software, before anyone else realized there was a market there worth investing in. He made the call to make all future GPUs support CUDA in 2006, before there were heavy users.


I don't think the physics processing units were ever big. This was mostly just offloading some of their physics processes from the CPU to the GPU. It could be seen as a feature of GPUs for games, like ray-tracing acceleration.

That's not what I was referring to. I was talking about NV selling GPGPUs for HPC loads, starting with the Tesla generation. They were mostly used for CFD.

Ah, you're right. Thanks for the correction. But seems like they have applications far beyond CFD if they are what's put in the biggest supercomputers.

CFD is what 90+% of non-AI supercomputer time is spent on. Whether you are doing aerodynamic simulations for a new car chassis, weather forecasting, or testing nuclear weapons in silico, or any of the other of literally hundreds of interesting applications, the computers basically run the same code just with different data inputs.

They still had a boom of being used for a lot of HPC loads, even non-AI supercomputers, although it was quickly dwarfed by their other markets.

I don't think it's luck. They invested in CUDA long before the AI hype.

They quietly (at first) developed general purpose accelerators for a specific type of parallel compute. It turns out there are more and more applications being discovered for those.

It looks a lot like visionary long term planning to me.

I find myself reaching for Jax more and more where you would have done numpy in the past. The performance difference is insane once you learn how to leverage this style of parallelization.


Are you able to share a bit, enough to explain to others doing similar work that this "Jax > numpy" aspect applies to what their work (and thus that they'd be well-off to learn enough Jax to make use of it themselves)?

This should be a good starting point:

https://docs.jax.dev/en/latest/jax.numpy.html

A lot of this really is a drop in replacement for numpy that runs insanely fast on the GPU.

That said you do need to adapt to its constraints somewhat. Some things you can't do in the jitted functions, and some things need to be done differently.

For example, finding the most common value along some dimension in a matrix on the GPU is often best done by sorting along that dimension and taking a cumulative sum, which sort of blew my mind when I first learnt it.


Or that parallel computing is immensely useful in general and that more use cases will be found for it in the future beyond AI.

At some point, maybe it isn’t luck anymore but a general trend towards parallel computing.


  > parallel computing.
Maybe because the acronym PCU will invite to many toilet jokes.

"No, I see the pee" and at least another that I'd rather not express in polite company ))


to be fair, the percentage of their revenue derived from ai-related sales is much higher now than before. Why is that not accurate?

GN did a video a few weeks ago in which they were showing a slide from Nvidias shareholder meeting in which it was shown that gaming was a tiny part of Nvidias revenue.

Basically, almost half of their revenue is pure profit and all of that comes from AI.

While the slide looked a lot nicer, the data is also available on their site https://nvidianews.nvidia.com/news/nvidia-announces-financia...



Just because customers use their hardware for AI does not mean the hardware maker is an AI company.

There's a lot of software involved in GPUs, and NVIDIA's winning strategy has been that the software is great. They maintained a stable ecosystem across most of their consumer and workstation/server stack for many years before crypto, AI and GPU-focused HPC really blew up. AMD has generally better hardware but poor enough software that "fine wine" is a thing (ie the software takes many years post-hardware-launch to actually properly utilize the hardware). For example, they only recently got around to making AI libraries usable on the pre-covid 5700XT.

NVIDIA basically owns the market because of the stability of the CUDA ecosystem. So, I think it might be fair to call them an AI company, though I definitely wouldn't call them just a hardware maker.


*barely passable software while their competitors literally shit the bed, but I take your point.

As someone who codes in CUDA daily, putting out and maintaining so many different libraries implementing complex multi-stage GPU algorithms efficiently at many different levels of abstraction, without having a ton of edgecase bugs everywhere, alongside maintaining all of the tooling for debugging and profiling, and still having regular updates, is quite a bit beyond "barely passable". It's a feat only matched by a handful of other companies.

Literally?

"Literally" as an intensifier predates the United States.

You aren't even "dying on this hill", people like you are inventing a hill made out of dead bodies.


Literally inventing a hill made out of dead bodies, or figuratively?

https://news.ycombinator.com/item?id=45487334

When more of their revenue comes from AI than graphics, and they're literally removing graphics output from their hardware...


This is similar to evolution. Evolution repurposes old systems for newer tasks. The GPU name is stuck but it has been deployed for AI.

I mean afaik the consumer GPUs portion of their business has always been tiny in comparison to enterprise (except to begin with right at the start of the company's history, I believe).

In a way it's the scientific/AI/etc enterprise use of Nvidia hardware that enables the sale of consumer GPUs as a side effect (which are just byproducts of workstation cards having a certain yield - so flawed chips can be used in consumer cards).


No, gaming revenue for NVIDIA was historically the major revenue percentage from the company (up until 2023). Only with the recent AI boom this changed.

Source (I am not sure how reliable this is because I got this from ChatGPT, but I remember seeing something similar from other sources): https://www.fool.com/investing/2024/02/12/gaming-was-nvidias....


Nvidia started as a gaming company and gaming was the majority of their business until the last 5-10 years.



Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: