Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

GPUs are also used to speed up inference (the math is virtually the same). You think your ChatGPT queries are running on x86 servers?


But do you think with the profit margins of NVidia, others won't be offering competing chips? Google already has their own for example.

From that perspective the notion that NVidia will own this AI future while others such as AMD and Intel standby, would be silly.

Im already surprised it took this long. The NVidia moat might he software, but not anything that warrants these kind of margins at this scale. It is likely there will be strong price competition on hardware for inference.


> You think your ChatGPT queries are running on x86 servers

What makes you think? Or are all non Nvidia GPUs x86?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: