Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

In my eyes, it is less risky with AMD. When you're rooting for the underdog, they have every incentive to help you. This isn't a battle to "win" all of AI or have one beat the other, I just need to create a nice profitable business that solves customer needs.

If I go with Nvidia, then I'm just another one of the 500 other companies doing exactly the same thing.

I'm a firm believer that there should not be a single company that controls all of the compute for AI. It would be like having Cisco be the only company that provides routers for the internet.

Additionally, we are not just AMD. We will run any compute that our customers want us to deploy for them. We are the capex/opex for businesses that don't want to put up the millions, or figure out and deal with all the domain specific details of deploying this level of compute. The only criteria we have is that it is the best-in-class available today for each accelerator. For example, I wouldn't deploy H100's because they are essentially old tech now.

> Are you not interested in AI/ML customers?

Read these blog posts and tell me why you'd ask that question...

https://chipsandcheese.com/2024/06/25/testing-amds-giant-mi3...

https://www.nscale.com/blog/nscale-benchmarks-amd-mi300x-gpu...



OK, I just looked at the first blog post: “ROCm is nowhere near where it needs to be to truly compete with CUDA.”

That’s all I need to know as an AI/ML customer.


That is fine. Nobody is pretending that the software side is perfect. What we and AMD are looking for is the early adopters willing to bet on a new (just available in April) class of hardware that is better than previous generations. Which, given the general status of AI itself today, should be pretty easy to find.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: