Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don't really buy the "cost of hardware" portion of the argument, even if everything else seems sound. In 1992 if you wanted 3D graphics, you'd call up SGI and drop $25,000 on an Indigo 2. 6 years later, you could walk into Circuit City and buy a Voodoo2 for 300 bucks, slap it in the PC you already owned, and call it a day.

I know we aren't in the 90s. I know that the cost of successive process nodes has grown exponentially, even when normalizing for inflation. But, still. I'd be wary of betting the farm on AI being eternally confined to giant, expensive special-purpose hardware.

This stuff is going to get crammed into a little special purpose chip dangling off your phone's CPU. Either that, or GPU compute will become so commodified that it'll be a cheap throw-in for any given VPS.



This stuff is going to get crammed through an algorithm orders of magnitude more efficient. We have living proof that it does not take 5GW to make a brain.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: