Hacker News new | past | comments | ask | show | jobs | submit login

One way to interpret this might be that in consumer products, it's easier for incumbents to add AI to improve an already well-marketed product than to build and market one from scratch.



Yeah, and I think it’s also simply that inference with strong models is expensive.

OpenAI is lighting boatloads of money on fire to provide the ChatGPT free version. Same with Google for their search results AI, and perplexity which has also raised a lot. Unless you can raise a billion and find a unique wedge, it’s hard to even be in the game.

You can try to use small cheap models, but people will notice that free ChatGPT is 10x better.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: