Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

There is a very strong use case for less-powerful but local LLMs, and there's going to be a big expansion in that area in the next couple years. So big, I'll bet that all the major AI players will do everything they can to cripple them.


Not Nvidia, that's their best scenario


Unfortunately I think you're overestimating how many people care enough about privacy to go through the effort of running LLMs locally and likely buying a GPU




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: