Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm often using LLMs for stuff that requires recent data. No way I'm running a web crawler in addition to my local LLM. For coding it could theoretically work as you don't always need latest and greatest but would still make me anxious.


That’s a perfect use case with MCP though.

My biggest issue is local models I can run on my m1/m4 mbp are not smart enough to use tools consistently, and the context windows are too small for iterative uses.

The last year has seen a lot of improvement in small models though (gemma 3n is fantastic), so hopefully it’s only a matter of time.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: