Hacker News new | past | comments | ask | show | jobs | submit login

Related question: what is everyone using to run a local LLM? I'm using Jan.ai and it's been okay. I also see OpenWebUI mentioned quite often.





LM studio if you just want an app. openwebui is just a front end - you'd need to have either llama.cpp or vllm behind it to serve the model

LMStudio, and sometimes AnythingLLM.

KoboldCPP + SillyTavern, has worked the best for me.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: