Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

If you running windows anywhere then you better off using ollama, lmstudio, and or LLamaSharp for coding these are all cross-platform too.


I found LlamaSharp to be quite unstable with random crashes in the built-in llama.cpp build.


Pretty cool! What are the steps to use these on mobile? Stoked about using ollama on my iPhone!


>> "If running windows" << All of these have web interfaces actually, and all of these implement the same openai api.

So you get to browse locally and remotely if you are able to expose the service remotely adjusting your router.

Coudflare will also expose services remotely if you wishhttps://developers.cloudflare.com/cloudflare-one/connections...

So you can also run on any LLM privately with ollama, lmstudio, and or LLamaSharp with windows, mac and iphone, all are opensource and customizable too and user friendly and frequently maintained.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: