Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Thank you. This is genuinely a valid reason even from a simple consistency perspective.

(edit: I think -- after I read some of the links -- I understand why Ollama comes across as less of a hero. Still, I am giving them some benefit of the doubt since they made local models very accessible to plebs like me; and maybe I can graduate to no ollama )



I think this is the thing: if you can use llama.cpp, you probably shouldn't use Ollama. It's designed for the beginner.


You shouldn't use Ollama as a beginner either. It comes with crazy begginer-hostile defaults out of the box.


Hmm? I would argue against that line of argumentation. It is ridiculously easy to start out of box and working. Once the user starts moving against obvious restrictions resulting from the trade-offs in defaults, they can move on to something more custom. Woulnd't that be the definition of beginner friendly?

I am biased since I effectively started with Ollama as my main local llm so take this response for what it is.

Still, you got me curious. Which defaults you consider hostile ( not disagreeing; this is pure curiosity )?


> Which defaults you consider hostile

The infamous Ollama context limits, for one.


It is infamous, but does it really stop anyone from exploring it. Granted, I am an anecdote, but flawed as it is, I would personally argue that the imposed limits are perfectly fine for someone, who is just starting. After all, those can be changed once you get your bearings.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: