Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
LLM Hallucination Seems Like a Big Problem, Not a Mere Speedbump (freddiedeboer.substack.com)
8 points by blueridge 78 days ago | hide | past | favorite | 3 comments


LLM are statistical language models. They don't hallucinate because they have no brain or senses for that.


Pedantic comment. It’s commonly understood that hallucination means “made up crap generated by an LLM” we could push for a better name like fabrication, but then we have to re-train all the 95% of the population who don’t even know LLMs aren’t trustworthy.


> "commonly understood"

That's the point: even on hacker news LLM are not understood even in the basics of the basics. I refuse to bend over the buzzwords of the industry to help them confuse and scam unknowledgeable people




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: