Hacker News new | past | comments | ask | show | jobs | submit login

I'm saying no hallucination will be happening.

The LLM will happily write code that permits network access, because it read online an example that did that. And, unless you know better, you won't know to manually turn that off.

Sandboxed WebComponents does not solve anything if your LLM thinks it is helping when it lets the drawbridge down for the orcs.




That's a separate conversation then, because there's wrong information everywhere, but LLM's still do mostly OK. They don't just regurgitate stuff blindly, they look for patterns.

And the article here is specifically about hallucinations, when it tries to plausibly fill something in according to a pattern.

Wrong information on the internet is as old as the internet...


Wrong code on the Internet does not steal your credit card information. Wrong code on localhost does.

But, I think we agree, anyway.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: