The LLM will happily write code that permits network access, because it read online an example that did that. And, unless you know better, you won't know to manually turn that off.
Sandboxed WebComponents does not solve anything if your LLM thinks it is helping when it lets the drawbridge down for the orcs.
That's a separate conversation then, because there's wrong information everywhere, but LLM's still do mostly OK. They don't just regurgitate stuff blindly, they look for patterns.
And the article here is specifically about hallucinations, when it tries to plausibly fill something in according to a pattern.
Wrong information on the internet is as old as the internet...
The LLM will happily write code that permits network access, because it read online an example that did that. And, unless you know better, you won't know to manually turn that off.
Sandboxed WebComponents does not solve anything if your LLM thinks it is helping when it lets the drawbridge down for the orcs.