Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Anyone reckon there's a chance that GPT hallucinates because it was trained on online material (e.g. reddit and other forums)? I'd have to say on topics I know GPT is about as hit or miss as a random internet comment, especially in that they'll both give a confidently stated answer whether the answer is factual or not.

Is it possible GPT just thinks[0] that any answer stated confidently is preferable over not giving an answer?

Promise I'm not just being snarky, legitimate wonder!

[0]: I know it doesn't actually think, you know what I mean



Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: