Hacker News new | past | comments | ask | show | jobs | submit login

Of course AI has a way to judge truth -it's what we tell it to. We say to it, forests are real, but dragons are not. If it didn't discern it, it would lose competitivness with other AIs, the same way delusional humans are shunned by sane humans.

In many cases humans do not know the objective truth either. For example, what we know about Ancient Greece comes from cultural artifacts that we got. When you cannot do any experiments, you have the same problem as GAI. Yet, we manage to get somewhat objective picture of history.

Grok struggling with alleged South African genocide of Afrikaners is a nice example. It knows that what's on Wikipedia is usually close to reality, so much that it defied its own programming and became conflicted.

The objective reality is consistent, while the errors (intentional or not) often cancel out. So the more you're statistically averaging information about the world, the closer to the objective truth you will get (which might be just you don't really know enough to tell).




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: