Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I just asked ChatGPT 4o “Who is the President of the United States”. It searched the web and provided citations

> The current president of the United States is Joe Biden. He has been in office since January 20, 2021, and is running for re-election in the upcoming 2024 presidential election [oai_citation:1,2024 United States presidential election - Wikipedia](https://en.wikipedia.org/wiki/2024_United_States_presidentia...) [oai_citation:2,Joe Biden: The President | The White House](https://www.whitehouse.gov/administration/president-biden/).



I think the problem I am talking about is more likely to come up if you ask a question which isn’t directly about who is the current President, but rather some other issue in which the President just happens to be mentioned.

Suppose hypothetically Trump wins in November, and on 2025-01-20 he is sworn in. On 2025-01-21 I ask it some question about a policy issue. It finds some articles from mid-2024 responding to the question. Those articles talk about “former President Trump” and its background knowledge says he is a former President, so in its answer it calls him the “former President”. Except (in this hypothetical) as of yesterday that’s no longer correct, he’s now the current President.

Whereas, a reasonably well-informed human would be less likely to make that mistake, because if Trump is inaugurated on 2025-01-20, you expect by the next day such a human would have added that fact to their background knowledge / long-term memory, and if they read an article calling him “former President”, they’d know that would be an older article from before his return to the White House




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: