What I am saying that today with ChatGPT 4x, if it detects that its model doesn’t have the current information, it will search on the web and summarize the results from different websites.
One example I have was “what’s going on in Atlanta over the next two weeks”.
I had another example I just posted where I asked it “who are Amex’s and Chase’s current transfer partners and which ones do they have in common”.
It searched the web and summarized the information like you would expect.
Suppose I ask it questions like “who is Joe Biden?” and “who is Donald Trump?”. It is going to say “current President elected in 2020” and “former President elected in 2016”. And I imagine it doesn’t need any web search to answer those questions, it already knows.
But, is it smart enough to think “if today’s date is 2024-11-06, I need to do a web search to find out results of yesterday’s Presidential election?”
For the average adult human (even very many non-American adult humans), “who is current US President” is something they know off the top of their head, not something they have to always look up. I think inevitably it is going to perform worse at some point if it has to always look that up if it has changed recently. At some point the model is going to not look it up - or not pick it up from the results of its search - and hence get it wrong.
Suppose, hypothetically, Biden unexpectedly dies tomorrow (of natural causes) and Kamala Harris becomes President. Because the model won’t be expecting that, it wouldn’t necessarily think to do the web search necessary to find out that had happened.
I think the problem I am talking about is more likely to come up if you ask a question which isn’t directly about who is the current President, but rather some other issue in which the President just happens to be mentioned.
Suppose hypothetically Trump wins in November, and on 2025-01-20 he is sworn in. On 2025-01-21 I ask it some question about a policy issue. It finds some articles from mid-2024 responding to the question. Those articles talk about “former President Trump” and its background knowledge says he is a former President, so in its answer it calls him the “former President”. Except (in this hypothetical) as of yesterday that’s no longer correct, he’s now the current President.
Whereas, a reasonably well-informed human would be less likely to make that mistake, because if Trump is inaugurated on 2025-01-20, you expect by the next day such a human would have added that fact to their background knowledge / long-term memory, and if they read an article calling him “former President”, they’d know that would be an older article from before his return to the White House
One example I have was “what’s going on in Atlanta over the next two weeks”.
I had another example I just posted where I asked it “who are Amex’s and Chase’s current transfer partners and which ones do they have in common”.
It searched the web and summarized the information like you would expect.