So? The internet in general is that, as are people sharing things they know. You might as well say humans are jet fuel for disinformation, which they are. You don't need an example to tell people that LLMs use popular knowledge since everybody knows that. But an example of an LLM generating true statements doesn't even support that claim anyway.
>> What I am trying to say is that LLM's have no concept of "truth". They only produce statistically relevant responses to queries submitted to them.
> So?
Okay, this is my last attempt to express myself clearly to you in this thread.
> The internet in general is that, as are people sharing things they know.
"The internet in general" and "people sharing things" is not the topic of this thread. The topic is LLM's and has evolved into whether or not those algorithms in conjunction with their training data sets possess knowledge of "truth", as introduced by yourself previously:
> If you're trying to show that LLMs can be guided into saying false things ...
> LLMs to tend to say true things ...
These are examples of anthropomorphization. This is understandable as most of the posts you have kindly shared in this thread have been been focused on people or conflating a category of algorithms with same.
What I have consistently said is quoted above
LLM's have no concept of "truth."
Any interpretation of text they generate as being "true" or "false" is done by a person reading the text, not the algorithms nor the data on which they were trained.
Sounds like you're not trying to say anything if your final attempt is LLMs have no concept of truth. Books don't have that either. Even humans don't really have it and use something else like "everybody knows" most of the time or science which itself isn't producing truth.
What I am trying to say is that LLM's have no concept of "truth". They only produce statistically relevant responses to queries submitted to them.
Assuming otherwise is a fallacy.
LLM services are conceptually closer to a "popularity contest" than a "knowledgeable Q&A session" their vendors purport them to be.