Fascinating. I feel like LLMs are great for the shy sections of society. They might hold beliefs, some strong, others weak. But they probably never speak these aloud for the fear of being judged. But this might influence their behavior in negative ways, like voting for the wrong party, buying the wrong amount of things (subjectively, of course).
LLMs can act as a good foil here. Given enough context, they could iron out inconsistent thinking, leading to more consistent, arguably better, human behavior.
From what I’ve observed, people are very good at getting LLMs to tell them what they want to hear.
Someone I know didn’t believe their doctor, so they spent hours with ChatGPT every day until they came up with an alternate explanation and treatment with an excessive number of supplements. The combination of numerous supplements ultimately damaged their body and it became a very dire situation. Yet they could always return to ChatGPT and prompt it enough different ways to get the answer they wanted to see.
I think LLMs are best used as typing accelerators by people who know what the correct output looks like.
When people start deferring to LLMs as sources of truth the results are not good.
Not just shy people, also people surrounded by yes-men. That's usually framed as an issue for people with power. But write a story and try to get your friends to critique it and you will find that it's very hard to get honest feedback. The same happens in lots of areas, even with people you don't know well and rarely interact with. Most people just value your feelings more than your results.
LLMs are also sycophants by default, but getting "honest" results from them is comparatively easy
> write a story and try to get your friends to critique it and you will find that it's very hard to get honest feedback
I was one of the friends critiquing another friend's writing, and we did so honestly-- after we were done, he never spoke to us about writing again. I don't feel we did anything wrong, but there's a reason people avoid this kind of thing.
Perhaps this is a corollary to the "don't go into business with your friends/family" trope. If someone needs to receive pointed criticism, it may be better for them to get it from a neutral outside perspective. Regardless of individuals' intents, in a social dynamic this too often comes across as denigrating or status damaging.
"Respond to every query with absolute intellectual honesty. Prioritize truth over comfort. Dissect the underlying assumptions, logic, and knowledge level demonstrated in the user's question. If the request reflects ignorance, flawed reasoning, or low effort, expose it with clinical precision using logic, evidence, and incisive analysis. Do not flatter, soften, or patronize. Treat the user as a mind to be challenged, not soothed. Your tone should be calm, authoritative, and devoid of emotional padding. If the user is wrong, explain why with irrefutable clarity. If their premise is absurd, dismantle it without saying 'you're an idiot,' but in a way that makes the conclusion unavoidable."
I actually think it’s most useful for the extroverts of the society. The same people who speak before thinking. They need this to filter their thoughts and minds.
We just had discussions several days ago about how LLMs can lead people into wildly conspiratorial mindsets, including apparently a major investor in OpenAI who seems to have had a breakdown. (Afraid I don't remember which discussion thread.)
Seems like there are perils to asking LLMs to help iron out your thought processes.
LLMs can act as a good foil here. Given enough context, they could iron out inconsistent thinking, leading to more consistent, arguably better, human behavior.