I agree with your assessment, it's maybe a bit of both.
The internet has given anyone/everyone a voice, for better or for worse, both widening and shortening the feedback loop. Now LLMs are shortening the loop even more, while unable to distinguish fact from fiction. Given how many humans will regurgitate whatever they read or heard as facts without applying any critical thought, the parallels are interesting.
I suspect that LLMs will affect society in several ways, assisting both the common consumers with whatever query they have at the moment, as well as DIY types looking for more in-depth information. Both are learning events, but even when requesting in-depth info, the LLM still feels like a shortcut. I think the gap between superficial and deep understanding of subjects is likely to get wider in the post-LLM world.
I do have hope for the garbage in, garbage out aspect though. The early/current LLMs were trained on plenty of garbage, but I think it's inevitable that will be improved.
The internet has given anyone/everyone a voice, for better or for worse, both widening and shortening the feedback loop. Now LLMs are shortening the loop even more, while unable to distinguish fact from fiction. Given how many humans will regurgitate whatever they read or heard as facts without applying any critical thought, the parallels are interesting.
I suspect that LLMs will affect society in several ways, assisting both the common consumers with whatever query they have at the moment, as well as DIY types looking for more in-depth information. Both are learning events, but even when requesting in-depth info, the LLM still feels like a shortcut. I think the gap between superficial and deep understanding of subjects is likely to get wider in the post-LLM world.
I do have hope for the garbage in, garbage out aspect though. The early/current LLMs were trained on plenty of garbage, but I think it's inevitable that will be improved.