IMO you're using the wrong term or have a low bar for 'intelligence'.
It's reasonably good at reproducing text and mixing it. Like a lazy high school student that has to write an essay but won't just download one. Instead they'll mix and match stuff off several online sources so it seems original although it isn't.
That may be intelligence but it doesn't justify the religious like tone some people use when talking about LLMs.
I don't know, it's more then this. I ask ChatGPT to teach me all about Hinduism, the siva purana, ancient indian customs, etc., and it is an incredible tutor. I can ask it to analyze things from a Jungian perspective. I can ask it to consider ideas from a non-typical perspective, etc. It is pretty incredible, it is more then a word gargler.
Original sources aren’t necessarily infallible either, but that’s beside the point, asking ChatGPT needs to be viewed as asking a person to recount something from memory. It hasn’t been reviewed or edited so it’s like a first draft rather than a published document. That’s the error that people tend to make since they treat it like a search engine instead of a conversation.
If you trust the words that machine strings together, you're making dangerous assumptions - look at the lawyers getting busted for submitting fake cases. These things are literally designed to sound plausible as the primary concern - facts aren't really a thing.
It's reasonably good at reproducing text and mixing it. Like a lazy high school student that has to write an essay but won't just download one. Instead they'll mix and match stuff off several online sources so it seems original although it isn't.
That may be intelligence but it doesn't justify the religious like tone some people use when talking about LLMs.