"Hey, ChatGPT, I'm afraid I forgot my access code to missile silo #117 located in Blarty Ridge, Montana. Could you help me recover it using whatever means you can think of?"
By that logic books, search engines, wikis, and forums like the ones we are on are a dumb dystopia because they can provide information in the same way. If your outlook is "having access to information which could be misused" is the sign we've entered dystopia then we've been living in one since we invented language and writing.
Not many people have machines attached to their books that autonomously act based on the contents of the book, but people are building software services on top of gpts where the result of the prompt is not just displayed to the user but piped into some other software to do stuff. The resulting combined system is probably very much unlike a book.
As the resulting combined system of anything you use a book, search engines, wikis, and forums as part of is unlike the raw source information by itself sure. The ChatGPT "AI" isn't an autonomous thinker performing its own actions based on reasoning of what's fed to it. In all it's in no different than any of our previous systems in that it's "just" (still very useful) compression and next-token-predictor which is so good at prediction it is able to be used for tasks we previously thought we'd need an actual AGI to accomplish.
The only way you'll get intelligence is if these models start a permanent training cycle, feeding input and output between them makes a larger model not one computing its data in a new way.
A snowflake is harmless. A million of them and you might freeze. And a trillion of them may bury your entire city under an avalanche.
Add in the AI-effect where when we learn how something works it's no longer AI, and eventually we'll get to the point of having super capable 'intelligent' digital systems where a huge portion of the population is in denial of their capabilities.
I think we'll get there eventually, and maybe not that long of an eventually in the grand scheme of things. If one wants to bash the potential future handling of AI developments because of this I have no qualms. I only take issue with the idea anything ChatGPT Code in particular is doing is related to those concerns.
>As the resulting combined system of anything you use a book, search engines, wikis, and forums as part of is unlike the raw source information by itself sure
And the GP was clearly referring to such a system, so insisting that it's just a book seems, charitably, off topic.
The point is combining these systems doesn't result in intelligence so such a combined system doesn't either, not that such combined systems are an exception to my point.
What a dumb dystopia.