> The company that creates an AGI first will win and get the most status.
I doubt it. History has shown that credit for an invention often goes to the person or company with superior marketing skills, rather than to the original creator.
In a couple of centuries, people will sincerely believe that Bill Gates invented software, and Elon Musk invented self-driving cars.
Edit: and it's probably not even about marketing skill, but about being so full of oneself to have biographies printed and making others believe how amazing they are.
Without sidetracking with definitions, there's a strong case to make that developing AGI is a winner takes all event. You would have access to any number of tireless human level experts that you could put to work at improving the AGI system, likely leading to ASI in a short amount of time, with a lead of even a day growing exponentially.
Where that leaves the rest of us is uncertain, but in many worlds the idea of status or marketing won't be relevant.
A few decades ago I thought that the first person to create AGI would instantly receive a Nobel prize/Turing award.
But my opinion on this has shifted a lot. The underlying technology is pretty lame. And improvements are incremental. Yes, someone will be the first, but they will be closely followed by others.
Anyway, I don't like the "impending doom" feeling that these companies create. I think we should tax them for it. If you throw together enough money, yeah, you can do crazy things. Doesn't mean you should be able to harass people with it.
Yes, it gets "smarter" each time, more accurate, but still lacks ANY creativity or actual thoughts/understanding. "You're completely right! Here, I fixed the code!" - proceeds to copy-paste the original code with the same bug.
LLMs will mostly replace:
- search (find information/give people expert-level advice in a few seconds)
- data processing (retrieval of information, listening and react to specific events, automatically transforming and forwarding of information)
- interfaces (not entirely, but mostly augment them, sort of a better auto-complete and intention-detection).
- most content ideation and creation (will not replace art, but if someone needs an ad, a business card, landing page, etc., the AI will do a good enough job).
- finding errors in documents/code/security, etc.
All those use-cases are already possible today, AI will just make them more efficient.
It will be a LONG time until AI will know how to autonomously achieve the result you want and have the physical-world abilities to do so.
For AGI, the "general" part will be as broad as the training data. Also, now the AI listens too much to the human instruction and is crippled for (good) safety reasons. While we have all those limitations set, the "general intelligence" will still be limited, as it would be too dangerous to set zero limits and see where it goes (not because it's smart, but it's like letting a malware have access to the internet).
Arguably hallucinating is a path to creativity. There are studies being done on having an llm hallucinate ideas and another validating them as a possible novel idea.
LLMs probably alone won’t be agi but humanity’s intelligence is also not just the limbic system or the neocortex. Our brains are also various different styled tools interconnected to create a greater whole. To that end LLMs may be a key part of bringing together the tools we already have been built (computing, machine learning, robotics, etc) into a larger system that is agi.
This depends on the perspective. Take a step back and consider what the actual technology is that makes this possible: neural networks, the transistor, electricity, working together in groups? All pretty cool, IMHO.
Some people believe the earth is flat. I doubt that the invention of the electric car will be attributed to musk. The invention of the car is not attributed to Ford either...
I doubt it. History has shown that credit for an invention often goes to the person or company with superior marketing skills, rather than to the original creator.
In a couple of centuries, people will sincerely believe that Bill Gates invented software, and Elon Musk invented self-driving cars.
Edit: and it's probably not even about marketing skill, but about being so full of oneself to have biographies printed and making others believe how amazing they are.