Human were already on the path to doing this without any help by AI. We already have the potentially world ending threats of both nuclear war and climate change, I am yet to be convinced that AI is actually more dangerous than either of those.
We currently hold all the agency. We have the potential to fix those. They’re not binary. We can slow/reverse climate impact and you can have a small nuclear war. Creating AI is a one-way function and once it exists, climate change or nuclear war or biological impact or survival become an outcome of what the AI does. We hand it our agency, for good or ill.
Wait, what? Why is AI unlimited? There are many constraints like the speed of information, calculation, available memory, etc. Where does it cross into the physical world? And at what scale? Is it going to mine iron unnoticed or something? How will it get raw materials to build an army? Firewalls and air gapped systems are all suddenly worthless because AI has some instant and unbounded intelligence? The militaries of the world watch while eating hot dogs?
A lot of things CAN happen but I'm confused when people state things as if they WILL. If you're that much of an oracle tell me which stonk to buy so I can go on holiday.
Authoritarian, mendacious and unpredictable. Controls a lot of resources (i.e. space launchers, satellites with unknown capabilities, robotic vehicles, supercomputers, propaganda machines). Considers himself above the government.
When was the last time Musk abducted 15,000+ children and force migrated them? Used the resources of a nation to invade a neighboring country with the aim of conquest? Come on, just admit that you were wrong to put them on the same level of your pyramid of people you hate.
Fortunately Sam Altman, not Musk is running point at OpenAI. imho Sam is the perfect person for the job. If anyone can manage the risks of something like AGI while also optimizing for the benefits, it’s Sam.
It's because he has money, influence and can plausibly claim to know things about business. More to the ppint, he has been involved with OpenAI and his reactions might give an indication of the internal politics there surrounding AI safety.
> More to the ppint, he has been involved with OpenAI and his reactions might give an indication of the internal politics there surrounding AI safety.
That’s an interesting thought, one that I would give more consideration to in the early days of Musk. However, given Musk’s increasingly intense and emotional public outbursts, I’m more inclined to believe his concern is less about AI safety, than it is about his ego being damaged for not being the one leading OpenAI.
Is he making a difference making inefficient luxury cars? Cars and car dependent infrastructure are part of the climate change problem, regardless of whether the cars burn fossil fuels
If anything, he's using his wealth to solve the wrong problems, and has sucked up taxpayer resources to do so
>When was the last time Musk abducted 15,000+ children and force migrated them?
When was the first time Putin did? According to my knowledge, it was just last year. Putin is 70 years old now and has been in control of Russia for over 20 years.
In short, Putin wasn't always this bad. He's gotten worse over the years.
Musk is now roughly the same age Putin was when he took power. If he somehow gains control over the resources of a nation like Putin did, he could be far worse than Putin in 20+ years.
The OP wasn't claiming that today's Musk is just as bad as today's Putin; he's just making examples of people with great potential for harm.
Putin has led similar genocidal campaign in Chechnya from the day one of his ascent to power. The only reason Chechen children were not abducted is Chechens are not Russian-passing and they had no desire to absorb them.
There's no hard limit on existential threats, we can keep adding more until one blows up and destroys us. Even if AI is less dangerous than nuclear destruction, that's not too comforting.
To call climate change 'world ending' is rather disingenuous given that the world has been significantly hotter and colder than what it is now just in the last 100k years.
It was never this hot within millions of years and differentiating between a world ending event and one that destroys economies and societies and eventually most life on the planet is disingenuous in itself
Sure it seems like a possible scenario but if it's a great filter it will have to do that every time and never survive to spread to the stars. If it does spread to the stars it will potentially conquer the galaxy quite quickly.