Hacker News new | past | comments | ask | show | jobs | submit login

He is less concerned that people can create an evil AI if they want to and more concerned that no person can keep an AI from being evil even if we tried.



He expects the bad guy with an AI to be stopped by a good guy with an AI?


No, he expects the AI to kill us all even if it was built by a good guy.

How much this result improves his outlook, we don't know, but he previously put our chance of extinction at over 95%: https://pauseai.info/pdoom


These guys and their black hole harvesting dreams always sound way too optimistic to me.

Humanity has a 100% chance of going extinct. Take it or leave it.


It'd be nice if it weren't in the next decade though.


No, he expects a bad AI to be unstoppable by anybody, including the unwitting guy who runs it.


works for gun control :)


I hope this is sarcasm because that is hardly a rule!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: