Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

There's no hard limit on existential threats, we can keep adding more until one blows up and destroys us. Even if AI is less dangerous than nuclear destruction, that's not too comforting.


> Even if AI is less dangerous than nuclear destruction

It's not. At least with the nukes there's a chance of resetting civilization.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: