Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don't think nuclear weapons are the kind of existential risk that AI doomsters imagine for AI.


Other than those that have called for nuking AI datacenters.


I feel obligated to point out that nobody has argued for nuking datacenters; the most radical AI existential-safety advocates have argued for is "have a ban on advanced training programs, enforced with escalating measures from economic sanctions and embargos to, yes, war and bombing datacenters". Not that anybody is optimistic on that idea working.


That presumably demonstrates they think nuclear war is less dangerous than AI.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: