Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Your first paragraph is exactly how I feel about nuclear weapons, to put it into context. I don’t think the logical conclusion from that viewpoint is that nuclear weapons aren’t that dangerous so we should just move ahead.


The difference is that it is somewhat feasible to control access to the materials necessary to produce nuclear weapons.

It is not remotely feasible to control access to computing devices.


I don't think nuclear weapons are the kind of existential risk that AI doomsters imagine for AI.


Other than those that have called for nuking AI datacenters.


I feel obligated to point out that nobody has argued for nuking datacenters; the most radical AI existential-safety advocates have argued for is "have a ban on advanced training programs, enforced with escalating measures from economic sanctions and embargos to, yes, war and bombing datacenters". Not that anybody is optimistic on that idea working.


That presumably demonstrates they think nuclear war is less dangerous than AI.


I think it has been empirically demonstrated that lapses in regards to the control and use of nuclear weapons can occur without the destruction of humanity.

(I am not an AI doomer, nor do I feel that nuclear weapons are not dangerous/should be less controlled)




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: