Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Human beings need a target for vengeance and hatred. If someone kills your child, you can hate them. If an automated car kills your child, you don't have anyone to hate. Saying that this is about safety thresholds is a distraction from the true human problem exposed by automated cars:

"Which individual will held accountable and risk jailtime if their car kills someone you love, and how can this individual be identified from the appropriate government registries within 24 hours of a death?"

Until this is clearly defined in law, automated driving will continue to be resisted under any number of plausible justifications, and arguing with those justifications will have little effect.



The owner of the autonomous car is/should be responsible.


It’s less interesting to me who is registered as responsible, or what process is used to select that person. But if no specific single named individual is registered as personally liable without possibility of corporate liability shield, then we won’t get public acceptance of self-driving cars for a much longer time than could be possible.


This seems obviously wrong to me. What is your reasoning here?


suppose you get framed by having a vulnerability in your vehicle getting exploited


If, as the registered person, you were notified about the vulnerability and didn’t patch it, you could be convicted of criminal neglect at minimum, same as a driver who ignores a recall notice and continued driving.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: