Something like the trolley problem is at work here, but you're the one tied up on the tracks.
Suppose the accident rate for regular cars were 1 fatality every 100 million miles driven (it actually is in the US).
Suppose further a hypothetical self-driving car has a proven rate of 1 fatality every 1 billion miles (10x better). Except when that fatality happens, it is because the car suddenly incinerates when arriving otherwise safely at its destination. Something about the advanced AI technology makes this outcome completely random and completely unfixable.
Which do you choose? Drive yourself, 10x more dangerous? Or leave it entirely up to chance, but 10x safer?
The rational choice is to pick the self-driving car. Yet I suspect many people (including me, I admit) would choose to drive themselves.
How far apart do those numbers need to be before most people give up the steering wheel?
An example of this effect can be already be found in motorcycles. I currently own a BMW motorcycle and a honda truck. The honda has all the modern driver aids, automatic braking radar, lane keeping, etc. It has many airbags and is statistically about 30x safer per mile then the motorcycle. The truck is far easier to drive. I still ride the motorcycle whenever I can. Why? Because the motorcycle forces me to become more fully human, and the truck turns me into more of a machine. On a motorcycle you smell the hay as you pass a field. You feel the cool air as you ride over the stream. Every tiny bump and crack in the pavement has an effect, and you feel them all. You are not in a car, you are in the world. You must PAY FULL ATTENTION to the here and now or you will get squished. A motorcycle forces you to BE HERE NOW.
Our mental suffering is not because car is on autopilot. Suffering happens because WE ARE ON AUTOPILOT. So I chose to trade the 30x risk of death for a 30x reduction in mental suffering. Rational? God I hope not.
Suppose the accident rate for regular cars were 1 fatality every 100 million miles driven (it actually is in the US).
Suppose further a hypothetical self-driving car has a proven rate of 1 fatality every 1 billion miles (10x better). Except when that fatality happens, it is because the car suddenly incinerates when arriving otherwise safely at its destination. Something about the advanced AI technology makes this outcome completely random and completely unfixable.
Which do you choose? Drive yourself, 10x more dangerous? Or leave it entirely up to chance, but 10x safer?
The rational choice is to pick the self-driving car. Yet I suspect many people (including me, I admit) would choose to drive themselves.
How far apart do those numbers need to be before most people give up the steering wheel?