Woah! And just as bad is the Tesla didn't even detect it had run a kid over. So you're also guilty of a hit-and-run. Hitting a kid running out from behind a car is something you could argue a human might have done as well depending on the circumstances. But the human would not have continued driving on as nothing happened (well not unless they're monster).
Not just that, a human would have stopped for the school bus as required by law. So the errors are (1) not stopping the car for the school bus, (2) running over the kid, and (3) not detecting that it ran over a kid.
The problem is that this software has to be incredibly fault-tolerant, and with so much complexity that is extremely difficult. A 99.9% accuracy rate isn't good enough because I might kill a child for every 1000 school buses I pass. It's why we still have pilots in planes even if computers do a lot more of the work than before.
I have taken lots of rides in Waymo, and it has always been smooth. So it is possible. The problem with Tesla is that its CEO lays down dogmatic rules, such as humans only need eyes to drive, so his cars will only have cameras, no LiDAR. He needs to accept that robot cars cannot drive the way humans do. Robot cars can drive better than humans, but they have to do it in their own way.
I might be willing to take a robotaxi because if something does happen it's not my fault. Same with a bus or train. But I won't trust FSD on my own car (LIDAR or no LIDAR), except in certain circumstances like highway driving, because if something did happen, I'd be at fault even if it was the FSD that failed.
> I'd be at fault even if it was the FSD that failed
Unless you're driving a Mercedes with Drive Pilot [1], in which case Mercedes accepts liability [2]. Drive Pilot is not FSD yet, but presumably as it acquires more capabilities Mercedes will continue their policy of accepting liability.
> A 99.9% accuracy rate isn't good enough because I might kill a child for every 1000 school buses I pass
You don't need a high level of accuracy for that - the rule is to not overtake a school bus, so that would equate to one illegal overtake for every 1000 (stopped) buses encountered. Also, not every illegal bus overtake would necessarily put a child in immediate danger.
Not just detect a pedestrian and plan a path through them. Hit a pedestrian and plan a path through them to finish the job.