Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Fatalities, 736 crashes: The shocking toll of Tesla’s Autopilot (washingtonpost.com)
59 points by 0xedb on June 10, 2023 | hide | past | favorite | 21 comments


> Former NHTSA senior safety adviser Missy Cummings, a professor at George Mason University’s College of Engineering and Computing, said the surge in Tesla crashes is troubling.

> “Tesla is having more severe — and fatal — crashes than people in a normal data set,” she said in response to the figures analyzed by The Post. One likely cause, she said, is the expanded rollout over the past year and a half of Full Self-Driving, which brings driver-assistance to city and residential streets. “The fact that … anybody and everybody can have it. … Is it reasonable to expect that might be leading to increased accident rates? Sure, absolutely.”

Absolutely terrifying that these beta tests are being conducted on public roads without everyone's consent.


Worrying about it being a “beta” is nonsensical. It’s a hands-on driver-assistance system available to the general public and its safety should be evaluated thusly. The word “beta” is meaningless from a safety perspective and is essentially just advertising that the system’s ultimate goal is to be driverless.

Also, we don’t use affirmative consent for vehicle regulations, because that’s silly. Instead, you are able to weigh in via the normal mechanism of a democracy.


beta means it's not yet ready for production, and the question is why this beta grade software is being used in production (on street with real people vs only in development environments) where it time and time again hurts and kills people. there is numerous people on hn saying that driving assistance almost killed them but they reacted in time. I don't see if/how those instances could be reported, but obviosly the failure rates could therefore be a lot higher.

high beam assistance should be an easy problem to solve compared to full self driving, yet I turn it off because it's not good enough/blinds oncoming traffic.

imho there should be estrictions to only specific areas/roads for testing, or fsd cars should only be allowed at much lower speed limits.


Something like this should be up for a referendum.

We have seriously a lot more major and pressing problems. No one is picking and choosing representatives based on how they feel about beta testing software with the general public.

Anyway, I wouldn’t be surprised if the NHTSA rightfully bans large scale operations of this software until overwhelming evidence of it being “safer than humans” have been given.

This also includes fully disclosing the source and training material used to the NHTSA.


> Something like this should be up for a referendum.

That would only increase the cost of achieving agreement by necessitating the purchase of adverts before referendum. It was sold as miraculous. People bought into something they were told is miraculous. They would surely slso vote for the miraculous thing to be allowed.


> Absolutely terrifying that these beta tests are being conducted on public roads without everyone's consent.

Tesla is by far not the worst offender. I rented a Kia EV6 and the lane keeping feature repeatedly tried to kill me within about 5 minutes of turning it on. I’ve had the same experience with plenty of other brand new vehicles. The only difference is Tesla’s implementation actually works so well that people stop paying attention to the road when they shouldn’t.


Just an anecdote, but I test drove 3 different kinds of Tesla (S, 3, Y) and the EV6 as well, and I found the lane keeping/cruise control in the EV6 to be substantially better - I think the EV6 has more and/or higher def cameras, although the S I drove had HW4, it was soon after they were introduced and it's possible the software has not yet been optimized for them, and radars to detect cars which Tesla has removed. For some reason, the Teslas seem to lack object permanence, so cars around you will flicker in and out of existence, that doesn't seem to happen on the Kia. I also learned that Tesla actually has completely different code for the base-level TACC (traffic aware cruise control) versus the $15k Autopilot option. I assumed they were mostly the same with features turned off for TACC, but afaik they're completely different code.


I despise whoever wrote this article. It is almost completely devoid of any real information. Trying to gain an understanding of the situation based on the on the information presented here (Total crash and fatality numbers all time and last year, motorcycle numbers too) is impossible.

If you want to talk about this subject, find the following data: Crash RATES of humans autopilot as well as other driver assistance systems in total and split over various time periods, compared to the number of systems in use in that period, and split over various categories like highway vs rural roads, or type of vehicles involved.

If you can not find this information then you know nothing and don't have anything to say. Just make your article about the fact that these numbers are important and do your best to get an idea. State the absolute number but explain how it is meaningless.

Right now this is just embarrassing.


I don’t understand why this has been downvoted. The article is clickbait fear-mongering. It claims 17 deaths in 3 years is a “shocking” toll yet and focuses on a couple of nasty accidents, yet has no opinion, comment, or issue with the 40,000 people that die _each year_ due to human error (a figure quoted nonchalantly in the article).

Articles like this prove how irrational humans are. Wanting to ban a system of driving that is quite literally orders of magnitude safer than a human driver, because very occasionally it makes a mistake a human wouldn’t make.


Order of magnitudes safer according to whomst, exactly? Whomst is supplying that data? And what exactly is their vested interest in claiming that it's safer?


It isn't actually safer, though. That's a pretty key point. Tesla has used misleading statistics in the past to claim that its driver assist (misleadingly labelled "full self driving") is safer than humans, but this is false. In the situations where Tesla's driver assist can be used, human drivers are much safer than they are on average, so comparing the accident rate directly does not tell you which is safer.


As long as Tesla keeps the real data secret it's reasonable for us to speculate. And it's reasonable to infer there's a reason they won't disclose it.

> It is unclear which of the systems was in use in the fatal crashes: Tesla has asked NHTSA not to disclose that information. In the section of the NHTSA data specifying the software version, Tesla’s incidents read — in all capital letters — “redacted, may contain confidential business information.”


Motorcyclist here. Generally, no sees me, but I’m especially suspicious of a Tesla behind or to the side of me.


Teslas may not I see you, but we do LispSporks22. You matter! ;)

I wonder how many deaths and crashes it will take for regulators to decide to start creating test frameworks that car manufactures must implement and pass.

It seems rather than creating such frameworks from the start as soon as it looks like it could be a problem, regulators wait until some particularly bad highly publicised event happens to bring such things in.


These numbers don’t really tell you anything useful about whether Tesla is an outlier unless they’re normalized in some way, such as crashes per vehicle equipped with a driver assistance system.


There were 3 million cars sold last year in the US, 1 million had ADAS. Tesla sold 1/2 of the cars with ADAS.


what do you consider ADAS? Saying 1/2 of the cars are Teslas is suspicious as heck. I don't buy it. Where are you getting these numbers from?


Who could have predicted that a vaporware implementation that's not even beta would be the cause of all these fatalities?

Autopilot my ass. Just call it what it is (assisted driving) and set realistic expectations with the drivers to what its capabilities are.


Everyone who uses it is clearly warned that it is an assistive system, and are constantly monitored for attentiveness.


Yeah. We need to make advertising it as autopilot illegal. It's not autopilot. Far from it. Also claiming X and Y but thw driver ia responsible is case something happens is shady af


I really don't think the name matters from a safety perspective. Also, if you want to base it on the definition, you'll find that its a pretty reasonable analogy to an airplane autopilot. The pilot of an airplane can't just walk away while on autopilot. But I don't think the definition matters much, because I don't believe the name can cause significant safety issues, and also there is no evidence to suggest it is causing issues in practice.

Also, your other argument really depends on what "X" and "Y" are, and I think you'll find that they are not really claiming what you think they are claiming.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: