Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> by the lower number of accidents that AVs are involved in per distance travelled compared to human-driven vehicles

Just a note that this number is hard to calculate accurately with an acceptable degree of certainty.

Anyone claiming that AVs are involved in fewer accidents per distance traveled than human drivers is either extrapolating from incomplete data or baking some unreliable assumptions into their statement.

Welch Labs has a good introductory video to this topic: https://youtu.be/yaYER2M8dcs?si=XEB4aWlYf6gnnTqM



Tesla just announced 500 million miles driven by FSD [1]. Per the video, were it fully autonomous they could have a 95% CI on "safer than human" at only 275 million miles [2], but obviously having human supervision ought to remove many of the worst incidents from the dataset. Does anyone know if they publish disengagement data?

[1] https://digitalassets.tesla.com/tesla-contents/image/upload/...

[2] https://youtu.be/yaYER2M8dcs?t=477


This just shows how statistics can mislead. I own a Tesla with FSD and it's extremely unsafe for city driving. Just to quantify, I'd say at its absolute best, about 1 in 8 left turns result in a dangerous error that requires me to retake control of the car. There is no way it even comes close to approaching the safety of a human driver.


I only spent 3/4 of my post adding caveats, geez. Thanks for the first hand intuition, though.


The caveats are missing the point that FSD is very obviously less safe than a human driver, unless you constrain the data to long stretches of interstate road during the day, with nice weather, clearly marked road lines and minimal construction. At that point, my "intuition" tells me human drivers probably still safer, but under typical driving conditions they very obviously are (at least with Tesla FSD, I don't know about Waymo)


The reason why I spent 3/4 of my post on caveats was because I didn't want people to read my post as claiming that FSD was safe, and instead focus on my real point that the unthinkable numbers from the video aren't actually unthinkable anymore because Tesla has a massive fleet. You're right, though, I could have spent 5/6 of my post on caveats instead. I apologize for my indiscretion.


> my real point that the unthinkable numbers from the video aren't actually unthinkable anymore because Tesla has a massive flee

Yes, I'm addressing that point directly, specifically the fact that this "unthinkable number" is misleading regardless of the number's magnitude.


FSD's imperfections and supervision do not invalidate their fleet's size and its consequent ability to collect training data and statistics (eventually, deaths per mile statistics). The low fleet size assumption in the presentation is simply toast.

If I had claimed that the 500 million number indicated a certain level of deaths-per-mile safety, that would be invalid -- but I spent 3/4 of my post emphasizing that it did not, even though you keep pretending otherwise.


You could start by comparing highway driving, where I think Tesla actually is quite good.


Tesla's mileage numbers are meaningless because the human has to take over frequently. They claim credit for miles driven, but don't disclose disconnects and near misses.

California companies with real self driving have to count their disconnects and report all accidents, however minor, to DMV. You can read the disconnect reports online.


Do you trust claims and data from Tesla?


Do you think they lied about miles driven in the investor presentation?

Nah, that would be illegal. Their statement leaves plenty of room for dirty laundry though. I'm sure they won't disclose disengagement data unless forced, but they have plenty of legal battles that might force them to disclose. That's why I'm asking around. I'd love to rummage through. Or, better, to read an article from someone else who spent the time.


> Nah, that would be illegal.

Musk has violated many rules regarding investors.


Note that it would need to drive those 275 million miles without incident to be safer than a human.

Which for Tesla's FSD is obviously not the case.

https://www.motortrend.com/news/tesla-fsd-autopilot-crashes-...


Your video and my response were talking about fatal crashes. Humans don't go 100 million miles between crashes.

Has FSD had a fatality? Autopilot (the lane-follower) has had a few, but I don't think I've heard about one on FSD, and if their presentations on occupancy networks are to be believed there is a pretty big distinction between the two.


Isn't "FSD" the thing they're no longer allowed to call self driving because it keeps killing cyclists? Google suggests lots of Tesla+cyclist+dead but with Tesla claiming it's all fine and not their fault, which isn't immediately persuasive.


> Google suggests lots of Tesla+cyclist+dead but with Tesla claiming it's all fine and not their fault, which isn't immediately persuasive.

With human drivers -- are we blaming Tesla for those too?

You do you, but I'm here to learn about FSD. It looks like there was a public incident where FSD lunged at a cyclist. See, that's what I'm interested in, and that's why I asked if anyone knew about disengagement stats.


It appears that the clever trick is to have the automated system make choices that would be commercially unfortunate - such as killing the cyclist - but to hand control back to the human driver just before the event occurs. Thus Tesla are not at fault. I feel ok with blaming Tesla for that, yeah.


Is that real? I've heard it widely repeated but the NHTSA definitions very strongly suggest that this loophole doesn't actually exist:

> https://static.nhtsa.gov/odi/ffdd/sgo-2021-01/SGO-2021-01_Da...

The Reporting Entity’s report of the highest- level driving automation system engaged at any time during the period 30 seconds immediately prior to the commencement of the crash through the conclusion of the crash. Possible values: ADAS, ADS, “Unknown, see Narrative.”


"It appears" according to what?

Stuff people made up is a bad reason to blame a company.


From here[1]:

> The new data set stems from a federal order last summer requiring automakers to report crashes involving driver assistance to assess whether the technology presented safety risks. Tesla‘s vehicles have been found to shut off the advanced driver-assistance system, Autopilot, around one second before impact, according to the regulators.

[1] https://www.washingtonpost.com/technology/2022/06/15/tesla-a...


You also need to cite them using that as a way to attempt to avoid fault.

Especially because the first sentence you quoted strongly suggests they do get counted.


Yeah, their very faux self driving package.


Can someone summarize the video? That was my first thought as well: crash data for humans is clearly underreported. For example, police don't always write reports or human drivers agree to keep it off the books.


The probability of a human driver causing a fatality on any given mile driven is 0.00000109% (1.09 fatalities occur per 100 million miles driven).

Applying some basic statistics to show to a 95% confidence level that a self driving system causes fewer fatalities than a human you would have to drive 275 million autonomous miles flawlessly.

This would require a fleet of 100 vehicles to drive continuously for 12.56 years.

And in practice self-driving vehicles don't drive perfectly. Best estimates on the actual number of miles driven to validate their safety is around 5 billion autonomously driven miles and that's assuming that they actually are safer than a human driver.

Then you get into the comparison itself. In practice AVs don't drive on all the same roads, at the same times, as human drivers. A disproportionate number of accidents happen at night, in adverse weather, and on roads that AVs don't typically drive on.

Then you have to ask if comparing AVs to all drivers and vehicles is valid comparison. We know for instance that vehicles with automated breaking and lane assist are involved in fewer accidents.

Then of course if minimizing accidents is really what you care about there's something easy we could do right now: just mandate all vehicles must have an breathalyzer ignition. We do this for some people who have been convicted of DUI but doing it for everyone would eliminate a third of fatalities.


Then of course if minimizing accidents is really what you care about there's something easy we could do right now: just mandate all vehicles must have an breathalyzer ignition. We do this for some people who have been convicted of DUI but doing it for everyone would eliminate a third of fatalities.

In a similar vein, if we put geo-informed speed governors in cars that physically prevented you from exceeding, say, 20% of the speed limit, fatalities would also likely plummet.

But people haaaaate that idea.


I'm fine with it notifying the driver it thinks you might be speeding, but I don't like the idea of actually limiting the speed of the car. I've used several cars which had a lot of cases where it didn't track the speed right. Driving near but not in a construction zone. Driving in an express lane which has a faster posted speed than the main highway. School zones. I've seen cars routinely get these things wrong. A few months ago I was on an 85 MPH highway and Google Maps suddenly thought I was on the 45 MPH feeder. Add 20%, that's 54 MPH max speed. So what, my car would have quickly enforced the 30 MPH drop and slam on the brakes to get it into compliance?

I'd greatly prefer just automatic enforcement of speeding laws rather than doing things to try and prevent people from speeding.


Honestly I would think something like transponders on every freeway would work better than GPS. Regardless, I think everyone in the thread could think of 10 technological ways of making this work. I think the biggest barriers are political, not logistical, and definitely not engineering.


So we spend a ton of money putting in transponders and readers in cars which still have various failure modes, or we just put cameras on the highways and intersections and say "car with tag 123123 went from gate 1 to gate 2 in x minutes, those are y miles apart, average speed had to be > speed limit, issue ticket to 123123".

The toll roads could trivially automatically enforce speed limits. They already precisely know when each car goes through each gantry, they know the distance between each gantry, so they know everyone's average speed.


Mostly because I think it would glitch and get the speeds wrong

If it was 100% accurate and you couldn’t get a speeding ticket if it was active I’d be all for it


Yeah, because need to be able to use your vehicle to escape pursuers and also as a ramming weapon. I assume police would get an exception from this rule, but they don't actually have more of a legal right to use their vehicle as a weapon than anyone else, they are just less likely to be questioned on their judgement of the situation as an emergency by the DA. Probably also a second amendment violation, but the Supreme Court might be too originalist (and not textualist enough) to buy that argument, as cars did not exist in the decades surrounding the founding.


> prevented you from exceeding, say, 20% of the speed limit

I initially read this as "20% of the speed of light", and though you were being sarcastic.


Did you mean exceeding the speed limit by 20%?

Because what you actually said is true too, and hints at why "it would be safer" is not a good enough reason to implement something.


I doubt it. Both of these methods are very intrusive.


I sustained traumatic injuries a few months ago when a driver on a suspended learner's permit hit me. The lazy cop issued no tickets for the multiple traffic violations. He couldn't be bothered to show up for the trial and the lazy prosecutor who only notified me of the trial three days in advance went with a bare minimum wrist slap for the suspension. It's as if it officially never happened.


That's awful. The amount of egregious vehicular violence the US has tolerated is disgusting. Waymo seems like the best bet to making experiences like yours a thing of the past.


And I'm not even sure how reliable is the "miles driven" metric. I mean I'm sure you can estimate it somehow, but what's the margin of error there?


Odometers are pretty well regulated and insurance companies will often have a good record of the readings over long periods. I'm not sure how the org doing the data collection does it precisely, but pretty accurate data is out there.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: