Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Cops “almost 99.9% sure” Tesla had no one at the wheel before deadly crash (arstechnica.com)
39 points by rurp on April 20, 2021 | hide | past | favorite | 53 comments


I know Tesla cutting its PR department adds to the company's scrappy edgy vibe, but messaging and basic facts on these kind of incidents shouldn't be relegated to the CEO's twitter replies to a rando [0].

Excerpt from Reuters [1]

> Herman said a tweet by Musk on Monday afternoon, saying that data logs retrieved by the company so far ruled out the use of the Autopilot system, was the first officials had heard from the company.

> “If he is tweeting that out, if he has already pulled the data, he hasn’t told us that,” Herman told Reuters. “We will eagerly wait for that data.”

edit: For some perspective, back in 2018, when a Model X owner died in the Highway 101 crash, Tesla's PR team put out a blog post with some actual detail several days later [2]. Tesla has sold far far more vehicles since then – nevermind becoming a global operation, nevermind Boring Co. and SpaceX also growing. It makes no sense for Musk to be deep in the details of a single crash, and seeing him tweet a non-committal phrase like "Data logs recovered so far..." adds virtually nothing to an already confusing situation.

[0] https://twitter.com/elonmusk/status/1384254194975010826

[1] https://www.reuters.com/article/tesla-accident-texas/update-...

[2] https://www.tesla.com/blog/what-we-know-about-last-weeks-acc...


>It makes no sense for Musk to be deep in the details of a single crash

I'd say makes more sense than a lot of his twitter usage, actually. Musk deciding to personally involve himself in an incident like this tracks with just about everything I've heard or read about him as a person and a CEO.


If you are the CEO of a company that sells a robot, shouldn’t you be deeply concerned about any time one of your robots is accused of killing people?

Likewise, public perception of Autopilot is very important to Tesla’s marketing and market share.

Tesla’s CEO should be very concerned about both of those issues.


It is amazing how many of these article don't mention or barely mention that Teslas have multiple systems to try to prevent this that would have needed to be circumvented for this car to be using Autopilot without anyone at the wheel. They systems can obviously be improved as they are very easy to circumvent, but the mere need to circumvent them is part of the story as it shows the people in the car were aware they were misusing the system and that this was an dangerous idea.


> Adding to the confusion, Musk himself has appeared on “60 Minutes” and Bloomberg TV behind the wheel of a Tesla with his hands in the air. He’s been talking about Tesla‘s fully autonomous technology as if it’s imminent since 2016. That year, Tesla posted a video showing one of its cars running in autonomous mode through Palo Alto. “The person in the driver’s seat is only there for legal reasons,” the video said.

https://www.latimes.com/business/story/2021-04-19/tesla-on-a...


Advertising... is shitty.

Not just for Tesla, lots of companies make exaggerated claims about their products. Take a look at any advertising around SuperCruise and you’ll see the phrase “Hands Free Driving” about a million times. But fundamentally it has the same limits on self driving as Tesla. SuperCruise and AutoPilot both rely on the driver to intervene. If Musk gets crap for briefly taking his hands off the wheel on TV, how does GM get a free pass on marketing as “Hands Free”?


Not sure how that is relevant to my specific point. There is someone in the driver set in that example.


or bugx.

any programmer will tell you bugs exist and both the telemetry and the police could technically be reporting the truth,


I am guessing after hitting the tree the doors got jammed, so the driver got in the back seat to try to force open the back door or trunk access. They burned alive.


Yeah. Tesla's require power to open doors. There are mechanical emergency release latches, but the back doors can't be easily opened, because the latch is not in an obvious place, but hidden in the base of the seat. Front doors have mechanical latch in the door, but I'm not sure how easily the door opens without power.

See the placement of back door emergency release in Model S: https://www.youtube.com/watch?v=01lXcD_Uz74&t=42s

Model 3 doesn't have emergency release latches for back doors.


latch or not, it's common for a collision to result in a door physically not being able to open due to malformation. In fact, in small aircraft (general aviation), you're taught to open the door and jam something in between the door and the frame before an emergency landing to avoid being trapped.


This as likely as pretty much any explanation I’ve heard.

I think Tesla tries to be too clever by far with their doors. I sure hope the rumor they are eliminating outside handles on the Cybertruck is nonsense.


this does seem like the most plausible explanation. doors aren't guaranteed to work after a collision.


This whole story is suspect, autopilot requires the seat to be weighted in order to be engaged. Elon also stated that the car didn't have autopilot enabled or FSD functionality: https://twitter.com/elonmusk/status/1384254194975010826


Has this recently changed? I've certainly seen videos of Autopilot with no driver in the seat, eg:

https://www.youtube.com/watch?v=Z_N5zrFJgII


According to the guy in the video (September 2020) the seat sensor exists but doesn't interact with autopilot.

The steps he had to take to turn it on and actually exit the vehicle without it turning off automatically was pretty comical though (TL;DW for anyone not watching):

1. Clip his seatbelt behind him, so he could keep it attached after getting out of the seat (undoing your seatbelt turns off autopilot)

2. Start driving at a low speed, turn on autopilot, then bring it to a stop

3. Climb out the window (as the doors didn't seem like they would open while the car was in motion)

4. Use a stick/tripod to tap on the acceleration pedal to resume autopilot driving

And even then, the vehicle stopped like 10 seconds later when autopilot got confused about its surroundings.

Even if the seat needs weighted down, this seems like a pretty trivial thing to do (put a backpack or something in the seat) if you're already going out of your way to intentionally trick the car in so many other ways.

Since the car didn't even have autopilot as a feature, it seems way more likely the riders just went with the age-old "prop the pedal down" method that's caused accidents like this since the beginning of cars.


Seems like only one step is necessary

1. Strap in seatbelt behind you

2. Start autopilot

3. Remove yourself from the driver seat, into the passenger seat

And now the car is in autopilot with no driver.

TBH Tesla should be doing more about this, it's too easy.


This is not true, people routinely get around this restriction.


Perhaps the driver was in the driver's seat and engaged autopilot and then hopped over into the passenger's seat and that's exactly when it crashed?


The car in question didn't even have autopilot, so it wasn't even a feature on that car that could have theoretically been engaged.

It's like saying they started the internal combustion engine before the crash.... which clearly can't be possible, because that vehicle doesn't have an internal combustion engine.


Where does it say it didn’t have autopilot?


In the tweet from Elon in the top-level comment.


The statement from Musk was "Data logs recovered so far show Autopilot was not enabled & this car did not purchase FSD.", which I would interpret to mean "Autopilot was purchased but wasn't enabled at the time; FSD wasn't purchased at all", otherwise I'd expect him to have written "this car did not purchase Autopilot and FSD."


Are Autopilot and FSD not the same thing?


Not quite, no.

AIUI Autopilot is the basic "can sometimes handle steering the car through basic tasks" package, and FSD is the additional "we promise this will be full autonomous driving one day if you buy it now" package.

Evidently as of a certain point all new Teslas come with Autopilot for free (but not the FSD license), and you can always buy upgrades after the fact (unless you have a Tesla sold prior to them shipping the current full suite of sensors in every car...)

cf. https://www.tesla.com/support/autopilot


It could be a hacked car to enable autopilot. There are lots of shady places which will do that for you and save you lots of $$$...


Can you elaborate on that? I've heard this too, but also Tesla praises their software is unhackable...


You drive the car to them in Seattle, fork over $4,300, they spend about an hour enabling FSD. You never get software updates and can't do warranty stuff afterwards.


There is no such thing is bug-free, un-hackable software! Any claims to the contrary are modern day snake oil.


Elon could be wrong if it's a hacked car with autopilot enabled without paying.


Do all model years (rolling versions) have seat weight sensors?


It would be great to have some statistics on who's using the autopilot and for how long.

I myself am driving one for a year now and it's so often the Tesla suddenly breaks (at least once in a trip) because it thinks that's on some parallel road or an overpass with a different speed limit that I would never trust any additional feature.

I'm curious if trust builds up over time or gets eroded. Or if this completely depends on the person.


I am pretty sure they enabled it, then the driver changed seats to be cool to pedestrians. Autopilot probably disengaged after some beeps. RIP.


Tesla’s autopilot is very comparable to autopilot in aircraft (read a bit before downvoting). It is something you engage to take the load off the vehicle operator but does not eliminate the need for the operator.

Autopilot (both in aircraft and in the Tesla) is really good at what it does. But if you start thinking about it as anything other than something which helps reduce the vehicle operators workload, it is dangerous.

Pilots understand this and get pretty extensive training. Some Tesla owners clearly do not. Some of the blame lays with Tesla due to marketing. Perhaps these hybrid machine/ human driving systems should require training.


Dunno why the focus is on autopilot exclusively.

You don't need autopilot to be an idiot here... people have crashed by setting cruise control and then messing around, either to show off or tempt fate.


The focus is because people were assuming Autopilot was the path of least resistance for stupidity here, not realizing that Autopilot has constraints to try and prevent people from doing crap like this, and because it becomes an opportunity to dunk on Tesla's self-driving.

(I'm not defending Tesla in general; I think they're a deeply flawed company. I just think it's important to have a confirmed factual basis for dunking on companies.)


Recent and related:

No one was in driver’s seat in fatal Tesla crash - https://news.ycombinator.com/item?id=26866754 - April 2021 (79 comments)

Two people killed in fiery Tesla crash with no one driving - https://news.ycombinator.com/item?id=26852399 - April 2021 (706 comments)


Shouldn't it be the NTSB investigating the black box logs? Tesla has a vested interest in presenting any analysis that mitigates or even eliminates their culpability


Any reason why the facts on this are coming from Elon's twitter?

For such a large company its seems strange that the CEO would be the one responsible for public damage control.


If you're talking about any other company then yes, but Elon is known to be the PR/Face person (when he wants it, which is often) and at times... get very personal.


Seems like yet another quality he shares with Trump then.


What the hell does “almost 99.9% sure” even mean anyway?


It actually has a pretty clear meaning: 99.9% means that someone ignorant (or lacking integrity) wants to project unrealistic certainty. That's what that phrase communicates.


Yeah, but it's not 99.9%, it's "almost 99.9%". Very weird statement.


I think Musk has said he'll get to 99.9999% safety by the end of 2020 (what he said they needed for complex intersections required for the level 5 robo taxi network they were to launch in 2020).


Yeah, it's a Yogi-ism. It's a sign of our time though, isn't it? Expressing absolute certainty is almost sacrilegious these days. "I'm 9.99999...% sure."


It means anywhere from 0% to 99.9% sure.


does not exist. You are either sure. Or not sure.



Sure...


Maybe they're insinuating that the position of the bodies rule out a scenario where the driver was thrown into the passenger or back seat via impact.


In a frontal collision unrestrained backseat passengers get thrown to the front of the car. Basic physics!


That the series of their surety converges to 99.9%?

https://en.m.wikipedia.org/wiki/Almost_convergent_sequence

My grasp of this advanced math relinquished long ago so I'm still not sure what it means.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: