I know Tesla cutting its PR department adds to the company's scrappy edgy vibe, but messaging and basic facts on these kind of incidents shouldn't be relegated to the CEO's twitter replies to a rando [0].
Excerpt from Reuters [1]
> Herman said a tweet by Musk on Monday afternoon, saying that data logs retrieved by the company so far ruled out the use of the Autopilot system, was the first officials had heard from the company.
> “If he is tweeting that out, if he has already pulled the data, he hasn’t told us that,” Herman told Reuters. “We will eagerly wait for that data.”
edit: For some perspective, back in 2018, when a Model X owner died in the Highway 101 crash, Tesla's PR team put out a blog post with some actual detail several days later [2]. Tesla has sold far far more vehicles since then – nevermind becoming a global operation, nevermind Boring Co. and SpaceX also growing. It makes no sense for Musk to be deep in the details of a single crash, and seeing him tweet a non-committal phrase like "Data logs recovered so far..." adds virtually nothing to an already confusing situation.
>It makes no sense for Musk to be deep in the details of a single crash
I'd say makes more sense than a lot of his twitter usage, actually. Musk deciding to personally involve himself in an incident like this tracks with just about everything I've heard or read about him as a person and a CEO.
It is amazing how many of these article don't mention or barely mention that Teslas have multiple systems to try to prevent this that would have needed to be circumvented for this car to be using Autopilot without anyone at the wheel. They systems can obviously be improved as they are very easy to circumvent, but the mere need to circumvent them is part of the story as it shows the people in the car were aware they were misusing the system and that this was an dangerous idea.
> Adding to the confusion, Musk himself has appeared on “60 Minutes” and Bloomberg TV behind the wheel of a Tesla with his hands in the air. He’s been talking about Tesla‘s fully autonomous technology as if it’s imminent since 2016. That year, Tesla posted a video showing one of its cars running in autonomous mode through Palo Alto. “The person in the driver’s seat is only there for legal reasons,” the video said.
Not just for Tesla, lots of companies make exaggerated claims about their products. Take a look at any advertising around SuperCruise and you’ll see the phrase “Hands Free Driving” about a million times. But fundamentally it has the same limits on self driving as Tesla. SuperCruise and AutoPilot both rely on the driver to intervene. If Musk gets crap for briefly taking his hands off the wheel on TV, how does GM get a free pass on marketing as “Hands Free”?
I am guessing after hitting the tree the doors got jammed, so the driver got in the back seat to try to force open the back door or trunk access. They burned alive.
Yeah. Tesla's require power to open doors. There are mechanical emergency release latches, but the back doors can't be easily opened, because the latch is not in an obvious place, but hidden in the base of the seat. Front doors have mechanical latch in the door, but I'm not sure how easily the door opens without power.
latch or not, it's common for a collision to result in a door physically not being able to open due to malformation. In fact, in small aircraft (general aviation), you're taught to open the door and jam something in between the door and the frame before an emergency landing to avoid being trapped.
This as likely as pretty much any explanation I’ve heard.
I think Tesla tries to be too clever by far with their doors. I sure hope the rumor they are eliminating outside handles on the Cybertruck is nonsense.
This whole story is suspect, autopilot requires the seat to be weighted in order to be engaged. Elon also stated that the car didn't have autopilot enabled or FSD functionality: https://twitter.com/elonmusk/status/1384254194975010826
According to the guy in the video (September 2020) the seat sensor exists but doesn't interact with autopilot.
The steps he had to take to turn it on and actually exit the vehicle without it turning off automatically was pretty comical though (TL;DW for anyone not watching):
1. Clip his seatbelt behind him, so he could keep it attached after getting out of the seat (undoing your seatbelt turns off autopilot)
2. Start driving at a low speed, turn on autopilot, then bring it to a stop
3. Climb out the window (as the doors didn't seem like they would open while the car was in motion)
4. Use a stick/tripod to tap on the acceleration pedal to resume autopilot driving
And even then, the vehicle stopped like 10 seconds later when autopilot got confused about its surroundings.
Even if the seat needs weighted down, this seems like a pretty trivial thing to do (put a backpack or something in the seat) if you're already going out of your way to intentionally trick the car in so many other ways.
Since the car didn't even have autopilot as a feature, it seems way more likely the riders just went with the age-old "prop the pedal down" method that's caused accidents like this since the beginning of cars.
The car in question didn't even have autopilot, so it wasn't even a feature on that car that could have theoretically been engaged.
It's like saying they started the internal combustion engine before the crash.... which clearly can't be possible, because that vehicle doesn't have an internal combustion engine.
The statement from Musk was "Data logs recovered so far show Autopilot was not enabled & this car did not purchase FSD.", which I would interpret to mean "Autopilot was purchased but wasn't enabled at the time; FSD wasn't purchased at all", otherwise I'd expect him to have written "this car did not purchase Autopilot and FSD."
AIUI Autopilot is the basic "can sometimes handle steering the car through basic tasks" package, and FSD is the additional "we promise this will be full autonomous driving one day if you buy it now" package.
Evidently as of a certain point all new Teslas come with Autopilot for free (but not the FSD license), and you can always buy upgrades after the fact (unless you have a Tesla sold prior to them shipping the current full suite of sensors in every car...)
You drive the car to them in Seattle, fork over $4,300, they spend about an hour enabling FSD. You never get software updates and can't do warranty stuff afterwards.
It would be great to have some statistics on who's using the autopilot and for how long.
I myself am driving one for a year now and it's so often the Tesla suddenly breaks (at least once in a trip) because it thinks that's on some parallel road or an overpass with a different speed limit that I would never trust any additional feature.
I'm curious if trust builds up over time or gets eroded. Or if this completely depends on the person.
Tesla’s autopilot is very comparable to autopilot in aircraft (read a bit before downvoting). It is something you engage to take the load off the vehicle operator but does not eliminate the need for the operator.
Autopilot (both in aircraft and in the Tesla) is really good at what it does. But if you start thinking about it as anything other than something which helps reduce the vehicle operators workload, it is dangerous.
Pilots understand this and get pretty extensive training. Some Tesla owners clearly do not. Some of the blame lays with Tesla due to marketing. Perhaps these hybrid machine/ human driving systems should require training.
You don't need autopilot to be an idiot here... people have crashed by setting cruise control and then messing around, either to show off or tempt fate.
The focus is because people were assuming Autopilot was the path of least resistance for stupidity here, not realizing that Autopilot has constraints to try and prevent people from doing crap like this, and because it becomes an opportunity to dunk on Tesla's self-driving.
(I'm not defending Tesla in general; I think they're a deeply flawed company. I just think it's important to have a confirmed factual basis for dunking on companies.)
Shouldn't it be the NTSB investigating the black box logs? Tesla has a vested interest in presenting any analysis that mitigates or even eliminates their culpability
If you're talking about any other company then yes, but Elon is known to be the PR/Face person (when he wants it, which is often) and at times... get very personal.
It actually has a pretty clear meaning: 99.9% means that someone ignorant (or lacking integrity) wants to project unrealistic certainty. That's what that phrase communicates.
I think Musk has said he'll get to 99.9999% safety by the end of 2020 (what he said they needed for complex intersections required for the level 5 robo taxi network they were to launch in 2020).
Yeah, it's a Yogi-ism. It's a sign of our time though, isn't it? Expressing absolute certainty is almost sacrilegious these days. "I'm 9.99999...% sure."
Excerpt from Reuters [1]
> Herman said a tweet by Musk on Monday afternoon, saying that data logs retrieved by the company so far ruled out the use of the Autopilot system, was the first officials had heard from the company.
> “If he is tweeting that out, if he has already pulled the data, he hasn’t told us that,” Herman told Reuters. “We will eagerly wait for that data.”
edit: For some perspective, back in 2018, when a Model X owner died in the Highway 101 crash, Tesla's PR team put out a blog post with some actual detail several days later [2]. Tesla has sold far far more vehicles since then – nevermind becoming a global operation, nevermind Boring Co. and SpaceX also growing. It makes no sense for Musk to be deep in the details of a single crash, and seeing him tweet a non-committal phrase like "Data logs recovered so far..." adds virtually nothing to an already confusing situation.
[0] https://twitter.com/elonmusk/status/1384254194975010826
[1] https://www.reuters.com/article/tesla-accident-texas/update-...
[2] https://www.tesla.com/blog/what-we-know-about-last-weeks-acc...