It could have been posted May 20, 2024 and I don't think it would have any real meaning. Musk has zero credibility with this sort of thing regardless of when it was posted.
Except for the part where their “safety reports” are aggressively and intentionally falsified to push product.
They intentionally deceive by comparing extremely non-comparable numbers while avoiding making any necessary and basic adjustments that harm their story. Of their many lies, they use pyrotechnic (airbag) deployments for their own numbers, but compare against all crashes. The NHTSA investigation [1] into Tesla ADAS crashes points out how publicly available data, that Tesla has easy access to and that any competent statistician can interpret, indicates pyrotechnic deployment occurs in only ~18% of crashes. Just that single factor alone shows Tesla has failed to disclose a literal 5x adjustment, let alone the other adjustments that have no doubt been concealed or ignored to avoid learning inconvenient and unmarketable truths.
Their intentional concealment of 5x errors shows their data analysis is entirely faulty and untrustworthy and has been for years.
You try telling Elon Musk not to post stuff. He ended up being forced to buy Twitter and pay an SEC fine because he tweets too much. He's going to post whatever he feels like, no matter what you, or anyone at Tesla, or the SEC says.
I wouldn't be surprised if Elon considers it just. Most people consider it just that a revolutionary technology owner is being held liable for their actions, even if they are nearly a decade afterwards.
Regulation for big corporations are more like suggestions these days, so this may even feel "fair enough" for Elon in retrospect.
Or the tweet was bullshit, and Tesla will do what every company does and determine whether it is better to settle or not based on the financial cost/benefit and not the truth or validity of the claim.
the defective design of the door latch system entrapping him in the vehicle
This was discussed furiously in the HN articles of the time; the latch apparently has a manual actuator, but how to use it is not obvious at all, especially in an emergency.
I also noticed the subtitle is "Settlement Comes After Automaker in April Struck Confidential Accord". That had to be deliberate.
Maybe ideally they should have mixed the two: a light pull does the electronic thing (rolls windows clear of weather stripping, etc), a hard pull does the manual thing.
I test drove a Lexus TX recently, it has a door assist too, but it's real simple: push the handle and the door assist operates, pull the handle (maybe twice if locked) and the door operates like a normal door. Easy and intuitive.
Apparently it's too hard to open a car door yourself anymore. I guess Tesla popularized pushing a button and then the car electrically pushes open the door a bit; but Lexus has it too now. Electric sliding doors in minivans are pretty old too, but this is for the front doors.
Model Y is like 3 except some of them do not have a manual release for rear-passengers, and those that do involve another step to pull up a mat from the bottom of the door-pocket:
I was t-boned in a new model s (with yoke) and there wasn't any release like the manual said. Dunno if the carpet was just covering it and someone skipped that step but even the folks at the shop couldn't figure how to open them. Same goes with releasing the rear seats forward, if you don't have power they wont go down and of course the emergency escape latch in the trunk is basically impossible to get to unless you're already in the trunk area.
Tesla driver assist is just fine, thanks. Tesla surely followed all relevant software best practices, like MISRA, ISO-26262, etc and is in no way liable for poorly designed software that has been enabling fully self-driving vehicles since 2015 as was promised by the CEO.
It’s a little confusing, but this incident is not about self driving or software (unless the latching system is software) If anything it’s probably about the latching system or how vulnerable it is to catching fire.
We may never know the truth but I’m not sure what Tesla is at fault here for or why they would settle. Twice the legal limit for alcohol (alleged) by the driver is very bad for the plaintiff.
Did you not read the article? They included info about an old case for background but this was about the Apple engineer who in 2018 was killed when his Tesla drove itself off the edge of a freeway and into a barrier at 71 mph.
I did read this article, which is what I was commenting on. “This incident” . The article was not about the previous case even though it was referenced.
This has nothing to do with driver assist as far as I can tell? It was a drunk driver that had her foot on the gas the whole time and made no attempt to brake.
Can't we sue the people who ran safety tests and regulations on the car to let it get on the road like that? Or the onus is on Tesla and this was a freak accident (manufactured wrong) they should have caught after design?
They have the best self driving of any company in the world. What other car can you buy and send off to work as a robotaxi on your behalf? That is truly incredible and I never hear anyone talking about it.
You never hear anyone talking about it because it's not yet possible to send a Tesla off to work as a robotaxi on your behalf. It will be incredible if/when they're able to do that, assuming they're able to do it without any major incidents. And people will be talking about it endlessly, and rightfully so, once/if it comes to pass. I sure hope it happens because I'd be stoked to ride in one.
Lol. I know Musk eventually did produce FSD (for America), but the unmanned robotaxi thing just seems to have so many fundamental problems I can't imagine it ever working (with people's owned cars, as opposed to the Waymo model)
... on certain roads, and with good quality road markings, and in certain weather and seasons, and for some, but not most or all, traffic conditions...
Exactly. If he hadn’t been under the influence then the door latch system wouldn’t have had a defective design
>The suit blamed the “propensity of the vehicle to catch fire, as well as the defective design of the door latch system entrapping him in the vehicle.”
Not parent-poster, but I did a bit more digging since (the linked article is terribly incomplete) and found out that not only did the driver also die, but the passenger was also tested (posthumously?) as being over the legal BAC limit.
> If he hadn’t been under the influence then the door latch system wouldn’t have had a defective design
So... Poe's Law here. I can't tell if this is a sarcastic comment pointing out that a defective design remains defective even if someone is drunk, or whether this is a serious comment implying it wasn't really defective in normal circumstances.
In any case, the person who was trapped-and-died was the passenger, and we don't know if or how-much they were drunk.
The driver survived... or else they're making a lawsuit from beyond the grave.
Correction since I can't edit: Both driver and passenger died and the plaintiff(s) do not include the driver. That said, I blame this on the linked-article being crap: Despite discussing the passenger's death, it never says any other person in the car died, which normally means they survived.
Instead, the driver died on the scene and the passenger later in a hospital [0]. More digging shows driver and passenger were (probably posthumously) tested at 0.21 and 0.17 BAC respectively. [1]
That wasn't what the case was about. It was about the propensity of the car to catch on fire and faulty door latch design that prevented the passenger who survived the impact from getting out.
> Tesla maintains there was nothing wrong with the car. It said the data event recorder showed that Speckman kept her foot on the accelerator pedal before the crash and never attempted to brake.
The family settling could seen as support for this claim. If it really did accelerate randomly, that seems like more of an NHTSA (or whoever) sort of thing more than something that could be settled.
This wasn't an unintended acceleration case or a driver assistance case. The case is entirely about the car catching fire and being impossible to escape.
Nah: If anybody was actually holding Musk "ultimately at fault regardless of anything else", we wouldn't be talking about settling a product-safety lawsuit, but instead about a criminal trial for manslaughter.
Since that's not the case--and I don't think anyone has even suggested it needs to be--we can safely infer that there's already a high degree of nuance and splitting of different levels and layers of responsibility going on.
At the end of the day, "operator error"--even drunken operator error--is not enough to automatically negate all safety flaws.
I wonder how many of these suits they will have to settle (and how many people will die/get injuried in the process) just for them to be able to avoid judgments being used against them in their other procedure for falsely advertising their car as FSD (and charging more specifically for it).
Given that Tesla has happily gone to all sorts of lengths, and held press conferences to throw drivers under the bus in previous fatal collisions (calling out, misleadingly, telemetry - "the vehicle was telling the driver to pay attention", yeah, once, eighteen minutes before this collision)...
... one has to wonder why they're so keen that this one doesn't go to trial.