Presumably this is why the permit is being suspended:
"The AV detected a collision, bringing the vehicle to a stop; then attempted to pull over to avoid causing further road safety issues, pulling the individual forward approximately 20 feet."
First of all, I find it ironic that the human driver is still at large. I'm sure the Cruise car recorded the plate and they passed this to the police.
Second, the corporate speak on the Cruise release is very good. Notice how it says "pulling the individual forward", in other words, the car dragged the person while it attempted to get out of the way. I guess nobody thought about that edge case. And they released a completely irrelevant simulation showing that if the other car had been an AV this wouldn't have happened. Yet, the real issue with the Cruise AV is that after a collision it just blindly tried to pull over and it dragged the person. The same thing would have happened if the person had simply collided with the car on their own.
> I find it ironic that the human driver is still at large
Ironic but completely expected. We have normalized and accepted the awfulness of human drivers to a degree that people just don't comprehend. Human would be terrible drivers even if you ignored the fact that many of them are chemically impaired, even if you ignored the fact that they are prone to rage and aggression.
As we consider the safety of self-driving vehicles, there is going to be a disorienting amount of cognitive dissonance as we are forced to confront the awfulness of what we already accepted and have been living with for a century. There were over 42,000 deaths due to motor vehicle accidents in 2022. That means if we created self-driving cars that were twice as safe as humans, they might save 21,000 lives per year and also kill 21,000 people per year. That sounds insane, but it would be a sane way of improving an insane situation.
Kibitzing about these individual incidents is a normal and inevitable human way to try to deal with the problem, but we need some way of measuring how deadly self-driving cars are in comparison to how deadly human drivers are. I never see that, so I wonder how we're going to know when we should loosen the reins on them? Are regulators actually doing their jobs, or are they just going to move inexorably forward while occasionally throwing bones to public outrage?
The stakes here are huge, tens of thousands of deaths and horrific injuries per year, and we will unnecessarily kill lots of people if we deploy self-driving technology too fast or too slowly.
The way this will probably work out in practice is regulators choosing the path of least resistance between uninformed public outrage and greed-driven industry pressure. What we should be asking for is a data-driven approach.
If a human being had done that they would have absolutely had the book thrown at them. There was a case where a woman was drunk driving and hit a man, he came up over the hood and into her windshield. He was alive but bleeding. She drove home and parked her car in the garage. He bled out through the night. NO ONE was sympathetic towards that woman.
But if an automated vehicle does it, it just becomes a line item that someone somewhere has to pay. You watch, someone will start offering insurance for that liability.
What we should be doing is making it an existential crisis for these companies. That will never happen and thus self driving vehicles will become good enough and that's it.
> If a human being had done that they would have absolutely had the book thrown at them.
Really? If they had been in a car accident caused by another driver, and took a few seconds to pull over because they were unaware they were dragging a person initially hit by the other driver, they would have been charged?
It's very hard to hold people responsible, and it's telling that you had to cite such a grisly and outrageous example. Unless you're drunk or it's a hit-and-run, you're unlikely to face any charges beyond a traffic violation.
> You watch, someone will start offering insurance for that liability.
Of course? Exactly like we've mandated for human drivers since the 1970s?
All this hand-wringing about things that we've accepted our whole lives with human drivers.
I'll say it again: we need data-driven approach to figuring out when autonomous vehicles reach the threshold of becoming less deadly than human drivers. If we jump the gun and allow widespread adoption too soon, people will die unnecessarily. If we drag our feet and allow widespread adoption too late, people will die unnecessarily.
> Really? If they had been in a car accident caused by another driver, and took a few seconds to pull over because they were unaware they were dragging a person initially hit by the other driver, they would have been charged?
unless you believe this person stayed completely silent during this, the answer is yes. The measuring stick that the law tends to use for liability like this is the reasonable person test and no reasonable person drives 20 feet with someone screaming in pain after having HIT their car.
Are there circumstances in which someone would NOT be charged? sure, there have been cases of toddlers running out in front of cars and it was determined the driver couldn't possible have seen them due to their height.
Since you apparently misunderstood, the point about the insurance is that maybe we shouldn't allow companies to insure themselves against killing people. We as a society absolutely went after Ford for choosing not to do recalls because the estimated cost of the recall exceeded the estimated payout for the deaths and injuries. Under no circumstances should we allow companies to make decisions like this.
Sorry but this just comes off as completely out of touch to me. Yes, I agree that there would be a legal case to charge someone in such a scenario. But what actually happens, at least in NYC, is that the cops and prosecutors will bend over backwards to give a driver the benefit of the doubt.
Cruise showed the DMV a video that cut out just before their car dragged (ran over) the pedestration. Effectively they were trying to do a hit and run as well by not showing footage of their car's behaviour that caused the injuries.
This is just a general problem. Companies very often kill people and get fines. The body count before a company starts to get close to "this might kill the company" is absurdly high.
Consider, the Dalkon shield which had a hospitalization rate of 5 in 1000 and a death toll easily in the 1000s (maybe 10s of thousands). [1] The company went bankrupt but not defunct. It is now a part of Pfizer. Nobody experienced any sort of jail time in association with the deaths (and, after it was banned, they started selling the dalkon shield in poorer regions).
A serial killer killing 10 people gets the death penalty, yet there exist mining companies that have easily killed 1000s of employees with silicosis (and are still operating today, looking at you Dow [2]).
yep, this is why I said it should be an existential crisis. mistakes happen, but if your mistakes can potentially cause bodily harm you better be damned well prepared to show how it was not negligence in any way or you should get a fine that scares the shit out of you.
But that's the idealist side of me talking, the realist understands that will never happen.
Yeah, the big issue we have is that we reward ignorance. It's in the legal best interest for companies producing physical products to be as oblivious as possible to issues of safety. Legally, the worst the worst outcome for them happens when there's documented evidence of safety problems. It's far better to feign ignorance. "Who could have predicted that opioids were addictive!". "How could we have known that this doctor was overprescribing, we are a big company without the resources to track everything".
It should really be the opposite in an ideal world. A company that has documented "this is what we did to make sure everything was safe" should less culpable than the company that moves fast and breaks things/people/the planet.
If a company killed someone, I'd want their defense to be a list of everything they did to prevent that person from dying, not a "well shucks, accidents happened, and this was a tragedy that couldn't be avoided!". That's the response that should end a company's existence.
You can punish the manufacturer of any item that creates an excessive danger to the public.
Lawn darts aren't intended to be thrown near people, and don't exist anymore because people used them negligently. Human-driven cars are used negligently and kill 40,000 Americans a year.
“We” have done nothing of this sort. SF is its own trashfire and they just gave up on enforcing any traffic violations except parking which brings revenue. You can look this up in their own stats for this year
The issue isn't even that Cruise was involved in an incident. They were not at fault for the collision and were only involved after a (human) driver launched the victim into the Cruise vehicle. Cruise had their permit suspended because they essentially tried to hide their involvement/responsibility and withheld video footage from the DMV.
My theory is that we’re just going to read about a never-ending (but possibly increasingly rare) series of edge cases. And in every one, it’ll be dismissed as “oh that was a really rare edge case” yet humans would have been able to make a better choice in the same circumstances.
> The AV detected a collision, bringing the vehicle to a stop; then attempted to pull over to avoid causing further road safety issues, pulling the individual forward approximately 20 feet.
Jesus Christ! Good riddance, I'm glad their permits are revoked. Can you imagine witnessing this and watching an autonomous vehicle slowly crushing and dragging someone underneath? That 20 ft must have felt like forever to witness. Let alone experience.
Really wish there was a way for tech (especially new AI tech) to wholesale abandon the city. Even just move south to the actual Silicon Valley. The city picking winners and losers like this is untenable
I don't know how to respond to this. If you screw up there needs to be consequences that bite. Suspending their operations is better than some meaningless fine and actually punishes them. If you can't see that, I can't help you.
Note this is suspending operations indefinitely with no pathway back. No "take these steps and we'll consider re-instatement". Basically a death knell. Moreover, the city had requested the DMV do this to Waymo as well (the state DMV just declined).
So yea, this is the city working to regulate driving, like it regulates housing and small business i.e. to the death while favoring incumbents that are far worse.
Cruise operates fine in Austin and Phoenix. They could just move their operations to South San Francisco or further south and be better off.
The city simply does not deserve the presence of tech, and for tech companies their presence in the city seems counter-effective
"The DMV has provided Cruise with the steps needed to apply to reinstate its suspended permits, which the DMV will not approve until the company has fulfilled the requirements to the department’s satisfaction."
Reading this, it rather seems like the AV wasn't at fault in the accident itself and handled itself well. However, after coming to a complete stop, while the pedestrian was still on the ground in front of it, it then started again and drove into the pedestrian and continued for another 20 feet pushing the pedestrian along (and presumably under) the car. They say it did this to leave the car in a safe place, but this differs drastically from a normal driver who'd get out and look round the vehicle before moving it anywhere.
I'd say the initial hit probably did a chunk of damage to the pedestrian, but quite likely not fatal unless their head hit the ground. But then being driven over and dragged 20 feet is going to be pretty scary and in all likelihood cause more serious injuries than the initial impact.
> Reading this, it rather seems like the AV wasn't at fault in the accident itself and handled itself well.
Probably because you're reading their press release. Their previous press release was even more convincing - they somehow forgot to mention the part about draggin the pedestrian at the time.
Here they somehow forget to mention the car has already accellerated to 18mph while a pedestrian was still crossing the road.
I get your point, but on the other hand if a pedestrian was crossing on their when the light turns green and the pedestrian was already clear of the lane, I think most human drivers would also proceed. I'm in a different country but, both as a driver and as a pedestrian, I witness this exact situation on a daily basis.
I'm not trying to detract from the horrible outcome of this case, but the problem wasn't the car pulling away at a green light when there were no obstacles in front of it, but the way the car decided to move itself to a "safer place" immediately after an accident when it clearly didn't have enough sensors or the AI wasn't trained sufficiently on this kind of situation to be 100% sure that it was safe for it to make the maneuver.
> but this differs drastically from a normal driver who'd get out and look round the vehicle before moving it anywhere
That is an optimistic view of the likely behavior of your hypothetical driver. Hit and runs are commonplace. Dragging struck pedestrians around isn't all that uncommon either. Panicked motorists frequently make very bad decisions.
“The Nissan Sentra then tragically struck and propelled the pedestrian into the path of the AV. The AV biased rightward before braking aggressively, but still made contact with the pedestrian. The AV detected a collision, bringing the vehicle to a stop; then attempted to pull over to avoid causing further road safety issues, pulling the individual forward approximately 20 feet. The driver of the Nissan Sentra fled the scene after the collision.”
"A dark colored Nissan Sentra was stopped in the adjacent lane to the left of the AV. When the light turned green, the Nissan Sentra and the AV entered the intersection. Against a red light, a pedestrian entered the crosswalk on the opposite side of Market Street across from the vehicles, passed completely through the AV’s lane of travel, then stopped mid-crosswalk in front of the Nissan Sentra. The Nissan Sentra then tragically struck and propelled the pedestrian into the path of the AV."
So there was this woman crossing a crosswalk in front of these two cars when she shouldn't have. I guess I don't understand why neither car stopped in front of the crosswalk when they saw this woman in the middle of the crosswalk (illegally).
IMO it was a bit aggressive for Cruise to have seen this woman jaywalking and still crossed the crosswalk after detecting that the woman had just passed the car but still in the middle of the crosswalk.
Incidentally, "jaywalking" will be legal in California in 2024, finally. Hopefully that will be the beginning of the end for those laws across the rest of the country.
(Of course, we're talking about crossing "when safe", not when you'd be walking into the path of a vehicle with the right of way.)
That's like the only time in Texas a pedestrian doesn't have the right of way in a marked or de-facto crosswalk; if there is a cross signal and it's red.
Have you driven in San Francisco or basically any large US city? If cars had to stop like that to give extra space for a jaywalker that had exited the path of travel, someone would rear end the AV, or at the very least the driving experience would be very jarring. If we wanted to legitimize this case then probably large areas of downtown in cities should be redefined to be free-for-all traffic with low speed limits.
I don't know if this was a homeless woman or not but there's plenty of times I've seen in SF of some mentally disturbed homeless person in the middle of the street.
In those situations I would 100% stop completely in front of them and/or drive very slowly around them. The difference between a regular jaywalker and someone who is mentally disturbed is the mentally disturbed jaywalker is unpredictable and you have no idea what they'll do next.
The right decision in this case was for both cars to at least slow down if not stop in front of the crosswalk. The Cruise car driving normally just because the walker happened to just pass to its left without slowing down doesn't seem like the best decision.
Slow down to a crawl, definitely, but if you stopped until they got off the street completely, you might never move again in some areas of SF (the accident happened near Market Street).
And it's not a given that doing so would make the situation safer, as other drivers would eventually drive around your car.
Normal drivers stop or proceed slowly through intersections when pedestrians are in the road regardless of whether the person is supposed to be there. With great power comes great responsibility.
Anyone who doesn’t slow down in situations like that is a waiting to be like the Nissan driver: a person a-okay with committing manslaughter.
And two wrongs don’t make a right. Just because some human drivers are negligent and horrible doesn’t mean we have to accept robots varieties that are also bad.
> IMO it was a bit aggressive for Cruise to have seen this woman jaywalking and still crossed the crosswalk after detecting that the woman had just passed the car but still in the middle of the crosswalk.
Wat? (Most) pedestrians aren't squirrels that hang out in the road running back and forth. You've put forth an unreasonable standard that basically nobody uses. Except for maybe those "special" types of drivers that will slam on their brakes and wave at you when your intended path was clearly behind their car after it had passed, as if you're supposed to be thankful for their putting themselves in your way to indulge in a simulation of altruism.
A good human driver may have slowed down from perceiving the emerging exceptional situation of the oncoming car not reacting to the pedestrian, but I'd say that's a much different framing than what you asserted.
That wasn't what I was referring to. This is in reference to a jaywalker. My point is both cars should have stopped when they saw a jaywalker crossing the street.
It doesn't sound like the Cruise car stopped because it basically saw that the woman had just barely crossed to its left out of its path of travel.
A more cautious driver would have slowed down at the very least or stopped completely.
This sort of nonsense where you take a serious statement and tell me the cutesy "name" of the autonomous vehicle really detracts from the seriousness of the issue. Also, the last paragraph where they try to talk about how their vehicle wouldn't have made the same mistake as the human driver, glosses over the actual issues the cruise car had which a human driver would have known not to do.
I had the same thought about the name. They also didn't refer to the name again in the rest of the article, so it wasn't even setting up a convenient name to refer to the vehicle and distinguish it from other AVs; it was entirely superfluous.
Ya, and since a Panini is a pressed sandwich, you'd think the PR people would immediately remove that since it's comparable to what the car did to the woman's leg.
"Cruise gives the cars cutesy names" is typical of fast turnaround journalism. The reporter doesn't have the time or resources to investigate the details to determine if this is indicative of a casual culture at Cruise. Instead, they can drop the easily verified vehicle name and let the readers fill-in-the-gaps. It doesn't exactly line up with CA DMV suspending the AV permit, but it adds so-called depth to the story.
I wouldn't expect this offhanded fact in long-form, slower journalism. But I expect this from Reuters.
In their original press release they didn't even mention dragging the pedestrian. Here they've only mentioned it because they were caught red handed by the DMV.
From my limited understanding, web browsers were improved drastically by the creation of cross-vendor testing (e.g. Acid tests). I wonder if there is a way to do some sort of AV simulator testing that would allow us to test the different manufacturers on data collected from the others.
AVs seem to all have different "mental models" of the external world, so it seems like we've got a lot of work before we can pipe data collected from this Cruise incident into a simulator and test Waymo, Uber, Tesla, etc.
https://getcruise.com/news/blog/2023/a-detailed-review-of-th...