I am a massive fan of Mark Roeper. Unfortunately he completely f**d up this one.
He tested using Autopilot, not the latest FSD on HW4, which is worlds apart in capabilities.
It is possible that the latest FSD would also crash, but that would be a valid test of FSD capabilities.
Testing using Autopilot and calling it "FSD crashes" is a HUGE misrepresentation of facts.
I am hoping Mark will post an update to the video.
If anything is misrepresenting it's both the name autopilot and the name "full self driving" for two things that neither are autopiloting or full self driving.
I think I already had this discussion on HN. I am not sure why most people think autopilot is hands of the wheel without looking. There literally is no autopilot, which does not require constant attention from a human. This is true for planes, ships and cars. Hell this true even for most drones.
FSD is very much correctly named. It does not say you can go for a sleep. It just means, the vehicle is capable of full self driving, which is true for most conditions and something most cars are not capable of. How would you have named it?
> I am not sure why most people think autopilot is hands of the wheel without looking.
Well, because they named it autopilot.
Autopilot means it pilots... automatically. Automatic pilot. Not Manual Pilot, not with intervention, automatically.
They could have named it "driver assist", "partial-pilot", "advanced cruise control", "automatic lanekeeping" or anything else, but they named it Autopilot. That's Tesla's fault.
Why is it named like this on ships or planes then? You most certainly need to intervene, when other vehicles are approaching. Just because this happens less in those environments, the name autopilot certainly fits.
If you go back in history, Tesla was one of the first manufacturers, to provide something more than just a lane assist. It is fair in my opinion to search for a different name, that distinguishes yourself from the competition.
For the same reason ships have starboard and port, and cars have driver's side and passenger's side. The terminology is different, the surrounding culture is different, and ignoring that will result in tragedy.
That's not how words work. They chose that word because it's evocative of exactly what most people think while being just vague enough to give them regulatory deniability.
A few years ago, I sat in an Italian airplane with the cockpit door open. The pilot and copilot spent most of the trip speaking to each other, using their hands to emphasize their words (if you've ever seen two Italians arguing passionately, you'll see what I mean).
As far as I could tell, they only had their hands on the instruments during take off and landing.
Commercial pilots have that point hammered into their heads. Tesla did everything in their power to convince buyers that "driver is there only for legal purposes" and they can chill instead of driving.
It's not unreasonable for people to think that "autopilot" means something that automatically pilots a vehicle. According to a dictionary automatic means "having the capability of starting, operating, moving, etc., independently". Whether or not that's how actual autopilots in airplanes and ships works is irrelevant, most people aren't pilots or captains. Tesla knew what they were doing when they chose "autopilot" instead of "lane assist" or similar, like their competitors did. It sounds more advanced that way, in line with how their CEO have been promising full autonomous driving "next year" for a decade now.
It's also worth noting that the recent ship collision in the north sea is thought to be because the autopilot was left on without proper human oversight, so even trained professionals in high stakes environments make that mistake.
Automatic means something different. Here is the definition of what an autopilot is:
The autopilot controls the watercraft or spacecraft without the need for constant manual control by a human. Autopilots do not replace the human operator, rather the autopilot assists the operator in controlling the vehicle or aircraft, allowing the operator to focus on broader aspects of the operation. (Wikipedia)
I do not like Tesla but to this day I don't get the outrage about their autopilot capabilities. It 100% fits the definition of what an autopilot does.
Like I mentioned in the first post, I don't think the definition of what autopilot means in terms of ships or airplanes is relevant. Tesla chose the name because it makes people think the auto part refers to automatic, and then assume the car can drive automatically.
Combine this with how they released a video[0] with the description "The person in the driver's seat is only there for legal reasons. He is not doing anything. The car is driving itself." at the same time. This video was about their FSD and not autopilot, but I think it's reasonable to assume most people won't know the difference. It's deceptive on purpose.
The _only_ relevant question is what the average person understands “autopilot” to mean in the context of a car. This isn't a question that can be declared answered by pulling out a dictionary.
Mercedes has gained approval to test Level 4 autonomous driving. Level 4 is considered fully autonomous driving, although the vehicle retains a traditional cockpit, and the driver can request control at any time. If a Level 4 system fails or cannot proceed, it is required to pull the vehicle over and bring it to a complete stop under its own control.
I would argue that it is getting very close to what people think autopilot can do. A car that, under certain circumstances, can drive for you and doesn't kill you if you don't pay constant attention.
The one which needs a leading vehicle to follow below 40mph on some stretches of freeways? Try to look up videos of it from owners that are not press or mercedes reps.
> Raising the bar in autonomous driving technology, Mercedes-Benz is the first automobile manufacturer in the US to achieve a Level 3 certification based on a 0-5 scale from the Society of Automotive Engineers (SAE). Under specific conditions, our technology allows drivers to take their hands off the steering wheel, eyes off the road — and take in their surroundings
(Specific conditions being the lead car and speed limits you noted, but that’s not what the person you’re replying to is talking about)
Additionally to their level 3 system, they’ve been granted permission to test Level 4 systems not for public use/availability, on a prototype vehicle:
Not sure about other countries but in Germany L4 is only active on freeways up to certain speeds. The video you linked is not showing this. At least the parts where I skipped through.
This isn't true. The autopilot in a plane does nothing but keeping it's course. In this regard it does less than cruise control. It will not even disengage if you forget it is turned on some planes. TCAS will warn you ahead of time, that something needs your attention. This isn't failsafe. You can not (irl should not) do something else, like reading a book, while autopilot is in action.
There is a study which links autopilot to 19 crashes and 175 minor incidents (which have been reported) sine 1983. If you take those numbers an put it into perspective, how often planes crash in general, it is not minor. Same goes for ships.
The checkride required to get an IFR rating quite literally requires you to “read a book” (brief the approach plate to the flight inspector) while the autopilot conducts the initial parts of a precision approach.
A 20 year old avionics suite (GTN 450) does much more than cruise control - you input a flight plan including an approach, it will fly the flight plan, capture the approach signals (VOR/localiser/whatever - which is far more complex than “keeping course”) all the way down to approach minimums.
The report you linked is specifically about general aviation - not about commercial (part 121) aviation. GA lags behind by decades (still uses carburettored engines burning leaded fuels).
In any case, it’s talking about error cases for autopilots, not the operational domain for when they should/shouldn’t be used. The key take away was improved training for identifying autopilot malfunctions and not reduce/eliminate the usage of autopilots in certain scenarios.
> You can not (irl should not) do something else
I’m curious, what autopilots have you used and what capabilities did they have?
Why? It is capable of full self driving. That has been proven numerous times. It is however not capable to do this all time. I ask again, give a better name that describes the function FSD currently offers. It is most certainly not a lane assist.
The name does not matter. In the terms it has to say L5.
Joking aside... 95% of the drives are with 1 or no manual disengagement. The disengagement does not have to be critical. I would always disengage to park because I am faster. Is this not enough to call something FSD?
If I'm on a road trip with somebody, and they're driving, I expect to be able to sleep in the passenger seat. “Full” implies that level of competence, which includes knowing its own limitations and therefore when to pull over.
I can think of half a dozen terms off the top of my head that would better describe what its doing, but they aren't quite as punchy, and critically, they won't give the impression that the car is capable of functioning without oversight. Being able to obliquely claim-without-claiming is very much the point of the current terminology.
* trip control
* trip assist
* co-pilot
* auto steer and throttle
* partial self-driving (i.e., what it's actually doing!)
* hands-free steering
Why does full mean to you never to interact with the car. This would mean L5 driving. They never claimed that. Even the mercedes system which is L4 is only working under certain conditions.
All the names you have given suggest, that it is not capable of managing a complete trip but it is.
If you do not need to do anything for 95% (often 100%) of the trip, which is true for most FSD drives, this would not qualify as FSD for you? Then we can end this discussion and just have different opinions.
Correct. And indeed, L5 is called “Level 5: Full driving automation”. The only reason the word “full” is even in there is because of previous abuses of the notion of “self-driving” cars that couldn't. “Full-no-we-really-mean-it-this-time self-driving”
It most certainly is. Is it reliable? No, not at all. However there are many YouTube channels showing certain routes, where this car is capable from driving itself from a to b without the need of manual intervention. Including parking.
“Reliable” is implicit in “fully self-driving”, otherwise I could call my car self-driving when it manages to stay in its lane after I let go of the steering wheel.
I just checked the current numbers on FSD Tracker. It is at 95% now for the last 30 days. That's a number I would call reliable under supervision. It is way higher than it used to be. It was at 73% when I checked it the last time.
Again. Present a name, which fits and describes the capabilities of FSD. It can under certain circumstances drive a route completely on its own. This is not a lane assist, not (advanced) Cruise Control, not HDA and not Super Cruise.
I think you're ignoring that most car drivers are not versed in the tools and jargon of the air and shipping industries. It really isn't relevant what "autopilot" means in various professional contexts, what matters is what somebody with a high school education (or less) thinks that "autopilot" means.
This is an interesting point. Maybe the problem is most people don’t drive boats or planes so are not familiar with the experience in those contexts. I think you’re right from the boat standpoint a “auto pilot” means you set a heading and the sails/rudder is adjusted to get there.
"Full self driving unless abnormal conditions come up in which case the driver is expected to be instantly alert and capable of doing better than FSD"?
Here's what the official Tesla website has to say about FSD vs. Autopilot:
> In addition to the functionality and features of Autopilot and Enhanced Autopilot, Full Self-Driving capability also includes:
> > Traffic and Stop Sign Control (Beta): Identifies stop signs and traffic lights and automatically slows your vehicle to a stop on approach, with your active supervision.
> > Upcoming: Autostreer on city streets
Since I don't see a stop sign, or a traffic light, I cannot imagine how that makes any difference or can in any way be considered a complete f*k up, or how that's a "HUGE misrepresentation of facts". These things, listed here copied verbatim from the website of the manufacturer, are completely irrelevant to what was being tested here. It's like arguing that a crash test is invalid because the crash test dummy had a red shirt instead of a yellow one.
> > Active safety features come standard on all Tesla vehicles made after September 2014 for elevated protection at all times. These features are made possible by our Autopilot hardware and software system [...]
No mention of FSD anywhere in that section. Tesla fanboys, pack it in.
The good old argument about the "latest" version on the "latest" hardware... As Tesla influencers have been saying for the past 5 years 'it will blow your mind'.
In the very first phrase he says "I'm in my Tesla on Autopilot"...
The video title says "self driving car". This is clearly dishonest when they intentionally did not test the feature named "self driving" in the car shown in the video thumbnail, nor disclose the fact that such a feature does exist and is significantly better than what they tested.
This video is a real missed opportunity. I would love to see how FSD would handle this and I hope someone else takes the opportunity to test it. In fact, testing FSD is such a trivially obvious idea that the fact that it's not even mentioned in the video makes me suspicious that they did test it and the results didn't match the narrative they wanted.
To be clear, this is like buying a car that has traffic aware cruise control available as an option, but turning it down, then insisting the TACC is broken and dangerous because it doesn’t work on your car.
The title of the video says “Self Driving Car” which can and is misleading viewers into thinking it’s a test of Tesla’s “supervised Full Self Driving” product since they do not sell other products that use that term. FSD at one point had a significantly more advanced video to voxel model as part of the perception stack that possibly could have detected this wall (though I believe their planner is now end to end and only gets video input, so I’d be really interested to see if it fails here)
Why would FSD be any more capable? Like why would anyone expect that? I get how this could happen, but this isn't advanced navigation it's basic collision avoidance of things in front of the car, something I'd expect autopilot to do at a bare minimum.
Like I said, I get how this could happen. But it is wild for a company proposing they've "almost" solved FSD to not be able to execute basic collision avoidance - a core, fundamental capability of FSD - to not have deployed that capability into the hardware of a car they claim is completely capable of FSD with that hardware.
No, collision avoidance has to put a greater degree of trust in the actual driver of the vehicle, the human, and only step in if it’s absolutely certain the human is about to hit something they did not intend to hit.
For example, when Tesla detects that you are about to accelerate into a human or a wall, they slow down the acceleration so that you have time to react, but they don’t stop it altogether.
That’s very different from FSD’s decision-making process.
For a while, FSD was a totally different system from Autopilot. FSD was the giant new neural net training approach from end to end for all kinds of driving, while Autopilot was the massive explicitly tuned C++ code base for highway driving from the early days. The goal was to merge them, but I haven't followed it closely lately, so I don't know if they ever did.
You see how this is worse, right? If automatic collision avoidance doesn’t work, why would you expect FSD to do better? (Or more to the point - why would a prospective buyer think so?)
And if collision avoidance doesn’t work better, then why isn’t FSD enabled on all cars — in order to fulfill this basic safety function that should work on all Teslas. Either way you look at it, this isn’t good. Expecting owners to buy an $8K upgrade to get automatic stopping to work is a bit much. Step one - get automatic stopping to work, then we can talk about me spending more money for FSD.
(And yes, I’m still bitter that my radar sensor was disabled).
Why don't you actually "trivially Google it" and see what comes up? Spoiler: it's not Tesla.
Normal, non-luxury cars from virtually every other manufacturer come STANDARD with radar. It's hooked into AEB and cruise control. Even new CIVICS come with radar AEB.
A radar-based system generally offers instantaneous speed and depth information. The best Teslas can do with cameras is guess at distance and then take finite differences to estimate speed - i.e. they make an estimate from an estimate.
The entire rest of the self-driving industry has known this forever - it's why Waymo, Cruise, Zoox, et. al. (who have actually tackled the long-tail cases of autonomy) not only use radar/LIDAR, but slap multiple of them on their vehicles.
Even a human being cannot reliably brake based solely on visual information.
So your "trivially Googleable" proof is... a tweet from four years ago that doesn't actually say what you think it does and a three minute video from a no-name account with 37 subscribers showing test results in broad daylight only?
If you had trivially Googled it, you'd know that "superior" is a score on the AEB test and does not mean "the best." How can you call me clueless when you clearly don't even understand the measuring system used for AEB testing?
My best guess at the answer is that you own some TSLA and are freaking out at the drop.
He disabled autopilot right before the crash. Before crash - you can see autopilot is on with rainbow road visualization, then cuts into his reaction, then at crash it's disabled - https://youtu.be/IQJL3htsDyQ?t=942
Caveat: Autopilot is known to disable itself right before the crash (if it senses it) tho. I doubt this is the case because he would've obviously mentioned.
He did not. Please stop making proclamations on things in which you think but do not know. Mark has also released raw footage to negate this line of attack.
He's testing emergency breaking.
It's not some pay-to-receive service surely? It should be always on, regardless of Autopilot or FSD.
It has come as a default feature with many new vehicles in the modern age.
Are you trying to say that Tesla's FSD emergency breaking and AutoPilot emergency breaking are different?
It's emergency breaking. None of that should matter.
And if it does matter, we are dealing with the possibility that Tesla is selling a deliberately worse product at a lower price point, in exchange for risking the lives of drivers and passengers.
There really is no meaningful difference here, because the result SHOULD be the same, regardless of what feature was enabled or disabled.
He didn’t completely f up, it was a valid test, and a misrepresentation of branding, not facts. The connotation of the name autopilot overlaps significantly with FSD. Nobody forced Tesla to name it that rather than enhanced cruise control or something.
Testing a wider release rather than a more exclusive one meh.
Granted it's muuuuch easier in planes because 1) they're already being routed to avoid each other by humans when the route is setup, 2) they continually announce their position with a high degree of accuracy, 3) generally speaking only two planes will be that close to each other at a time in the air so deconflicting doesn't cause huge ripple effects and 4) the sky is really big but also gives them a 3d dimension to play with where streets are 2 dimensional.
I was a little bit imprecise. Because the article was talking about a stationary obstacle, literally a wall, I was still talking about stationary objects when I said that airplanes don't avoid obstacles. The autopilot of an airplane will not avoid any kind of stationary obstacle, not even a mountain. It can certainly be done; several militaries use terrain–following guidance in both airplanes and missiles. But there’s nothing about the word “autopilot” that should cause people to think that the car will avoid stationary obstacles. An autopilot follows a route and that’s it.
Some aircraft can avoid other aircraft under some circumstances but those aren’t stationary obstacles and unlike random walls that people build, or mountains, they are all tagged with transponders.
Except terrain following radar also exists for military aircraft. Civilian planes don't have it because it's not something they want or need to do not be cause the capability doesn't exist.
If the plane is anywhere near terrain on autopilot things have gone horribly wrong and a person should be flying anyways.
> The autopilot of an airplane will not avoid any kind of stationary obstacle, not even a mountain
Well a core part of the flight is colliding with the ground at the end.
But also the hazard of a stationary object doesn’t exist - crew receive an IFR clearance and fly that route; that route won’t have stationary obstacles.
Aviation’s autopilots aren’t “eyes-off” in VFR so that’s a meaningless comparison.
Sure, there’s a good argument that we should limit self–driving cars to predictable environments. For example, we could dig a bunch of tunnels under our cities and have self–driving cars in them. It’s a controlled environment. If the tunnels connect to buildings underground, then the cars would not need to be used on surface roads.
All the other drive assistance systems out there with proximity radar would have detected the wall. None of them claim a grandiose name like 'autopilot'.
>All the other drive assistance systems out there with proximity radar would have detected the wall
All? Of the 30 cars tested, 7 got "Poor" grades for IIHS's "front crash prevention rating". I spot checked a few of the failing cars and they all supposedly use radar.
> The poor-rated vehicles also struggled in the tests with the passenger car target. Most failed to slow enough in the 37 mph test with the target centered to qualify for additional AEB testing. However, in most trials with the passenger car and semitrailer, they delivered timely forward collision alerts.
Detecting a motorcycle is significantly harder than detecting a semitrailer, which is what this kind of wall would look like to a radar.
While walls like this are a contrived example, I distinctly remember a case from a few years ago where Autopilot failed to detect a semitrailer, drove under it at full speed, and decapitated the driver.
You are mistaken. Automotive radar does not detect stationary walls because it cannot tell the difference between a metal sign near/above the roadway and a wall blocking the roadway. Automotive radars are useful only for moving obstacles.
Still, watching the video people will take away that Tesla FSD is not safe (whereas only autopilot is relatively unsafe). Nothing can be further from the truth, considering I have been using some form of FSD/AutoPilot since it started.
One very interesting things happens when you drive with FSD. Because the FSD in-cabin camera is watching you extremely closely, it is literally like having 2-drivers driving.
I insist that my wife and kids drive with FSD, because then I know then that FSD is forcing them to watch the road. Think about that! Use the latest FSD on HW4 for a while (I do 90% of my drives with FSD) and make an informed decision. IMO, not using FSD is killing/hurting people.