Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

My Tesla will allow me to engage autopilot in a school zone, obeying the adjustment I’d set on speed limit - while using, not the correct school zone speed limit, but the non-school time speed limit. It would allow me to go 30 mph over the school zone speed.

How can Tesla claim self driving if the car can’t read a sign that says - speed limit 25 mph during school hours, and properly adjust? Humans just look around to determine if school is likely in session by the number of cars in the parking lot during normal school hours, or they know the school calendar.

How does a self driving car make that determination? Query the school district website for the school, identifying their bell schedule and tacking on a buffer ahead and behind? Assume a school schedule that’s M-F? What if it’s a religious school that operates Sun-Thursday? Now the car has to determine which religious sects obey which calendar? Is it different in each country?

Just another example of a massive hurdle self driving cars have……

And another recall that should be issued.



> My Tesla will allow me to engage autopilot in a school zone

https://twitter.com/TaylorOgan/status/1478802681141645322

Just a reminder at how awful Tesla cars "self driving" cars are at actually stopping.

Please do NOT rely upon autopilot, fsd, or any other "Tesla-tech". They're incapable of seeing objects like small children in time.

This was a test done at CES on January 6th, 2022.

-------

In contrast, the people who setup the demo were showing a more advanced self-driving car who could actually stop when the child suddenly runs out onto the street.

https://twitter.com/luminartech/status/1479223353730756608


When I learned to drive the only advise my mom gave was. “If you see a ball then slam on the break”

I was told this for months.

Few years later Driving down a road that had endless cars parked along sides. So no visibility of yards.

A small ball came from behind a parked car and bounced in front of me. All I heard was my moms voice. Instantly slammed on brakes.

Sure enough a kid ran out in front of me chasing the ball.

Car stopped inches from kid. He never even noticed me. Even a moments hesitation and that kid would have been dead.

I was going a little below speed limit too, which clearly helped.


This is great but I think it's always good to be aware of the consequences of stopping quickly (rear ending which might cause its own problems of the same magnitude)


A good point. But person behind has a metal cage and air bags.

Kid has a t-shirt.


The impact of the rear ending would probably propel the car forward and hit the child, is what I think the previous commentor meant.


You're saying it's better to just hit a child with your own momentum than hit a child with the momentum of the tailgater behind you?


That, and/or possibly killing several other people. I'm not advocating that people run over the kid at all, I just mean: always be aware of people behind you when braking quickly.


I don't know about other people but as a driver I try to be aware of what's behind me at all times. Where I live (Sydney Australia) it's extremely rare to be tailgated on a suburban street — but when I am, I'll drive to the conditions. That means in an area where road incursions are probable (e.g. where there are pedestrians or playgrounds) I will drop my speed below the limit. If someone is being persistent or aggressive, I'll pull over and let them pass.


The drivers behind you are responsible for maintaining a safe following distance so that you can slam on the brakes when a kid runs into the street. It’s not your job to worry about them. Eyes on the road in front of you.


You should worry about what's in front of your car first and what is behind it after. There's a reason we ~always assign blame to the car that does the rear ending.

Don't tailgate, people. You never know when the car in front of you is going to slam on the brakes because a ball, small child, or plastic bag jumps out in front of it.


Rear ending is almost always the job of the following car to maintain safe distance.


I totally agree with you that this tech needs to get better, but I really want to see apples-to-apples comparison. I would expect Tesla to also stop if a child was running across the movement path in broad daylight.

The night example looks to be specifically stacked against autopilot. Tesla vision is notoriously bad at detecting stationary objects and it needs a lot of light to function well. Lidar/Radar are significantly better than cameras detecting straight ahead obstacles in low light conditions. I would really like to hear Tesla to defend their decision to not use them.

In any case, this testing is great because it lets us know when the autopilot requires extra supervision.


> but I really want to see apples-to-apples comparison.

EDIT: Luminar's car is on the other lane, and there's also a balloon-child in the Luminar's lane. You can see Luminar's car clearly stop in the head-to-head test.

There's also the "advanced" test, where the kid moves out from behind an obstacle here. Luminar's tech does well:

https://twitter.com/PatrickMoorhead/status/14787645152609116...

> I would expect Tesla to also stop if a child was running across the movement path in broad daylight.

Nope.

https://jalopnik.com/this-clip-of-a-tesla-model-3-failing-an...

https://www.latimes.com/business/story/2019-09-03/tesla-was-...

This "tech" can't even see a firetruck in broad daylight. Why do you think it can see a child?

This isn't a one-off freak accident either. "crashing into stopped emergency vehicles with flashing lights in broad daylight" is common enough that NHTSA has opened up an investigation into this rather specific effect: https://static.nhtsa.gov/odi/inv/2021/INOA-PE21020-1893.PDF


The camera alone seems to see a lot:

- 2018: https://youtu.be/_1MHGUC_BzQ

- 2021: https://youtu.be/XfqabC_akV0

Is the car reacting to what it's seeing? Probably not, but I'm not sure if adding a lidar fixes that.


> The camera alone seems to see a lot:

In perfect conditions, on a sunny day.

I'm in Sweden, and the sun shining directly into your eyes from barely above the horizon while the road is wet/covered with snow and reflects that sun at you is a regular occurence during winter months. I odubt Tesla's camera will be able to see anything.


This is the reason why a single camera alone is not capable of being the sole source of information for a self-driving system. The technology currently available for camera systems does not capture a high enough dynamic range to be able to see details in darkness when the Sun in in frame. You could use multiple cameras all with different sensitivities to light and combine them, but it's going to be very difficult.


I really don't see what's difficult. You don't even need multiple cameras, you can simply use very short exposures and combine short exposure shots into a longer exposure one when needed. Multiple cameras are useful to handle glare though.


I think you two just proved my point


Why would it be very difficult? You can split the same light beam after the lens, and send it to two cameras with different diaphragm or sensitivity. You'd then synthesize a perfectly aligned HDR picture.


I think you two just proved my point


Its because Tesla cars are regularly causing "phantom braking" events.

Tesla is trapped between a rock and a hard place. Their "phantom braking" events are causing a lot of dismay to their drivers (https://electrek.co/2021/11/15/tesla-serious-phantom-braking...). But if they reduce phantom-braking, they increase the chance of hitting that child on the road.


Elon claims that the radar was the primary source of phantom braking. He said that matching up a high fidelity sensor (the cameras) with a lower fidelity sensor (the radar) was proving near impossible. I also suspect the supply chain pains massively factored into his decision to remove the radar from all vehicles since roughly late January of last year.


Anyone in the car industry would know this as obviously false? Radar based emergency breaking is availability and works really well in many cars from 5+ years ago.


I’m not debating the validity. This is literally what Elon said.


Radar was removed in May 2021, which predates the article I quoted by multiple months.

I'm sure Elon was blaming Radar for phantom braking in the April / May time period. We can give a few months for the cars to update to the newest version as well.

But by November 2021, RADAR was no longer a relevant excuse. I think you may be mistaken about when Elon said what and when. You gotta keep the dates in mind.


Respectfully, you’re incorrect on the date of the Tesla vision only hardware release. My wife got a model y in early Feb 2021 and it was in the first batch of Tesla vision vehicles that did not ship with a radar. It was manufacturered in January as that’s when we got the VIN. This is first hand experience, not heresay. Elon announced it after they’d been shipping those vehicles for a bit. I was both amused and surprised. She was pissed off that Autopilot was nerfed compared to my 2018 model 3 for max speed as they were working out bugs in the Tesla Vision branch of the code.

I also never said a date about when Elon said those things in my comment, but now understand what you mean about post-vision. But the FSD Beta and Autopilot codebases are so different I am not sure I’d compare them for phantom braking (though recent FSD Beta appears to have way less of this occurrence).

But maybe I’m biased. We have two Teslas, one with, and one without a radar. We’ve seen much more phantom braking with my radar equipped model 3. Anecdotally, I find it happening less in the Y. Also, I didn’t click the article originally as Fred is a click diva and generally disliked by the Tesla community for his questionable reporting. Electrek is an EV fan blog, not much else.


https://www.washingtonpost.com/technology/2022/02/02/tesla-p...

WashPo reports a huge spike of federal complaints from Tesla owners starting in Oct 2021, well into the Vision-only Tesla technology

These are some pretty respectable sources. Federal complaints are public.

> “We primarily drove the car on two-lane highways, which is where the issues would show themselves consistently,” he said in an email. “Although my 2017 Model X has phantom braked before, it is very rare, the vision-based system released May 2021 is night and day. We were seeing this behavior every day.”

So we have Electrek, Washington Post, and the official NHTSA Federal registry in agreement over these phantom braking events spiking in October / November timeframe of 2021. I don't think this is an issue you can brush off with anecdotal evidence or anti-website kind of logic.


That’s totally fair. I’m not pretending it isn’t a problem. Phantom braking is scary as hell when you’re on the highway. I misread your comment on the date and think that’s the thing you really focused on, when I didn’t. You’re right. This is a serious problem.


He also says that creating a Hyperloop is "not that hard"


Tesla partnered with Luminar by the way and even tested their LiDAR on a model 3 last year. I guess they weren't impressed though, since they seem to still be all-in on passive optical recognition.


> I guess they weren't impressed though, since they seem to still be all-in on passive optical recognition.

That's one take - the other take is that they have been selling cars claiming that they are capable of full FSD because they are going to sell it without Lidar, and have been selling FSD as a $5k bolt on, so swapping to Lidar at this point would be a PR nightmare even if it was a better solution....

That's the cynical view though... (Although I also wouldn't be the one to tell the people that have spent lots of money on Autopilot that they have bought total vaporware - or be the CFO that announces they are back-fitting Lidar cameras). Once you are all-in on 'lidar is shit' it makes it hard to reverse the trend, despite rapidly falling costs.


>Once you are all-in on 'lidar is shit' it makes it hard to reverse the trend

It can be done, if there's good cause. Just partner with your lidar oem of choice, get them to do a white paper about how the latest point increase version of hardware or firmware is "revolutionary!" and then claim that your earlier criticisms of lidar have been fully addressed by the groundbreaking new lidar tech.


I've actually been suspecting this will happen once solid state LIDAR technology crossed a certain threshold.

Traditional old school LIDAR units with spinning scan heads are why quite a few self driving cars have the odd bumps and protrusions on them. It's very easy to see someone who wants to make a "cool car" looking at these protrusions, deciding "lidar is shit" and doing everything possible to avoid it. There are some good engineering reasons to avoid traditional lidar units. Meanwhile solid state LIDAR tech has only been on the market for a few years and is still quite expensive compared to traditional LIDAR models, but its definitely superior for a lot of places people want to be able to use LIDAR or where LIDAR would be an excellent competitor to other technology currently in use such as 3D depth mapping and Time of Flight cameras. I briefly looked into some of this stuff when considering work on an "art game" using VR and various 3D scanning technologies in order to make a "fake" Augmented Reality experience as part of constructing the deliberate aesthetic choices of the project.

Solid state LIDAR will definitely be pushed forward by market demand for wider fields of view, lower costs, and smaller module size. All of which will eventually lead to a situation where it will be stupid not to augment the self driving technology due to the massive benefits with zero downsides.


One way out of the LIDAR PR dead end would be for Tesla:

1.) When solid state LIDAR is ready, re-brand it something like SSL technology (Solid State LIDAR) and put it on new high end Teslas.

2.) Wait for all 'camera only' enabled Teslas with FSD beta to age out of service and upsell the owners on a heavily discounted FSD subscription for their brand new Teslas with SSL.


A third path would be to frame the addition of solid state LiDAR as purely an enhancement to their existing cameras, framing it as a camera upgrade instead of a new separate sensor.


That's straight out of Apple's playbook. I recall how Tim Apple ridiculed the OLED displays, until it became impossible to ignore. So I guess it can be done.


FSD is a $12k bolt on.


The public line from Musk for a while has been "LiDAR doesn't work in inclement weather, so L5 autonomous driving can't rely on LiDAR"

Obviously his stated motivation and actual motivation need not be the same.


At least Tesla are consistent - the self driving is dangerously unreliable under any weather condition :)


"Opening an investigation" means nothing before a conclusion is reached.

The accusations could be valid or totally baseless, investigations are opened regardless and specifically to find out validity.


> The accusations could be valid or totally baseless

Read the listed report. All 11 accidents were confirmed to be:

1. Tesla vehicles

2. Confirmed to be on autopilot / full self driving.

3. Against a stopped emergency vehicle with flashing lights or road flares.

These facts are not in dispute. The accusations aren't "baseless", the only question remaining is "how widespread" is this phenomenon.

These 11 accidents have resulted in 1-fatality and 11 injuries.

--------

We are _WAY_ past "validity" of the claims. We're at "lets set up demos at CES to market ourselves using Tesla as a comparison point", because Tesla is provably that unreliable at stopping in these conditions.


Hey, move fast and break things. And call your terrible experimental technology "autopilot".


> The night example looks to be specifically stacked against autopilot.

I would argue so is the real world.


It is exactly the same scenario where Uber self-driving killed a pedestrian crossing the road at night


You mean when the uber test car detected the person but was programmed to do nothing when an obstruction was detected


I'm fine doing away with Uber's self-driving as well. Although I think Tesla's is the worst of the lot, I'm not confident in or thrilled by any self-driving tech on public roads in the next decade


The exact situation where "uber self driving" killed a pedestrian was: the driver was literally watching a movie at her job, while she was supposed to be driving a car and training a self driving system.

A driver killed a pedestrian because she wasn't paying attention: https://www.youtube.com/watch?v=hthyTh_fopo


Sure, but this was supposed to be fully autonomous. Nobody is arguing the human didn’t make a mistake. The autonomous system, however, definitely also did.


This may be technically true (I actually don't know what the drivers full intended purpose at the time was) but it doesn't negate some extremely sketchy software practices on a safety critical system, like "action suppression" to avoid nuisance braking.

As in most accidents of this nature, there is a chain of mistakes. It's bad practice to ignore some mistakes simply because we can also point to other failures in the chain of events.


Volvo's emergency braking system, which detected it and would have braked in time, had been restricted by Uber to not be able to take any action.

Uber's system was set in a way that "non identified object in my way" didn't trigger an immediate slow down, but instead a "sleep it off for a second and check again". It saw the impact coming, and instead of notifying the driver it decided to wait a full second first, because it was programmed to do so. Which any programmer can recognize as the "turn it off and on again" idiom that tells us their system was misidentifying lots of things.

What the driver did or did not do once notified doesn't change that. That car would have killed someone, somewhere, sometime, because it was programmed to not avoid it.


Wasn't this not a pedestrian, but a cyclist crossing in a completely inappropriate place? Granted an SDC should still react to this while many humans in the same situation would not.


Pedestrian slowly walking a bike, with basically no reflective clothing, on a dark night. This is exactly how humans kill pedestrians with cars all the time.


Sure, but that's also why that specific car was equipped with a radar-based obstacle detection.....which the company specifically disabled. There's a very good chance that this system would have saved that person's life. Also while yes, humans are crap at this, it's very rare that you'd just plow into someone at full speed without even attempting to slow down or swerve - which is exactly what the car did.


Tesla didn't use LIDAR because it is more expensive [0]. Quoting Musk:

> Anyone relying on LIDAR is doomed. Doomed. Expensive sensors that are unnecessary. It’s like having a whole bunch of expensive appendices... you’ll see.

[0]: https://www.theverge.com/2019/4/24/18512580/elon-musk-tesla-...


Cost is not the only point he was making. The problem you need to solve is not just “Is there something?”, but also “What is it? And where is it going to move?”. LIDAR cannot do that. Or at least if you get LIDAR to do that, then you would have also been able to get it done with a camera, in which case you wouldn’t have needed LIDAR in the first place.

LIDAR certainly is the low hanging fruit when it comes to the firmer question though (i.e. what is there in my path right now).


One question I've always had about Tesla's sensor approach: why not use binocular forward facing vision? Seems like it would be a simple and cheap way to get reliable depth maps, which might help performance in the situations which currently challenge the ML. Detecting whether a stationary object (emergency vehicle or child or whatever) is part of the background would be a lot easier with an accurate depth map, or so it seems to me.

Plus using the same cameras would help prevent the issues with sensor fusion of the radar described by Tesla due to the low resolution of the radar.

I know the b-pillar cameras exist, but I don't think their FOV covers the entire forward view, and I don't think they have the same resolution as the main forward cameras (partly due to wide FOV).

I'd love to hear why I'm wrong though.


They use three forward facing cameras, actually. And they do get a 3D representation.

https://mobile.twitter.com/sendmcjak/status/1412607475879137...

https://youtu.be/j0z4FweCy4M?t=3780


Sure, but they're not getting that 3d map from binocular vision. The forward camera sensors are within a few mm of each other and different focal lengths.

And the tweet thread you linked confirms it's a ML depth map:

> Well, the cars actually have a depth perceiving net inside indeed.

My speculation was that a binocular system might be less prone to error than the current net.


Sure. You're suggesting that Tesla could get depth perception by placing two identical cameras several inches apart from each other, with an overlapping field of view.

I'm just wondering if using cameras that are close to each other, but use different focal lengths, doesn't give the same results.

It seems to me that this is how modern phones are doing background removal: The lenses are very close to each other, very unlike the human eye. But they have different focal lengths, so depth can be estimated based on the diff between the images caused by the different focal lengths.

Also, wouldn't turning a multitude of views into a 3D map require a neural net anyway?

Whether the images differ because of different focal lengths or because of different positions seems to be essentially the same training task. In both cases, the model needs to learn "This difference in those two images means this depth".

I think with the human eye, we do the same thing. That's why some optical illusions work that confuse your perception of which objects are in front and which are in the back.

And those illusions work even though humans actually have an advantage over cheap fixed-focus cameras, in that focusing the lens on the object itself gives an indication of the object's distance. Much like you could use a DSL as a measuring device by focusing on the object and then checking the distance markers on the lens' focus ring. Tesla doesn't have that advantage. They have to compare two "flat" images.


> I'm just wondering if using cameras that are close to each other, but use different focal lengths, doesn't give the same results

I can see why it might seem that way intuitively, but different focal lengths won't give any additional information about depth, just the potential for more detail. If no other parameters change, an increase in focal length is effectively the same as just cropping in from a wider FOV. Other things like depth of field will only change if e.g. the distance between the subject and camera are changed as well.

The additional depth information provided by binocular vision comes from parallax [0].

> Also, wouldn't turning a multitude of views into a 3D map require a neural net anyway?

Not necessarily, you can just use geometry [1]. Stereo vision algorithms have been around since the 80s or earlier [2]. That said, machine learning also works and is probably much faster. Either way the results should in theory be superior to monocular depth perception through ML, since additional information is being provided.

> It seems to me that this is how modern phones are doing background removal: The lenses are very close to each other, very unlike the human eye. But they have different focal lengths, so depth can be estimated based on the diff between the images caused by the different focal lengths.

Like I said, there isn't any difference when changing focal length other than 'zooming'. There's no further depth information to get, except for a tiny parallax difference I suppose.

Emulation of background blur can certainly be done with just one camera through ML, and I assume this is the standard way of doing things although implementations probably vary. Some phones also use time-of-flight sensors, and Google uses a specialised kind of AF photosite to assist their single sensor -- again, taking advantage of parallax [3]. Unfortunately I don't think the Tesla sensors have any such PDAF pixels.

This is also why portrait modes often get small things wrong, and don't blur certain objects (e.g. hair) properly. Obviously such mistakes are acceptable in a phone camera, less so in an autonomous car.

> And those illusions work even though humans actually have an advantage over cheap fixed-focus cameras, in that focusing the lens on the object itself gives an indication of the object's distance

If you're referring to differences in depth of field when comparing a near vs far focus plane, yeah that information certainly can be used to aid depth perception. Panasonic does this with their DFD (depth-from-defocus) system [4]. As you say though, not practical for Tesla cameras.

[0] https://en.wikipedia.org/wiki/Binocular_disparity [1] https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.36... [2] https://www.ri.cmu.edu/pub_files/pub3/lucas_bruce_d_1981_2/l... [3] https://ai.googleblog.com/2017/10/portrait-mode-on-pixel-2-a... [4] https://www.dpreview.com/articles/0171197083/coming-into-foc...


>different focal lengths won't give any additional information about depth, just the potential for more detail.

This is also why some people will optimize each eye for different focal length when getting laser eye surgery. When your lens is too stiff from age, it won't provide any additional depth perception but will give you more detail at different distances.


Wow. Ok. I did not know that. I thought that there is depth information embedded in the diff between the images taken at different focal lengths.

I'm still wondering. As a photographer, you learn that you always want to use a focal length of 50mm+ for portraits. Otherwise, the face will look distorted. And even a non-photographer can often intuitively tell a professional photo from an iPhone selfie. The wider angle of the iPhone selfie lens changes the geometry of the face. It is very subtle. But if you took both images and overlayed them, you see that there are differences.

But, of course, I'm overlooking something here. Because if you take the same portrait at 50mm and with, say, 20mm, it's not just the focal length of the camera that differs. What also differs is the position of each camera. The 50mm camera will be positioned further away from the subject, whereas the 20mm camera has to be positioned much closer to achieve the same "shot".

So while there are differences in the geometry of the picture, these are there not because of the difference in the lenses being used, but because of the difference in the camera-subject distance.

So now I'm wondering, too, why Tesla decided against stereo vision.

It does seem, though, that they are getting that depth information through other means:

Tesla 3D point cloud: https://www.youtube.com/watch?v=YKtCD7F0Ih4

Tesla 3D depth perception: https://twitter.com/sendmcjak/status/1412607475879137280?s=6...

Tesla 3D scene reconstruction: https://twitter.com/tesla/status/1120815737654767616

Perhaps it helps that the vehicle moves? That is, after all, very close to having the same scene photographed by cameras positioned at different distances. Only that Tesla uses the same camera, but has it moving.

Also, among the front-facing cameras, the two outermost are at least a few centimeters apart. I haven't measured it, but it looks like a distance not unlike between a human's eyes [0]. Maybe that's already enough?

[0] https://www.notateslaapp.com/images/news/2022/camera-housing...


> But, of course, I'm overlooking something here. Because if you take the same portrait at 50mm and with, say, 20mm, it's not just the focal length of the camera that differs. What also differs is the position of each camera. The 50mm camera will be positioned further away from the subject, whereas the 20mm camera has to be positioned much closer to achieve the same "shot".

Yep, totally.

> Perhaps it helps that the vehicle moves? That is, after all, very close to having the same scene photographed by cameras positioned at different distances.

I think you're right, they must be taking advantage of this to get the kind of results they are getting. That point cloud footage is impressive, it's hard to imagine getting that kind of detail and accuracy just from individual 2d stills.

Maybe this also gives some insight into the situations where the system seems to struggle. When moving forward in a straight line, objects in the peripheral will shift noticeably in relative size, position and orientation within the frame, whereas objects directly in front will only change in size, not position or orientation. You can see this effect just by moving your head back and forth.

So it might be that the net has less information to go on when considering objects stationary directly in or slightly adjacent to the vehicles path -- which seems to be one of the scenarios where it makes mistakes in the real world, e.g. with stationary emergency vehicles. I'm just speculating here though.

> Also, among the front-facing cameras, the two outermost are at least a few centimeters apart. I haven't measured it, but it looks like a distance not unlike between a human's eyes [0]. Maybe that's already enough?

Maybe. The distance between the cameras is pretty small from memory, less than in human eyes I would say. It would also only work over a smaller section of the forward view due to the difference in focal length between the cams. I can't help but think that if they really wanted to take advantage of binocular vision, they would have used more optimal hardware. So I guess that implies that the engineers are confident that what they have should be sufficient, one way or another.


> why not use binocular forward facing vision?

Because Tesla have demonstrated that it's unnecessary. The depth information they are getting from the forward-facing camera is exceptional. Their vision stack now produces depth information that is dramatically superior to that from a forward-facing radar.

https://www.youtube.com/watch?v=g6bOwQdCJrc&t=556s

(It's also worth noting that depth information can be validated when the vehicle is in motion, because a camera in motion has the ability to see the scene from multiple angles, just like a binocular configuration. This is how Tesla trains the neural networks to determine depth from the camera data.)


How can it be unnecessary if they are having all these issues? The phantom brake events are no joke.


It makes intuitive sense since you can say, play video games with one eye closed. Yes you lose field of view. Yes you lose some depth perception. But you don’t need to touch your finger tips and all your ability to make predictive choices and scan for things in your one-eyed field of view remains intact.

In fact, we already have things with remote human pilots.

So increasing the field of view with a single camera should intuitively work as long as the brains of the operation was up to the task.


Also there are plenty of humans who are blind in one eye and they can still drive a car without difficulty.


What I was talking about was largely doesn’t apply to the Autopilot legacy stack currently deployed to most Tesla cars.

Personally I wish Tesla would spend a couple of months cleaning up their current beta stack and deploying it specifically for AEB. But I don’t know if that’s even feasible without affecting the legacy stack.


> Their vision stack now produces depth information that is dramatically superior to that from a forward-facing radar.

RADAR is more low fidelity though, blocky, slow and doesn't do changes in direction or dimension very well. RADAR isn't as good as humans at depth. Only benefit of RADAR is it works well in weather/night and near range as it is slower to bounce back than lasers. I assume the manholes and bridges that confuse RADAR are due to the low fidelty / blocky feedback.

LiDAR is very high fidelity and probably more precise than the pixels. LiDAR is better than humans at depth and at distance. LiDAR isn't as good at weather, neither is computer vision. Great for 30m-200m. Precise depth, dimension, direction and size of object in motion or stationary.

See the image at the top of this page and overview on it. [1]

> High-end LiDAR sensors can identify the details of a few centimeters at more than 100 meters. For example, Waymo's LiDAR system not only detects pedestrians but it can also tell which direction they’re facing. Thus, the autonomous vehicle can accurately predict where the pedestrian will walk. The high-level of accuracy also allows it to see details such as a cyclist waving to let you pass, two football fields away while driving at full speed with incredible accuracy.

[1] https://qtxasset.com/cdn-cgi/image/w=850,h=478,f=auto,fit=cr...

[2] https://www.fierceelectronics.com/components/lidar-vs-radar


> Because Tesla have demonstrated that it's unnecessary. The depth information they are getting from the forward-facing camera is exceptional.

Sure! Here's a Tesla using its exceptional cameras to decide to drive into a couple of trucks. For some strange reason the wretched human at the wheel disagreed with the faultless Tesla:

https://twitter.com/TaylorOgan/status/1488555256162172928


That was an issue with the path planner, not depth perception, as demonstrated by the visualisation on screen. The challenge of path planning is underrated, and it's not a challenge that gets materially easier with the addition of LIDAR or HD maps. At best it allows you to replace one set of boneheaded errors with another set of boneheaded errors.


No! It was an issue with the trucks! They shouldn't have been in the way in the first place! Don't they know a Tesla is driving through? They mustn't have been able to see it since they lack exceptional cameras.


Apologies, I thought you were being serious.


That's okay. I didn't think you were being serious so that makes us even.


Which raises the question of why it was so easy to demonstrate it failing at CES.


Because the software running in release mode is a much, much older legacy stack. (Do we know if the vehicle being tested was equipped with radar or vision only?)


But AI and ML isn't as good as a human brain or maybe any brain. I imagine the gap has to be closed with better and multiple sensors or make fundamental leaps in computing technology.


LIDAR works in dark and in bad weather, its useful to know there is something in front of you, even if you dont know what it is.


What bad weather are you talking about where LiDAR works well? It notably does not perform well (or really at all) in rain/snow/fog.

You might be thinking of radar, which Tesla is also no longer putting in their cars.


I've never understood their reasoning. It sounds like a Most Interesting Man in the World commercial: "I don't always tackle the hardest AI problems known to mankind, but when I do, I tie one hand behind my back by not fusing data from every possible sensor I can find in the DigiKey catalog."

IR lidar would be pretty useful in rain and fog, I'd think. But I'd rather have all three -- lidar, radar, and visual. Hell, throw in ultrasonic sonar too. That's what Kalman filters are for. Maybe then the system will notice that it's about to ram a fire truck.


The puzzle piece you are missing is that sensor fusion is not an easy problem either. The Tesla perspective is that adding N different sensors into the mix means you now have N*M problems instead of M.


I hope that's not their perspective, because that perspective would be wrong. There are entire subdisciplines of control theory devoted to sensor fusion, and it's not particularly new. Rule 1: More information is better. Rule 2: If the information is unreliable (and what information isn't?), see rule 1.

Some potential improvements are relatively trivial, even without getting into the hardcore linear algebra. If the camera doesn't see an obstacle but both radar and lidar do, that's an opportunity to fail relatively safely (potential false braking) rather than failing in a way that causes horrific crashes.

Bottom line: if you can't do sensor fusion, you literally have no business working on leading-edge AI/ML applications.


LIDAR does not work in bad weather, fog or heavy snow looks like an oncoming wall.


"You'll see" is the perfect Musk sign for "I have no idea what I'm talking about and I'm frankly just interested in a few suckers believing me."


How much more expensive are we talking? Also won't it get cheaper with time, like the batteries?


Every Pro iPhone has one. So it already got pretty cheap by now. Looking at Mercedes' Level 3 Autopilot tech you can also see how well you can integrate the sensors into the front of a car.


Short range VCSEL is very different than the automotive rotary lidar systems.


At the time of comment, a LiDAR rig would cost around $10,000. A few years before that, they were more like $100,000. Presumably the cameras are much cheaper.

I would be willing to bet that production efficiencies will be found that will eventually drive that cost down significantly.


>stacked against autopilot

To be fair, it's not a computer performance benchmark being gamed here. If nightime is problematic, autopilot shouldn't be running at night. Because if I'm a pedestrian then the odds are stacked against me with any physical encounter with a vehicle. Fairness in the scenario setup shouldn't really be part of the conversation unless it goes beyond the claims if the manufacturer, i.e., if Tesla had said "autopilot does not function in these conditions and should not be used at those times" and then "nightime" was one of those conditions listed. If Tesla hasn't said that a scenario is outside the scope of AutoPilot then the scenario is an appropriate test & comparison point.


> if Tesla had said "autopilot does not function in these conditions and should not be used at those times" and then "nightime" was one of those conditions listed. If Tesla hasn't said that a scenario is outside the scope of AutoPilot then the scenario is an appropriate test & comparison point.

I'd go further and say add "and set the software not to engage this feature at nighttime". Simple disclaimers are not enough when lives are at stake.


I would really like to hear Tesla to defend their decision to not use them.

Andrej Karpathy talks some about it in this (it's quite long, but the whole thing is quite interesting):

https://www.youtube.com/watch?v=NSDTZQdo6H8


I'd never trust a "self-driving" car without lidar. It should be a requirement. There's tons of research on how easy it is to completely fool neural nets with images.


The night example looks to be specifically stacked against autopilot.

I don't think so. If the autopilot can't be used at night, I - who live in Norway - just can't use it during the winter as there isn't enough light. I don't even live above the arctic circle and am lucky enough to get 4-5 hours of (somewhat dimmed) daylight during the darkest times.

If it doesn't do night, it is simply a gimmick to lure you into paying for a promise of a car.


Reminds me of this incident that happened to me a last December: I was driving my kid to school and I noticed some pedestrians on the sidewalk . The mom was walking and texting and the little boy was dribbling a soccer ball while they walked to the school. And suddenly the soccer ball got on the road and the kid dove after it .. in the middle of the road inches from my car. I am so grateful to whatever braking system my car had for stopping just in time. I honked and the mom flipped me the birdie and cussed me out in what I think was Russian.

Kids are stupid and unpredictable and AI/ML can't work out all the insane ways kids can put themselves in harms way. No autopilot or FSD can . Peolple should not rely upon them.


I think the main point is to know the limitations of the technology and to deploy it appropriately. For instance, I don't rely on old-school cruise control to stop for small children, either, even though I engage it in school zones.

This isn't limited to "Tesla-tech". The same rules apply to ALL technology.


"I think the main point is to know the limitations of the technology and to deploy it appropriately"

Where does Tesla provide a list of such limitations for it's customers, I am sure it would be extensively documented given that lives are at stake ?

Or should I find out those limitations myself, potentially killing a few children in the process?


> Where does Tesla provide a list of such limitations for it's customers,

One specific place is first sentence of the FSD Beta welcome email:

"Full Self-Driving is in limited early access Beta and must be used with additional caution. It may do the wrong thing at the worst time, so you must always keep your hands on the wheel and pay extra attention to the road. Do not become complacent."

That's been my experience with it. Right now, the beta doesn't reduce my workload, it increases it. When I want to "just drive", I turn the beta off.

That said, Tesla can and should do more. They need to better frame the capabilities of the system, staring with the silly marketing names.


> It may do the wrong thing at the worst time.

So, basically, I need to somehow predict that FSD will do the wrong thing and react myself, _before_ the worst time, because the worst time is when it's already too late.

Or, in other words, whereas any other car manufacturer has fallbacks for when the driver is not doing what they're supposed to, Tesla treats the driver as the fallback instead. I just don't understand what is this magic that is supposed to allow the driver to predict incorrect AI behavior.


> So, basically, I need to somehow predict that FSD will do the wrong thing and react myself, _before_ the worst time, because the worst time is when it's already too late.

Don't confuse prediction and anticipation. Prediction requires that you know what's going to happen. Anticipation is getting ready for something that might happen. Anticipation is a normal part of defensive driving every day, not prediction.

Let's go back to defensive driving 101: defensive driving allows mistakes to be made. It allows bad things to occur and still recover from them safely. Bad things happen because of mistakes are made by humans in the car, humans outside of the car and also by the computer in the car. The change here is that you the computer is being given much more latitude to make mistakes. It does NOT grant the computer the ability to remove defensive margins from driving.

If you drive (regardless of FSD) with no defensive driving margins, you immediately enter "too late" territory whenever a mistake is made.


Definitely, and that's exactly why I claim this is not even close to FSD, and why I absolutely do not want this in my car.

If I have to be on the wheel and ready to react at any point in time, then I'd rather be the one driving and that's it.

I think regular "old" adaptive cruise control and lane assist are vastly superior to this. I am on the wheel and in charge for 99.9% of the time, as I should be regardless of FSD, and the technology saves me in the 0.1% when I am not.

FSD will never be FSD without a complete redesign of infrastructure, which will not happen in our lifetime.


What does FSD really give you, then? It doesn't reduce the mental toll of staying alert and anticipating the road. It's probably only safe to use on the highway. On the highway, it provides the same automated acceleration/braking you could get from radar-assisted cruise control you can find on any modern car. Like cruise control, it's probably a good idea not to rely on it to avoid large stationary obstacles like a turning freight truck. It does purport to control the steering wheel too, but you can't really trust it not to steer into highway medians either.

They should just rename it to something boring like "camera-assisted cruise control" and remove the beta label so everyone can use it.


That is what Tesla Autopilot is.

FSD is aiming for something a bit higher.


Find them yourself by RTFM maybe?

Tesla puts all the info you need in the owners manual, just like every other manufacturer with automated systems on their cars.

https://www.tesla.com/ownersmanual/model3/en_us/GUID-8EA7EF1...

There are dozens of warnings throughout the manual explaining limitations and cautions around using the systems.

Every other car I've owned with the same or similar systems has the same warnings littered throughout the manual.


> I think the main point is to know the limitations of the technology and to deploy it appropriately.

Such as, for example, by not calling it "autopilot" or "full self driving"?


It may work somewhat like airplane autopilot, but the environments are not comparable. A plane has nothing to hit but terrain which is easily identified and almost all other obstacles in the air are transmitting their position.

It's entirely deceptive.


In addition, pilots are required to have thousands of hours of training for that specific model airplane. I'm sure the limitations of autopilot come up.

Meanwhile, in most US states, an adult can walk into a DMV, demonstrate the ability to turn on the vehicle and do a 3-point/k turn, and walk out with a license.


And at least in one state, all a kid needs is their parent to tell the DMV they can drive

[0] https://www.caranddriver.com/news/a32329549/georgia-no-drive...


That's not that bad, Belgium used to have no driving licenses for normal cars (everyone could drive) and the accident figures were similar to neighbour countries.


I'll give you FSD, but autopilot makes sense to me as someone familiar with aviation.


How about "The person in the driver's seat is only there for legal reasons. He is not doing anything. The car is driving itself."

Tesla Marketing: 2016


> Tesla Marketing: 2016

For reference, this same marketing video is still up on Tesla's site[1].

[1] https://www.tesla.com/videos/autopilot-self-driving-hardware...


A small misunderstanding - "legal reasons" are that it is the person in the driver's seat who is legally liable for damage done while the car was driving itself, not Tesla.


Sounds like trash, but then that's not relevant to what I said.


The important thing here is that for over half-a-decade, Tesla has been lying to its customers about its capabilities.

When in actuality, Tesla will reliably crash into pedestrians and stationary firetrucks. To the point where people at other companies are confident to make live-demos of this at electronic shows.

---------

Calling it "autopilot" or "fsd" isn't the problem. The problem is that Tesla actively lies about its capabilities to the public and its customers. It doesn't matter "how" they lie or exactly what weasel words they use. The issue is that they're liars.

We can tell them to change their name or change their marketing strategy. But as long as they're liars, they'll just find a new set of weasel words to continue their lies.


Does autopilot make sense? Aviation autopilot seems to be many orders of magnitude more reliable than Tesla's autopilot.

In fact, autopilot in aviation contexts is regularly used when human pilots are worse, such as landing at airports that regularly experience fog & low visibility conditions. As in, autopilot is the fallback for humans, not the other way around.

Heck, aviation autopilot is now available for use in emergency landings ( https://www.avweb.com/aviation-news/garmin-autoland-wins-202... ).

Compared to Tesla autopilot, these are seemingly two vastly unrelated situations.


Surely autopilot is an easier problem to solve compared to self-driving cars? Air traffic is controlled, road traffic is chaotic. Aerial vehicles move through what's essentially empty space with pretty much no obstacles, cars must navigate imperfect always changing urban mazes full of people whose actions are unpredictable.


And there is land infrastructure in place for autolanding, aiming localization etc.. Much different from roads.


Is the average purchaser of autopilot familiar with aviation and the technical capabilities of an autopilot in that context?


I’m not familiar with aviation and the only reason I’m aware that airplane autopilot is actually not a self-flying system is because of Tesla and their weasel excuses for their reckless marketing.


Does it? What’s the expected response time on disengagement for a plane?


FSD? Friendship drive?


People intuitively understand the capabilities of cruise control. Can the same be said if FSD?


Given the crashes with cruise control in bad weather, I think the level of understanding is likely fairly similar.


I am not seeing the relation between cruise control and crashes in bad weather?

If I bought something that says it can drive itself, then I expect I do not need to pay attention to the road because it can drive itself. Just like if my friend can drive themselves and I am a passenger, I can trust them to handle paying attention to the road.

To go out of your way and call something "full" self driving only indicates that I should have zero qualms about trusting that I do not need to pay attention to the road.


I'm guessing the 'bad weather' comment is referring to the common belief[1], possibly exaggerated[2], that cruise control can be dangerous and cause crashes when the road is slippery. Not sure what's changed with newer traction control systems. I'd have to believe this has gotten even less likely but I don't know; my cars are too old to even have ABS.

One of the anecdotes in the Jalopnik article mentions that the vehicle is a Town Car, which is significant because those are rear wheel drive and handle very differently from most cars on the road in slick conditions. I would certainly expect more issues with older RWD cars and trucks because they tend to fishtail and spin if the rear wheels are given power without traction.

[1] https://www.snopes.com/fact-check/wild-when-wet/ [2] https://jalopnik.com/lets-debunk-the-idea-that-its-not-safe-...


Why would you ever engage cruise control in school zones?


I personally have a tendency to match the speed of the cars around me. IMNHO, most cars speed through school zones. I use cruise control as a tool to prevent me from accidentally matching the speed of the cars around me and breaking the school zone speed limit.


To avoid speeding. It can be hard to avoid accidentally speeding by 1-2 mph and enforcement is sometimes zero tolerance.


That's crazy, I've never seen anybody get a ticket for 1-2mph over the limit. Problems with that: cops would be wasting resources because 1-2mph over the limit isn't significantly dangerous. Also, the radar guns can't be easily calibrated to that level of accuracy.

If our local cops did this, I'd just make an online post about it so everybody knew the cops were doing it and then it would stop.

My experience (based on a few tickets and observing many cops) is that they don't really care unless you're about 10+ MPH over the limit and also doing unsafe things. That's not to say they don't snag people just driving 5MPH over the limit, but it's not a core activity unless the department is using tickets as a revenue source or trying to make some sort of weird point.


I was in a drivers education class (I’d rear ended someone, and was trying to keep points off my license), and we went around the room explaining what law we broke to wind up in the class. One attendee was there for 1 MPH over in a school zone. Was it probably racial profiling? Probably. But I now stick to exactly 0 MPH over in school zones, and have routinely seen police monitoring speed while dropping my kids off at school. There appears to be zero tolerance even for the most politically connected soccer mom.


Acknowledged. If I received such a ticket I would sign the form the officer gave me and then immediately protest (appeal) the ticket and go to court. In particular, giving a ticket for 1mph over the limit doesn't make sense because the marginal danger (IE, how much more dangerous it is) to drive 1 mph over the limit is tiny, it's 10-15 mph that is dangerous. The police actually have to make a case justifying the value of spending the time of stopping you.

I just looked into the details. In my state, CA, 1-15mph over the limit is specially treated, with one point that eventually gets cleared.

I'm amused because (as I mentioned) I live in a school zone and I just drove home at about 5mph, because the streets were so croweded that anything faster would have been impossible. A cop could not have parked in any location near my house because every spot was taken, and all sightlines were blocked by SUVs or buses.


> it's not a core activity unless the department is using tickets as a revenue source or trying to make some sort of weird point.

Or the officer is racist. I know we’re veering into very off topic discussion here but your experience and resulting list does miss a key component for an experience often described as “driving while black”. 1mph over the speed limit would absolutely see you get pulled over.


> If our local cops did this, I'd just make an online post about it so everybody knew the cops were doing it and then it would stop.

Either you have a very unusual constabulary, or this is wildly optimistic.


The local police were running a traffic light camera scam. Once it was made public, it was immediately stopped. https://padailypost.com/2019/07/12/city-ends-red-light-camer...


That’s a situation where the cameras were wrong and a court would have likely forced a change.

The scenario under discussion is a case where the police are within their rights. A simple blog post would never force any change in most municipalities, much less immediately.


I forgot to mention that cars don't really have high accuracy speedometers; they could be 10% off which could easily cause a conscientious driver to speed. What's the point of giving somebody a ticket for driving 26mph when their speedometer says 24? That's, like, just silly.


I’m not disagreeing with the premise that the police are being overzealous. I just find it hard to imagine a world where someone complaining about it online is guaranteed to result in changes.


In some locations the speed enforcement is autonomous.

>My experience...

Will of course be much different from someone who lives in a different locale or is a different ethnicity and/or social class than you.

>If our local cops did this, I'd just make an online post about it so everybody knew the cops were doing it and then it would stop.

Yeah, ok.

I've met a lot of people with inflated egos but believing you can dictate local law enforcement policy with your internet posts is on a whole nother level.


Autonomous? You mean, like a system that takes photos and sends you a bill? Yes, most such things were removed in our town after it turned out they were set wrong (sending tickets to people who didn't break the law).

Our city manager reads patch, reddit and other things for our town and occasionally engages with the community around policy. This is absolutely something where if you wrote a careful post on reddit saying "Hey, are our cops doing the right thing stopping people going 1mph near a school instead of stopping <whatever>"? There would be an argument, a few people would say 1mph is 1mph over the law, but really, the outcome would be that the ticket appeal would be approved and the cops in my town would be told not to do that.

Pointing out to somebody who says "in my experience" that others would have a different experience is pointless. I know that. If cops are giving people tickets for ethnicity (or even deadheads driving through georgia, which used to happen) that's an entirely different problem from pointless enforcement.


That is a non issue. If your speed variance is 2mph then drive 4mph under the speed limit.


With a 25mph limit and no significant traffic, that would be actively miserable.


I live on a road with two schools zones about a mile apart. I have had people pass me in the morning in the school zone! People do.not.care.

EDIT: Fixed "ppl" to "people".


This is a pet peeve of mine, but why use 'ppl' when you spell out every other word, and then spell out people in the end?

Edit: Yes, ppl bugs me and there is no rational reason why. Emphasis on 'pet peeve'.


Because typing on phones can be annoying and ppl is quicker than people. :)

Or maybe they were texting and using FSD ;)


That's not really valid in the age of Swype-style keyboards. Less effort to do one swipe than multiple taps to hit individual letters for txt type.

Also, if you text and drive I sincerely hope you hit a tree or something else solid that doesn't hurt the road traffic and pedestrians around you - alleged FSD or no, we don't have level 5 fully-autonomous cars yet and so not paying attention to the road is just as bad as drunk or drug driving.


Sure is valid. Not everyone likes the Swype style keyboards.

The FSD part was a joke. :)


Fixed it for you.


To make sure you aren't speeding?


I've never had trouble maintaining 1-2 mph under the limit. I just use less gas and look at the speedometer at times.


I agree with you. With a little bit of driving experience, you have a natural sense for what the safe speed is on a road, and that speed is almost always the speed limit, in my experience.

On a busy road with lots of pedestrians crossing, I naturally want to go much slower than I would on the same road if there were no other pedestrians or traffic. "School zones" just codify that into law - when you expect lots of kids to be crossing a road, the speed limit of the road should be lower.

The issue, for me at least, is the ambiguity. When is the school zone in effect? This creates a cognitive load. The road was clearly meant for 45 mph travel, because that is the normal speed limit. So if I let my "autopilot" brain take over, I will probably go over the 25 mph school zone limit.

It's a special case. So when I see a school zone, I unconditionally set the cruise control to be the school zone speed limit. This frees my brain from any congitive load about whether school is in session. It also guarantees that I am not influenced by the guy tailgating me.

The ability to set the speed of your car exactly, without monitoring, is really useful.


That's definitive proof that it still doesn't work reliably, let alone the system confusing the moon with the traffic light. [0] It shows that it is even worse at night.

I have to say that FSD is so confused, you might as well call it 'Fools Self Driving' at this point.

Oh dear.

[0] https://twitter.com/jordanteslatech/status/14184133078625853...


Puts them in good company. Not the first system to get the wrong ideas about the moon.

https://en.wikipedia.org/wiki/Ballistic_Missile_Early_Warnin...


While 1000% agree the current Tesla FSD beta is in serious need of work; comparing it to unreleased specialized hardware in trials setup by makers of said specialized hardware is a little disingenuous.


But I'm not comparing it against the technology. I'm simply pointing out that Tesla __regularly__ crashes into balloon-children, to the point where a competitor literally used Tesla as a marketing-mule to show off how much progress they made.

--------------

This entire marketing scheme that Luminar did only works because they 100% trust that the Tesla will run into that balloon child on a regular basis (!!!!). This is literally a live demonstration in a major convention center full of a live audience.

I don't have any idea how seeded or unfair Luminar's test is in favor of Luminar's hardware. I don't even trust that Luminar's hardware works (I don't know if they're faking their tech either).

But what I do trust, is for that Tesla to plow into that balloon over-and-over again on a reliable basis.

That's how far away Tesla is from "full self driving", or whatever the heck they want to call it.


This is nonetheless comparing a product currently on sale with a development platform not currently on sale. Surely a fairer test would be to compare a Luminar tech development rig with a Tesla tech development rig.

Luminar doesn't appear to have access to Tesla's under-development technology beta which is, as YouTube videos of FSD beta clearly shows, markedly superior to the technology currently deployed in most Tesla cars. I can't say whether the FSD beta would reliably stop for balloon-children, but from the warts-and-all videos being posted on YouTube, it seems highly likely that it would.


> Surely a fairer test

What?

The "test", as far as I'm concerned, is whether or not Tesla will kill children on our streets. And given this test, it proves that Tesla's emergency braking capabilities are awful.

I don't care about Luminar tech at all, frankly. If Luminar becomes available to buy one day, maybe I'll care then.

---------

At the _ROOT_ of this discussion is a poster who openly admits to using "Autopilot" in school-zones. Given that Tesla will _NOT_ stop for balloon-children until its way too late (the braking action was _AFTER_ the balloon-child was smacked), I have to immediately reprimand the original poster. This is a critical safety issue that people must understand, "Autopilot" and "FSD" are NOT ready to be used in school zones, or other locations where children might run out into the streets randomly.

This has nothing to do with tech-companies swinging their tech around at a show. This has everything to do with stopping the next child from getting smacked by an ignorant driver.


By that logic, the fairer test is for an independent arbiter to compare Tesla's AEB with other cars on sale, modelling a variety of common collision scenarios. All that's been proven here is that a motivated competitor was able to identify one scenario where their development tech works and Tesla's does not.

Let me be clear: I'm not saying that Tesla's current release version of AEB is anywhere near good enough. Clearly there is substantial room for improvement. If I had any influence over Tesla's priorities, I would make them package up the subset of FSD Beta necessary to perform AEB and get that deployed to the entire fleet as a matter of urgency.

------

As for whether a driver chooses to use any L1 or L2 driver assistance features in an inappropriate context, that is always the responsibility of the licensed driver. Engaging these systems is always a manual task performed by the driver. They are not on by default.

If there is a legitimate concern here, perhaps the correct response is to institute a blanket ban on the use of any assistance technologies (e.g. adaptive cruise, lane centring) while in school zones.


Though that scenario is generally well handled by the emergency braking systems found in the standard safety suite of regular automakers.

https://www.iihs.org/news/detail/performance-of-pedestrian-c...


This is such a slimy fake test it makes me side with Elon. The cardboard cutout doesn't behave anything like a child. It behaves like a bit of cloth rubbish which I would hope the car would run straight through.


By a school that "cloth rubbish" could be accompanied by child chasing it (see comment near this one), so one would hope it would at least slow down...


This isn't an honest test. Think through the reality and then mimic that - but the reality isn't a child standing still in the middle of the road in the middle of the night.

Also, Tesla requires you pay attention still - which is relying on it, but they tell you NOT to rely on it 100%, so in this demo the driver is at fault for not watching ahead of them and breaking. So your claim that they're awful at stopping is pretty disconnected.


Even if you are paying attention, wouldn’t there be an inherent delay between passively paying attention and taking over a self-driving car? I don’t have any evidence but it seems like it’s inevitable. Furthermore, by design, even attentive, well-intentioned drivers are more likely to be distracted or under estimate risks if the car is doing the driving 99% of the time. The ‘well technically it wouldn’t have murdered that clearly visible child if you were paying attention’ defence is both specious and obviates a lot of the value of teslas autonomous driving tech.

I won’t argue there aren’t some inherent benefits of even basic driver aids, but if you pay have to pay 100% attention 100% of the time, the tech loses it’s lustre. This is a problem largely of Tesla’s own making IMO — their marketing and Elon’s public statements paint a much, much rosier picture and people buy into it.


I can't be the only one a bit freaked-out with the laughter at the end of that first video?


You can carefully construct a case where it wouldn’t work, obviously. Even humans have blind spots, and are prone to visual hallucinations. Stop spreading FUD. I have AP and I know it works reasonably well - as long as I have my eyes on road and hands on steering, it is safe, while reducing a great deal of stress


Absolutely. My 2021 Audi has traffic sign recognition, and recognizes school zone signs, flags them as such on the console and heads up display. It also recognizes the flashing light indicating that the school zone is "active".

But yet, Tesla, the "not-dinosaur", screws this up completely?

Oof.

AND, if you have adaptive cruise, it will absolutely recognize the discrepancy between your speed and school zones, and will decelerate the car to that speed.


Many school zones do not have flashing lights.

Isn't the correct answer today that humans should override in these situations?

Frankly, as a human I also find "during restricted hours" signs frustrating. How do I know which hours those are?


Frankly, as a human I also find "during restricted hours" signs frustrating. How do I know which hours those are?

As a human being, you weigh the risks and make a choice. Which has the worse outcome — getting to your destination 18 seconds later, or killing a child?


Then just respect those limitations at all times. The school zone will last a whole 200m, you can live with driving 20 in there all the time instead of wondering if you can go full speed in a school zone.


A zone near me has two different sign indicator systems. They appear to go off at different intervals. In both cases I don't see humans around. I would not be surprised if the time the crossing guards and children were present was a completely different, third time.


I still don't know what is meant by "when children are present". On the sidewalk? In the playground? In the classroom?


You act like this is somehow unknowable.

Did you try to google for it? Did you look up the actual laws where you are?

It isn't hard to find the answer. If you're in California, then Vehicle Code 22352 defines it. The same will be true for every other place in the world.


That's a fair point. I grew up in Washington, where I believe it was not defined at all. It was a bit of a hobby of mine to look up vehicular RCWs when I began driving, and the Internet was available.

But it's also possible it's something that puzzled me pre-Internet, and I never bothered to look up post-Internet. There are plenty of those as well.


I asked a CHP officer once and they said "In or near the crosswalks, including sidewalks near crosswalks." Note that police officers are not necessarily up on all of the intricacies of the law though.


It is frustrating that it isn't clearly delineated, but you should be fine basing your guess on some reasonable assumptions: schools are busiest during drop off and pickup, with kids out and about near the road, very good chance those times are included in the "restricted hours", possibly the hours in between as well. Public and secular schools are typically not open on weekends, very likely any weekend hour is not restricted. Schools are usually closed during the summer months and for a time during winter break. This can be tricky because school still can have very different schedules and you may not know them if you don't have kid sin school, but again context can help: is the parking lot completely empty? No one around? School's probably not in session, restricted hours probably don't apply. If you were ticketed for speeding in a school zone during morning drop off, I think you'd have a hard time arguing you didn't know it was a restricted time. Maybe during lunch you could make a case, I guess it would depend on the judge you got.


>arguing you didn't know it was a restricted time

What difference does that make? If it is, you get a fine. Just like any other road rule.


In the US at least you can typically contest a ticket. If you feel it was unjust or unfair you can make your case before a judge, doesn’t mean you get it overturned.

If the sign didn’t state hours and it wasn’t clear it was a restricted time, maybe you could make a case.


I'm sure I saw cars with an option to read speed limit signs about 10 years back. Really boggles the mind that Tesla have gotten away with calling their cars "Full Self Driving".


> I'm sure I saw cars with an option to read speed limit signs about 10 years back.

To be fair, those systems were just best-effort. I'm pretty sure Teslas can handle far harder sign situations than those.

The problem is the edge case and while Telsa may fair better, the older assistant systems did explicitly warn you that it was best effort.


How does the adaptive cruise control know whether or not school is in session and children are on their way to/from school? Or does it slow you down even if it's Christmas morning? If your car forces you to go 20mph below the speed limit that seems like a major safety issue


An easy solution is to always slow down in a school zone.

If you want/need to drive faster in the school zone, you simply step on the accelerator and deactivate self-driving. You can turn it on again when through the school zone.


I think this is very much car-dependent, and probably even user-configurable.

For example, the cruise control on my 2021 Mazda 3 doesn't enforce school zone speed limits (or any speed limits), although it recognises them and will flash on the HUD if you're exceeding the limit. Since it's not enforced, I can just ignore it if it has misidentified a speed sign.

Although, school zone signs around my area actually include a bit of genius design: they are hinged in the middle [1], so during school holidays, the signs are covered up by closing the hinge.

Finally, at least in suburban areas in Australia, a school zone covers a couple hundred metres at most. So even if you or your car gets it wrong, driving 20km/h under the speed limit is a very temporary inconvenience for drivers behind you (unlike, for example, driving 20km/h under the speed limit for 50 kilometres on a single-lane highway).

[1] https://www.canberratimes.com.au/images/transform/v1/crop/fr...


A truly great AI will monitor Instagram post frequency to get to the bottom of this


Interestingly, Tesla’s earliest autopilot software (made by Mobileye) could read and respond to speed limit signs, but MobilEye patented that ability and so when Tesla switched to their in-house software, they lost that ability.

Seems insane to me that you can patent reading a speed limit sign, since reading signs is what signs are for and is necessary to obey the law, but there we go… “with a computer” seems sufficient grounds to make something patentable.


I was wondering what happened. I knew they used to be able to actually read the signs, but now its all a database that can be quite wrong. I think the DB is nice to have, since signs can be few and far between, but would really like to see it back to reading signs.

I don't understand how that could possibly stand up as a patent. It shouldn't pass the obviousness test. Reading a sign is a super obvious thing to do. But you would still have to spend millions fighting it in court. Which is insane.


I don't follow this discussion. My Tesla reads speed signs fine, and always has?


What model do you have? The 3 definitely doesn't read speed limit signs. Or at least doesn't use the information if it does read them. It will drive right past a sign that says 25MPH and still say the limit is 35MPH. Or drive past a 45MPH and say its 35MPH. Things like that. And there is no way to even report it being wrong, that I have been able to find.


Current Tesla AP can read speed limit signs too.


Heh. I wonder what might happen to a person with a bionic eyes.

After lawsuit, signs are blacked out in their vision.


Technically, to black them out, you'd need to recognize them first - which would violate that patent.


Ha, subscription fee to pay for the license to the patent troll.

Then people will install modded bionic eyes firmware downloaded from Ukranian websites[1]. The future will be like Cyberpunk 2077, except the hacks are just to bypass paywall DRM.

[1] https://www.digitaltrends.com/cars/john-deere-tractor-hacks-...


There are no truly self driving cars, and probably won't be. When a traffic cop or school teacher holds up a hand to stop cars or say "turn left NOW", what training data exists?

There should be a much stronger involvement from cities / states legislating a ban on any kind of self-driving in these areas. (self-braking -- sure!) It will have to wait for a child to die, unfortunately.

I'd like to see a self-lane-keeping lane on interstates with a 120 mph speed limit and concrete barriers. If we can have "Zero emissions" cars incentivized with access to HOV lanes, why not cars that can do a good job at lane-keeping and merge-scheduling in areas where there are exactly zero other distractions?


This reminds me of when I was driving in rural Texas a few months back and came across an apparently very recent accident involving a jack-knifed truck. The firefighters had just arrived on scene. While slowly and carefully driving around the accident, a firefighter got out seemed quite upset with me and seemed to be yelling something at me as loudly as he could, but I couldn't make out a word. I still have no idea if I did something wrong. Driving is 99.9% routine and boring and that .1% is ambiguous and quite potentially life threatening. I share the skepticism of self driving cars.


I had someone go in front of me while circling his flashlight as if to proceed. When I did, he casually walked in front of me. He wasn't actually paying attention and was just flinging his light for directing traffic without thinking about it.

Yeah, the .1% can be really ambiguous and dangerous.


Cars are made to be completely quiet inside them, which means there is limited way to communicate with those inside and also that horns are obnoxiously loud.


I agree with you. Real self driving is impossible, IMO. Current "AI" tech will never get us there. And even if we cracked real AGI, I don't see a reason to expect the computer intelligence to be a better driver than a human. AGI does not mean the absence of emotions, distractions, or miscalculations.

We as a society should be realistic about the advantages and limitations of self driving technology. On a highway with well marked lanes and no construction, pedestrians, etc, self driving is awesome. That is the use case that should be optimized and encouraged by states. Everything else should realistically be banned.


> And even if we cracked real AGI, I don't see a reason to expect the computer intelligence to be a better driver than a human.

You can't be serious. Human are notoriously bad in situations where nothing happens 99.9% of the time but requires quick reaction 0.1% of the time.


You're assuming an AGI would have all the characteristics of a machine algorithm and enough intelligence to do exactly what a human would/should (or better) in all driving situations.

That's a big ask, and is a huge superset of AGI.


Humans were evolved for vastly different things than driving. The space of all intelligences is huge.


And yet there is no evidence that self-driving systems are any safer than human drivers, or that they'll ever even be as safe as human drivers, let alone safer than them.


What do you mean "no evidence"? The companies testing self-driving cars now have hundreds of millions of miles driven and I haven't seen any reliable report showing they have higher kill rates than human drivers.

It would seem entirely possible that large amounts of highway driving can be handled by self-driving cars with slightly lower rates of accidents (tired and drunk people kill a lot of people!)


Why would a computer intelligence be any less prone to distraction?


Why would a computer intelligence be any less prone to distraction?

Just wiggle the mouse when you're in a construction zone, so the computer's processing speeds up. Just like on the desktop!


Easy - we can define the computer's objective function. Why would it get distracted by things it doesn't care about?

Meanwhile, humans are checking Instagram and Pornhub while driving around.


Well we haven't invented AGI yet, and so we don't even know if it will be possible to control them with an objective function. So your opinion is entirely speculative and not based on any science.


Who is talking about AGI? No one.


Well, the comment you replied to was talking about AGI.


It's still vaporware for now, but next year GM is supposedly going to start selling a system that will allow true hands-off self driving on highways. It will be interesting to see if they really deliver. The claims seem to be well beyond what Tesla currently sells.

https://www.caranddriver.com/news/a37886786/general-motors-u...


> General Motors is adding another tier to its hands-free driving technology with its new Ultra Cruise system that it claims will work in 95 percent of driving situations.

This seems to be exactly what I mean. They admit that their system can't handle the 5% of edge cases, and market it appropriately. This is not a full self driving car.


> It will have to wait for a child to die, unfortunately.

Even then. The US has a pretty high tolerance for that.


Unless the death is being exploited in order to remove constitutionally protected liberties, the US has pretty close to zero tolerance for the death of minors.


> When a traffic cop or school teacher holds up a hand to stop cars or say "turn left NOW", what training data exists?

This seems like one of the scenarios (multiple but limited in number) explicitly listed in traffic laws which might be scarce in randomly sampled driving data, however, is trivial to artificially manufacture in as large quantities as you need. Go to a local movie producer for their supply of uniform costumes and other varied clothing and the actors/extras to play the role of a traffic cop, and you can get quite a lot of training data in a single week on a budget that's trivial compared to most other self-driving car experiments.


If a cop is directing traffic, you should obey them. But what if they’re next to a construction worker, a cowboy, an Indian chief, a biker, and a sailor?


You're basically asking the question of what is the minimum sufficient level of general intelligence required to allow self-driving cars to go forward.

I instead have a different criteria. If somebody could show that their self-driving car can drive 250 million miles without a person (pedestrian, driver, or passenger) across a fleet and range of conditions, then that's good enough for me (currently, people drive 100 million miles before somebody gets killed).

I figure 2.5X more safe than the average human would lead to an enormous savings in lives and one might even make the argument, at that point, that there was a moral imperative to disallow people driving!

(BTW I live next to an elementary school and people drive past exceeding the posted limit all the time. I struggle to move my car safely through all the distractions. The one thing that helps the most is the radar which hits the breaks if it thinks I'm gonna back into a person or car.)


the roads self-driving cars drive on at this point are heavily self-selected as well. I want see a few thousand self-driving cars interact on the streets of Delhi or see one go through a snowy mountain pass with barely any signs

when people make these safety stat comparisons nowadays seems to be often ignored under how much more ambiguous conditions humans still drive safely.


How about Nigeria, lol!

I'd love to see FSD handle this... https://www.youtube.com/watch?v=umBS-HEUImM


I do not expect FSD to handle that. Nor would I consider it an explicit design goal.

However, I was just driving in my neighborhood in northern CA and saw conditions very much like that (I live next to a school and I was driving right when school let out).


I don't think either the Delhi Street or Snowy Mountain Pass are really required to deploy self-driving cars to 65% of the drivers in the world.


snowy mountain pass maybe not but your average traffic in the world is closer to the cities of India, China, Indonesia or Brazil than a grid system in the suburbs of Phoenix Arizona. Keep in mind the US and Europe account for 10% of the world's population, and I wouldn't even have a lot of confidence in self-driving in your average traffic situation in Rome

Tbh I'm not even sure if self-driving as it is covers 65% of the American population. Has anyone ever tried to use a self-driving car on a rural American road?


The counterpoint is that places and culture can change. I’m sure there was a huge kerfuffle when cars started sharing roads with horse-drawn carriages in London, and I’m sure some people thought cars would never have a place, but here we are. If self-driving cars hit the mainstream in North America and Europe, I guarantee seemingly unfit cities will adjust because truly autonomous vehicles have sooooo many benefits.


yes, people ride teslas in some sort of automated mode on rural roads.

I see your point about the larger driving population, but self driving will roll out in wealthy countries with new car models first, and I think solving the problem there will predate a large solution, for many different reasons.

All my comments about self driving are basically limited to urban and semi-urban wealthy countries since those are the groups subsidizing the research now.


I'm not a fan of Tesla personally but it is worth mentioning that "autopilot" and "self driving" are not the same. Autopilot is, and has always been, cruise control on steroids. Full self driving hasn't reached the consumer market. To expect your Tesla to be that is lying to yourself.


Meanwhile, Tesla seems to be using them nearly interchangeably in its marketing.

https://www.tesla.com/support/full-self-driving-computer

Tesla attempts to bury the lead by saying drivers shouldn't use these features without being "fully attentive" but uses names like "Full Self-Driving" all over their marketing material.

https://www.tesla.com/support/autopilot


Elon has also publicized cases where users filmed themselves not being attentive.

https://nypost.com/2019/05/10/elon-musk-weighs-in-on-porn-fi...


Tesla itself has marketing material that's even worse[1].

[1] https://www.tesla.com/videos/autopilot-self-driving-hardware...


> To expect your Tesla to be that is lying to yourself.

No, it's more simple: Tesla has been lying to you. Musk has been claiming full self driving "next year" for nine years:

https://jalopnik.com/elon-musk-promises-full-self-driving-ne...

Five years ago Tesla claimed that all cars being produced had the "hardware needed for full self-driving capability at a safety level substantially greater than that of a human driver".

That was lie too:

https://www.tesla.com/blog/all-tesla-cars-being-produced-now...


Doesn't Tesla have beta software the driver can turn on with the name "Full Self Driving"? And isn't it intended to be used, "beta tested", on public roads?


Apparently they came out and said that "Full Self Driving™" isn't supposed to be understood as full self driving.


Ah, the Wyngz of the AI industry


Its also worth pointing out what feature set you get with "Autopilot" and even how well those features operate varies a lot depending what country you are in. In the USA, it's gone quite a bit beyond just cruise control on steroids, and will now follow _any_ road no matter how curvy so long as there is a painted center line. In other words, you can turn it on outside schools, in complex urban neighborhoods etc and it will just keep following the road until you disable it or the painted line stops.

In many European countries, the system is restricted to a much more traditional lane assist and does not follow tight bends like this. I have a Model 3 with the standard fit "Autopilot" (not FSD) in US and have tried same model in the UK.

If you follow Bjorn Nyland on YouTube (popular electric car YouTuber), he also recently discovered how much more advanced "Autopilot" is on the American cars when he drove a US import in Thailand and compared to his previous Norwegian Tesla, despite feature being labelled the same.

This all makes having any kind of reasonable discussion on the internet about it really, really hard.


My car's cruise control turns on even without a painted centre line.


To complicate matters further, conventional Traffic Aware Cruise is technically the third driver assist mode Tesla offer in their cars in addition to "AP"/"FSD", this of course also works without the painted line.


I think Elon was the one doing the lying.


Are the signs not delimited by day of week, time and month where you are? E.g. 7:30-18:30 Mon - Fri Sep - Jun

Where I live, police can and will ticket you for speeding during those times. Regardless of if there's students around or if school is even in session or not.


In my state the school zone speed limit is only in effect if it's a school day and there are children around. If all the kids are all inside it's the normal speed limit. I guess Tesla could just use the school zone limit all the time during school times, but people would be annoyed.


At least in Georgia, there's at least one yellow light with a speed limit sign attached. During active school hours, the yellow light will flash on and off.

On my daily commute, there's two overhead flashing yellow lights. Previously, my Model Y would begin to brake and then speed back up, thinking it's a normal traffic light (prior to FSD). With FSD, it's at least smart enough to know not to brake; but it certainly doesn't read the speed limit from the sign, as it normally would.


which state has "when children are around" as part of the law? Feels ripe for abuse and kind've unlikely....


Massachusetts: https://mutcd.fhwa.dot.gov/htm/2009r1r2/part7/part7b.htm#sec...

Can be times of day, "when flashing", or "when children are present". Time of day isn't great because of irregular after school activities, events, delayed starts because of snow, etc.



I live near a road that is divided and normally a 50, unless children are present. It'd be nice if they had flashing lights.


New York is one also, even though the signs don't say it.

https://www.dot.ny.gov/about-nysdot/faq/posting-speed-limit-...


This doesn’t say when children are present - it says school days and maybe specific times. No?


meant to reply one level higher in the thread.


California signs all say "When children are present". But I do think that's not a terribly difficult determination for a machine to make.


The idea of present might be a difficult one. I'm not up to speed on any court cases that have determined what present means, but it's likely more nuanced than a simple "can Tesla identify a child in sight". A human would be more likely to play it safe and just obey regardless.


My understanding is "present" means "school is in session", regardless of whether any children are visible.


Illinois https://www.ilga.gov/legislation/ilcs/fulltext.asp?DocName=0...

> On a school day when school children are present and so close thereto that a potential hazard exists because of the close proximity of the motorized traffic

this lawyer https://www.jolietlaw.com/will-county-attorneys/understandin.... claims that means the limit is not in effect when kids are inside


I don't know the actual law, but California has school zone speed limit signs that say "when children are present."


This doesn't have to be state law, it can be municipal law. All it takes is city council putting up a sign that says "15 MPH when children are present".


>Are the signs not delimited by day of week, time and month where you are?

Depends on the locale. A lot of places just say "Speed Limit is X When Children Present", or "Speed Limit is Y when Lights [on the sign] are Flashing".


Yeah I've seen many such signs and it seems like the clear best approach. People should drive carefully around crowds of children and normally otherwise, regardless of some schedule posted on a given school's website.


Where I live, the school zone speed limits are only in affect when school is actually in session, and the yellow lights are flashing to let you know that it is (but we are also talking about the difference between 15MPH and 25MPH). Sounds like a plum income stream for your city's police department.


Where I live, the signs will state either a time of enforcement of school zones or "When Children Present". Generally, elementary schools will have school zones active during all hours that kids are on site. Middle Schools and high schools tend to be "When Children Present".


I believe signaling has a lot to do with the law in my state, since I'm living in a big city, the signaling is always present. And like I said before, the normal speed limit is 25MPH, so its not like normal traffic is too fast, it just goes down to 15MPH during school in and school out time (there aren't kids hanging around outside the schools otherwise, it is an urban environment).


> Are the signs not delimited by day of week, time and month where you are? E.g. 7:30-18:30 Mon - Fri Sep - Jun

In my local area the signs say either "during school hours", "while children are present" or they say nothing at all and you just have to know what hours and days the lower speed limit is enforced.


Yeah it’s a racket they can hit you with double fines at any time they feel like it.


The answer is for the car to always drive according to the slowest speed allowed.

So, it's always 25mph in that zone. Changing it based on "school hours" is a bad idea anyways.


In Australia, it's not just little side roads that run by school entrances that have this rule; the school zone thing applies even on fairly major free-flowing roads. Two examples I can think of are

https://www.google.com/maps/@-37.9293496,145.0051951,3a,59y,...

https://www.google.com/maps/@-33.7710688,151.0985503,3a,75y,...

As the usual speed limit is quite a bit higher at 70km/h, driving at 40km/h (25mph) outside these times would make you, at best, a rather annoying obstacle to surrounding traffic.


This isn't really safer.

Driving behind a driver not keeping up with traffic and breaking erratically is a traffic hazard (happened to me a few days ago with a human driver).


It's generally a bad idea to fit safety regulations around the safety limitations of the item they're regulating. We set speeds, sometimes based on time of day or presence of children. Humans handle this just fine.

If your car can't, then the car needs to be fixed or it's "self-driving" functionality entirely disabled. Changing speed limit laws to compensate for these limitations is entirely the wrong direction.


My point is that FSD needs to be at least as capable as a human to follow speed limit signs.

The problem generalises - it's also unacceptable for FSD not to keep up with traffic on a freeway or randomly throw in the brakes to avoid spurious hazards for the same reasons.


I remember the story on here a while ago about a self-driving car that got rear-ended because it stopped in the merging lane of an empty highway rather than accelerating like any human would...


If I go 5 mph slower than the speed limit I have had people pass me on the right, this seems less safe…


Even worse, there are school signs that say 15mph when children are present. Kids could be behind cars, bushes, people, etc. That’s a very hard task to deal with.


I mean if the kids are in the bushes I'm not sure a human would be able to figure that out either. It's been said before: self driving cars don't have to be perfect, just better than humans. And humans are super flawed.


The US has about 100 million miles driven per traffic fatality.

Humans are flawed but human drivers are way safer than the human detractors would have you think.


Human drivers "can" be way safer; they aren't always. There's likely some balance where overall, self driving is statistically safer than some group of suboptimal drivers. It remains to be seen but there's always hope.


Unless the unsafe parts of self driving only apply to previously unsafe drivers it will still struggle to take off.

Not every human driver has the same risk, but every self driving car will. (Or it will be based on which car you are in rather than how safe you are.) In other words, relatively safe human drivers could actually see their risk levels go up in a self driving car, even it if it statistically safer than all human drivers.


You can't make that assumption when Tesla FSD won't even engage except in the "easy" situations.


Could you clarify what interactions that figure includes? I.e. is it fatalities for people inside motorized vehicles, or does it include something like a car-bicycle accident?


Motor vehicle fatality statistics in the USA includes pedestrians and bicyclists struck by a car.


Well, for PR purposes, they might have to be substantially better than humans, or the backlash to incidents might be too great.


> That’s a very hard task to deal with.

Not really though, you could just assume there are kids and do 15mph. Not everything is a hard to solve machine learning challenge.


So many ifs. Look, even if you pass the school zone, kids play around. In rural areas, this is very common. Yeah, you could go at the speed limit, but people are just careful or make a judgment call.


Tesla doesn't claim that the car is responsible for setting the correct speed limit -- they actually claim the opposite. From the Owner's Manual:

> Warning

> Do not rely on Traffic-Aware Cruise Control or Speed Assist to determine an accurate or appropriate cruising speed. It is the driver's responsibility to cruise at a safe speed based on road conditions and applicable speed limits.


A problem with that is the car sometimes decides to change the set speed based on the signs and its not always obvious when you're in traffic until the car decides to take off. Alternatively it will slam breaks on if it misreads a sign and you have your speed set high. They need an option for a more dumb cruise control that ignores speed limits.


Of course the fine print covers their asses, but this is not what you’d think watching the marketing demos and YouTube videos around fsd capabilities.


That's in the manual, the fan marketing essential says it reads speed limit signs so you don't miss any of the action while your passenger plays Cuphead (I think though eventually they removed the ability for the passenger to play while you drive).


We should mail a copy of the owners manual to Elon Musk then because just a few days ago he, for about the fourth time, announced that he would be shocked if "Tesla does not achieve Full Self-Driving that is safer than human drivers this year (and 5 years ahead of everyone)".

Maybe we have different definitions of FSD and being safer than a human but to me that includes obeying the speed limits and stop signs

https://electrek.co/2022/01/31/elon-musk-tesla-full-self-dri...


The reality is that life is full of edge cases. For full autonomous driving we probably need full AGI.


I think what we'll end up seeing are the developments of sensors and protocols that are invisible to people, but can be sensed by cars nearby. Traffic lights and road signs will have sensors that emit electronic messages identifying what they are and what rules they are communicating that self-driving vehicles should obey. Other cars will start to have them as well. We may even see small barriers constructed between roads and sidewalks that exist only to discourage pedestrians from entering the road to make automated driving even safer. Your vehicle will be rented from a third party. The owner will legally have to have it registered with the city so that the interior cameras can conduct facial recognition to identify the drivers and passengers. Law enforcement can remotely disable a car and could also send signals to get all other cars to pull over to ease the arrests of carjackers and other criminals. Possibly coming to a small town near you.


Humans have general intelligence and are pretty bad drivers. Why would a computer intelligence be any better? If anything, a computer intelligence would be more likely to get bored or distracted if their "brain" is running faster than ours.


Humans are annoying drivers, but they aren’t particularly bad. I’m surprised it is 1 fatality per 100,000,000 miles, that’s not bad…


I agree that it's a miracle (or perhaps, a testament to modern engineering) that fatalities are so low. But I don't think fatalities is a great measurement.

I've been rear ended twice, and I've seen 4 rear endings happen. Nobody was hurt in any of these accidents, but it was definitely a pain in the ass for everyone involved.


It seems to be one of these AI-complete problems. https://en.wikipedia.org/wiki/AI-complete


I do not understand how cars can really be self driving IMHO. For example, say I'm going down a residential street and there's a few children playing with a football on the side of the road ahead of me. Well, in that situation I'm thinking "one of them could kick the ball into the road and run out" so I'll premptively slow down and be on "alert" just-in-case that happens.

If the worst happens and a child runs out, even though I may have a slower reaction speed than the fancy car, I was already going slower and in the back of my mind I was prepared.

I don't see if a car can do that kind of reasoning.

Another example I can think of, at traffic lights with a pedestrian crossing, my driving instructor taught me to look out for the lit up "Wait" text, if it's on, I can slow down and preempt that it might change before I pass.


This is the exact example I use too. Now self-driving till AI has a proper world model, so maybe not ever. Another case: I drive and see an empty paper bag blown onto the road by the wind. I know it is empty because of the way it flew, but "autopilot" may mistake it for the rock and suddenly stop or swerve. Though mistaking rock for a paper bag and driving on would be much worse.


OR maybe the Tesla is so good at driving it doesn’t need to slow down. Maybe it should actually speed up to give people less time to get in front of it.


*taps forehead*


The simplest thing would be to just get rid of the temporary speed limit and set it to 25mph at all times. 25mph is fast enough.


Another example of a walkable "neighborhood" (with sports, after school, and community events, community usage of facilities at 6-7 days a week) that we ignore the safety of in favor of cars to get wherever else they are going faster.


Yeah I’m a bit surprised to see the problem phrased as “car needs to understand complicated signs to deal with 55mph road having variable speed limit to 25mph during school times” rather than “city needs to figure out how to not have a road with highway speeds going right next to a school”.

I don’t want to make a more general argument about changing things to be easier for self driving cars. Merely that higher speeds should be for roads between places rather than the kind of streets used to access ordinary buildings like schools.


It's standard car-centric thinking. If you start looking into sustainable transport or reducing car usage etc you'll find most of the resistance comes from people unable to imagine a life without a car.


>"25mph is fast enough."

No, it absolutely isn't.


I think you’re missing the point. But maybe you aren’t missing the point and just don’t want to explain your speed-limit philosophy?


The GP said "The simplest thing would be to just get rid of the temporary speed limit and set it to 25mph at all times. 25mph is fast enough."

The reason why the speed limit drops to 25 mph during certain times around schools is because that is when children and adolescents are traveling to or from the campus grounds. It makes sense that you would want the speed limit reduced during a period of increased chaos. But it doesn't make the same sense to keep that reduction when the school is out of session, or when classes are taking place and the attendees are almost entirely contained within the campus.

And, in real terms, no one likes dropping down to 25 mph on a road that is normally 45 mph at all times. Like it or not, most drivers are not as anti-car as the folks on HN and they absolutely will complain to the city about the low speed limit or just ignore it. At which point, they will go back to conditional 25 mph limits.


To be clear, I think the problem is having a fast road right by a school and making a world where the road next to the school is not fast really means putting the fast road somewhere that isn’t full of turnings so drivers can go quickly and without interruptions all day.

The road outside my school when I was young was a 30 but too narrow and bendy for people to drive that fast. When I was older at a larger school in a bigger town it was also a 30 I think with no speed limit changes through the day. The school was on the edge of the town and as you left the urban area it became a little wider and a 40. Outside my place of work in a big city, I’m not even sure what the speed limit is but it would certainly be unwise to attempt more than 20, but that doesn’t matter because the road is for accessing local buildings not crossing the city.

I (or my parents) didn’t choose this arrangement because of some car hatred, that’s just the way it happened to be.

One question is what happens if you reverse the status quo bias. suppose there’s a 25mph street outside a school and 3 proposals:

1. Do nothing

2. Build a fast 55mph bypass road for the traffic that wants to go from roughly one side of the school to roughly the other

3. Widen the street next to the school and increase its speed to 55mph, but only outside of school hours. Keep the turnings for the school and other buildings on the street.

Perhaps there’s some yawning cultural gap between us but option 3 seems pretty terrible to me as it looks like it does little to help the traffic or the school. But maybe it would be the only feasible thing to do and building a new road wouldn’t be an option. When I look at my schools I described above, the first went for option 2 (well the bypass was more for the slow narrow bendy streets than the school) long before I attended the school, and the second went for option 1.


But we're talking about self-driving cars. There's won't be any drivers. When I'm on the train I don't complain about the speed limit because I know fuck all about what speed is safe on that line. Same thing.

Also there shouldn't be a fast road right next to a school. It's insane that such a thing exists.


It is when you aren’t paying attention:)


> speed limit 25 mph during school hours

I've seen many signs that say 25 mph when children are present. That becomes an interesting judgment call: if children are walking down the street or on the sidewalk, it's clearly triggered. But what if children are playing in a fenced in part of the school? How would a computer know that those kids are not able to easily climb over the fence, and are therefore not 'present' quite in the same way as if they were on the sidewalk?


FYI, children playing in a fenced part of a school qualifies as children present.


Citation?

I've talked about this with my lawyer friends (I am a lawyer also), and we've never been able to figure out exactly what this phrase means. Surely it isn't meant to be interpreted literally, as this would imply that the lower speed only applies when multiple children (not a single child) are present.


It's not meant to be interpreted like a computer program, it's meant to be interpreted like law. I'm pretty confident that a judge could reasonably conclude there's an implied "any" quantifier for children, with 1 child satisfying the criteria. Regardless, here's some caselaw that said it only means when children are arriving and leaving:

https://scholar.google.com/scholar_case?case=173803621968999...

Indeed, here in Michigan most of these signs have hours that they apply, sometimes it's like 7:00 am-4:00pm, other times they split it into a morning and afternoon session...


In my pre-law days, I tried to argue to an Illinois county judge that it meant outside not just kids were inside the school that day. Got the ticket anyway.


Wow, the judge considered kids to be present when there were no kids around? That seems pretty outrageous, if the sign said "present".

That doesn't seem to comport with how people actually drive when students are in school but not visible. It also seems questionable to require people to know whether kids are in a school at any given time. You can be fairly sure they're there Mon-Fri during most non-summer weeks of the year, but there are also kids at the school after school for many hours, practicing sports/drama/science bowl.

I'm pretty sure I'd get honked at if I drove 25 MPH past a school in a 35 MPH zone at 5p on a weekday, if there were no kids around.


I think the judge just got it wrong, and maybe have just been pissed a 19 year old was fighting a ticket in his court.

Seems like an absurd reading of that language to me.


How does a self driving car make that determination? Query the school district website for the school, identifying their bell schedule and tacking on a buffer ahead and behind? Assume a school schedule that’s M-F? What if it’s a religious school that operates Sun-Thursday? Now the car has to determine which religious sects obey which calendar? Is it different in each country?

Just another example of a massive hurdle self driving cars have……

This isn't just a problem for self-driving cars, school zone laws are ambiguous, even for human drivers... and the rules can differ by locality. This is a problem that states and localities need to figure out.

https://mynorthwest.com/1543253/washington-driving-school-zo... https://www.reviewjournal.com/local/education/more-school-zo... https://patch.com/missouri/fenton-highridge/school-zone-spee...


The rule was created to lower the risk of kids getting hit by a car.

So let's focus on achieving the end goal and updating whatever law that is necessary along the way rather than taking the existing laws as gospel.

Here it could be signaling to the car where the school or construction zone is so in addition to it lowering its speed it should also bias its perception of kids or construction workers more towards false positive than false negative.


>How can Tesla claim self driving

Autopilot and self driving are two distinct things.


Are you saying that Tesla shouldn't claim to have a "Full Self Driving" feature if it's still not autonomous? Because they do.

"Will the FSD computer make my car fully autonomous?

Not yet. All Tesla cars require active driver supervision and are not autonomous. "

https://www.tesla.com/support/full-self-driving-computer


Not to the average, non-technical person.


I consider myself a pretty technical person and I don't know the difference. Are they not synonyms? Auto=self, pilot=driving


Everyone knows airplane autopilot still needs a human pilot. Everyone.


And yet a couple years ago, they claimed Autopilot was full self-driving.


Tesla have two very distinct cruise/steering product grades, one is called Autopilot and another called Full Self-Driving. It seems unlikely to me that Tesla would claim that their first product was their second product.


This has nothing to do with the article really, but stupid signs will always be a problem as long as they exist. In Austria, there are signs on the highway that allow electric cars to obey a higher speed limit (130 Km/h instead of 100 Km/h; https://autorevue.at/ratgeber/ig-l-immissionsschutzgesetz-lu... ) provided they have (optional, but now standard) green number plates. My car doesn't know that it doesn't have the green plates, so it cannot know what limit to use. Also, it can't look up the paragraph quoted on the sign to read the current version of the law, presumably.


Does that make it a stupid sign, though?


Yes, because it doesn't contain the necessary information to interpret it and on top of that it's hard to read at highway speeds and thus dangerous.


Are you not running FSD beta? That's the subject at hand here. And FWIW: my car sees and honors lit school zone signs in my neighborhood.

> How does a self driving car make that determination?

How does a human driver make that determination? They don't. Those signs don't work, they never have and never will, and they certainly won't in the age of autonomy. Frankly if you *did* want to come up with a solution to this problem, I'd bet on it working better as a data management problem (i.e. do exactly what you said and then geocode the results for the car to interpret) than as a signage problem (i.e. trust people to honor rules that they'll never understand).


They don't. This is why stocks like TSLA should NOT be subject to quarterly earnings reports, they should be subject to quarterly safety reports instead and earnings should not be public for a long period of time.


Well, they are far safer that the other guys, so the same should be applied to all the manufacturers then.


Far safer than who?

Tesla leads the world in terms of self-driving fatalities. It's not even close; the total deaths involving self-driving vehicles from every other automaker in the world combined is less than 1/10th the number of Tesla self-driving fatalities.

Note that I include autopilot in the meaning of "self-driving" here because Tesla does in its marketing.


In terms of total deaths per miles driven, all factors considered, Tesla is very close to the top of the best in terms of safety.

Yeah, there are idiots misusing autopilot but there are about a couple more orders of magnitude more people who die due to drowsy driving a car without driver assistance features.


When has Tesla ever described Autopilot as being Full Self Driving? Regardless what you think of these names, they are nonetheless distinct product names under the Tesla umbrella. Conflating them seems akin to Ford mistakenly describing an Ecosport as being an F150.


So 12 for tesla and how many for other companies? Where is your data?

https://www.tesladeaths.com/


So what? If the total deaths per mile for Tesla is lower then it’s better for everyone.


Well, yes.

We have an economy run by a corpus of idiot shareholders who neither use the product nor are affected by the product, and are free to crash its value at will, and will crash it when earnings doesn't "beat" expectations, so the company is forced to prioritize the entirely wrong things.

If a company wants to "accelerate the transition to clean energy", quarterly earnings is NOT the thing to be prioritizing. Earnings are important for business sustainability, but on much longer cycles than quarterly.


School zone signs around my area actually include a bit of genius design: they are hinged in the middle [1], so during school holidays, the signs are covered up by closing the hinge.

[1] https://www.canberratimes.com.au/images/transform/v1/crop/fr...


Can they handle construction zones where lanes split all over the place and speeds change? I'm guessing no. Based on the constant amount of scuff marks along the guards of the 5 freeway and pretty much every other freeway in California I'm assuming thats a hard challenge even for human drivers.


All they need to do is recognize a construction zone a minute ahead of time and get the human to take over. This would allow level 4 through construction zone. Level 4 is self driving "in the easy parts", and give humans enough time to take over in the "hard parts" (level 5 is everywhere, and as you note a much harder problem). Note though that you can't just stop driving, you need to allow time for the human to figure out what it going on, how to properly have humans take over is itself a hard problem.


Right, but they don’t do that, do they? Their failure mode is beeping at you to take over at the last possible second.


There are a number of other ways where they don't meet level 4. Level 4 is fully hands off in specific situations, and Tesla is never hands offs.


I could see construction zones required to setup self-driving parking buffers for the cars to 'fail' into safely.


For what it’s worth, the one thing my Tesla has been consistently good at is picking the correct lines out of a jumble of nonsense on the road.

I’ve used AP heavily in highway construction zones, and at night with bad visibility, and in inclement weather, and in combinations of the above. AP does better than I can manually do at picking the correct set of road marks to follow, even in cases where the construction has left partial incorrect marks underneath / conflicting_with the right ones.

AP has a ton of other issues of varying levels of severity, but if you’re asking “can I trust it in a construction zone”, I’d say yes based on my usage.


It’s actually remarkably good at this now. Source: Me driving with FSD


No, they fail spectacularly at ad-hoc road work lane setups or setups where the "official" lanes are temporarily blacked out. I think that was the cause of the Tesla smashing into a freeway median a couple years ago.


> I think that was the cause of the Tesla smashing into a freeway median a couple years ago.

If it happened a couple of years ago you’re not talking about FSD.


To be fair i wouldn't know that some religious schools are S-Thurs unless it was spelled out on a sign too. Guess a thorough understanding of all types of religious schools and their operating days of the week should be compulsory for getting a drivers license.


>Just another example of a massive hurdle self driving cars have……

This is my thoughts as well, that the promises of AV may have grossly underestimated the problem space. Solving the problem of perception does not equate to solving the problems of driving.


That, among other things, is pretty much why fully autonomous cars == artificial general intelligence; and that's also why it won't happen this year, neither the next one, neither on the next 10. It will happen someday, for sure, just not soon.


I don't think that's correct.

Having the ability to reason about what you see can be helpful but it's not necessary.

Whatever the most developed version of the Tesla FSD software is, it already recognizes that people (children) can hide behind objects.

If you look at the way the FSD development is made you'd notice that they changed from coding to designing AI around a particular task. This is something they started doing quite recently.

Inferring speed limits from sign data is one problem but you could extend this to classify driving conditions based on image data from all cameras. Trying to detect a reasonable speed limit based on that, forgoing sign detection entirely. Sure, the car can get it wrong but if you err on the side of caution (which you do) the car will pick the slowest option based on conditions and you won't care because you aren't driving anyway.

As horrible as this may sound, ultimate, the system will prove itself when it is involved in fewer accidents and at that point it is a benefit to all. If this doesn't happen then Tesla is wrong but it sure looks to me like they are moving along this path nicely.

Driving a car is a simple problem. At least compared to something like general intelligence. You only need to know if it is safe to drive in a particular direction and sure this is a complex problem as you go into the details but it isn't general intelligence hard. We don't even know what intelligence is but we do have something that is looking a lot like cars that can drive themselves.

I'm a believer.


One way to address this problem is to designate which sections of road are school zoned and which are not. Then, include information about the school zone schedule. I can't imagine this information hasn't been digitized in some way yet.


Even if it is.. how does it get updated? Or verified? Or protected? Who's at fault if the information is inaccurate?


It's an easily solved problem: the state legislature writes a law mandating a statewide database, and schools are required to enter their information.

Like some other commenters pointed out, ad-hoc situations like police directing traffic and one-direction-at-a-time utility work are a much bigger concern.


I know some municipalities publish this information on their city websites. Granted, I do not know much about municipalities and GIS, but I imagine it is possible this is in some format that can be made available to map data services.


The mapping data that Tesla uses does know about school zones, as well as other time or season based speed zones, and even weather-dependent speed zones. They all get rendered on screen, with the currently-active zone shown.


As a former Tesla owner.. I love the car but I honestly don’t believe full self driving is worth the effort nor necessary.

Otherwise Tesla has been great for my portfolio and the couple years of zipping around were great.


Impossible... None of the people mine has hit have ever filed a complaint?

[Sorry in advance]


Couldn’t there just be some google maps / waze API that jurisdictions can enter speed limit information into for different days and times of the week, and just have the car query that?


Judging by just how incorrect information in Google Maps is, I'd say that's a terrible idea. It will be fed incorrect data, and never updated.


As far as I know, these signs always have either flashing lights when active or a printed schedule. I don't think it would be enforceable against human drivers otherwise.


Very very far from "always". Many say "when kids are present" or "on school days". Some rely on external signs to indicate whether the zone is active or not, and some localities within the US have completely unique signs.

And this is just the US. Imagine how much variety there's gonna be outside of it.


Okay, so after checking the MUTCD, the "when children are present" sign is allowed, along with either a list of times or when flashing.

One state defines it as any of the below:

> (1) School children are occupying or walking within the marked crosswalk.

> (2) School children are waiting at the curb or on the shoulder of the roadway and are about to cross the roadway by way of the marked crosswalk.

> (3) School children are present or walking along the roadway, either on the adjacent sidewalk or, in the absence of sidewalks, on the shoulder within the posted school speed limit zone extending 300 feet, or other distance established by regulation, in either direction from the marked crosswalk.

So this would be a bit difficult to implement. Probably best handled by defaulting to slow down and let the driver override if they determine no children are present.


"And this is just the US. Imagine how much variety there's gonna be outside of it."

Less than you think. America is an outlier among developed nations for how non-standardised their road signs are.


In addition to knowing the school hours, a self-driving car has to read and understand the sign - in every language.


That's not the biggest problem: All drivers have to obey hand signals from policemen and other people who are authorized to direct traffic. Which means your car has to know what police officers and state troopers and highway patrol and mounties look like where you happen to be located right now, and not be confused by some dude dressed up as one. Oh, and it has to understand the hand signals as well.

Good luck little car.


I don't think it's reasonable to expect either car or human to determine the distinction between "person dressed as cop" and "sworn police officer acting in accordance with his duties".

Probably shouldn't anyway. It's not unusual for civilians at the site of some unexpected hazard to warn traffic.


Alright then, follow me little car into my bodyshop so I can shut you down and strip you for parts.


"How can Tesla claim self driving if the car can’t read a sign that says - speed limit 25 mph during school hours, and properly adjust?"

Self driving will always be dangerous unless overall traffic infrastructure is updated.

Can you imagine a train where the 100% of the onus of auto-baking falls on the train itself, without zero input from sensors and towers outside the train?


> Self driving will always be dangerous unless overall traffic infrastructure is updated.

I don't see how people can propose this kind of thing with a straight face, when we live in a world where we can't even afford to replace the paint on the road when it gets worn away.

Yeah, sure, governments everywhere will be lining up to pay billions of dollars for putting up and maintaining new infrastructure to provide us with some low ROI shiny toys. And I have a bridge on a blockchain to sell you.


Also, this kind of investment could be far better invested in making cities and (especially the US) far less dependant on cars in general.


We would need to re-think resource allocation. Fewer deaths means smaller healthcare expenditure. Disability claims. Improved road infrastructure doesn't need to be that expensive.


> Fewer deaths means smaller healthcare expenditure

Well, actually more deaths would mean smaller healthcare expenditure. The dead can't get sick anymore.


Yes? That is pretty easy to imagine actually. It would be much less efficient than the current system with a central train controller, but it is definitely imaginable.

In the case of Tesla massively failing to drive safely in a school area: If you cannot operate safely, don't allow autopilot to engage at all.


Yes! I would rather spend resources to standardize “smart” traffic control infrastructure, where vehicles and road/street constantly communicate with each other even if it would just be for augmenting drivers’ awareness. For example in-car warnings about abrupt stop ahead, train approaching level crossing, positions of nearby vehicles, traffic light cycles, actual speed limits etc… Training “AI” to make sense of (sometimes barely) human readable signs and clues is waste of time in my opinion. Maybe just for helping with low speed obstacle avoidance…


Yes. Such a train wouldn't be allowed to drive.

Logical extrapolation of that point left as an exercise for the reader.


I think the central thesis of OP is that the current infrastructure is built for humans. We seem to do OK. So, if anything, these kinds of issues are an indictment of the failure of self-driving tech that was boosted to insane hype around 2017-2018. Now we're getting to a phase called "Trough of Disillusionment" in the Gartner terminology. If we require the rest of the infrastructure to be rebuilt for self-driving tech, then it is a irrefutable admission of failure.


Reading speed limits signs is the easy part compared to figuring out how to handle all the kids walking to school on the sidewalk next to the road (any kid might get pushed into traffic, or decide to start crossing right there...). Even figuring out school hours is easy. Of course it isn't just kids, I've had to handle a bear on the road in front of me before.


> Can you imagine a train where the 100% of the onus of auto-baking falls on the train itself, without zero input from sensors and towers outside the train?

I'm not hugely familiar with trains, but as I understand it, trains in the general case have a much worse braking distance to visibility ratios than cars do.

Roads are generally designed to be safely navigable in good conditions when their occupants are obeying the speed limit, without external sensors. Rail lines are designed to be safely navigable only with the aid of external sensors. That's why trains can take blind corners at speed.


Self driving will always be dangerous unless overall traffic infrastructure is updated.

I prefer to have the computers work for people, rather than the other way around.


Many years ago I was surprised to find out how traffic-detecting intersection lights worked.

I assumed computer vision had been solved and they had reliable enough detection of cars, so you could just plop a few cameras in the intersection to detect waiting cars and start the transition when safe.

Nope! Instead, large numbers of intersections are dug up and electromagnets are installed to detect cars. These are very sensitive (but can miss cyclists) and super-simple; a microcontroller can reliably detect a car as a pulse on a line.


Not specific to tesla but I guess the answer is in your question. Most likely all the current self driving efforts across board will fail till we have smart roads. A road that can help a vehicle self drive will ensure the complexity of self driving is reduced. And of course that might mean these roads will have to be human driver, pedestrian free.


Tesla's self driving is gimmick actually and far from being serious piece of technology.


Is it really an issue to just always have the car obey the lower speed limit?


Yes.


The worst part is that people are paying 12k for "fsd"


If I see a sign like that near a "church school" or whatever I likely won't know it's 25mph on a Sunday.

If there are crossing guards or whatever I'm sure I will stop however, not sure a Tesla will.


Can't we tag children with a 5mph transponder tag?


Seems like a good way for child predators to find prey.


Not if the kids run 6 mph.


slow clap


It’s not self driving yet, that’s how.


Er, so?

My 20 year old Civic will allow me to drive 70mph in a 20. Why shouldn't it? It's my car.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: