Tesla's autopilot, on the road

My Nissan Leaf Forum

Help Support My Nissan Leaf Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
LeftieBiker said:
I'm not sure that replacing inattentive drivers with reckless cars is a good solution. ;)

Well, in the example I linked, the AEB on the car saved the little girl's life. She was able to get up and cry about it, versus being crushed by the inattentive driver (not sure why she was going that fast in a school zone). Hitting stopped emergency vehicles is still its weak point, but it's already saving lives. So I'm ALMOST ready to say that I'd prefer the reckless cars over the inattentive drivers ... ALMOST!
 
Virtually all new cars in the US come with AEB, and I don't know of anyone who thinks that's a bad thing, as it acts as a backup to the driver rather than attempting (or claiming to be able) to replace the driver either part- or full-time.

As it is, M-B is the only manufacturer offering a certified Level 3 system anywhere in the US as of now, and that's in very limited conditions (slower speed stop and go on appropriate sections of freeway):
On suitable freeway sections and where there is high traffic density, DRIVE PILOT can offer to take over the dynamic driving task, up to the speed of 40 mph . . . Once conditions are suitable, the system indicates availability on the control buttons. When the driver activates DRIVE PILOT, the system controls the speed and distance, and effortlessly guides the vehicle within its lane. The route profile, events occurring on the route and traffic signs are correspondingly taken into consideration. The system also reacts to unexpected traffic situations and handles them independently, e.g., by evasive maneuvers within the lane or by braking maneuvers.

LiDAR sensor and redundant systems
The top priority for Mercedes-Benz when introducing such a system is safety, which includes high demands on operational reliability. DRIVE PILOT builds on the surround sensors of the Driving Assistance Package and comprises additional sensors that Mercedes-Benz considers indispensable for safe conditionally automated driving. These include LiDAR, as well as a camera in the rear window and microphones for detecting emergency vehicles, as well as a road wetness sensor in the wheel well. A vehicle equipped with the optional DRIVE PILOT system also has redundant steering and braking actuators and a redundant on-board electrical system, so that it remains maneuverable even if one of these systems fails and a safe handover to the driver can be ensured.

If the driver fails to take back control even after increasingly urgent prompting and expiration of the takeover time (e.g., due to a severe health problem), the system brakes the vehicle to a standstill in a controlled manner while engaging the hazard warning lights. Once the vehicle has come to a standstill, the Mercedes-Benz emergency call system is activated and the doors are unlocked to make the interior accessible for first responders.
https://group.mercedes-benz.com/inn...3: the automated,to intervene by the vehicle.

It seems unlikely this system would issue 150 warnings without deciding the driver was not paying attention.
 
We had a weird event today when propilot was actively trying to steer us out of our lane. I have never seen it act that way before. We turned it off after fighting with it for 30 seconds, but odd none the less.
 
ABG:
NHTSA head says Tesla Autopilot investigation outcome could be announced soon

Acting administrator hints at the outcome in only the broadest of terms
https://www.autoblog.com/2023/08/24...nvestigation-outcome-could-be-announced-soon/

The National Highway Traffic Safety Administration (NHTSA) will resolve its two-year investigation into Tesla Autopilot and could make a public announcement soon, the agency's acting head told Reuters.

"We'll get to a resolution (of the Tesla probe)," Acting NHTSA Administrator Ann Carlson told Reuters in an interview at the agency's headquarters.

Speaking broadly of advanced driver assistance systems, she said, "It's really important that drivers pay attention. It's also really important that driver monitoring systems take into account that humans over-trust technology."

She declined to discuss how the Tesla investigation might be resolved, but added "hopefully you'll hear something relatively soon." Tesla did not immediately respond to a request for comment.

The agency is investigating the performance of Autopilot after identifying more than a dozen crashes in which Tesla vehicles struck stopped emergency vehicles. It is also investigating whether Tesla vehicles adequately ensure drivers are paying attention when using the driver assistance system.

In June 2022, NHTSA upgraded the probe into 830,000 Tesla vehicles it first opened in August 2021 to an engineering analysis — a required step before it could potentially demand a recall. Last month, NHTSA sought updated responses and current data from Tesla in the probe. . . .

NHTSA has said previously that evidence raised questions about the effectiveness of Tesla's alert strategy, which seeks to compel driver attention.

The agency said in 2022 nine of 11 vehicles in prior crashes exhibited no driver engagement, or visual or chime alerts, until the last minute preceding a collision, while four showed no visual or chime alerts at all during the final Autopilot use cycle.

In addition to over-trusting technology, humans who aren't mentally and/or physically engaged in a task for an extended period of time will let their attention wander and/or zone out. Which is why, to take one example, the designers of interstates started to add otherwise unneeded curves to them when they'd previously made them straight (there's a 70 mile section of I-80 in western Iowa without a single curve); with the driver having essentially no need to react to any change in the situation for an extended period of time, they slipped into a state that in the driver ed. films I had to watch in my teens was described in a doom-laden voice as "HIGHWAY HYPNOSIS".

Another example I remember reading was from one of the earliest examples of human factors research, much of which was first carried out in the second world war. In his history of Operational Research in the RAF's Coastal Command ("O.R. in World War 2 - Operational Research against the U-Boat*", written in 1946 but only allowed to be publicly released in 1973. Operational Research is the British term, it's known as Operations Research in the U.S.), C.H. Waddington mentioned that from observing them they established a maximum time for a radar observer to monitor the radar screen before someone else would replace him. I forget what the exact length of time was, but it was somewhere in the 15-30 minute range. These anti-submarine flights lasted hours, and most of the time nothing was detected; most crews never saw a U-boat for their entire flying career. All the radar observer had to look at was the radar 'beam' trace endlessly circling the CRT for hours on end, so it's no surprise that people would get distracted or mentally disengage without even realizing that they were doing so, and miss actual targets. Just imagine one person staring at the test pattern on a TV (I may be dating myself) for hours a day, on numerous days over a period of a year or more in the hope that it will change just once in a small way, and consider how likely it would be that you'd notice it.


*On the off chance that anyone's interested: https://www.amazon.com/Operational-Research-World-Histories-science/dp/023615463X

Also see: https://en.wikipedia.org/wiki/Operations_research
 
GRA said:
ABG:
NHTSA head says Tesla Autopilot investigation outcome could be announced soon

Acting administrator hints at the outcome in only the broadest of terms
https://www.autoblog.com/2023/08/24...nvestigation-outcome-could-be-announced-soon/

The National Highway Traffic Safety Administration (NHTSA) will resolve its two-year investigation into Tesla Autopilot and could make a public announcement soon, the agency's acting head told Reuters.

"We'll get to a resolution (of the Tesla probe)," Acting NHTSA Administrator Ann Carlson told Reuters in an interview at the agency's headquarters.

Speaking broadly of advanced driver assistance systems, she said, "It's really important that drivers pay attention. It's also really important that driver monitoring systems take into account that humans over-trust technology."

She declined to discuss how the Tesla investigation might be resolved, but added "hopefully you'll hear something relatively soon." Tesla did not immediately respond to a request for comment.

The agency is investigating the performance of Autopilot after identifying more than a dozen crashes in which Tesla vehicles struck stopped emergency vehicles. It is also investigating whether Tesla vehicles adequately ensure drivers are paying attention when using the driver assistance system.

In June 2022, NHTSA upgraded the probe into 830,000 Tesla vehicles it first opened in August 2021 to an engineering analysis — a required step before it could potentially demand a recall. Last month, NHTSA sought updated responses and current data from Tesla in the probe. . . .

NHTSA has said previously that evidence raised questions about the effectiveness of Tesla's alert strategy, which seeks to compel driver attention.

The agency said in 2022 nine of 11 vehicles in prior crashes exhibited no driver engagement, or visual or chime alerts, until the last minute preceding a collision, while four showed no visual or chime alerts at all during the final Autopilot use cycle.

In addition to over-trusting technology, humans who aren't mentally and/or physically engaged in a task for an extended period of time will let their attention wander and/or zone out. Which is why, to take one example, the designers of interstates started to add otherwise unneeded curves to them when they'd previously made them straight (there's a 70 mile section of I-80 in western Iowa without a single curve); with the driver having essentially no need to react to any change in the situation for an extended period of time, they slipped into a state that in the driver ed. films I had to watch in my teens was described in a doom-laden voice as "HIGHWAY HYPNOSIS".

Another example I remember reading was from one of the earliest examples of human factors research, much of which was first carried out in the second world war. In his history of Operational Research in the RAF's Coastal Command ("O.R. in World War 2 - Operational Research against the U-Boat*", written in 1946 but only allowed to be publicly released in 1973. Operational Research is the British term, it's known as Operations Research in the U.S.), C.H. Waddington mentioned that from observing them they established a maximum time for a radar observer to monitor the radar screen before someone else would replace him. I forget what the exact length of time was, but it was somewhere in the 15-30 minute range. These anti-submarine flights lasted hours, and most of the time nothing was detected; most crews never saw a U-boat for their entire flying career. All the radar observer had to look at was the radar 'beam' trace endlessly circling the CRT for hours on end, so it's no surprise that people would get distracted or mentally disengage without even realizing that they were doing so, and miss actual targets. Just imagine one person staring at the test pattern on a TV (I may be dating myself) for hours a day, on numerous days over a period of a year or more in the hope that it will change just once in a small way, and consider how likely it would be that you'd notice it.


*On the off chance that anyone's interested: https://www.amazon.com/Operational-Research-World-Histories-science/dp/023615463X

Also see: https://en.wikipedia.org/wiki/Operations_research

Why not wait until the NHTSA report?! This is just one writer's speculation on what the report will be about.
 
Oils4AsphaultOnly said:
Why not wait until the NHTSA report?! This is just one writer's speculation on what the report will be about.

Sure, I'll wait for the specifics in the report, but that doesn't change the behavior of human beings, which has been studied extensively. We know how most humans behave in such situations, which is why the NTSB has been so insistent on the need to require high-quality driver attention-monitoring and warnings, followed rapidly by the car coming safely to a stop if they were ignored. Tesla didn't want to bother with that.

That's aside from the inability of all current L2 ADAS systems to reliably detect stopped emergency vehicles even with their lights flashing and sirens on. I'll be interested to see if M-B's L3 system is more capable in that situation, but they insist that requires Lidar among other sensors, and Tesla doesn't want to bother with it. I can only hope that the NHTSA is finally getting serious about ADAS and public safety; it's long overdue.
 
Tesla ordered by auto regulators to provide data on ‘Elon mode’ Autopilot configuration
https://www.cnbc.com/2023/08/30/tesla-ordered-by-nhtsa-to-provide-data-on-elon-mode-for-autopilot.html
 
https://www.reddit.com/r/TeslaModel3/comments/17axdqd/dangerous_issue_when_wind_surfing_on_pas_side/ has a video of this:
"Dangerous issue when wind surfing on pas side with FSD enabled M3

M3 will aggressively swerve left if the front passenger sticks their hand out the window. The car registers the hand as a person and will try and "avoid" them. First time this happened was on a main road traveling about 45mph and the car went into on coming traffic which needless to say was terrifying. We tested this again with the latest update and the issue still exist."
 
Virginia sheriff’s office says Tesla was running on Autopilot moments before tractor-trailer crash
https://www.cnbc.com/2023/12/13/virginia-sheriffs-office-says-tesla-was-on-autopilot-before-crash.html
 
I heard that and have to wonder why the engineer (now sadly dead) isn't at fault since he didn't have his hands on the steering wheel. I am assuming that if he had he could have avoided the crash the self driving software caused because it would have let him override it's "desires".
 
I heard that and have to wonder why the engineer (now sadly dead) isn't at fault since he didn't have his hands on the steering wheel. I am assuming that if he had he could have avoided the crash the self driving software caused because it would have let him override it's "desires".
I agree. And in two previous similar cases that went to trial the jury agreed also. Tesla's problem is that, despite warnings in the manual, the naming of the systems - Auto Pilot and Full Self Driving - and Musks comments about how good the systems are has many people expecting too much of the systems and ignoring the warnings.

In my experience the systems are very good, but like human drivers they're not perfect. I've had my Tesla save my butt a couple of times, and many times I've felt the need to take control. The systems are much, much better today than what was available at the time of this incident. Still, one does have to pay attention, not play games on one's phone or iPad like this person was doing. You can't 'set it and forget it'.

The curious thing about this case is why Tesla decided to settle given its previous victories in court. It's sure to invite ever more lawsuits.
 
Back
Top