Tesla's autopilot, on the road

My Nissan Leaf Forum

Help Support My Nissan Leaf Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
jjeff said:
I know very little of Tesla's autopilot but I can't believe the car didn't have some sort of feature that would turn it off if there was no weight in the drivers seat?
Yes, there is.

I've probably driven over 40,000 miles using AutoPilot. Like I said, if there was no one to be in the driver's seat while AutoPilot was engages, then these people needed to defeat multiple mechanisms to detect a driver was present and holding the steering wheel.
 
jjeff said:
I know very little of Tesla's autopilot but I can't believe the car didn't have some sort of feature that would turn it off if there was no weight in the drivers seat? and I also can't imagine how anyone could move from the drivers seat to any other seat, especially when the vehicle was in motion. Heck, I have a hard time just getting out of the drivers seat of a Tesla let alone moving to the passenger seat! ahh to be young(and dumb!) again :roll:
I must say when I first read this I was wondering if it was some new club akin to the mile-hile club, the 70mph club :? but if one person was in the front and one back, that wouldn't make sense.........


By multiple mechanisms, jlv meant:
- pre-buckled the seat-belts under you, because you planned on leaving the seats in the first place.
- left a weight under you to trick the seat sensors
- started from a part of the street that had lane markings (because AP won't engage where there isn't clear lane markings even though it'll stay engaged when the markings disappear)
- and kept the accelerator down to override the surface streets speed restriction, while you're sitting in the passenger seat.

Frankly, it's only the news media (and those who have an ax to grind) who thinks this is autopilot related. Anyone who knows how A/P works automatically assumed it was a drunk driver who fled the scene.
 
You forgot the weight on the steering wheel to apply torque.


But this car wasn't on AP. AP won't go "a high rate of speed" when not on a divided highway - it is limited to no more than 5MPH over the speed limit.
 
jlv said:
You forgot the weight on the steering wheel to apply torque.


But this car wasn't on AP. AP won't go "a high rate of speed" when not on a divided highway - it is limited to no more than 5MPH over the speed limit.

I assumed that they were keeping the pedal pressed, they'd hold the steering wheel as well. *shrug*

It's just a completely absurd amount of effort just to make autopilot work in the way that the media described.

People are dumb. And no amount of "protection" will prevent people from doing dumb things. Otherwise, why isn't Chevy being sued for "allowing" this truck driver to hit a telephone pole?!?!
https://www.youtube.com/watch?v=ICAWV74IljE&ab_channel=Nigeman
 
IMO... Tesla should stop calling it "autopilot" which has implications of being autonomous. People don't read warnings and fine print and when they buy new tech they feel compelled to wring it out for all the money they just spent on it without regard for safety warnings.
In the end it's the driver's fault. They activated the feature. BUT, calling it autopilot is just asking for trouble.
Nissan calls it ProPilot. Does that even sound like it's autonomous?
 
EVDean said:
IMO... Tesla should stop calling it "autopilot" which has implications of being autonomous. People don't read warnings and fine print and when they buy new tech they feel compelled to wring it out for all the money they just spent on it without regard for safety warnings.
In the end it's the driver's fault. They activated the feature. BUT, calling it autopilot is just asking for trouble.
Nissan calls it ProPilot. Does that even sound like it's autonomous?

I agree entirely. The name should clearly convey it's purpose as only a safety backup for the human driver. And they need to STFU about "full self driving" until it exists and has been certified as such.
 
Nubo said:
EVDean said:
IMO... Tesla should stop calling it "autopilot" which has implications of being autonomous.

And they need to STFU about "full self driving" until it exists and has been certified as such.
+1
+1

"AutoPilot" is the (terrible) marketing name. When you use the car, the user interface uses the terms Traffic-Aware Cruise Control (TACC) and Autosteer for the two parts of AP. FSD is an even worse marketing name.
 
jlv said:
You forgot the weight on the steering wheel to apply torque.


But this car wasn't on AP. AP won't go "a high rate of speed" when not on a divided highway - it is limited to no more than 5MPH over the speed limit.


I thought they'd removed that limitation after imposing it for a while. And of course, Tesla allowed A/P speeds more than 5 mph over on divided (but not limited access with grade-separated crossings) highways, which is outside the ODD and was a factor in two fatal accidents three years apart involving underrunning fully visible crossing semis in Florida. Preventing A/P use in such cases was one of the recommendations that NTSB made to both Tesla and the NHTSA, first in 2017 and then subsequently, and which so far has been ignored by both.
 
CR Engineers Show a Tesla Will Drive With No One in the Driver's Seat
After a fatal crash in Texas, we demonstrated how easy it is to defeat Autopilot’s driver monitoring
https://www.consumerreports.org/autonomous-driving/cr-engineers-show-tesla-will-drive-with-no-one-in-drivers-seat/
 
A simple weight eh? Well I still put it on the operator, they are clearly going out of their way to make the car do something it is not designed to do.
Doing something similar to the GM system does sadly seem necessary. One has to make it as difficult as possible to bypass sadly. If they were only risking their own lives I might be OK with the Darwin awards but they risk others lives and damage to other people's property.

Others with comments about wording changing the name from autopilot I agree may indeed be necessary, it creates a false impression for some people. Even the Nissan Japan commericals for propilot bother me when Kimtaku takes his hands off the wheel, yes you can to grab a drink or something but even that can create a false impression in someone.
 
GRA said:
SalisburySam said:
As things become more and more idiot-proof, the world makes better idiots.


Yet, if a manufacturer simply ignores taking steps to prevent entirely forseeable idiotic behavior when it's in their power to do so, they shouldn't escape their own responsibility.

I agree...a bit. I think the problem is in the phrase "entirely foreseeable." After the fact many things are viewed as oh, they should've seen that coming, but no one does regardless of how possible or logical. And there are those pesky unintended consequences, also possible, logical, and likely unforeseen. Maybe a manufactured device enabled a certain act of stupidity, or didn't disable it's possibility, and maybe there's a culpability. I think it's a tiny one. Stupidity has its own rewards and punishments; for me without knowing all the details yet, I can't see much actual Tesla liability, although I'm sure they'll be sued to the heavens and the press will feast for hours until the next thing happens.
 
SalisburySam said:
GRA said:
SalisburySam said:
As things become more and more idiot-proof, the world makes better idiots.


Yet, if a manufacturer simply ignores taking steps to prevent entirely forseeable idiotic behavior when it's in their power to do so, they shouldn't escape their own responsibility.

I agree...a bit. I think the problem is in the phrase "entirely foreseeable." After the fact many things are viewed as oh, they should've seen that coming, but no one does regardless of how possible or logical. And there are those pesky unintended consequences, also possible, logical, and likely unforeseen. Maybe a manufactured device enabled a certain act of stupidity, or didn't disable it's possibility, and maybe there's a culpability. I think it's a tiny one. Stupidity has its own rewards and punishments; for me without knowing all the details yet, I can't see much actual Tesla liability, although I'm sure they'll be sued to the heavens and the press will feast for hours until the next thing happens.


As I've mentioned, problems due to the lack of A/P driver engagement monitoring as well as allowing any ADAS or ADS to operate outside their ODD were foreseen, and Tesla and the other manufacturers as well as the NHTSA were specifically warned about these issues by the NTSB 4 years ago after the Williston crash, with the warnings repeated after subsequent fatal crashes.

Please read the NTSB comment letter on the NHTSA notice of proposed rulemaking. Search for "NTSB NHTSA 2020-0106-0617", it should be the second result, a PDF file from the NTSB dated Feb. 21, 2021. It's fifteen pages total and well worth the read, but the parts specifically referring to those two issues can be found on pages 6-9 starting at "Risk Mitigation Pertaining to Monitoring Driver Engagement" and continuing through "Risk Mitigation Pertaining to Operational Design Domain", plus the full list of NTSB recommendations on pages 13-15.

It also discusses AV testing on public roads, and the need for mandatory reporting by all manufacturers of specified data at regular intervals, so that manufacturers can't help or cherry pick the data to show their systems at their best, i.e. independent outside verification is needed.
 
SalisburySam said:
GRA said:
SalisburySam said:
As things become more and more idiot-proof, the world makes better idiots.


Yet, if a manufacturer simply ignores taking steps to prevent entirely forseeable idiotic behavior when it's in their power to do so, they shouldn't escape their own responsibility.

I agree...a bit. I think the problem is in the phrase "entirely foreseeable." After the fact many things are viewed as oh, they should've seen that coming, but no one does regardless of how possible or logical. And there are those pesky unintended consequences, also possible, logical, and likely unforeseen. Maybe a manufactured device enabled a certain act of stupidity, or didn't disable it's possibility, and maybe there's a culpability. I think it's a tiny one. Stupidity has its own rewards and punishments; for me without knowing all the details yet, I can't see much actual Tesla liability, although I'm sure they'll be sued to the heavens and the press will feast for hours until the next thing happens.

Exactly. Deliberate misuse isn't something that can be safe-guarded against, otherwise drunk drivers would've been able to sue the car manufacturers. You can't prevent drivers from deliberately circumventing your safety measures, otherwise cruise control would've been banned a LONG time ago!
 
Some more info, via ABG:
NTSB: Tesla owner started trip in driver's seat before fatal crash

The preliminary report offers no explanation as to the cause

https://www.autoblog.com/2021/05/10/tesla-texas-crash-ntsb-report/


Home security camera footage shows that the owner of a Tesla got into the driver's seat of the car shortly before a deadly crash in suburban Houston, according to a government report Monday.

But the preliminary report on the crash that killed two men doesn't explain the mystery of why authorities found no one behind the wheel of the car, which burst into flames after crashing about 550 feet (170 meters) from the owner's home. Nor does it conclusively say whether Tesla's “Autopilot” partially automated driver-assist system was operating at the time of the crash, although it appears unlikely.

The National Transportation Safety Board said it's still investigating all aspects of the crash. An onboard data storage device in the console, however, was destroyed by fire. A computer that records air bag and seat belt status as well as speed and acceleration was damaged and is being examined at an NTSB lab.

The NTSB said it tested a different Tesla vehicle on the same road, and the Autopilot driver-assist system could not be fully used. Investigators could not get the system's automated steering system to work, but were able to use Traffic Aware Cruise Control.

get the system's automated steering system to work, but were able to use Traffic Aware Cruise Control.

Autopilot needs both the cruise control and the automatic steering to function. Traffic Aware Cruise Control can keep the car a safe distance from vehicles in front of it, while autosteer keeps it in its own lane.

“The NTSB continues to collect data to analyze the crash dynamics, postmortem toxicology test results, seat belt use, occupant egress and electric vehicle fires,” the agency said in its report. “All aspects of the crash remain under investigation as the NTSB determines the probable cause. . . .”


Curiouser and curiouser. Direct link to report: https://www.ntsb.gov/news/press-releases/Pages/NR20210510.aspx
 
GRA said:
Some more info, via ABG:
NTSB: Tesla owner started trip in driver's seat before fatal crash

The preliminary report offers no explanation as to the cause

https://www.autoblog.com/2021/05/10/tesla-texas-crash-ntsb-report/


Home security camera footage shows that the owner of a Tesla got into the driver's seat of the car shortly before a deadly crash in suburban Houston, according to a government report Monday.

But the preliminary report on the crash that killed two men doesn't explain the mystery of why authorities found no one behind the wheel of the car, which burst into flames after crashing about 550 feet (170 meters) from the owner's home. Nor does it conclusively say whether Tesla's “Autopilot” partially automated driver-assist system was operating at the time of the crash, although it appears unlikely.

The National Transportation Safety Board said it's still investigating all aspects of the crash. An onboard data storage device in the console, however, was destroyed by fire. A computer that records air bag and seat belt status as well as speed and acceleration was damaged and is being examined at an NTSB lab.

The NTSB said it tested a different Tesla vehicle on the same road, and the Autopilot driver-assist system could not be fully used. Investigators could not get the system's automated steering system to work, but were able to use Traffic Aware Cruise Control.

get the system's automated steering system to work, but were able to use Traffic Aware Cruise Control.

Autopilot needs both the cruise control and the automatic steering to function. Traffic Aware Cruise Control can keep the car a safe distance from vehicles in front of it, while autosteer keeps it in its own lane.

“The NTSB continues to collect data to analyze the crash dynamics, postmortem toxicology test results, seat belt use, occupant egress and electric vehicle fires,” the agency said in its report. “All aspects of the crash remain under investigation as the NTSB determines the probable cause. . . .”


Curiouser and curiouser. Direct link to report: https://www.ntsb.gov/news/press-releases/Pages/NR20210510.aspx

There's nothing curious about it. The driver lost control and crashed within 550 ft of his driveway. The doors were jammed shut from the crash (which happens - remember Paul Walker?), and the driver tried to flee through the backseat, but didn't make it (that's why he was in the backseat). We've long known that autopilot wasn't a contributing factor and this report was released to settle the FUD.

The constable at the scene should never have claimed that no one was driving the car. He should've stopped with "no one was in the driver's seat" and claimed anything as fact beyond that.
 
Oils4AsphaultOnly said:
GRA said:
Some more info, via ABG:
NTSB: Tesla owner started trip in driver's seat before fatal crash

The preliminary report offers no explanation as to the cause

https://www.autoblog.com/2021/05/10/tesla-texas-crash-ntsb-report/


Home security camera footage shows that the owner of a Tesla got into the driver's seat of the car shortly before a deadly crash in suburban Houston, according to a government report Monday.

But the preliminary report on the crash that killed two men doesn't explain the mystery of why authorities found no one behind the wheel of the car, which burst into flames after crashing about 550 feet (170 meters) from the owner's home. Nor does it conclusively say whether Tesla's “Autopilot” partially automated driver-assist system was operating at the time of the crash, although it appears unlikely.

The National Transportation Safety Board said it's still investigating all aspects of the crash. An onboard data storage device in the console, however, was destroyed by fire. A computer that records air bag and seat belt status as well as speed and acceleration was damaged and is being examined at an NTSB lab.

The NTSB said it tested a different Tesla vehicle on the same road, and the Autopilot driver-assist system could not be fully used. Investigators could not get the system's automated steering system to work, but were able to use Traffic Aware Cruise Control.

get the system's automated steering system to work, but were able to use Traffic Aware Cruise Control.

Autopilot needs both the cruise control and the automatic steering to function. Traffic Aware Cruise Control can keep the car a safe distance from vehicles in front of it, while autosteer keeps it in its own lane.

“The NTSB continues to collect data to analyze the crash dynamics, postmortem toxicology test results, seat belt use, occupant egress and electric vehicle fires,” the agency said in its report. “All aspects of the crash remain under investigation as the NTSB determines the probable cause. . . .”


Curiouser and curiouser. Direct link to report: https://www.ntsb.gov/news/press-releases/Pages/NR20210510.aspx

There's nothing curious about it. The driver lost control and crashed within 550 ft of his driveway. The doors were jammed shut from the crash (which happens - remember Paul Walker?), and the driver tried to flee through the backseat, but didn't make it (that's why he was in the backseat). We've long known that autopilot wasn't a contributing factor and this report was released to settle the FUD.

The constable at the scene should never have claimed that no one was driving the car. He should've stopped with "no one was in the driver's seat" and claimed anything as fact beyond that.


You know for a fact that the driver tried to flee through the backseat? Shouldn't you stop with "no one was in the driver's seat" and not claim anything as fact beyond that?
 
GRA said:
Oils4AsphaultOnly said:
GRA said:

There's nothing curious about it. The driver lost control and crashed within 550 ft of his driveway. The doors were jammed shut from the crash (which happens - remember Paul Walker?), and the driver tried to flee through the backseat, but didn't make it (that's why he was in the backseat). We've long known that autopilot wasn't a contributing factor and this report was released to settle the FUD.

The constable at the scene should never have claimed that no one was driving the car. He should've stopped with "no one was in the driver's seat" and claimed anything as fact beyond that.


You know for a fact that the driver tried to flee through the backseat? Shouldn't you stop with "no one was in the driver's seat" and not claim anything as fact beyond that?

My original hypothesis (posted last week) was that the driver fled the scene. Now that they have conclusive evidence that the driver died at the scene, many suppositions can be laid down to rest:
- The driver was found dead in the back seat.
- Driver was seen entering the driver's seat and driving away by local security cameras.
- car had 550 ft to accelerate to impact speed.

Indeed, an alien or ghost could've intervened and moved the driver, or maybe the recoil from impact could've ejected the driver into the back seat, but Occam's razor rules out the first possibility, and newton's first law rules out the second. If you don't start with the assumption that Tesla's autopilot is at fault, then the truth is much simpler to grasp.
 
Oils4AsphaultOnly said:
...Indeed, an alien or ghost could've intervened and moved the driver, or maybe the recoil from impact could've ejected the driver into the back seat, but Occam's razor rules out the first possibility, and newton's first law rules out the second. If you don't start with the assumption that Tesla's autopilot is at fault, then the truth is much simpler to grasp.

The hard part is putting myself into the mindset of someone who would attempt to abuse the car's systems in this way. It's hard to label it as anything short of insanity. And so we look for a more reasonable explanation but sometimes people are just crazy.
 
Back
Top