Tesla's autopilot, on the road

My Nissan Leaf Forum

Help Support My Nissan Leaf Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
GRA said:
And finally we have confirmation of the details. Meanwhile: https://www.mynissanleaf.com/viewtopic.php?f=10&t=20321&start=2330#p611804

GRA said:
ABG:
Buttigieg defends safety agency appointment after Musk claims she's 'biased'
'He's welcome to call me if he's concerned'

https://www.autoblog.com/2021/10/23/pete-buttigieg-eelon-musk-nhtsa-appointment-twitter/


. . . Musk has taken umbrage with the appointment of Duke University engineering and computer science professor Missy Cummings as a safety adviser at the National Traffic Highway Safety Administration (NHTSA). “Objectively, her track record is extremely biased against Tesla,” he said Tuesday.

Cummings, who directs the Humans and Autonomy Laboratory at Duke, responded that she was “happy to sit down and talk” to Musk anytime.

Cummings has also been a frequent Twitter user, often airing her concerns over Tesla’s driver assistance technology and the company’s methods for rolling it out on the social media platform. In September, she sent a series of tweets criticizing Tesla’s rollout of the “safety score” as a way of allowing drivers access to its “Full Self-Driving” beta program.

But her criticisms of Tesla go back much further. Two years ago, Cummings said (also on Twitter) that Autopilot, Tesla’s advanced driver assistance system, “easily causes mode confusion, is unreliable and unsafe.” She added that NHTSA should require the automaker to turn it off.

Cummings’ nomination to the safety advisory role at NHTSA could suggest a more conservative stance on advanced driver assistance systems (ADAS) and Tesla at the agency in the future.

Of course, NHTSA and Tesla are no strangers. In August, the agency opened a safety probe into Autopilot, after discovering 12 incidents in which Tesla vehicles crashed into parked emergency vehicles. Regulators also investigated an incident involving a fatal crash in 2017 and 25 further crashes involving Tesla’s ADAS since that time.

As late as August, in response to a tweet asking Cummings whether she thinks FSD could ever achieve full autonomy, she said: “my prediction is never.” But that doesn’t mean she necessarily thinks LiDAR — light detection and ranging software that measures distance with pulses of laser light — is the answer either. Instead, she suggested that full self-driving will not be possible without “a complete rethink of reasoning under uncertainty” that can only be brought about through advances in deep learning. . . .


Right, because there's no conflict of interest in having someone who serves on the board of a LiDAR company being appointed to head a safety regulatory agency? (/s) She feels LiDAR is the only way to get safe autonomous driving, and I understand that you sympathize with her, but humans have been driving successfully for years without LiDAR. It is wrong to __assume__ that your way is the only correct one, and that everyone else has to follow it.

Nor is it a conflict of interest that she subscribes to a group who are so twisted in their hate of TSLA that they actively maintain a blocklist of anyone who holds dissenting opinions of their position and have been working for years towards the bankruptcy of a company? (/s) Look, I can understand not liking a company, but to actively block out people who hold thoughts different from your own is an invitation for confirmation bias, and that's what they have, which is why they've been wrong for so many years.

She is prejudiced against Tesla, and that's a fact. Her association to TSLAQ is such a big deal that she's deleted her twitter account and all evidence showing what kind of delusional people she's been cavorting with.
 
IEVS:
Tesla Submits Partial Response In NHTSA Probe But It's Confidential
The EV maker is also under fire from NTSB for not complying with safety recommendations made in 2017.

https://insideevs.com/news/543409/tesla-submits-partial-response-nhtsa/


Re the latter part:
NTSB "deeply concerned" in letter to Tesla

The NHTSA is not the only organization asking Tesla to review its driver assistance tech. In an October 25 letter, the National Transportation Safety Board (NTSB) said it is “deeply concerned” at Tesla’s failure to respond to the safety recommendations issued in 2017 related to its Autopilot driver-assist feature.

Back then, the organization had called on Tesla and other manufacturers to limit the use of advanced driver-assist systems to roadways for which they were designed; NTSB also recommended automakers to develop more effective ways to ensure drivers were paying attention. Tesla was the only automaker not to respond, according to NTSB.

“If you are serious about putting safety front and center in Tesla vehicle design, I invite you to complete action on the safety recommendations we issued to you four years ago.”

NTSB Chairwoman Jennifer Homendy

In the letter, the NTSB adds that subsequent accidents in which Autopilot was involved clearly showed the cars’ “potential for misuse requires a system design change to ensure safety.”

The NTSB has no regulatory authority, but it relies on its powers of persuasion on government and industry.
 
I've been driving with FSDBeta for about a week. Despite issues I would expect in a beta version, if anything, the car under FSDBeta control, is so safety oriented that other drivers sometimes get annoyed. I'm constantly vigilant when using it and will override it when I think it's being too slow to move through an intersection or pass a stopped vehicle. And, of course sometimes it just does boneheaded things like change lanes for 2 secs and then change back; no cars around. ??!!
I have a standard route that I engage it on daily; lots of unprotected lefts and narrow roads with parked cars on both sides. Sun, rain and fog.
So, far it's quite remarkable but a work in progress for sure.
 
ABG:
Tesla’s in-dash video games can be played even while driving

Obviously unsafe


https://www.autoblog.com/2021/12/07/tesla-in-dash-video-games-play-while-driving/


Many Tesla vehicles allow drivers to play a selection of games on the infotainment system while the car is in motion, according to a report by The New York Times. The company rolled out an update in the summer that reportedly let drivers play solitaire, jet fighter game "Sky Force Reloaded" and strategy title "The Battle of Polytopia: Moonrise" while on the road.

The touchscreen is said to display a warning before a game of solitaire starts. “Solitaire is a game for everyone, but playing while the car is in motion is only for passengers," the message reads, according to the Times. That indicates Tesla knows the game is playable while the car's moving.

Although players have to acknowledge that they're a passenger, the driver can tap that button and play the game. Even if a passenger is playing something, it's possible that a driver will divert their attention to the screen to see what's happening anyway.

The National Highway Traffic Safety Administration says 3,142 people died in crashes involving distracted drivers in the U.S. in 2019. A 2017 study suggested that many infotainment features absorbed drivers' attention too long for them to be safe. Researchers at the University of Utah found that, when drivers used voice-based and touchscreen systems, they "took their hands, eyes and mind off the road for more than 24 seconds to complete tasks."

Drivers are supposed to keep their hands on the steering wheel when Tesla's Autopilot is engaged, but a recent study suggested drivers become less attentive when the mode is active. In August, the NHTSA said it was investigating Autopilot following a number of crashes with parked first responder vehicles. Those resulted in one death and 17 injuries.

Other automakers lock many touchscreen and infotainment features when the car is in motion. Stellantis (fka Fiat Chrysler), for instance, lets drivers and passengers watch a DVD on the dashboard screen in some vehicles, though only when the car's parked. NHTSA guidelines urge automakers to ensure that cars with infotainment devices prevent drivers from carrying out "inherently distracting secondary tasks while driving.”

Tesla has added a number of games to its infotainment system over the past few years. Until a few months ago, they were only playable while the car was parked. The Times says Tesla and CEO Elon Musk didn't respond to requests for comment — the company no longer has a PR department. Engadget has contacted NHTSA for comment.


I had though this was explicitly prohibited, not just an NHTSA 'guideline'. It certainly needs to be illegal, and if in fact this article is factually accurate, that Tesla allows this is just another example of their contempt for the safety of members of the non-Tesla public.
 
GRA said:
ABG:
Tesla’s in-dash video games can be played even while driving

Obviously unsafe


https://www.autoblog.com/2021/12/07/tesla-in-dash-video-games-play-while-driving/


Many Tesla vehicles allow drivers to play a selection of games on the infotainment system while the car is in motion, according to a report by The New York Times. The company rolled out an update in the summer that reportedly let drivers play solitaire, jet fighter game "Sky Force Reloaded" and strategy title "The Battle of Polytopia: Moonrise" while on the road.

The touchscreen is said to display a warning before a game of solitaire starts. “Solitaire is a game for everyone, but playing while the car is in motion is only for passengers," the message reads, according to the Times. That indicates Tesla knows the game is playable while the car's moving.

Although players have to acknowledge that they're a passenger, the driver can tap that button and play the game. Even if a passenger is playing something, it's possible that a driver will divert their attention to the screen to see what's happening anyway.

The National Highway Traffic Safety Administration says 3,142 people died in crashes involving distracted drivers in the U.S. in 2019. A 2017 study suggested that many infotainment features absorbed drivers' attention too long for them to be safe. Researchers at the University of Utah found that, when drivers used voice-based and touchscreen systems, they "took their hands, eyes and mind off the road for more than 24 seconds to complete tasks."

Drivers are supposed to keep their hands on the steering wheel when Tesla's Autopilot is engaged, but a recent study suggested drivers become less attentive when the mode is active. In August, the NHTSA said it was investigating Autopilot following a number of crashes with parked first responder vehicles. Those resulted in one death and 17 injuries.

Other automakers lock many touchscreen and infotainment features when the car is in motion. Stellantis (fka Fiat Chrysler), for instance, lets drivers and passengers watch a DVD on the dashboard screen in some vehicles, though only when the car's parked. NHTSA guidelines urge automakers to ensure that cars with infotainment devices prevent drivers from carrying out "inherently distracting secondary tasks while driving.”

Tesla has added a number of games to its infotainment system over the past few years. Until a few months ago, they were only playable while the car was parked. The Times says Tesla and CEO Elon Musk didn't respond to requests for comment — the company no longer has a PR department. Engadget has contacted NHTSA for comment.


I had though this was explicitly prohibited, not just an NHTSA 'guideline'. It certainly needs to be illegal, and if in fact this article is factually accurate, that Tesla allows this is just another example of their contempt for the safety of members of the non-Tesla public.

That's an issue of perspective. The games are allowed to be played in much the same way as you're allowed to drive into oncoming traffic. If the driver actively bypasses that warning, then much like clicking the seatbelt behind you to defeat the seatbelt alert, it's the driver's fault.

Permitting the games to be played by the passengers is for the passenger's entertainment and doesn't interfere on any of the driver info (including navigation, which is diverted to the left-side of the screen). And game playing while the car is in motion can get pretty annoying for the passenger, since every time the driver puts the car into gear, the same warning pops up and forces the game to stop. It's really up to the driver to take ownership of their own actions.

And plenty of drivers currently text while driving and we don't blame the phone companies for being a distraction, right?!

Edit: Like with many things, you can't "accidentally" turn it on. It requires intentional action. Much like if the passenger was watching a movie on their phone, if the driver chooses to be distracted by what the passenger is doing, that's on the driver.
 
Oils4AsphaultOnly said:
GRA said:
ABG:
Tesla’s in-dash video games can be played even while driving

Obviously unsafe


https://www.autoblog.com/2021/12/07/tesla-in-dash-video-games-play-while-driving/


Many Tesla vehicles allow drivers to play a selection of games on the infotainment system while the car is in motion, according to a report by The New York Times. The company rolled out an update in the summer that reportedly let drivers play solitaire, jet fighter game "Sky Force Reloaded" and strategy title "The Battle of Polytopia: Moonrise" while on the road.

The touchscreen is said to display a warning before a game of solitaire starts. “Solitaire is a game for everyone, but playing while the car is in motion is only for passengers," the message reads, according to the Times. That indicates Tesla knows the game is playable while the car's moving.

Although players have to acknowledge that they're a passenger, the driver can tap that button and play the game. Even if a passenger is playing something, it's possible that a driver will divert their attention to the screen to see what's happening anyway.

The National Highway Traffic Safety Administration says 3,142 people died in crashes involving distracted drivers in the U.S. in 2019. A 2017 study suggested that many infotainment features absorbed drivers' attention too long for them to be safe. Researchers at the University of Utah found that, when drivers used voice-based and touchscreen systems, they "took their hands, eyes and mind off the road for more than 24 seconds to complete tasks."

Drivers are supposed to keep their hands on the steering wheel when Tesla's Autopilot is engaged, but a recent study suggested drivers become less attentive when the mode is active. In August, the NHTSA said it was investigating Autopilot following a number of crashes with parked first responder vehicles. Those resulted in one death and 17 injuries.

Other automakers lock many touchscreen and infotainment features when the car is in motion. Stellantis (fka Fiat Chrysler), for instance, lets drivers and passengers watch a DVD on the dashboard screen in some vehicles, though only when the car's parked. NHTSA guidelines urge automakers to ensure that cars with infotainment devices prevent drivers from carrying out "inherently distracting secondary tasks while driving.”

Tesla has added a number of games to its infotainment system over the past few years. Until a few months ago, they were only playable while the car was parked. The Times says Tesla and CEO Elon Musk didn't respond to requests for comment — the company no longer has a PR department. Engadget has contacted NHTSA for comment.


I had though this was explicitly prohibited, not just an NHTSA 'guideline'. It certainly needs to be illegal, and if in fact this article is factually accurate, that Tesla allows this is just another example of their contempt for the safety of members of the non-Tesla public.

That's an issue of perspective. The games are allowed to be played in much the same way as you're allowed to drive into oncoming traffic. If the driver actively bypasses that warning, then much like clicking the seatbelt behind you to defeat the seatbelt alert, it's the driver's fault.

Permitting the games to be played by the passengers is for the passenger's entertainment and doesn't interfere on any of the driver info (including navigation, which is diverted to the left-side of the screen). And game playing while the car is in motion can get pretty annoying for the passenger, since every time the driver puts the car into gear, the same warning pops up and forces the game to stop. It's really up to the driver to take ownership of their own actions.

And plenty of drivers currently text while driving and we don't blame the phone companies for being a distraction, right?!

Edit: Like with many things, you can't "accidentally" turn it on. It requires intentional action. Much like if the passenger was watching a movie on their phone, if the driver chooses to be distracted by what the passenger is doing, that's on the driver.


Actually, I do blame the phone companies for allowing their phones to be used while driving, along with the drivers, since they are both putting my life at risk by their decisions. In the case of drivers using phones in cars, as I've mentioned, it's been happening at decreasing intervals, now down to about once every 10 days or less while I'm walking or biking when I avoid being injured or killed by one of these idiots. Which is why, even if I could still hear it, I put my phone away when I get into a car, and if I'm ever riding with anyone who uses one while moving, I'll ask that they pull over and let me out. I've even seen someone texting while riding their bike :roll: Clearly a favorite for a Darwin Award.

Your seat belt analogy is flawed. After all, if someone dies from not wearing their seat belt (the first fatal accident I happened upon was one such), they don't kill me too if I'm in another car. (I suppose in a head-on crash they could be catapulted through their windshield and then through mine. Not sure if that's ever happened).

This is totally unacceptable; we know there's no shortage of Homer Simpsons out there: https://images.app.goo.gl/85zZL1Q9M7gwRRu18, but companies don't have to enable their irresponsibility, especially when they have the power to prevent it. We've made using a phone while driving illegal, but that obviously isn't an adequate deterrent, and it doesn't matter whether it's hands-free or not - we know from human factors studies that carrying on a conversation or any other interactive mental activity involves far more distraction than when it's purely passive. This is the height of irresponsibility by Tesla. I had thought it was hard for them to surpass their past efforts in that vein. Turns out I was wrong.
 
GRA said:
Oils4AsphaultOnly said:
GRA said:
ABG:


https://www.autoblog.com/2021/12/07/tesla-in-dash-video-games-play-while-driving/





I had though this was explicitly prohibited, not just an NHTSA 'guideline'. It certainly needs to be illegal, and if in fact this article is factually accurate, that Tesla allows this is just another example of their contempt for the safety of members of the non-Tesla public.

That's an issue of perspective. The games are allowed to be played in much the same way as you're allowed to drive into oncoming traffic. If the driver actively bypasses that warning, then much like clicking the seatbelt behind you to defeat the seatbelt alert, it's the driver's fault.

Permitting the games to be played by the passengers is for the passenger's entertainment and doesn't interfere on any of the driver info (including navigation, which is diverted to the left-side of the screen). And game playing while the car is in motion can get pretty annoying for the passenger, since every time the driver puts the car into gear, the same warning pops up and forces the game to stop. It's really up to the driver to take ownership of their own actions.

And plenty of drivers currently text while driving and we don't blame the phone companies for being a distraction, right?!

Edit: Like with many things, you can't "accidentally" turn it on. It requires intentional action. Much like if the passenger was watching a movie on their phone, if the driver chooses to be distracted by what the passenger is doing, that's on the driver.


Actually, I do blame the phone companies for allowing their phones to be used while driving, along with the drivers, since they are both putting my life at risk by their decisions. In the case of drivers using phones in cars, as I've mentioned, it's been happening at decreasing intervals, now down to about once every 10 days or less while I'm walking or biking when I avoid being injured or killed by one of these idiots. Which is why, even if I could still hear it, I put my phone away when I get into a car, and if I'm ever riding with anyone who uses one while moving, I'll ask that they pull over and let me out. I've even seen someone texting while riding their bike :roll: Clearly a favorite for a Darwin Award.

Your seat belt analogy is flawed. After all, if someone dies from not wearing their seat belt (the first fatal accident I happened upon was one such), they don't kill me too if I'm in another car. (I suppose in a head-on crash they could be catapulted through their windshield and then through mine. Not sure if that's ever happened).

This is totally unacceptable; we know there's no shortage of Homer Simpsons out there: https://images.app.goo.gl/85zZL1Q9M7gwRRu18, but companies don't have to enable their irresponsibility, especially when they have the power to prevent it. We've made using a phone while driving illegal, but that obviously isn't an adequate deterrent, and it doesn't matter whether it's hands-free or not - we know from human factors studies that carrying on a conversation or any other interactive mental activity involves far more distraction than when it's purely passive. This is the height of irresponsibility by Tesla. I had thought it was hard for them to surpass their past efforts in that vein. Turns out I was wrong.

The more you try to save the idiots from themselves, the more idiotic they become ... cue texting bicyclist. You've already done what you could by being vigilant yourself.
 
Just posting this here to point out how the self-selection process of FSD Beta produces a better set of QC testers than any simulated environment: https://electrek.co/2022/01/17/elon-musk-claims-no-crash-tesla-full-self-driving-beta/

Others are worried that they would become the un-suspecting victim of someone creating youtube videos of themselves asleep at the wheel of a car in FSD, when the reality (so far) is far from it. Despite the many youtube videos showing where FSD fails to handle the situation well, there hasn't been any accidents attributed to it (the claim filed with the NHTSA hasn't been substantiated). This shows that the current crop of beta testers (numbering in the thousands), knowing that they're responsible for their use of FSD, have been more diligent over it than some dedicated QC employees, and can produce millions of miles of actual test data faster than a dedicated program.

My position has always been that we need actual L5 autonomous systems ASAP, consequences being an acceptable price to pay, because the delay of such a system is more deaths and destruction. Others think the consequences aren't worth it. Let's not repeat these old back-n-forth arguments and just focus on its progress (or lack thereof) for now.
 
Driver in Fatal Tesla Autopilot Crash Charged With Felony Manslaughter in US First
The driver-assist system was enabled when the Tesla struck another vehicle and killed two people in 2019.
https://www.thedrive.com/news/43919/driver-in-fatal-tesla-autopilot-crash-charged-with-felony-manslaughter-in-us-first
 
cwerdna said:
Driver in Fatal Tesla Autopilot Crash Charged With Felony Manslaughter in US First
The driver-assist system was enabled when the Tesla struck another vehicle and killed two people in 2019.
https://www.thedrive.com/news/43919/driver-in-fatal-tesla-autopilot-crash-charged-with-felony-manslaughter-in-us-first

This accident occurred at the southbound 110 freeway off-ramp near where it intersects the westbound 91 freeway at its end point and
transitions to a surface street, Artesia Blvd at the Vermont Ave intersection. It's highly likely that the Autopilot system didn't see/anticipate
a freeway end point or a stop light controlled intersection at the 91 freeway end point, resulting in a deadly collision at the intersection.
 
If I understand this right, the vehicle was on AutoPilot and not FSD. AutoPilot is similar in capability to our ProPilot which means it will give you lane and speed control only. It would by design not know about traffic lights or stop signs and hence Tesla would not be at fault on this tragic accident. It does appear to be a driver attention deficit, negligence type situation but of course more will be revealed as the case progresses.
 
OldManCan said:
If I understand this right, the vehicle was on AutoPilot and not FSD. AutoPilot is similar in capability to our ProPilot which means it will give you lane and speed control only.
Even though Tesla now bundles EAP under FSD when you buy the car, most Tesla cars on the road do not have actual FSD enabled yet.

And in my recent road tests of various EVs, I found that the driver assistance (TAC and lane keeping) feature on the Mach-e and the ID.4 would both allow me to operate the vehicle on side roads. The same was true when I drove a LEAF with ProPilot1 (back in 2019). Nearly everything in this thread about AutoPilot now applies to all those cars.

Except the name. "AutoPilot" is the absolute worst marketing name ever.

Interesting that inside car, the interface calls out the two systems separately: TAC and AutoSteer.
 
lorenfb said:
It's highly likely that the Autopilot system didn't see/anticipate
a freeway end point or a stop light controlled intersection
This accident happened before Autopilot got the ability (via an OTA update) to stop at traffic lights and stop signs (and that ability is only for cars that had FSD purchased).

But it doesn't matter: the driver ran a red light. The driver is 100% responsible for not paying attention.
 
OldManCan said:
If I understand this right, the vehicle was on AutoPilot and not FSD. AutoPilot is similar in capability to our ProPilot which means it will give you lane and speed control only. It would by design not know about traffic lights or stop signs and hence Tesla would not be at fault on this tragic accident. It does appear to be a driver attention deficit, negligence type situation but of course more will be revealed as the case progresses.

That's an incorrect comparison!

FSD, which uses some of the same hardware and software as Autopilot, currently exists as an evolving collection of features that can assist the driver with parking, changing lanes on the highway, making turns, and coming to a complete halt at traffic lights and stop signs. Some owners have access to unfinished FSD beta software, which can control even more vehicle functions on public roads, even though Musk said that some versions of the software were “not great.” One version of FSD beta has already been subject to a product recall.

https://www.consumerreports.org/automotive-industry/elon-musk-tesla-self-driving-and-dangers-of-wishful-thinking-a8114459525/

Based on the above, Tesla may be considered as a co-defendant in the SoCal red light accident involving use of Autopilot, as implied in the accident report;

https://www.thedrive.com/news/43919/driver-in-fatal-tesla-autopilot-crash-charged-with-felony-manslaughter-in-us-first

The court will most likely have to decide the state of the Autopilot firmware at the time of the accident and its contribution to the accident.
 
I don't think the vehicle in question had FSD. I might be wrong on this. I thought it was on AutoPilot only running loose towards the intersection...
 
OldManCan said:
I don't think the vehicle in question had FSD. I might be wrong on this. I thought it was on AutoPilot only running loose towards the intersection...

It's not about FSD. Read about the functions provided by AutoPilot, i.e. its functions are not limited like Nissan's and other OEMs', in the link provided.
The discovery process will allow the court to determine the extent of Tesla's liability with regard to AutoPilot, not here on MNL.
 
lorenfb said:
OldManCan said:
I don't think the vehicle in question had FSD. I might be wrong on this. I thought it was on AutoPilot only running loose towards the intersection...

It's not about FSD. Read about the functions provided by AutoPilot, i.e. its functions are not limited like Nissan's and other OEMs', in the link provided.
The discovery process will allow the court to determine the extent of Tesla's liability with regard to AutoPilot, not here on MNL.

Those functions weren't available at the time of the accident, so no. At that time, only highway lane keeping and speed control were available to the general public. Many of the current FSD features were part of the Enhanced Autopilot package at that time. None of which was able to handle stoplights until 2020.
 
Oils4AsphaultOnly said:
Those functions weren't available at the time of the accident, so no. At that time, only highway lane keeping and speed control were available to the general public. Many of the current FSD features were part of the Enhanced Autopilot package at that time. None of which was able to handle stoplights until 2020.

Precisely! As Lorenfb suggests this is for the court to study and confirm of course but it should be a clear situation.
 
lorenfb said:
Read about the functions provided by AutoPilot, i.e. its functions are not limited like Nissan's and other OEMs',
When I test drove a LEAF with ProPilot, I was able to (even encouraged by the dealership LEAF expert) to engage ProPilot on the 4 lane road in front of the dealership. It is 2 lanes in each direction with a double yellow line down the middle, and traffic lights. I didn't keep it on until I got close to a traffic light, but I didn't see anything stopping me from going through one.

In my more recent test drives (after ny accident) of the ID.4, the lane keeping was able to be engaged on a side road with one lane in each direction, separated by a double yellow line, on a road with traffic lights and stop signs. I didn't try to keep it on near a stop sign.

What do these cars do at a stop sign or traffic light?
 
Back
Top