Tesla's autopilot, on the road

My Nissan Leaf Forum

Help Support My Nissan Leaf Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
OldManCan said:
Oils4AsphaultOnly said:
Adverse conditions aren't impossible to navigate with vision alone. Millions of humans do it annually. Some conditions are just too treacherous for even human drivers, and the cars should pull-over and wait for the storm to pass, as should the humans.

But yes, if battery energy density improves quickly enough, autonomous flying vehicles will arrive before cars.

Agreed on the need to pull over but we all know it doesn't happen as much as it should.

Let's put aside the road infrastructure cost etc point for a second, why would we not want to supplement vision with other sensor abilities (ie LIDAR or future tech not yet here)? If we have the ability to add beyond-human capabilities to make the experience safer I'd be all for it.

Now we're getting into the nitty gritty details. Autonomous tech to solve the drunk and inattentive drivers would save over 30k lives annually and can be solved with human vision alone, because it's strictly about removing human stupidity from the equation. No super-human capabilities required.

Autonomous tech to bring super-human capabilities to solving adverse road-conditions where no human "should" venture would probably save only a handful of lives annually, if even that much. Because only stupid (or so drunk that they're no different) people would choose to drive where they can't see.

Regulations only serve to guide economic development (quit polluting to save a few bucks), not solve technical problems.
 
I respect your views and see that we will keep going back & forth on this. Best that we both sit back and wait to see how it all shapes up in the years ahead. Let's go back to the topic - Tesla's autopilot, on the road. Sorry for instigating this side chat about the future and my thoughts.
 
California DMV accuses Tesla of deceptive practices in marketing Autopilot and Full Self-Driving options
https://www.cnbc.com/2022/08/05/california-dmv-says-tesla-fsd-autopilot-marketing-deceptive.html
KEY POINTS
In a pair of July 28 filings with California’s Office of Administrative Hearings, an official and lawyers for the DMV wrote that Tesla’s “Autopilot” and “Full Self-Driving” marketing suggest the cars are capable of operating autonomously, when in fact they can’t.
In a worst-case scenario, the company could temporarily lose the licenses which allow it to operate as a vehicle manufacturer and auto dealer in California.
Tesla has fifteen days to respond to the accusations before the administrative court, otherwise the DMV will take a default decision.
 
https://electrek.co/2022/08/22/tesla-autopilot-prevent-40-crashes-per-day-wrong-pedal-errors/

Not sure if this claim includes instances where the driver realizes the mistake and is already changing pedals before AEB kicks in. It's a good example of how error-prone human drivers are though.
 
This is hilarious...

https://insideevs.com/news/606383/tesla-fsd-beta-tester-complains-elon-responds/?fbclid=IwAR0JpE4F6JahHUNyhFkwwPGiur_ghxbR2a1GQO3q2O_1uy-O29y5QWTU0C8
 
To continue from
cwerdna said:
Learned of https://elonmusk.today/ the other day. Has lots of "great" stuff/material from him like https://elonmusk.today/#robotaxi-network and https://elonmusk.today/#autonomous-cross-country-trip. :lol:
From https://www.zdnet.com/article/elon-musk-says-that-the-self-driving-tesla-could-be-ready-by-the-end-of-the-year/
"The two technologies I am focused on, trying to ideally get done before the end of the year, are getting our Starship into orbit ... and then having Tesla cars to be able to do self-driving," Musk said, according to the Reuters report.

Since Musk first announced the idea in 2016, there have been several new release dates each year, which all came short. He even promised a Tesla self-autonomous trip from NYC to LA which got delayed to 2017, then 2018 and is yet to happen. So it's worth taking Musk's new timeline with a grain of salt.
 
ABG:
NHTSA: 10 more fatal crashes linked to Tesla's automated driving tech

An 11th crash was mistakenly linked to a Ford pickup truck that didn't have self-driving tech


The National Highway Safety Administration released new data indicating that 10 people were killed in the United States in crashes involving vehicles that were using automated driving systems. The crashes all took place during a four-month period earlier this year between mid-May and September of this year.

Each of the 10 deaths involved vehicles made by Tesla, though it is unclear from the National Highway Traffic Safety Administration's data whether the technology itself was at fault or whether driver error might have been responsible. An 11th fatal crash appears in the data involving a Ford pickup truck, but it was later found that Ford reported the incident too quickly and that the pickup was not actually equipped with the automaker's partial self-driving tech.

The 10 deaths included four crashes involving motorcycles that occurred during the spring and summer. Two fatalities happened in Florida and one each in California and Utah. Safety advocates note that the deaths of motorcyclists in crashes involving Tesla vehicles using automated driver-assist systems such as Autopilot have been increasing.

Tesla alone has more than 830,000 vehicles on U.S. roads with the systems. The agency is requiring auto and tech companies to report all crashes involving self-driving vehicles as well as autos with driver assist systems that can take over some driving tasks from people.

The new fatal crashes are documented in a database that NHTSA is building in an effort to broadly assess the safety of automated driving systems, which, led by Tesla, have been growing in use. Previous figures from an earlier period were released in June showing that six people died in crashes involving the automated systems, and five were seriously hurt. Of the deaths, five occurred in Teslas and one a Ford. In each case, the database says that advanced driver assist systems were in use at the time of the crash.

Michael Brooks, executive director of the nonprofit Center for Auto Safety, said he is baffled by NHTSA's continued investigations and by what he called a general lack of action since problems with Autopilot began surfacing back in 2016.

“I think there's a pretty clear pattern of bad behavior on the part of Tesla when it comes to obeying the edicts of the (federal) safety act, and NHTSA is just sitting there,” he said. “How many more deaths do we need to see of motorcyclists?”

Brooks noted that the Tesla crashes are victimizing more people who are not in the Tesla vehicles.

“You're seeing innocent people who had no choice in the matter being killed or injured,” he said.

A message was left Tuesday seeking a response from NHTSA.

Tesla’s crash number may appear elevated because it uses telematics to monitor its vehicles and obtain real-time crash reports. Other automakers lack such capability, so their crash reports may emerge more slowly or may not be reported at all, NHTSA has said.

NHTSA has been investigating Autopilot since August of last year after a string of crashes since 2018 in which Teslas collided with emergency vehicles parked along roadways with flashing lights on. That investigation moved a step closer to a recall in June, when it was upgraded to what is called an engineering analysis.

In documents, the agency raised questions about the system, finding that the technology was being used in areas where its capabilities are limited and that many drivers weren’t taking steps to avoid crashes despite warnings from the vehicle.

NHTSA also reported that it has documented 16 crashes in which vehicles with automated systems in use hit emergency vehicles and trucks that were displaying warning signs, causing 15 injuries and one death.

The National Transportation Safety Board, which also has investigated some of the Tesla crashes dating to 2016, has recommended that NHTSA and Tesla limit Autopilot’s use to areas where it can safely operate. The NTSB also recommended that NHTSA require Tesla to improve its systems to ensure that drivers are paying attention. NHTSA has yet to act on the recommendations. (The NTSB can make only recommendations to other federal agencies.)

Messages were left Tuesday seeking comment from Tesla. At the company's artificial intelligence day in September, CEO Elon Musk asserted that, based on the rate of crashes and total miles driven, Tesla's automated systems were safer than human drivers — a notion that some safety experts dispute. . . .

In addition to Autopilot, Tesla sells “Full Self-Driving” systems, though it says the vehicles cannot drive themselves and that motorists must be ready to intervene at all times.

The number of deaths involving automated vehicles is small compared with the overall number of traffic deaths in the U.S. Nearly 43,000 people were killed on U.S. roads last year, the highest number in 16 years, after Americans returned to the roads as the pandemic eased. Authorities blamed reckless behavior such as speeding and driving while impaired by drugs or alcohol for much of the increase.
 
ABG:
Tesla opens 'Full Self-Driving' beta testing to anyone in North America

Adding a lot more testers to a situation the NTSB calls 'the Wild West on our roads'

https://www.autoblog.com/2022/11/27/tesla-full-self-driving/


Tesla is making its controversial driver-assistance system available to customers previously deemed not safe enough behind the wheel to test it out.


Boy, that's just what we need, even less safe drivers failing to monitor their cars while using the mis-named FSD.
 
Tesla driver blames San Francisco Bay Bridge crash on 'Full Self-Driving'
The Thanksgiving accident was one of several that snarled traffic for holiday travelers
https://www.autoblog.com/2022/12/22/san-fran-bridge-crash-tesla-fsd/

Tesla under investigation by NHTSA for two more crashes that may have involved Autopilot or FSD
https://www.cnbc.com/2022/12/22/nhtsa-initiates-two-more-tesla-crash-investigations.html - it includes the above crash.
 
Made this case earlier and glad to see Tesla also seems to be agreeing...

https://electrek.co/2022/12/06/tesla-radar-car-next-month-self-driving-suite-concerns/
 
https://techstory.in/tesla-banned-from-calling-its-software-fsd-by-california/

https://www.pcmag.com/news/california-bans-tesla-from-calling-software-full-self-driving

California has outlawed Tesla from calling its software Full Self-Driving, as legislators said its name could lead a 'reasonable person' to believe the feature allows a car to become fully autonomous.
 
cwerdna said:
Tesla driver blames San Francisco Bay Bridge crash on 'Full Self-Driving'
The Thanksgiving accident was one of several that snarled traffic for holiday travelers
https://www.autoblog.com/2022/12/22/san-fran-bridge-crash-tesla-fsd/

Tesla under investigation by NHTSA for two more crashes that may have involved Autopilot or FSD
https://www.cnbc.com/2022/12/22/nhtsa-initiates-two-more-tesla-crash-investigations.html - it includes the above crash.
https://twitter.com/kenklippenstein/status/1612848872061128704 has a tweet w/the video of the accident. Tweet says, in part:
"I obtained surveillance footage of the self-driving Tesla that abruptly stopped on the Bay Bridge, resulting in an eight-vehicle crash that injured 9 people including a 2 yr old child just hours after Musk announced the self-driving feature."

Points to story at https://theintercept.com/2023/01/10/tesla-crash-footage-autopilot/ which I haven't read yet. I did watch the video though.
 
Although the failures are spectacular, (The golden gate bridge incident is a prime example - "why wasn't the driver paying attention to what his car was doing?"), the averages are improving: https://insideevs.com/news/630650/tesla-safety-report-2022q3-autopilot-improved/
=====
According to Tesla, the number of miles driven per one accident registered, when using Autopilot technology, is consistently increasing, which indicates that cars are getting safer to use.
...
In Q3, the company noted one accident per 6.26 million miles driven - a 13 percent increase year-over-year. The record result was noted in Q1, at 6.57 million miles driven.
...
"It's important to note that the results are comparable only within a particular category (Autopilot or without Autopilot), not between the categories as the input data might be widely different (like simple highway driving or complex city driving). In other words, we can only see whether the active safety systems are improving over time (and it's also only a rough comparison), but we can't compare Autopilot to non-Autopilot driving."
...
"We collect the amount of miles traveled by each vehicle with Autopilot active or in manual driving, based on available data we receive from the fleet, and do so without identifying specific vehicles to protect privacy. We also receive a crash alert anytime a crash is reported to us from the fleet, which may include data about whether Autopilot was active at the time of impact. To ensure our statistics are conservative, we count any crash in which Autopilot was deactivated within 5 seconds before impact, and we count all crashes in which the incident alert indicated an airbag or other active restraint deployed. (Our crash statistics are not based on sample data sets or estimates.) In practice, this correlates to nearly any crash at about 12 mph (20 kph) or above, depending on the crash forces generated. We do not differentiate based on the type of crash or fault (For example, more than 35% of all Autopilot crashes occur when the Tesla vehicle is rear-ended by another vehicle). In this way, we are confident that the statistics we share unquestionably show the benefits of Autopilot."
=====

So in summary, the autopilot system is improving. But from the data, no concrete conclusions can be made about whether or not driving with autopilot is safer.
 
https://youtu.be/Nvvhmc837Tw

This poster has some good video footage of A/P working as intended. He also cited one interesting statistic: A/P preventing 40 pedal mis-applications daily.

We're not at FSD yet, but the rate of improvement is getting us there.
 
cwerdna said:
^^^
BTW, you are aware of the misleading video from Tesla from Oct 2016, right?

https://www.tesla.com/blog/all-tesla-cars-being-produced-now-have-full-self-driving-hardware

Even if you don't agree with the assertions at https://dailykanban.com/2017/02/ca-dmv-report-sheds-new-light-misleading-tesla-autonomous-drive-video/, the delays did happen (https://insideevs.com/news/329266/teslas-new-product-announcement-pushed-back-to-wednesday-needs-refinement-says-musk/) and you can see Tesla's CA autonomous testing disengagements at https://www.dmv.ca.gov/portal/dmv/detail/vr/autonomous/disengagement_report_2016 for yourself besides looking at what they reported for all the years at https://www.dmv.ca.gov/portal/dmv/detail/vr/autonomous/testing. It sure fooled my college classmate. He got the impression that Tesla was much further ahead than they were. We were both computer science majors in college and have worked in software most of our professional careers.

The 2018 reports weren't as nicely formatted for Waymo (for example), so you can look at https://thelastdriverlicenseholder.com/2019/02/13/update-disengagement-reports-2018-final-results/ for rankings.

This is besides all his missed dates of a cross-country autonomous drive. I haven't verified all the dates at https://www.inverse.com/article/55141-elon-musk-autonomous-driving, but they look right. Example below:
Oct. 2016: Musk announces all cars will ship with enough cameras to enable FSD, and that the company will complete a cross country trip “all the way from L.A. to New York” by the end of 2017.
DMV pulled the above reports, but you can still see the 2016 reports via https://web.archive.org/web/20180201185440/https://www.dmv.ca.gov/portal/dmv/detail/vr/autonomous/disengagement_report_2016 and Teslas at https://web.archive.org/web/20180811140504/https://www.dmv.ca.gov/portal/wcm/connect/f1873c87-4f21-4beb-b665-050cada6db7a/Tesla_disengage_report_2016.pdf?MOD=AJPERES.

Finally news about that misleading 2016 video has been coming out recently, which IIRC NYT in a documentary brought up in the past 2 years.

Tesla staged 2016 self-driving demo, says senior Autopilot engineer
https://arstechnica.com/cars/2023/01/tesla-staged-2016-self-driving-demo-says-senior-autopilot-engineer/

Tesla engineer testifies that 2016 video promoting self-driving was faked
https://techcrunch.com/2023/01/17/tesla-engineer-testifies-that-2016-video-promoting-self-driving-was-faked/

Tesla staged Autopilot demo video, says director of software
https://www.theverge.com/2023/1/17/23559294/tesla-autopilot-2016-video-pre-mapped-traffic-lights
 
"May cause crashes"

https://www.cnbc.com/2023/02/16/tesla-recalls-362758-vehicles-says-full-self-driving-beta-software-may-cause-crashes.html
 
Finally watched https://www.youtube.com/watch?v=V2u3dcH2VGM (only 6 minutes long) as it was embedded in https://www.carscoops.com/2023/08/new-footage-shows-tesla-on-autopilot-crashing-into-police-car-after-alerting-driver-150-times.

Article begins with "New Footage Shows Tesla On Autopilot Crashing Into Police Car After Alerting Driver 150 Times
Six officers who were injured in the crash are suing Tesla despite the fact that the driver was allegedly impaired".
 
Tesla Model 3 Owner Ignores Warnings, Drives Car Into Floodwater While Using FSD Beta
https://www.roadandtrack.com/news/a44869591/tesla-model-3-drives-into-flood-fsd/
 
cwerdna said:
Tesla Model 3 Owner Ignores Warnings, Drives Car Into Floodwater While Using FSD Beta
https://www.roadandtrack.com/news/a44869591/tesla-model-3-drives-into-flood-fsd/

Driver was an idiot.

This is what we're hoping FSD would eventually solve (AEB did reduce the severity of the accident, but would rather have no accident at all):
https://twitter.com/Teslasaveslives/status/1692916195194724494
 
Back
Top