Tesla's autopilot, on the road

My Nissan Leaf Forum

Help Support My Nissan Leaf Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
I could imagine Tesla being a tempting target for the lawsuit crowd. They have money, Elon Musk could be portrayed as an arrogant, elitist CEO full of hubris, Tesla drivers are generally well off and some of them seem to be idiots, etc. etc. If a car on FSD ever kills someone it could get nasty quickly.
 
If some knucklehead riding in the back seat of his Tesla ever causes an accident with me, God help him.

No forgiveness for that kind of willful irresponsibility.
 
goldbrick said:
I could imagine Tesla being a tempting target for the lawsuit crowd. They have money, Elon Musk could be portrayed as an arrogant, elitist CEO full of hubris, Tesla drivers are generally well off and some of them seem to be idiots, etc. etc. If a car on FSD ever kills someone it could get nasty quickly.


Allegedly, a car on A/P already killed a non-passenger, according to a front page article in the NYT today, although if you want to read the whole thing it's behind a paywall:
'It Happened So Fast’: Inside a Fatal Tesla Autopilot Accident

https://www.nytimes.com/2021/08/17/business/tesla-autopilot-accident.html

A quick skim while waiting in line at the grocery store says this happened in Florida in 2019. The Tesla driver claims he dropped his phone while on a call, bent down to get it and took his eyes off the road, and A/P failed to stop and crashed into a stopped car, killing the other driver (I think); not an occupant of the Tesla in any case. I believe the article also said the road was closed, but had to move on as my turn at the cashier arrived, so didn't get any more details.

This is the first I'm hearing about this particular accident, AFAIA the first fatal one in the U.S. not involving a Tesla occupant, and far more important because of that. It can be argued that occupants of a Tesla that are injured or die due to an A/P or FSD-controlled crash have given implied consent to the risk. That argument can't be made here.

I'm trying to find out more; the article mentioned both the Tesla owner's and the deceased's name, but I've forgotten the latter.

[EDIT] Found some more detail on a law firm's website: https://zimmermanfrachtman.com/blog...ility-tesla-on-autopilot-during-deadly-crash/

The Tesla hit a stopped pickup which then hit a couple of pedestrians, killing one and injuring the other.
 
Senators urge FTC to investigate Tesla's Autopilot and self-driving claims
Sens. Markey and Blumenthal accused the company of 'misleading advertising and marketing.'
https://www.engadget.com/senators-ftc-tesla-autopilot-full-self-driving-192943873.html
 
^^^ Yeah, the FTC's failure to date to stop Tesla's false claims and misleading names ranks right up there with the NHTSA's failure to do their job re Tesla and other L2 ADAS systems. Tesla's behavior has been the most egregiously irresponsible.
 
Elon Musk says Tesla’s latest beta self-driving software is ‘not great’
https://www.cnbc.com/2021/08/23/elon-musk-says-tesla-fsd-beta-9point2-software-is-not-great.html
 
cwerdna said:
Elon Musk says Tesla’s latest beta self-driving software is ‘not great’
https://www.cnbc.com/2021/08/23/elon-musk-says-tesla-fsd-beta-9point2-software-is-not-great.html

This type of article is so predictable, even the TSLA bulls expect it (skip to the 7:20 mark): https://www.youtube.com/watch?v=B3WC5flvqEw&ab_channel=TeslaDaily

FSD v9.2 might be "not great", but it still beats the pants off of what else is available out there.
 
Until it fails and someone is badly injured or dies. Calling something that can't reliably distinguish a stopped vehicle from open road "Full Self Driving" is criminal.
 
LeftieBiker said:
Until it fails and someone is badly injured or dies. Calling something that can't reliably distinguish a stopped vehicle from open road "Full Self Driving" is criminal.

Just in case it wasn't clear. FSD Beta is NOT in the hands of the general public, yet. I'm in the advanced software access pool, have paid for FSD already, and do NOT have access to FSD. I only have access to the FSD feature set of navigate-on-autopilot, summon, autopark, stop-sign control. The general public hasn't been given access to FSD yet. All deaths have been due to the misuse of autopilot.

All the talk about general AI is about being able to distinguish stopped vehicles from shadows and sky.
 
I'm sure many have heard about Toyota's autonomous vehicles hitting a pedestrian during the paraolympics (https://finance.yahoo.com/news/toyota-halts-self-driving-e-125310719.html).

But did you manage to read to this part:
"The vehicle had stopped at a T junction and was about to turn under manual control of the operator, who was using the vehicle's joystick control, when the vehicle hit the athlete going at around 1 or 2 kilometres an hour, Toyoda said."

That's not evidence of autonomous vehicles not being ready (which is more a fault of Toyota's engineers), but that human beings are way too error-prone.
 
This was actually mentioned on local news even though it happened on the other side of the country.

A Tesla Model 3 hit a parked police car in Orlando, driver said she was ‘in Autopilot’
https://www.cnbc.com/2021/08/28/tesla-model-3-hit-a-parked-police-car-in-orlando-driver-said-she-was-in-autopilot.html

SEE: Tesla on autopilot crashes into FHP cruiser on I-4
https://www.wftv.com/news/local/tesla-autopilot-crashes-into-fhp-cruiser-i-4/ZAQDPK43TFCXXATCTSBNKQ46OM/
 
While we don't have FSD we have the "regular" autopilot. I can say so far at least two times it has saved me from an accident. Once when a vehicle 3 or 4 in front of me slowed down and caused an emergency breaking of the other cars in front of me and while I was paying attention, as you always should be, the car started breaking a split second before I realized what was unfolding in front of me. I don't think I would have hit the car in front of me, I am quite sure I would have been rear ended but becasue the car started to slow down sooner and it wasn't a slam on the breaks situation.

The second time I would have been hit was someone driving 100 mph swerving across lanes and would have totally side swiped us but the Tesla did slam on the brakes hard and he just missed the front right corner of the car. I never noticed him coming as he was in the far right of four lanes lanes cutting across all four to the far left lane.

So while I agree that FSD or auto pilot isn't perfect if your are driving it is an extra set of eyes. Just looking at the data for vehicle crashes, including all the ones in the press, Teslas are still significantly less likely to be in an accident with autopilot than without.

Also 7 of the 11 vehicle crashes that NHSA are looking in to were between midnight and 4am and also involved drivers receiving tickets for driving under the influence. I wonder if they had been driving a 'regular" car if they would be alive today. Again not suggesting anyone under the influence drive. And there might be something to this that maybe those drivers thought it would be fine to let the car do it's things, which it is obviously not.
 
I totally agree, I see it as another improvement in vehicle safety, be it smart cruise or whatever. I don't think of it as a replacement for me in any way shape or form, it's just another driving aid.

After recently have driven one way from Green Bay, through Milwaukee to Chicago in a conventional car and back right away in the Tesla I felt so much less stress and fatigued in the Tesla and I can't put a price on how much that helps for me when traveling in stop & go traffic, but everyone is different.
 
cwerdna said:
This was actually mentioned on local news even though it happened on the other side of the country.

A Tesla Model 3 hit a parked police car in Orlando, driver said she was ‘in Autopilot’
https://www.cnbc.com/2021/08/28/tesla-model-3-hit-a-parked-police-car-in-orlando-driver-said-she-was-in-autopilot.html

SEE: Tesla on autopilot crashes into FHP cruiser on I-4
https://www.wftv.com/news/local/tesla-autopilot-crashes-into-fhp-cruiser-i-4/ZAQDPK43TFCXXATCTSBNKQ46OM/

Stopped vehicles in the middle of the road are exactly the situations that users of autopilot (or ANY manufacturer's ADAS systems) are supposed to be on the lookout for. Anyone getting into such accidents were clearly NOT monitoring the road.
 
Back
Top