Tesla's autopilot, on the road

My Nissan Leaf Forum

Help Support My Nissan Leaf Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
Oils4AsphaultOnly said:
GRA said:
ABG:
Consumer Reports calls Tesla Summon a 'glitchy' 'science experiment'
Consumers are participating 'in a kind of science experiment'
https://www.autoblog.com/2019/10/09/tesla-summon-glitchy-consumer-reports/

Although I'm not a fan of Dana Hull, at least she gets the point in this article: https://www.bloomberg.com/news/features/2019-10-09/tesla-s-autopilot-could-save-the-lives-of-millions-but-it-will-kill-some-people-first


I thought that was a good, well-balanced article, and it essentially comes down to an ethical and societal question. Is it acceptable for a company rather than the society at large to decide to test a product that can potentially kill a member of the public who hasn't given their consent to be part of the test? Companies like Waymo and individuals like me say it's not, and Tesla says it is.
 
And these are the situations where you'd be glad you had smart summon, even if you NEVER use it 99.9% of the time: https://www.reddit.com/r/teslamotors/comments/djatmv/i_freed_my_tesla_from_a_locked_garage_with_smart/
 
Traffic cones are visualized in the latest autopilot update (not on my cars yet) :
e4iymyet32w31.jpg
 
Tesla in 'Auto-Pilot' Hits Back of Police Car on I-95 in Norwalk: State Police
https://www.nbcconnecticut.com/news/local/Tesla-in-Auto-Pilot-Hits-Back-of-Police-Car-on-I-95-in-Norwalk-State-Police-565925101.html

https://www.facebook.com/connecticutstatepolice/posts/2559517627610906 has a post from the CT state police on the above.
 
The Tesla driver said the vehicle was on Autopilot and he was checking on his dog in the back seat prior to the crash. The driver was issued a misdemeanor summons for reckless driving; no one involved was seriously injured.
Pretty clear this driver should lose their license for a while.
 
C&D:
NHTSA Investigating Indiana Crash Where Tesla Model 3 Hit Fire Truck
The driver rear-ended a fire truck in late December, and as in other cases, Tesla's Autopilot is suspected (but not confirmed) to be involved.
https://www.caranddriver.com/news/a30474504/nhtsa-indiana-tesla-model-3-crash-fire-truck/


The National Highway Traffic Safety Administration (NHTSA) is now taking up two fatal Tesla crashes potentially involving Autopilot, both of which coincidentally occurred on the same day, December 29. The first, in Gardena, California, involved a Model S running through a stoplight and crashing into the side of a Honda Civic, killing both occupants in that car. This latest investigation, first brought to light by Reuters earlier this week, involves a married couple in a Model 3 who collided into the rear of a fire truck on Interstate 70 in Cloverdale, Indiana, just west of Indianapolis. According to the Indianapolis Star, Derrick Monet, 25, hit the truck—which had its lights flashing while parked in the left lane—at a high rate of speed. His wife, Jenna Monet, 23, was in the front passenger seat. Both were taken to a local hospital where she was pronounced dead. The State Police report says the driver "failed to observe the emergency vehicle positioned in the passing lane of I-70, running into the back of the parked fire truck."


The one in Indiana with the fire truck sounds like it was another case of AP/AEB not recognizing a stopped vehicle as a legitimate target. I think this is the second fatality in such a crash (the first was in China). The one in Gardena could be human error or A/P. If A/P, this would be the first fatal A/P accident that caused the death of a non-occupant of the Tesla. If it proves to be so, expect lots of news coverage and political/government regulatory response.
 
cwerdna said:
Tesla Model 3 Crash: Narrow UK Roads Are Not Autopilot Friendly
https://insideevs.com/news/393351/video-uk-tesla-autopilot-crash/

That idiot driver deserved that for several reasons. It's not a narrow road it's a road with cars intruding in the lanes and a driver not using AP correctly. More FUD for the masses. AP is NOT FSD and any driver that did not take over long before that got what they deserved. What an idiot.
 
EVDRIVER said:
cwerdna said:
Tesla Model 3 Crash: Narrow UK Roads Are Not Autopilot Friendly
https://insideevs.com/news/393351/video-uk-tesla-autopilot-crash/

That idiot driver deserved that for several reasons. It's not a narrow road it's a road with cars intruding in the lanes and a driver not using AP correctly. More FUD for the masses. AP is NOT FSD and any driver that did not take over long before that got what they deserved. What an idiot.

EVdriver, +1. Total FUD just because it’s a Tesla. Was the driver a 12-year-old?
 
News flash, fire trucks fall from the sky and you don't have time to react. I never look at the road when driving even for huge objects far in front of me. I wish there was a feature on AP that drove all these people off a cliff so they are no longer on the road.
 
EVDRIVER said:
... AP is NOT FSD ...

True. Which is why it's tragic that Tesla insists on calling it "Autopilot". The general public are not pilots and do not understand the limitations, human factors, and responsibilities involved with the aircraft systems from which they appropriated the name.

Personally, until such time as full-time autonomy is approved, I don't think these systems should be allowed to have steering inputs at all except in case where an emergency situation has been detected and the driver is not responding appropriately. And even then the driver should be able to override the device's inputs.
 
Nubo said:
EVDRIVER said:
... AP is NOT FSD ...

True. Which is why it's tragic that Tesla insists on calling it "Autopilot". The general public are not pilots and do not understand the limitations, human factors, and responsibilities involved with the aircraft systems from which they appropriated the name.

Personally, until such time as full-time autonomy is approved, I don't think these systems should be allowed to have steering inputs at all except in case where an emergency situation has been detected and the driver is not responding appropriately. And even then the driver should be able to override the device's inputs.

No the general public ignores common sense. Does not matter what it is called they know the limitations if the read.
 
EVDRIVER said:
Nubo said:
EVDRIVER said:
... AP is NOT FSD ...

True. Which is why it's tragic that Tesla insists on calling it "Autopilot". The general public are not pilots and do not understand the limitations, human factors, and responsibilities involved with the aircraft systems from which they appropriated the name.

Personally, until such time as full-time autonomy is approved, I don't think these systems should be allowed to have steering inputs at all except in case where an emergency situation has been detected and the driver is not responding appropriately. And even then the driver should be able to override the device's inputs.

No the general public ignores common sense. Does not matter what it is called they know the limitations if the read.

Safety-critical systems must take human factors into account. Even such highly proficient and trained people as professional pilots can (and do) fall victim to design-induced failure scenarios when human factors are not thoughtfully considered.
 
Nubo said:
EVDRIVER said:
Nubo said:
True. Which is why it's tragic that Tesla insists on calling it "Autopilot". The general public are not pilots and do not understand the limitations, human factors, and responsibilities involved with the aircraft systems from which they appropriated the name.

Personally, until such time as full-time autonomy is approved, I don't think these systems should be allowed to have steering inputs at all except in case where an emergency situation has been detected and the driver is not responding appropriately. And even then the driver should be able to override the device's inputs.

No the general public ignores common sense. Does not matter what it is called they know the limitations if the read.

Safety-critical systems must take human factors into account. Even such highly proficient and trained people as professional pilots can (and do) fall victim to design-induced failure scenarios when human factors are not thoughtfully considered.

+1000.
 
Back
Top