Tesla's autopilot, on the road

My Nissan Leaf Forum

Help Support My Nissan Leaf Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.

edatoakrun

Well-known member
Joined
Nov 11, 2010
Messages
5,222
Location
Shasta County, North California
Maybe we should give Tesla's autopilot semi-autonomous feature its own thread?

And stop posting on other threads, including:

Autonomous driving LEAF, and the implications for BEVs.

http://www.mynissanleaf.com/viewtopic.php?f=12&t=10233&start=210

LAT reports on the legal implications of the first autopilot fatality last week:

Tesla's 'autopilot mode' puts it at risk for liability in crashes

By rolling out self-driving technology to consumers more aggressively than its competitors, Tesla Motors secured a spot in the forefront of a coming industry.

But that strategy could expose the company to a risk it has sought to avoid: liability in crashes.

Tesla in 2015 activated its autopilot mode, which automates steering, braking and lane switching. Tesla asserts the technology doesn’t shift blame for accidents from the driver to the company.

But Google, Zoox and other firms seeking to develop autonomous driving software say it’s dangerous to expect people in the driver’s seat to exercise any responsibility. Drivers get lulled into acting like passengers after a few minutes of the car doing most of the work, the companies say, so relying on them to suddenly brake when their cars fail to spot a hazard isn’t a safe bet...

Such a concern could undermine Tesla, whose autopilot feature is central to a fatal-accident investigation launched last week by federal regulators...

If the accident happened because the software was inadequate (because it couldn’t spot the white vehicle on a light backdrop) and proper testing would have found the flaw, Tesla could be on the hook, said Jon Tisdale, a general partner in Gilbert, Kelly, Crowley & Jennett’s Los Angeles office...

Brown’s family has said through attorneys that they hope lessons from his crash “will trigger further innovation which enhances the safety of everyone on the roadways.”

A decision on whether to file a lawsuit isn’t likely until the federal inquiry is completed, and the family’s focus remains on mourning, the attorneys said.
http://www.latimes.com/business/technology/la-fi-tn-tesla-liabilty-20160705-snap-story.html
 
Tesla specifically stated long before this, that this type of accident was possible. They also make sure people approve the use of a beta version of software. If you want to go to sleep, fine. But if you die, it's your own fault.

I hope Americans don't kill this new tech by over litigating personal responsibilities. Sometimes it is not someone else's fault no matter how much you want it to be.
 
2k1Toaster said:
Tesla specifically stated long before this, that this type of accident was possible. They also make sure people approve the use of a beta version of software. If you want to go to sleep, fine. But if you die, it's your own fault.

I hope Americans don't kill this new tech by over litigating personal responsibilities. Sometimes it is not someone else's fault no matter how much you want it to be.
I hope companies don't oversell technical capabilities before they're ready so that people don't put themselves at risk by being lulled into believing technology is more capable than it is. Doing so risks stalling progress on those technical capabilities because people will become scared of them.
 
dm33 said:
2k1Toaster said:
Tesla specifically stated long before this, that this type of accident was possible. They also make sure people approve the use of a beta version of software. If you want to go to sleep, fine. But if you die, it's your own fault.

I hope Americans don't kill this new tech by over litigating personal responsibilities. Sometimes it is not someone else's fault no matter how much you want it to be.
I hope companies don't oversell technical capabilities before they're ready so that people don't put themselves at risk by being lulled into believing technology is more capable than it is. Doing so risks stalling progress on those technical capabilities because people will become scared of them.

They didn't oversell anything. They clearly say that you as the driver must drive. Hands on the wheel at all times and ready to take over at all times. How much more clear can they be?

We need to stop protecting stupid.
 

Feds investigate Autopilot usage in second recent Tesla crash


U.S. auto-safety regulators are scrutinizing the Pennsylvania crash of a Tesla Motors Inc. Model X to determine whether the sport-utility vehicle’s Autopilot system was in use, days after starting a formal probe of the Silicon Valley company’s technology.

The National Highway Traffic Safety Administration is “collecting information” from the electric-car maker, state police and the driver of the Model X about the July 1 crash, the agency said in a statement Wednesday. Regulators are attempting to “determine whether automated functions were in use at the time of the crash,” the agency said.

Tesla TSLA, +0.20% said it doesn’t currently believe Autopilot was being used at the time of the crash. The Palo Alto, Calif., auto maker said it hasn’t been able to reach the driver and hasn’t received data about the state of the vehicle’s controls before the collision, possibly because of a damaged antenna.

The Detroit Free Press reported that the Tesla SUV hit a guardrail and then crashed into a concrete median before rolling on its roof on the Pennsylvania Turnpike more than 100 miles east of Pittsburgh. The driver and passenger survived the crash, the newspaper reported...
http://www.marketwatch.com/story/feds-investigate-autopilot-usage-in-second-recent-tesla-crash-2016-07-06
 
Motor Trend recently did a comparison of a number of these driver assist packages.
Quite detailed, and lots of quantifiable data, pretty good read.

http://www.motortrend.com/news/testing-semi-autonomous-cars-tesla-cadillac-hyundai-mercedes/
 
Currently, the biggest "problem" with Tesla's AutoPilot, as evidenced by that Motor Trend comparison, is that it's so much better than competing manufacturers' systems. The other OEMs will have the same "problem" as their tech improves.

The issue is that many AutoPilot users disengage themselves from driving, thus leaving themselves at the mercy of a system that's still far from perfect. Sure, AutoPilot isn't "supposed" to be used this way, but unfortunately we know that it is.

Arguably, though, this is not much of issue if the use of AutoPilot results in lower accident and fatality rates than not using it. I haven't seen sufficient data to come to a conclusion here. It seems that while AutoPilot is in some key areas not as capable as a human driver in a qualitative sense, the system on the other hand does not become drowsy, distracted, or fatigued. It's also improving with time.

Given how quickly this technology is evolving, I hope that our federal regulators maintain a light touch, at least for now.
 
abasile said:
Currently, the biggest "problem" with Tesla's AutoPilot, as evidenced by that Motor Trend comparison, is that it's so much better than competing manufacturers' systems. The other OEMs will have the same "problem" as their tech improves.

The issue is that many AutoPilot users disengage themselves from driving, thus leaving themselves at the mercy of a system that's still far from perfect. Sure, AutoPilot isn't "supposed" to be used this way, but unfortunately we know that it is.
This LA Times article puts it very well:
LA Times said:
The problem with level 2, critics say, is that it’s just autonomous enough to give drivers the false sense that the vehicle can drive itself, which can lead to careless behavior.
 
Less-than-complimentary report on how TSLA is handling its PR response to questions of autopilot safety:

Elon Musk Twitter rant a 'case study' in how not to handle a crisis, experts say

Tesla’s behaviour in the aftermath of news that a driver died while using the car’s autopilot feature criticised by crisis management experts as ‘error-filled’

Joshua Brown was killed in Florida in May after neither he nor the car’s autopilot system detected a truck crossing the highway ahead of him...

Jonathan Bernstein, president of the consulting firm Bernstein Crisis Management, said Musk’s behaviour was a perfect case study in the wrong way to handle this sort of crisis.

“What a CEO should do when there’s a death associated with one of his company’s products is respond, first and foremost, with compassion, and then with words that express competence and confidence,” Bernstein said.

“Musk seems to prefer angry defensiveness.”

“Quoting statistics that explain why the death isn’t so bad in the big picture has been proven time and time again to be quite ineffective in influencing public opinion,” he added...

According to Bernstein, resorting – as Musk did – to statistics to try to put an accident or malfunction which resulted in a death into a wider context, however well-meaning, was ill-judged. “I haven’t seen anybody foolish enough to try the statistics approach in a long time,” he said.

Asked what advice he would give to Musk, Bernstein said that he should “take a step back, take a deep breath, and practice delivering a message that communicates compassion, confidence, and competence”.

“And if you can’t do that, keep your mouth shut,” he added.
https://www.theguardian.com/technology/2016/jul/07/tesla-elon-musk-autopilot-death-crisis-management?CMP=share_btn_tw
 
GCR article, which I generally agree with. I just wish we could trust all drivers to behave in similar fashion to the article's author (Joshua Brown obviously didn't):
Tesla Autopilot crash: what one Model S owner has to say
http://www.greencarreports.com/news/1104892_tesla-autopilot-crash-what-one-model-s-owner-has-to-say

Personally, while I'd want any new car I'd buy to have AEB, until far better autonomy is available I wouldn't want auto lane keeping, and I've even become a bit less enthusiastic over ACC. I use conventional cruise control regularly, but I know that if I stop paying attention I'll rear end anyone I overtake in the same lane, and that knowledge (along with needing to steer) keeps me mentally engaged so that I can tweak the speed or cancel cruise as necessary.

If I had ACC, my attention might wander a bit, as I'd expect the car to not plow into the person in front of me, as it had correctly prevented that from happening the previous 1,000 times. But what if this instance is the exception to the rule, and I have to be the one to recognize the situation and take over at short notice? If the car allows a great enough following distance to be selected, I'd probably have enough time to get my head fully back into the game, assess the situation and act appropriately, but if it doesn't, I may be eating someone's bumper - that's the insidious problem that arises with systems that work in 'most' situations, 'most' of the time.
 
Two stories from AN lay out the significance of NTSB's autopilot investigation.

Odd, isn't it, that while many seem inherently fearful of self-driving cars before they experience them, the existential question once drivers experience autonomy, is how do you convince the same drivers that four-out-of-five-times-reliability is not good enough to turn over their lives to their autopilots...

NTSB to scrutinize driver automation with probe of Tesla crash

WASHINGTON -- For years, U.S. investigators have been calling for more automation on motor vehicles, such as sensors that slam on the brakes to prevent a crash.

At the same time, the National Transportation Safety Board, in its probes of transportation mishaps, has warned that such devices may also have a down side: the technology can confuse operators if it's poorly designed or lead to complacency that breeds its own hazards.

Now, for the first time in a highway accident, those two potentially contradictory themes will be put to the test as the NTSB opens an investigation into a fatal accident involving a Tesla Motors Inc. sedan that was driving with a feature called Autopilot enabled...
http://www.autonews.com/article/20160708/OEM11/307089859/ntsb-to-scrutinize-driver-automation-with-probe-of-tesla-crash

First real-world look at how drivers interact with autonomous technology

...NHTSA's inquiry is likely to focus on the ability of Autopilot's cameras and sensors to detect and react to road hazards. Even if the system performed as it was designed to, experts say, the agency could find that it poses an unreasonable safety risk as deployed.

Tesla rolled out its Autopilot system in October through software update, with key features still in the beta, or late testing, stage...

Tesla warned: "The driver is still responsible for, and ultimately in control of, the car."

The extent to which drivers heed that warning is unclear. Missy Cummings, an engineering professor and human-factors expert at Duke University, says humans tend to show "automation bias," a trust that automated systems can handle all situations when they work 80 percent of the time.

The Model S crash that killed driver Joshua Brown, a former Navy SEAL and Tesla fan, was "not an isolated incident," Cummings said, citing examples of other reported crashes involving Tesla cars that had Autopilot functions activated...
http://www.autonews.com/article/20160709/OEM11/307119915/first-real-world-look-at-how-drivers-interact-with-autonomous
 
They seem to have already made up their minds...

Consumer Watchdog Calls On Elon Musk To Disable Tesla’s Autopilot; Says Company Must Pledge To Be Liable For Self-Driving Failures If Feature Returns


7/8/2016
http://www.consumerwatchdog.org/newsrelease/consumer-watchdog-calls-elon-musk-disable-tesla%E2%80%99s-autopilot-says-company-must-pledge-be-

Mr. Elon Musk July 7, 2016
Chairman, Product Architect and CEO
Tesla Motors Inc.
3500 Deer Creek Rd.
Palo Alto, CA 94304

Dear Mr. Musk:

We are writing to express Consumer Watchdog’s concerns about Tesla Motors’ woefully
inadequate response to the tragic May 7 fatal crash in Florida of a Tesla Model S being
controlled by the autopilot feature, and the emerging pattern of blaming victims involved in the
crashes while using the feature. Our first concern is your inexplicable delay in announcing the
crash. You made a public acknowledgement on June 30, only after the National Highway
Traffic Safety Administration announced it was investigating. Such a delay when your
customers continued to drive on public highways relying on autopilot and believing it to be safe
is inexcusable...

Tesla is rushing self-driving technologies to the highways prematurely, however, as the crashes
demonstrate, autopilot isn’t safe and you should disable it immediately. If autopilot can
ultimately be shown to meet safety standards and is then redeployed, you must pledge to be
liable if anything goes wrong when the self-driving system is engaged.

Sincerely,

Jamie Court Carmen Balber John M. Simpson
President Executive Director Privacy Project Director
http://www.consumerwatchdog.org/resources/ltrmusk070716.pdf
 
evnow said:
Similar features by other companies work even worse - and aren't in Beta.
Does that include the hands off feature that allows people to doze off? Makes for a far more dangerous and useful system because of this.
 
edatoakrun said:
Odd, isn't it, that while many seem inherently fearful of self-driving cars before they experience them, the existential question once drivers experience autonomy, is how do you convince the same drivers that four-out-of-five-times-reliability is not good enough to turn over their lives to their autopilots...
We (us humans) already turn over our lives to autopilots....

Traffic lights
Elevators
Fire alarm systems
Smoke alarms
An airliner with a pilot still has "automation" between the pilot and the physical parts

I can think of lots more. How about a list?
 
jeffthewalker said:
edatoakrun said:
Odd, isn't it, that while many seem inherently fearful of self-driving cars before they experience them, the existential question once drivers experience autonomy, is how do you convince the same drivers that four-out-of-five-times-reliability is not good enough to turn over their lives to their autopilots...

We (us humans) already turn over our lives to autopilots....

Traffic lights
Elevators
Fire alarm systems
Smoke alarms
An airliner with a pilot still has "automation" between the pilot and the physical parts

I can think of lots more. How about a list?
The first four systems are orders of magnitude less complex than developing a level 4 or 5, or even level 3 autonomous vehicle.

(http://www.techrepublic.com/article/autonomous-driving-levels-0-to-5-understanding-the-differences/ and http://www.nhtsa.gov/About+NHTSA/Press+Releases/U.S.+Department+of+Transportation+Releases+Policy+on+Automated+Vehicle+Development explains the levels.)

A plane's autopilot also has FAR less to deal with given that planes are much further apart than cars, there aren't physical lane barriers, they don't need to worry about pedestrians, traffic lights, traffic signs, etc. There's also an air traffic control system, standards, along w/controllers and pilots that undergo lots of training besides a pilot's license being much harder to get than a driver's license.

The subject of vision in computers is a very difficult subject (most folks w/a computer science background know this) and there are all sorts of other challenges (e.g. interpreting traffic signals, recognizing pedestrians, bicyclists, gestures by police (and recognizing they're police vs/ a kook on the road) that might contradict traffic lights, etc.) In addition, you need to have reliable distance measurements.

See the IEEE videos at http://www.mynissanleaf.com/viewtopic.php?f=12&t=6338 or maybe the video in my 2nd post there.

As I posted elsewhere w/a few edits here:

There are so many questions and implications as cars become more autonomous, if not totally. Examples:
- who is liable in the event of an accident or traffic laws are broken?
- will drivers be able to react quickly enough and takeover in the event of the system screwing up or not knowing what to do?
- if there's a no-win situation where someone has to die, who should it kill? The driver or the others? What if it's between the autonomous vehicle having to send the car off a cliff to kill the driver and its passengers in order to save a busload of people?
- will drivers and others around them trust the technology enough?
- will driving skills atrophy so much that drivers will be unable to properly drive w/o aids or take over, in the event it's needed?
- how will autonomous vehicles inform other vehicles and pedestrians as to their intent? The Nissan IDS Concept video at ~2:25 https://www.youtube.com/watch?v=h-TLo86K7Ck shows examples of indicators and signage.
- should drivers be able to take over control at all or should there be no steering wheels or pedals or should they be disabled/retracted when in auto-pilot mode?
 
Not really an argument for or against, but some of you may recall a lawsuit about a decade ago: an elderly couple bought an RV, and set out on a vacation trip. After getting on the freeway, they set the cruise control, and...both went into the back for a cup of coffee. The RV crashed, of course, and they sued, arguing that the salesman hadn't adequately explained how the cruise control system worked, and what its limitations were.
 
cwerdna said:
jeffthewalker said:
edatoakrun said:
Odd, isn't it, that while many seem inherently fearful of self-driving cars before they experience them, the existential question once drivers experience autonomy, is how do you convince the same drivers that four-out-of-five-times-reliability is not good enough to turn over their lives to their autopilots...

We (us humans) already turn over our lives to autopilots....
The first four systems are orders of magnitude less complex than developing a level 4 or 5, or even level 3 autonomous vehicle.

I agree 100% with every point you make.

I quoted "edatoakrun How do you convince the same drivers...." and gave a few simple examples where humans already accept many levels of automation.

I understand tech complexity.

Pilots license
Amateur radio license
Engineer (retired)
Microprocessor control programmer (still going)
Lay psychologist:)
 
LeftieBiker said:
Not really an argument for or against, but some of you may recall a lawsuit about a decade ago: an elderly couple bought an RV, and set out on a vacation trip. After getting on the freeway, they set the cruise control, and...both went into the back for a cup of coffee. The RV crashed, of course, and they sued, arguing that the salesman hadn't adequately explained how the cruise control system worked, and what its limitations were.
That one has sure made the rounds! http://www.snopes.com/autos/techno/cruise.asp
 
Back
Top