Tesla's autopilot, on the road

My Nissan Leaf Forum

Help Support My Nissan Leaf Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
baustin said:
Which will likely disable the AutoPilot functions...
In a car-share vehicle that may be true (assuming you can write code to do that). But in a privately-owned car, I doubt they could get away with that. Even if they can, that just provides an opportunity for competitors who offers camera-free cabins. Most people simply won't care, and will ignore them just as they ignore security cameras now. A business opportunity awaits, to serve those that do care.
 
GRA said:
baustin said:
Which will likely disable the AutoPilot functions...
In a car-share vehicle that may be true (assuming you can write code to do that). But in a privately-owned car, I doubt they could get away with that. Even if they can, that just provides an opportunity for competitors who offers camera-free cabins. Most people simply won't care, and will ignore them just as they ignore security cameras now. A business opportunity awaits, to serve those that do care.

If Tesla is going to install a camera to monitor the driver while AutoPilot is active, then I would image that they would code it to either disable or limit the AutoPilot functions if the camera gets no image. If they are not going to do that, then what is the point of installing a camera to monitor the driver?
 
baustin said:
GRA said:
baustin said:
Which will likely disable the AutoPilot functions...
In a car-share vehicle that may be true (assuming you can write code to do that). But in a privately-owned car, I doubt they could get away with that. Even if they can, that just provides an opportunity for competitors who offers camera-free cabins. Most people simply won't care, and will ignore them just as they ignore security cameras now. A business opportunity awaits, to serve those that do care.

If Tesla is going to install a camera to monitor the driver while AutoPilot is active, then I would image that they would code it to either disable or limit the AutoPilot functions if the camera gets no image. If they are not going to do that, then what is the point of installing a camera to monitor the driver?
What's to stop someone from taping a photo (of Elon Musk or J.B. Straubel, if they wish) in front of the camera lens? :D I can see plenty of ways to game such a system, but maybe they can make it foolproof. To me though, the whole issue simply goes away if the cars are fully autonomous and the company assumes responsibility for any accidents the car causes while it's driving itself, which is what the standard should be. Then, who cares what the occupants are doing inside the car? Monitor cameras used as above serve solely as a means of shifting the blame when a car is semi-autonomous. Monitoring a driver for signs of drowsiness is another matter, but there are other indicators of that which don't require a camera.
 
Reading through these last few posts I am directed to the little "EYE" on the top of my Laptop. How many of you are concerned with someone remotely watching you through that camera? How many of you have covered that EYE to SPY? As soon as I get a new laptop (had to get a replacement this past November) I place one of those little circular band-aids over the lens. It both protects the lens and allows me to be protected from any remote viewing. jmho
 
Graffi said:
Reading through these last few posts I am directed to the little "EYE" on the top of my Laptop. How many of you are concerned with someone remotely watching you through that camera? How many of you have covered that EYE to SPY? As soon as I get a new laptop (had to get a replacement this past November) I place one of those little circular band-aids over the lens. It both protects the lens and allows me to be protected from any remote viewing. jmho
Yup.
 
Graffi said:
Reading through these last few posts I am directed to the little "EYE" on the top of my Laptop. How many of you are concerned with someone remotely watching you through that camera? How many of you have covered that EYE to SPY? As soon as I get a new laptop (had to get a replacement this past November) I place one of those little circular band-aids over the lens. It both protects the lens and allows me to be protected from any remote viewing. jmho

As I see it, this fear of remote viewing is fueled by paranoia and ignorance. Many cameras have a light to show when they are active. The only reported case, that I am aware of, where this actually happened, the remote viewer loaded the software before he handed out the laptops to the recipients. Yes, it could happen, but the odds of it are extremely low.
 
baustin said:
Graffi said:
Reading through these last few posts I am directed to the little "EYE" on the top of my Laptop. How many of you are concerned with someone remotely watching you through that camera? How many of you have covered that EYE to SPY? As soon as I get a new laptop (had to get a replacement this past November) I place one of those little circular band-aids over the lens. It both protects the lens and allows me to be protected from any remote viewing. jmho

As I see it, this fear of remote viewing is fueled by paranoia and ignorance. Many cameras have a light to show when they are active. The only reported case, that I am aware of, where this actually happened, the remote viewer loaded the software before he handed out the laptops to the recipients. Yes, it could happen, but the odds of it are extremely low.
Sure, much of it is driven by paranoia, but as the old saying goes, just because you're paranoid doesn't mean they aren't out to get you! :lol:

Re indicator lights, the people who worry about such things will point out that if they can hack your computer remotely to turn the camera on, they can probably disable the indicator light as well. See the "Autonomous vehicles" thread for Tesla's concerns about hacking the AI.
 
please keep on-topic.

The camera used to monitor driver behavior in a semi-autonomous vehicle is not analogous to the one on a computer.

Blocking the vehicle interior camera in the 3 would presumably result either in disabling some or all of the driver assist functions, or completely shutting down the car.

TSLA as not yet announced which option it will employ to discourage such behavior.
 
edatoakrun said:
please keep on-topic.

The camera used to monitor driver behavior in a semi-autonomous vehicle is not analogous to the one on a computer.
They both represent a loss of privacy to those who are worried about it, employing the exact same technology to do so; only the resulting actions are different, so it seems very much on-topic.

edatoakrun said:
Blocking the vehicle interior camera in the 3 would presumably result either in disabling some or all of the driver assist functions, or completely shutting down the car.

TSLA as not yet announced which option it will employ to discourage such behavior.
Alternatively, they could leave it up to the owner/driver as to whether or not they want the camera activated, with possible effects on the insurance rates. Or, mandatory camera usage may simply be outlawed, if enough of the public cares. I doubt they would, but it's possible.
 
edatoakrun said:
please keep on-topic.

The camera used to monitor driver behavior in a semi-autonomous vehicle is not analogous to the one on a computer.

Blocking the vehicle interior camera in the 3 would presumably result either in disabling some or all of the driver assist functions, or completely shutting down the car.

TSLA as not yet announced which option it will employ to discourage such behavior.
Yet airline pilots have been successful in keeping cameras out of the cockpit - their workplace - because of the loss of privacy it represents.

Sorry, but driver-facing cameras are not needed to create autonomous vehicles.
 
WSJ has seemingly overcome the barrier blocking accurate reporting on AP- and most other Tesla subjects.

Sources:


Tesla’s Push to Build a Self-Driving Car Sparked Dissent Among Its Engineers

Elon Musk’s ambitious goals for Autopilot technology have prompted safety warnings and resignations
PALO ALTO, Calif.— Tesla Inc. TSLA 0.05% Chief Executive Elon Musk jolted the automotive world last year when he announced the company’s new vehicles would come with a hardware upgrade that would eventually allow them to drive themselves.

He also jolted his own engineering ranks.

Members of the company’s Autopilot team hadn’t yet designed a product they believed would safely and reliably control a car without human intervention, according to people familiar with the matter.

In a meeting after the October announcement, someone asked Autopilot director Sterling Anderson how Tesla could brand the product “Full Self-Driving,” several employees recall. “This was Elon’s decision,” they said he responded. Two months later, Mr. Anderson resigned.

In the race to develop autonomous vehicles, few companies have moved faster than Tesla, an electric-car pioneer that this year surpassed General Motors Co. as the nation’s most-valuable auto maker.

Behind the scenes, the Autopilot team has clashed over deadlines and design and marketing decisions, according to more than a dozen people who worked on the project and documents reviewed by The Wall Street Journal. In recent months, the team has lost at least 10 engineers and four top managers—including Mr. Anderson’s successor, who lasted less than six months before leaving in June.

Tesla said the vehicle hardware unveiled in October will enable “full self-driving in almost all circumstances, at what we believe will be a probability of safety at least twice as good as the average human driver.” The self-driving feature is subject to software development and regulatory approval, and “it is not possible to know exactly when each element of the functionality described” will be available, Tesla noted...

Weeks before the October 2015 release of Autopilot, an engineer who had worked on safety features warned Tesla that the product wasn’t ready, according to a resignation letter circulated to other employees and reviewed by the Journal.

Autopilot’s development was based on “reckless decision making that has potentially put customer lives at risk,” the engineer, Evan Nakano, wrote.

Tesla declined to comment specifically on Mr. Nakano...
https://www.wsj.com/articles/teslas-push-to-build-a-self-driving-car-sparks-dissent-among-its-engineers-1503593742?mg=prod/accounts-wsj
 
edatoakrun said:
WSJ has seemingly overcome the barrier blocking accurate reporting on AP- and most other Tesla subjects.
The problem that I have with this article is that it's essentially trying to show that Elon Musk and Tesla Motors have been acting in bad faith, trying to push a product before it's ready and endangering the public. The current AP features, though, have been rolled out in what I'd consider a sufficiently conservative manner. Of course, not everyone agrees, but the other automakers are seeking to develop similar systems.

The biggest issue I see is that "full self driving" (FSD) appears to be far from deployment, and Tesla has been overly optimistic as to how long it's going to take. In the meantime, they're happily taking money from people who are eager to pre-pay for FSD, and they've used FSD to market their vehicles. I don't at all believe that they're acting in bad faith, but rather that they've made the all-too-common mistake among engineers of failing to fully appreciate the complex details that need to be addressed. Remember, Elon is a physicist and engineer himself - he's not simply some corporate executive blindly pushing forward.

Essentially, Elon Musk feels very confident that FSD is do-able in the near term, and other key people have understandably disagreed. Elon says that he is personally quite involved in AP efforts. I don't doubt that Elon desperately wants to achieve FSD. At this very moment, he's probably pushing his AP team to make significant personal sacrifices and "achieve the impossible". While FSD is not going to get done as quickly as hoped for, I wouldn't be too quick to count Elon out.

If I were in my 20s with no kids, I'd probably want to get a software job at Tesla and work on cutting edge stuff like AP/FSD. But for someone who's more established in life and desires a healthy work/life balance, a job under Elon's watchful eye could be a hard sell.
 
The problem that I have with this article is that it's essentially trying to show that Elon Musk and Tesla Motors have been acting in bad faith, trying to push a product before it's ready and endangering the public.

Go read the threads regarding AP on Tesla Motors Club and you will be left with the same impression.
 
Joe6pack said:
The problem that I have with this article is that it's essentially trying to show that Elon Musk and Tesla Motors have been acting in bad faith, trying to push a product before it's ready and endangering the public.

Go read the threads regarding AP on Tesla Motors Club and you will be left with the same impression.
This one, for example:

...This morning when approaching a bridge overpass traveling at 65mph the car very quickly decelerated to 45mph, throwing everything in the passenger seat into the floorboard BUT MORE IMPORTANTLY causing everyone behind her to start braking hard and almost caused a major accident. My wife would not have been part of it but our car would have been the cause of it...

My point in this post is to offer a bit of acknowledgement to those that have had similar experiences and say "I hear'ya" AND to ask publicly WHERE IS OUR SS. Either SilkySmooth from months ago or our SomethingSpecial mentioned over 2 weeks ago. As much as I HATE to say it, Tesla either needs to turn off TACC and EAP until they have something safer or put out an update to make it an order of magnitude safer...
https://teslamotorsclub.com/tmc/threads/so-silkysmooth-never-happened-now-waithing-on-somethingspecial.96373/
 
abasile said:
I don't at all believe that they're acting in bad faith, but rather that they've made the all-too-common mistake among engineers of failing to fully appreciate the complex details that need to be addressed.

Right, the FSD problem is not deterministic in nature, i.e. it's solution is inherently probabilistic requiring the use of AI.

abasile said:
Remember, Elon is a physicist and engineer himself - he's not simply some corporate executive blindly pushing forward.

Actually, he just has two undergraduate degrees, one in physics and the other in economics.

abasile said:
Essentially, Elon Musk feels very confident that FSD is do-able in the near term, and other key people have understandably disagreed. Elon says that he is personally quite involved in AP efforts. I don't doubt that Elon desperately wants to achieve FSD. At this very moment, he's probably pushing his AP team to make significant personal sacrifices and "achieve the impossible". While FSD is not going to get done as quickly as hoped for, I wouldn't be too quick to count Elon out.

But Elon's ability to achieve success in FSD as has been achieved with SpaceX is unrealistic, where the problem is more
than just hiring the best PhDs, e.g. in rocket technology/science at SpaceX.
 
NTSB press release on the Joshua Brown crash: https://www.ntsb.gov/news/press-releases/Pages/PR20170912.aspx

The National Transportation Safety Board determined Tuesday that a truck driver’s failure to yield the right of way and a car driver’s inattention due to overreliance on vehicle automation are the probable cause of the fatal May 7, 2016, crash near Williston, Florida.

The NTSB also determined the operational design of the Tesla’s vehicle automation permitted the car driver’s overreliance on the automation, noting its design allowed prolonged disengagement from the driving task and enabled the driver to use it in ways inconsistent with manufacturer guidance and warnings.

As a result of its investigation the NTSB issued seven new safety recommendations and reiterated two previously issued safety recommendations.

“While automation in highway transportation has the potential to save tens of thousands of lives, until that potential is fully realized, people still need to safely drive their vehicles,” said NTSB Chairman Robert L. Sumwalt III. “Smart people around the world are hard at work to automate driving, but systems available to consumers today, like Tesla’s ‘Autopilot’ system, are designed to assist drivers with specific tasks in limited environments. These systems require the driver to pay attention all the time and to be able to take over immediately when something goes wrong. System safeguards, that should have prevented the Tesla’s driver from using the car’s automation system on certain roadways, were lacking and the combined effects of human error and the lack of sufficient system safeguards resulted in a fatal collision that should not have happened,” said Sumwalt.

Findings in the NTSB’s report include:

  • The Tesla’s automated vehicle control system was not designed to, and could not, identify the truck crossing the Tesla’s path or recognize the impending crash. Therefore, the system did not slow the car, the forward collision warning system did not provide an alert, and the automatic emergency braking did not activate.

    The Tesla driver’s pattern of use of the Autopilot system indicated an over-reliance on the automation and a lack of understanding of the system limitations.

    If automated vehicle control systems do not automatically restrict their own operation to conditions for which they were designed and are appropriate, the risk of driver misuse remains.

    The way in which the Tesla “Autopilot” system monitored and responded to the driver’s interaction with the steering wheel was not an effective method of ensuring driver engagement.*

    Tesla made design changes to its “Autopilot” system following the crash. The change reduced the period of time before the “Autopilot” system issues a warning/alert when the driver’s hands are off the steering wheel. The change also added a preferred road constraint to the alert timing sequence.

    Fatigue, highway design and mechanical system failures were not factors in the crash. There was no evidence indicating the truck driver was distracted by cell phone use. While evidence revealed the Tesla driver was not attentive to the driving task, investigators could not determine from available evidence the reason for his inattention.

    Although the results of post-crash drug testing established that the truck driver had used marijuana before the crash, his level of impairment, if any, at the time of the crash could not be determined from the available evidence.

The NTSB issued a total of seven safety recommendations based upon its findings, with one recommendation issued to the US Department of Transportation, three to the National Highway Traffic Safety Administration, two to the manufacturers of vehicles equipped with Level 2 vehicle automation systems, and one each to the Alliance of Automobile Manufacturers and Global Automakers.

The safety recommendations address the need for: event data to be captured and available in standard formats on new vehicles equipped with automated vehicle control systems; manufacturers to incorporate system safeguards to limit the use of automated control systems to conditions for which they are designed and for there to be a method to verify those safeguards; development of applications to more effectively sense a driver’s level of engagement and alert when engagement is lacking; and it called for manufacturers to report incidents, crashes, and exposure numbers involving vehicles equipped with automated vehicle control systems.

The board reiterated two safety recommendations issued to the National Highway Traffic Safety Administration in 2013, dealing with minimum performance standards for connected vehicle technology for all highway vehicles and the need to require installation of the technology, once developed, on all newly manufactured highway vehicles.

The abstract of the NTSB’s final report, that includes the findings, probable cause and safety recommendations is available online at https://go.usa.gov/xRMFc. The final report will be publicly released in the next several days. The docket for this investigation is available at https://go.usa.gov/xNvaE.

* The abstract goes into more detail on this point:
6. Because driving is an inherently visual task and a driver may touch the steering wheel without
visually assessing the roadway, traffic conditions, or vehicle control system performance,
monitoring steering wheel torque provides a poor surrogate means of determining the
automated vehicle driver’s degree of engagement with the driving task.

Recommendations included in the full report:
RECOMMENDATIONS

New Recommendations

As a result of its investigation, the National Transportation Safety Board makes the
following new safety recommendations:

To the US Department of Transportation:

  • 1. Define the data parameters needed to understand the automated vehicle control systems
    involved in a crash. The parameters must reflect the vehicle’s control status and the
    frequency and duration of control actions to adequately characterize driver and vehicle
    performance before and during a crash.

To the National Highway Traffic Safety Administration:

  • 2. Develop a method to verify that manufacturers of vehicles equipped with Level 2 vehicle
    automation systems incorporate system safeguards that limit the use of automated vehicle
    control systems to those conditions for which they were designed
    .

    3. Use the data parameters defined by the US Department of Transportation in response to
    Safety Recommendation [1] as a benchmark for new vehicles equipped with automated
    vehicle control systems so that they capture data that reflect the vehicle’s control status and
    the frequency and duration of control actions needed to adequately characterize driver and
    vehicle performance before and during a crash; the captured data should be readily
    available to, at a minimum, National Transportation Safety Board investigators and
    National Highway Traffic Safety Administration regulators.

    4. Define a standard format for reporting automated vehicle control systems data, and require
    manufacturers of vehicles equipped with automated vehicle control systems to report
    incidents, crashes, and vehicle miles operated with such systems enabled.

To manufacturers of vehicles equipped with Level 2 vehicle automation systems (Audi of
America, BMW of North America, Infiniti USA, Mercedes-Benz USA, Tesla Inc., and Volvo
Car USA):

  • 5. Incorporate system safeguards that limit the use of automated vehicle control systems to
    those conditions for which they were designed
    .

    6. Develop applications to more effectively sense the driver’s level of engagement and alert
    the driver when engagement is lacking while automated vehicle control systems are in use.
    To the Alliance of Automobile Manufacturers and to Global Automakers:

    7. Notify your members of the importance of incorporating system safeguards that limit the
    use of automated vehicle control systems to those conditions for which they were designed
    .

Reiterated Recommendations

As a result of its investigation, the National Transportation Safety Board reiterates the
following safety recommendations:

To the National Highway Traffic Safety Administration:

  • Develop minimum performance standards for connected vehicle technology for all
    highway vehicles. (H-13-30)

    Once minimum performance standards for connected vehicle technology are developed,
    require this technology to be installed on all newly manufactured highway vehicles. (H-13-31).

I'd also recommend having a look at slides #46-51 in the power point presentation: https://www.ntsb.gov/news/events/Documents/2017-HWY16FH018-BMG-presentations.pdf
 
We just got back from a 900 mi round trip (MA=>VA and back) where over 650 miles were all done on autopilot. While AP2 isn't perfect, and I personally think FSD is a long way away, it certainly makes highway driving less tedious. Why only 650? My wife is more of a control freak and at times needs to be reminded to turn on the AP. 99% of my time at the wheel was on AP.

AP2 has come a long way in the last 6 months. Back on the earlier versions (up until May or so, when they lifted the max speed from 60 to 85MPH) we would occasionally see the sudden slow down at an overpass as was quoted above. That hasn't happened since then at all.
 
Back
Top