DanCar
Posts: 1025
Joined: Sat Apr 24, 2010 12:00 am
Delivery Date: 10 Mar 2013
Location: SF Bay area, 94043

Re: Tesla's autopilot, on the road

Thu May 23, 2019 4:38 am

From my informal polling at work, most people don't think much about NoA, but they are more concerned (pissed off is a more accurate description) about sudden unexpected and unexplained braking. Still an issue in latest release. Hopefully something easy to fix.
2013 Leaf SL leased 3/10/2013
https://twitter.com/DanielCardena

Oils4AsphaultOnly
Posts: 669
Joined: Sat Oct 10, 2015 4:09 pm
Delivery Date: 20 Nov 2016
Leaf Number: 313890
Location: Arcadia, CA

Re: Tesla's autopilot, on the road

Thu May 23, 2019 2:18 pm

cwerdna wrote:
GRA wrote:I'm a bit pushed for time this week, so I'll have to put off my reply for a while. I'll get to it when I can. In the meantime, you can ponder this while hopefully reconsidering your answer to my question re what constitutes adequate pre-public deployment development testing:
Tesla's Navigate on Autopilot Requires Significant Driver Intervention
CR finds that latest version of Tesla's automatic lane-changing feature is far less competent than a human driver
For those who don't follow along on TMC (I take hiatuses sometimes for months at a time), CR's findings (https://www.consumerreports.org/autonom ... ervention/) aren't really new. There are tons of threads and posts on TMC about how poorly NoA works, in general. It's been going on ever since the feature was rolled out.

And, IIRC, there are definitely some software releases that are worse than others from an AP and NoA POV. Latest isn't always the greatest.
Since GRA won't post this, I thought I'd do you both a favor and post it: https://insideevs.com/news/351110/consu ... pilot/amp/

Turns out, CR actually approves of NoA. They just don't like the version that does the lane-changes for you. Go figure. I'll still make my own judgements based upon the actual use of the product though, instead of hearsay.
:: Model 3 LR :: acquired 9 May '18
:: Leaf S30 :: build date: Sep '16 :: purchased: Nov '16
100% Zero transportation emissions (except when I walk) and loving it!

User avatar
SalisburySam
Gold Member
Posts: 327
Joined: Thu Sep 27, 2012 11:01 am
Delivery Date: 24 Feb 2012
Leaf Number: 018156
Location: Salisbury, NC

Re: Tesla's autopilot, on the road

Fri May 24, 2019 6:09 am

There might be come confusion on lane changing so let me interject my experience with changing lanes in my Model 3. My 10-month-old long-range, rear-wheel drive car has what at the time were called Enhanced Auto Pilot (EAP) and Full Self-Driving (FSD) options. The former has several features such as the Traffic-Aware Cruise Control (TACC), Summon (to go straight into and out of a parking spot), Self Parking (to automatically parallel and perpendicular park), and AutoSteer (to remain in lane). The FSD option is vaporware at this point but allegedly will include a no-cost hardware update at some point and enabling features. The definition and content of these options have changed since my purchase, so YMMV. The current definition of FSD offers the Navigate On AutoPilot feature, IIRC.

That said, the NoA feature works pretty well for me, and has gotten much, much better since it first appeared in a firmware update sent to my car a few months ago. Today on the latest firmware (received version 2019.16.2 yesterday), the Lane Change feature works as follows:

1- You can leave AutoSteer off and change lanes manually, as with any vehicle, and with or without cruise control in use.
2- AutoSteer can be on and you can trigger a lane change by activating the turn signal in the direction you wish to go. The car does the rest including turning off the turn signal after completing the lane change.
3- AutoSteer can be on and you’re using the confirmed Lane Change. This means the car will identify a Lane Change opportunity, ask you to confirm with the turn signal, and the car does the rest. Note that this and Option 2 are NOT mutually exclusive.
4- AutoSteer can be on and you’ve chosen to use the no-confirmation Lane Change option. This means the car will identify a Lane Change opportunity and if your hands can be sensed by the car as physically steering (applying some torque to the wheel), the car will do the rest including turning on the turn signals to initiate and turning them off when the lane change is complete. Note that this and Option 2 are NOT mutually exclusive.

All of the above work, and work quite well in my case. And note that all are optional...you decide, and you choose, and you can change your mind at any time. For me, I like Option 3 above as it is the most intuitive, again for me. To me, Option 4 is not exactly “no confirmation” as you have to apply a tiny amount of steering torque...another form of confirmation. But it’s there, it works, and it’s nice to have choices.
Nissan 2012 LEAF SL, 13,500 miles, 9 bars, 70.4% SOH, 46.19 Ahr

Tesla Model 3: Long Range Rear Wheel Drive | Extended AutoPilot | Full Self-Driving
Delivered: July, 2018 | 10,500 miles
Get 1000 miles free Supercharging: https://ts.la/john70942

User avatar
jlv
Moderator
Posts: 1049
Joined: Thu Apr 24, 2014 6:08 pm
Delivery Date: 30 Apr 2014
Leaf Number: 424487
Location: Massachusetts

Re: Tesla's autopilot, on the road

Fri May 24, 2019 8:10 am

SalisburySam wrote:All of the above work, and work quite well in my case. And note that all are optional...you decide, and you choose, and you can change your mind at any time. For me, I like Option 3 above as it is the most intuitive, again for me. To me, Option 4 is not exactly “no confirmation” as you have to apply a tiny amount of steering torque...another form of confirmation. But it’s there, it works, and it’s nice to have choices.
+1

Excellent description and summary. Had CR not been trying to be negative on Tesla, they might have posted something equally as informative as your single message here.

Last weekend I took a 500 mile round trip across VT to NY and back. I turned on no-confirmation lane changes ("option 4") for the first time on the whole trip. In the light traffic volume of I89 in VT it seemed to do reasonably well. I doubt very much that the current iteration would be tolerable on a highway with more than just medium volume.
LEAF '13 SL+Prem (mfg 12/13, leased 4/14, bought 5/17, sold 11/18) 34K mi, AHr 58, SOH 87%
Tesla S 75D (3/17)
Tesla X 100D (12/18)
80K 100% BEV miles since '14
ICE free since '18

GRA
Posts: 10687
Joined: Mon Sep 19, 2011 1:49 pm
Location: East side of San Francisco Bay

Re: Tesla's autopilot, on the road

Tue May 28, 2019 5:48 pm

Back from the Holiday weekend, and hope everyone enjoyed it,.
Oils4AsphaultOnly wrote:
GRA wrote:
Oils4AsphaultOnly wrote: There are at least 2 other reasons why Tesla could have good data and still not be able to share it. Access to the raw data might involve a significant breach of network security protocols. And/or extraction/reproduction of that data is too expensive (includes staff resources) to justify an effort that might not satisfy all the critics.
If other companies are able to do it, then Tesla can (and should) be held to the same standard.
Really? And which other companies have released their driver assistance accident data? We're talking non-FARS here right?
Oh, none of the other ADAS companies currently, because none of them other than Tesla are claiming that they're currently safer than humans. It's the full AV companies that are required to provide their data, and anyone making such safety claims about an ADAS should also be required to do so. Which is just what CR has been calling on them to do, most recently on April 22:
Consumer Reports: Tesla Must Prove Safety Before Claiming “Self-Driving” Ability
https://advocacy.consumerreports.org/pr ... g-ability/
. . . David Friedman, Vice President of Advocacy for Consumer Reports, said, “Technology has the potential to shape future transportation to be safer, less expensive, and more accessible. Yet, safety must always come first. Today’s driver assistance technologies have helped deliver on safety, but the marketplace is full of bold claims about self-driving capabilities that overpromise and underdeliver. For instance, Tesla’s current driver-assist system, ‘Autopilot,’ is no substitute for a human driver. It can’t dependably navigate common road situations on its own, and fails to keep the driver engaged exactly when it is needed most.

“We’ve heard promises of self-driving vehicles being just around the corner from Tesla before. Claims about the company’s driving automation systems and safety are not backed up by the data, and it seems today’s presentations had more to do with investors than consumers’ safety. We agree that Tesla, and every other car company, has a moral imperative to make transportation safer, and all companies should embrace the most important principle: preventing harm and saving lives.

But instead of treating the public like guinea pig, Tesla must clearly demonstrate a driving automation system that is substantially safer than what is available today, based on rigorous evidence that is transparently shared with regulators and consumers, and validated by independent third-parties. In the meantime, the company should focus on making sure that proven crash avoidance technologies on Tesla vehicles, such as automatic emergency braking with pedestrian detection, are as effective as possible.”
H'mm, those criticisms sound familiar.
Oils4AsphaultOnly wrote:
GRA wrote:
Oils4AsphaultOnly wrote:I have no misconceptions that people won't become complacent again, but the trade-off is that during these years, there have been many lives saved. Yes, I know you want the data to prove this (and you know I only have a count of how many videos and arrests of people asleep/drunk at the wheel). I just want to spell out what I'm claiming and how your studies don't refute it. Also, the chilling effect that you're so worried about (but hasn't happened) is why I'm pointing out that A/P is ADAS. People are _correctly_ assigning responsibility to the drivers and not A/P.
If the ADAS is guaranteed to be abused by normal humans, and it is, and the manufacturer doesn't take steps to ensure that it can't be, then responsibility is shared. Seeing as how Teslas know which road they're on and the speed limit, a single line of code could have prevented both Brown's and Brenner's deaths, or at least, eliminated A/P's role in that death, by preventing A/P from being used a situation where Tesla knows that AEB doesn't work, which is dealing with cross traffic. Something like this is all it would take:

If ROADTYPE = "Freeway" then A/P = "OK" ELSE A/P = "Not OK"

This is what Cadillac does with Supercruise, so Tesla should take one of the programmers they have spending far more time writing cutesy Easter Eggs and put them to work making a simple change that will save lives.
You know nothing about programming if you think it's really that simple. And you know even less about neural nets if you expect the programmers to interject their code like that. If Cadillac's engineers really wrote code like that, then they'll never get to any level of self-driving. Considering that they at least made it to level 2+, there's still hope for them.
I make no pretense of claiming to be a programmer - the last time I did any was when I taught myself Basic (on a Commodore Pet, 8k RAM and program/data storage on a cassette tape) back it the late '70s, but you cannot deny that an instruction such as the above limiting A/P use to limited-access freeways with no cross traffic would have prevented both Brown's an Brenner's A/P-enabled deaths, in circumstances where Tesla knew it wouldn't work. Cadillac actually goes further than just limiting Supercruise usage to limited-access freeways (and having an eye-tracking camera to ensure drivers are looking at the road, as they only allow its usage on those portions of freeways for which they have high-def digital LIDAR maps. The latter on U.S. 101 at the Hwy 85 interchange would have presumably prevented Huang's death at the hands of A/P.

Here's CR's comparison test of the various ADAS systems: https://www.cadillac.com/content/dam/ca ... rticle.pdf
Oils4AsphaultOnly wrote:
GRA wrote:
Oils4AsphaultOnly wrote: So what does the FARS database show? I don't have a desktop available to analyze the compressed 14MB csv file.
Demographics, # of occupants, driver/pax/motorcyclist, non-motorist, conditions, speeds, type of accident etc.
Not what I mean, but I guess you're not able to analyze the data yourself huh? Depending on other people to draw the conclusions? I'll dig into it later and see if at least make/model info is buried in there.
I am not a professional statistician nor do I play one on TV, so of course I'm not going to analyze the data myself. Neither is Elon, but that hasn't stopped him from making claims based on erroneous methodology.
Last edited by GRA on Tue May 28, 2019 6:17 pm, edited 1 time in total.
Guy [I have lots of experience designing/selling off-grid AE systems, some using EVs but don't own one. Local trips are by foot, bike and/or rapid transit].

The 'best' is the enemy of 'good enough'. Copper shot, not Silver bullets.

GRA
Posts: 10687
Joined: Mon Sep 19, 2011 1:49 pm
Location: East side of San Francisco Bay

Re: Tesla's autopilot, on the road

Tue May 28, 2019 6:13 pm

Oils4AsphaultOnly wrote:
cwerdna wrote:
GRA wrote:I'm a bit pushed for time this week, so I'll have to put off my reply for a while. I'll get to it when I can. In the meantime, you can ponder this while hopefully reconsidering your answer to my question re what constitutes adequate pre-public deployment development testing:
For those who don't follow along on TMC (I take hiatuses sometimes for months at a time), CR's findings (https://www.consumerreports.org/autonom ... ervention/) aren't really new. There are tons of threads and posts on TMC about how poorly NoA works, in general. It's been going on ever since the feature was rolled out.

And, IIRC, there are definitely some software releases that are worse than others from an AP and NoA POV. Latest isn't always the greatest.
Since GRA won't post this, I thought I'd do you both a favor and post it: https://insideevs.com/news/351110/consu ... pilot/amp/

Turns out, CR actually approves of NoA. They just don't like the version that does the lane-changes for you. Go figure. I'll still make my own judgements based upon the actual use of the product though, instead of hearsay.
If I'd seen the article (and IEVS' headline had accurately reflected CR's views), I'd have been happy to post it, but as I posted the link to and directly quoted from CR's own release, why bother to filter it through a particular forum? Your claim that CR approves of NoA is without foundation. From an article on GCR dated May 24th:
. . . It's worth noting that when the organization first tested Navigate on Autopilot last November on its Model S, it found the same problems, among others, and Musk later said the company would update the system to return to its original lane after a pass. The update Consumer Reports received didn't seem to include that ability. . . .
https://www2.greencarreports.com/news/1 ... orts-finds

The article goes on to quote David Friedman about Tesla's pre-public release testing regimen:
Consumer Reports' vice president of advocacy, David Friedman, is the former acting administrator of NHTSA under the Obama administration. He said, "Tesla is showing what not to do on the path toward self-driving cars: release increasingly automated driving systems that aren’t vetted properly. Before selling these systems, automakers should be required to give the public validated evidence of that system’s safety—backed by rigorous simulations, track testing, and the use of safety drivers in real-world conditions."
IOW, pretty much what the FAA failed to ensure Boeing did adequately in the case of the 737 Max, the difference being that in an airliner accident people die by the hundreds, while for cars the total per individual accident is much smaller, but the number of accidents is far greater.

For an example of exactly the opposite approach to development testing compared to Tesla, and one which I obviously believe is necessary, see the following article. BTW, in a previous post you stated that there hadn't been any backlash owing to self-driving car accidents. I meant to reply at the time, but got distracted. In fact, as noted below there was a major backlash after the Herzberg death, and those where self-driving vehicles kill non-occupants are the ones that I'm worried will set back the development and deployment of AVs. The general public is far more worried about being put at risk by self-driving cars that they aren't in. Anyone who's riding in one has volunteered to act as a crash-test dummy for the company, so people aren't as concerned about those deaths as they are when an AV kills a non-occupant, potentially themselves:
May 19, 2019, 10:00am
Hand Gestures And Horses: Waymo’s Self-Driving Service Learns To Woo The Public
https://www.forbes.com/sites/alanohnsma ... 6c74e11124
. . . “We've always had a very conservative approach to making both our users and the public feel safe with this technology and what it is we're doing here," a Waymo spokeswoman said. . . .

The cautious expansion, partly driven by a public that is both intrigued by the prospect of a digital chauffeur and easily spooked by self-driving blunders, contrasts with some of the more ambitious roadmaps projected by Waymo rivals. Tesla CEO Elon Musk claims his electric car company can deliver a self-driving future as early as next year–assuming his new computer and self-driving software, perfected by utilizing camera and sensor data collected in “shadow mode” from hundreds of thousands of Teslas on the road work as promised. General Motors’ Cruise unit has its own aggressive market plans, and Uber and Lyft are also pursuing own self-driving car technology as a way to keep the cost of rides low. Getting the cars to navigate on their own is just the start; getting the public comfortable with both driving alongside and in a car without a driver is another big challenge.

Numerous surveys of public sentiment, such as one released in March by AAA, show a high level of worry over driverless vehicles, especially in the wake of a deadly 2018 accident in which a self-driving Uber test vehicle struck and killed a pedestrian in Tempe and fatal crashes involving Tesla drivers who may have overestimated the capabilities of the carmaker’s Autopilot software. A new Capgemini study finds a majority of people surveyed globally are awaiting the technology with “anticipation,” plus “a degree of uncertainty and concern,” says Markus Winkler, Capgemini’s head of automotive research. . . .

MIT’s Reimer thinks it could take decades of refinement until robots truly change how most people get around. For now, winning public acceptance for the technology means finding ways to validate how well it performs in the real world. In other words, a steady ground game is better than a Hail Mary pass for autonomous technology.

“The view for companies like Waymo is `we have to be able to show functional safety. Otherwise, we can't protect our decisions in a court of law, where this will all end up long term,’” he said. “Elon is working mostly on the deep neural net side where a good chunk of it is a black box. Documenting, defending that in court is going to be tough.”
Last edited by GRA on Tue May 28, 2019 6:58 pm, edited 2 times in total.
Guy [I have lots of experience designing/selling off-grid AE systems, some using EVs but don't own one. Local trips are by foot, bike and/or rapid transit].

The 'best' is the enemy of 'good enough'. Copper shot, not Silver bullets.

GRA
Posts: 10687
Joined: Mon Sep 19, 2011 1:49 pm
Location: East side of San Francisco Bay

Re: Tesla's autopilot, on the road

Tue May 28, 2019 6:27 pm

jlv wrote:
SalisburySam wrote:All of the above work, and work quite well in my case. And note that all are optional...you decide, and you choose, and you can change your mind at any time. For me, I like Option 3 above as it is the most intuitive, again for me. To me, Option 4 is not exactly “no confirmation” as you have to apply a tiny amount of steering torque...another form of confirmation. But it’s there, it works, and it’s nice to have choices.
+1

Excellent description and summary. Had CR not been trying to be negative on Tesla, they might have posted something equally as informative as your single message here.
I'm curious about where you get the idea that CR is biased about Tesla. CR has never accepted any ads, buys all the products they test anonymously at retail, and tests them to the same standards. Were they biased against Tesla when they gave the model S the highest test rating of any car they'd ever tested? I've seen people claim that CR is biased against this or that vehicle because of the frequency of repair ratings and their recommending or r not recommending a car based on that, but that shows that the person making that claim is unaware of how those F-O-R ratings are arrived at - they are based solely on customer surveys, and CR just compares the % them to the norms and applies the appropriate rating,but only if they've got a large enough sample size to be significant. Otherwise, no rating.
Guy [I have lots of experience designing/selling off-grid AE systems, some using EVs but don't own one. Local trips are by foot, bike and/or rapid transit].

The 'best' is the enemy of 'good enough'. Copper shot, not Silver bullets.

GRA
Posts: 10687
Joined: Mon Sep 19, 2011 1:49 pm
Location: East side of San Francisco Bay

Re: Tesla's autopilot, on the road

Tue May 28, 2019 7:55 pm

International Business Times:
Tesla Autopilot Safety Issues Continue As EV Slams Into Another Car
https://www.ibtimes.com/tesla-autopilot ... ar-2795153

Stopped car on highway in lane, other car swerved into then out of lane, so known problem, but one we'll see occur increasingly often as the number of Teslas on the road increase. From that article there was also this which I hadn't heard about, but which we can expect to see more and more of if Tesla doesn't dial it back:
. . . In fact, Tesla recently agreed on a $13 million settlement with a former employee who was struck by the Model S while working. . . .
A more complete analysis of the accident in Norway is available in the original Forbes article, in which the Tesla owner credits A/P with saving his life (which may or may not be true, as the article's author points out):
May 26, 2019, 11:28am
Tesla On Autopilot Slams Into Stalled Car On Highway, Expect More Of This
https://www.forbes.com/sites/lanceeliot ... c07bdc4fe5
Guy [I have lots of experience designing/selling off-grid AE systems, some using EVs but don't own one. Local trips are by foot, bike and/or rapid transit].

The 'best' is the enemy of 'good enough'. Copper shot, not Silver bullets.

lorenfb
Posts: 2240
Joined: Tue Dec 17, 2013 10:53 pm
Delivery Date: 22 Nov 2013
Leaf Number: 416635
Location: SoCal

Re: Tesla's autopilot, on the road

Tue May 28, 2019 9:10 pm

GRA wrote:
If ROADTYPE = "Freeway" then A/P = "OK" ELSE A/P = "Not OK"

This is what Cadillac does with Supercruise, so Tesla should take one of the programmers they have spending far more time writing cutesy Easter Eggs and put them to work making a simple change that will save lives.
You make a valid point that it's very easy to control when AP fully functions or what road/traffic/speed limits and other conditions should
limit what AP modes of operation are functional. Tesla obviously knows AP's present limitations.
#1 Leaf SL MY 9/13: 74K miles, 48 Ahrs, 5.2 miles/kWh (average), Hx=70, SOH=78, L2 - 100% > 1000, temp < 95F, (DOD) > 20 Ahrs
#2 Leaf SL MY 12/18: 4.5K miles, 115 Ahrs, 5.5 miles/kWh (average), Hx=98, SOH=99, DOD > 20%, temp < 105F

Oils4AsphaultOnly
Posts: 669
Joined: Sat Oct 10, 2015 4:09 pm
Delivery Date: 20 Nov 2016
Leaf Number: 313890
Location: Arcadia, CA

Re: Tesla's autopilot, on the road

Tue May 28, 2019 10:29 pm

GRA wrote:Back from the Holiday weekend, and hope everyone enjoyed it,.
Oils4AsphaultOnly wrote:
GRA wrote: If other companies are able to do it, then Tesla can (and should) be held to the same standard.
Really? And which other companies have released their driver assistance accident data? We're talking non-FARS here right?
Oh, none of the other ADAS companies currently, ...
Thank you. I think it's obvious that the point has been made right here.

The rest of your post pertains to the standards for self-driving, to which we both know that Tesla never claimed A/P to be self-driving - hyperbolic claims aside.

Your position on coding for edge cases is very much a case of "knowing just enough to be dangerous" school of thought. That assumption of how things _should_ be done [based on a lack of expertise] can be far more dangerous than what Uber had done. I'll address this point in more detail in response to your other post.
GRA wrote: ... because none of them other than Tesla are claiming that they're currently safer than humans. It's the full AV companies that are required to provide their data, and anyone making such safety claims about an ADAS should also be required to do so. Which is just what CR has been calling on them to do, most recently on April 22:
Consumer Reports: Tesla Must Prove Safety Before Claiming “Self-Driving” Ability
https://advocacy.consumerreports.org/pr ... g-ability/
. . . David Friedman, Vice President of Advocacy for Consumer Reports, said, “Technology has the potential to shape future transportation to be safer, less expensive, and more accessible. Yet, safety must always come first. Today’s driver assistance technologies have helped deliver on safety, but the marketplace is full of bold claims about self-driving capabilities that overpromise and underdeliver. For instance, Tesla’s current driver-assist system, ‘Autopilot,’ is no substitute for a human driver. It can’t dependably navigate common road situations on its own, and fails to keep the driver engaged exactly when it is needed most.

“We’ve heard promises of self-driving vehicles being just around the corner from Tesla before. Claims about the company’s driving automation systems and safety are not backed up by the data, and it seems today’s presentations had more to do with investors than consumers’ safety. We agree that Tesla, and every other car company, has a moral imperative to make transportation safer, and all companies should embrace the most important principle: preventing harm and saving lives.

But instead of treating the public like guinea pig, Tesla must clearly demonstrate a driving automation system that is substantially safer than what is available today, based on rigorous evidence that is transparently shared with regulators and consumers, and validated by independent third-parties. In the meantime, the company should focus on making sure that proven crash avoidance technologies on Tesla vehicles, such as automatic emergency braking with pedestrian detection, are as effective as possible.”
H'mm, those criticisms sound familiar.
Oils4AsphaultOnly wrote:
GRA wrote:
If the ADAS is guaranteed to be abused by normal humans, and it is, and the manufacturer doesn't take steps to ensure that it can't be, then responsibility is shared. Seeing as how Teslas know which road they're on and the speed limit, a single line of code could have prevented both Brown's and Brenner's deaths, or at least, eliminated A/P's role in that death, by preventing A/P from being used a situation where Tesla knows that AEB doesn't work, which is dealing with cross traffic. Something like this is all it would take:

If ROADTYPE = "Freeway" then A/P = "OK" ELSE A/P = "Not OK"

This is what Cadillac does with Supercruise, so Tesla should take one of the programmers they have spending far more time writing cutesy Easter Eggs and put them to work making a simple change that will save lives.
You know nothing about programming if you think it's really that simple. And you know even less about neural nets if you expect the programmers to interject their code like that. If Cadillac's engineers really wrote code like that, then they'll never get to any level of self-driving. Considering that they at least made it to level 2+, there's still hope for them.
I make no pretense of claiming to be a programmer - the last time I did any was when I taught myself Basic (on a Commodore Pet, 8k RAM and program/data storage on a cassette tape) back it the late '70s, but you cannot deny that an instruction such as the above limiting A/P use to limited-access freeways with no cross traffic would have prevented both Brown's an Brenner's A/P-enabled deaths, in circumstances where Tesla knew it wouldn't work. Cadillac actually goes further than just limiting Supercruise usage to limited-access freeways (and having an eye-tracking camera to ensure drivers are looking at the road, as they only allow its usage on those portions of freeways for which they have high-def digital LIDAR maps. The latter on U.S. 101 at the Hwy 85 interchange would have presumably prevented Huang's death at the hands of A/P.

Here's CR's comparison test of the various ADAS systems: https://www.cadillac.com/content/dam/ca ... rticle.pdf
Oils4AsphaultOnly wrote:
GRA wrote: Demographics, # of occupants, driver/pax/motorcyclist, non-motorist, conditions, speeds, type of accident etc.
Not what I mean, but I guess you're not able to analyze the data yourself huh? Depending on other people to draw the conclusions? I'll dig into it later and see if at least make/model info is buried in there.
I am not a professional statistician nor do I play one on TV, so of course I'm not going to analyze the data myself. Neither is Elon, but that hasn't stopped him from making claims based on erroneous methodology.
:: Model 3 LR :: acquired 9 May '18
:: Leaf S30 :: build date: Sep '16 :: purchased: Nov '16
100% Zero transportation emissions (except when I walk) and loving it!

Return to “Off-Topic”