GRA
Posts: 13009
Joined: Mon Sep 19, 2011 1:49 pm
Location: East side of San Francisco Bay

Re: Tesla's autopilot, on the road

Mon May 17, 2021 10:19 pm

Both L.A. Times:
DMV probing whether Tesla violates state regulations with self-driving claims
https://www-latimes-com.cdn.ampproject. ... lot-safety

. . . Asked for detail, DMV spokesperson Anita Gore said via email, “The DMV cannot comment on the pending review.” She did list the penalties that might be applied if a company is found to have violated DMV regulations that prohibit misleading advertising concerning automated vehicles.

In small print, Tesla says on its website that full self-driving “does not make the car autonomous” and that “active supervision” is required by the driver. But social media are rife with videos showing drivers, mostly young men, overcoming Tesla’s easily defeated driver-monitoring system to crawl into the back seat and let the Tesla “drive itself” down public highways. . . .

Although a driver is legally responsible for such misbehavior, the fine print in Tesla advertising provides a weak defense against deceptive marketing allegations, said Bryant Walker Smith, a leading expert on automated vehicle law at the University of South Carolina. He cites the Lanham Act, the federal law that governs trademarks.

If the DMV finds Tesla is misleading customers, potential penalties include suspension or revocation of DMV autonomous vehicle deployment permits and manufacture and dealer licenses, the DMV spokesperson said. She added that “a vehicle operating on public roads using autonomous technology without first obtaining a permit can be removed from the public roadway by a police officer.”

Although the National Highway Traffic Safety Administration has no authority to regulate vehicle advertising, the DMV’s own rules allow it to sanction manufacturers that advertise a vehicle as autonomous when it is not. The Federal Trade Commission also regulates such advertising; an FTC spokesperson declined to comment. A request to interview the DMV’s director, Steve Gordon, was declined. Tesla lacks a media relations department.

In July 2020, a Munich court ruled that Tesla had been misleading consumers about the capabilities of its autonomous systems and ordered the company’s German subsidiary to stop using phrases such as “full potential for autonomous driving” on its website and in advertising materials. . . .

“Tesla seems to be asking for legal trouble on many fronts,” law professor Smith said. “From the FTC and its state counterparts for deceptive marketing. From the California DMV for, potentially, crossing into the realm of autonomous vehicle testing without state approval, from competitors with driver assistance systems, competitors with actual automated driving systems, ordinary consumers, and future crash victims who could sue under state or federal law.”

Tesla is facing hundreds of lawsuits. At least several deaths have been connected with use or misuse of Autopilot. NHTSA has more than 20 investigations open on Tesla, though how long they’ll take to be resolved, NHTSA won’t say. . . .

Tesla driver in fatal California crash had posted videos of himself in vehicle
https://www-latimes-com.cdn.ampproject. ... in-vehicle

The driver of a Tesla involved in a fatal crash that California highway authorities said may have been operating on Autopilot posted social media videos of himself riding in the vehicle without his hands on the wheel or foot on the pedal.

The May 5 crash in Fontana, 50 miles east of Los Angeles, is also under investigation by the National Highway Traffic Safety Administration. The
probe is the 29th case involving a Tesla that the federal agency has probed.

In the Fontana crash, a 35-year-old man identified as Steven Michael Hendrickson was killed when his Tesla Model 3 struck an overturned semi on a freeway about 2:30 a.m. . . .

Another man was seriously injured when the electric vehicle hit him as he was helping the semi’s driver out of the wreck.

The California Highway Patrol announced Thursday that its preliminary investigation had determined that the Tesla’s partially automated driving system called Autopilot “was engaged” before the crash. The agency said it was commenting on the Fontana crash because of the “high level of interest” about Tesla crashes and because it was “an opportunity to remind the public that driving is a complex task that requires a driver’s full attention.”

However on Friday, the agency walked back its previous declaration.

“To clarify,” a new CHP statement said, “There has not been a final determination made as to what driving mode the Tesla was in or if it was a contributing factor to the crash. . . .”

Tesla, which has disbanded its public relations department, did not respond Friday to an email seeking comment. The company says in its owner’s manuals and on its website that both Autopilot and “Full Self-Driving” are not fully autonomous and that drivers must pay attention and be ready to intervene at any time.

Autopilot at times has had trouble dealing with stationary objects and traffic crossing in front of Teslas.

[Desc.of FL, Mountain View fatal crashes]

Tesla’s system, which uses cameras, radar and short-range sonar, also has trouble handling stopped emergency vehicles. Teslas have struck several firetrucks and police vehicles that were stopped on freeways with their flashing emergency lights on.

After the Florida and California fatal crashes, the National Transportation Safety Board recommended that Tesla develop a stronger system to ensure drivers are paying attention, and that it limit use of Autopilot to highways where it can work effectively. Neither Tesla nor the safety agency took action. . . .

NHTSA, which has authority to regulate automated driving systems and seek recalls if necessary, seems to have developed a renewed interest in the systems since President Biden took office.

[Snore . . . Riinng . . . Snore . . . Riiinng . . . snore . . Riinngg . . . Snort! Grabs phone] National Highway Transportstion Administration! Uh, what? Yes sir, wide awake and on the job. We'll get right on that! Yes, sir, goodbye, sir. [Phone is hung up. . . . . . . snore].


While we're still waiting on confirmation, it's likely a non-occupant was injured by a Tesla driving itself, which collided with a semi-dumptruck lying on its side on the freeway. Do you suppose that the regulatory agencies will finally do their jobs?
Guy [I have lots of experience designing/selling off-grid AE systems, some using EVs but don't own one. Local trips are by foot, bike and/or rapid transit].

The 'best' is the enemy of 'good enough'. Copper shot, not Silver bullets.

GRA
Posts: 13009
Joined: Mon Sep 19, 2011 1:49 pm
Location: East side of San Francisco Bay

Re: Tesla's autopilot, on the road

Mon May 17, 2021 10:21 pm

Both L.A. Times:
DMV probing whether Tesla violates state regulations with self-driving claims
https://www-latimes-com.cdn.ampproject. ... lot-safety

. . . Asked for detail, DMV spokesperson Anita Gore said via email, “The DMV cannot comment on the pending review.” She did list the penalties that might be applied if a company is found to have violated DMV regulations that prohibit misleading advertising concerning automated vehicles.

In small print, Tesla says on its website that full self-driving “does not make the car autonomous” and that “active supervision” is required by the driver. But social media are rife with videos showing drivers, mostly young men, overcoming Tesla’s easily defeated driver-monitoring system to crawl into the back seat and let the Tesla “drive itself” down public highways. . . .

Although a driver is legally responsible for such misbehavior, the fine print in Tesla advertising provides a weak defense against deceptive marketing allegations, said Bryant Walker Smith, a leading expert on automated vehicle law at the University of South Carolina. He cites the Lanham Act, the federal law that governs trademarks.

If the DMV finds Tesla is misleading customers, potential penalties include suspension or revocation of DMV autonomous vehicle deployment permits and manufacture and dealer licenses, the DMV spokesperson said. She added that “a vehicle operating on public roads using autonomous technology without first obtaining a permit can be removed from the public roadway by a police officer.”

Although the National Highway Traffic Safety Administration has no authority to regulate vehicle advertising, the DMV’s own rules allow it to sanction manufacturers that advertise a vehicle as autonomous when it is not. The Federal Trade Commission also regulates such advertising; an FTC spokesperson declined to comment. A request to interview the DMV’s director, Steve Gordon, was declined. Tesla lacks a media relations department.

In July 2020, a Munich court ruled that Tesla had been misleading consumers about the capabilities of its autonomous systems and ordered the company’s German subsidiary to stop using phrases such as “full potential for autonomous driving” on its website and in advertising materials. . . .

“Tesla seems to be asking for legal trouble on many fronts,” law professor Smith said. “From the FTC and its state counterparts for deceptive marketing. From the California DMV for, potentially, crossing into the realm of autonomous vehicle testing without state approval, from competitors with driver assistance systems, competitors with actual automated driving systems, ordinary consumers, and future crash victims who could sue under state or federal law.”

Tesla is facing hundreds of lawsuits. At least several deaths have been connected with use or misuse of Autopilot. NHTSA has more than 20 investigations open on Tesla, though how long they’ll take to be resolved, NHTSA won’t say. . . .

Tesla driver in fatal California crash had posted videos of himself in vehicle
https://www-latimes-com.cdn.ampproject. ... in-vehicle

The driver of a Tesla involved in a fatal crash that California highway authorities said may have been operating on Autopilot posted social media videos of himself riding in the vehicle without his hands on the wheel or foot on the pedal.

The May 5 crash in Fontana, 50 miles east of Los Angeles, is also under investigation by the National Highway Traffic Safety Administration. The probe is the 29th case involving a Tesla that the federal agency has probed.

In the Fontana crash, a 35-year-old man identified as Steven Michael Hendrickson was killed when his Tesla Model 3 struck an overturned semi on a freeway about 2:30 a.m. . . .

Another man was seriously injured when the electric vehicle hit him as he was helping the semi’s driver out of the wreck.

The California Highway Patrol announced Thursday that its preliminary investigation had determined that the Tesla’s partially automated driving system called Autopilot “was engaged” before the crash. The agency said it was commenting on the Fontana crash because of the “high level of interest” about Tesla crashes and because it was “an opportunity to remind the public that driving is a complex task that requires a driver’s full attention.”

However on Friday, the agency walked back its previous declaration.

“To clarify,” a new CHP statement said, “There has not been a final determination made as to what driving mode the Tesla was in or if it was a contributing factor to the crash. . . .”

Tesla, which has disbanded its public relations department, did not respond Friday to an email seeking comment. The company says in its owner’s manuals and on its website that both Autopilot and “Full Self-Driving” are not fully autonomous and that drivers must pay attention and be ready to intervene at any time.

Autopilot at times has had trouble dealing with stationary objects and traffic crossing in front of Teslas.

(Desc.of FL, Mountain View fatal crashes)

Tesla’s system, which uses cameras, radar and short-range sonar, also has trouble handling stopped emergency vehicles. Teslas have struck several firetrucks and police vehicles that were stopped on freeways with their flashing emergency lights on.

After the Florida and California fatal crashes, the National Transportation Safety Board recommended that Tesla develop a stronger system to ensure drivers are paying attention, and that it limit use of Autopilot to highways where it can work effectively. Neither Tesla nor the safety agency took action. . . .

NHTSA, which has authority to regulate automated driving systems and seek recalls if necessary, seems to have developed a renewed interest in the systems since President Biden took office.

(Snore . . . Riinng . . . Snore . . . Riiinng . . . snore . . Riinngg . . . Snort! Grabs phone) National Highway Transportation Safety Administration! Uh, what? Yes sir, wide awake and on the job! We'll get right on that! Yes, sir, will do! Goodbye, sir. (Phone is hung up. . . . . . . snore . . . snore)


While we're still waiting on confirmation, it's likely that a non-occupant was injured by a Tesla driving itself, which collided with a semi-dumptruck lying on its side on the freeway. If it proves to be another A/P-enabled accident, do you suppose that the regulatory agencies will finally do their jobs? I have my doubts.

IEVS:
NHTSA Misses Opportunity To Say There's No Autonomous Car Yet

It created a series of videos with Jason Fenske but did not address that matter.
https://insideevs.com/news/507889/nhtsa ... icles-yet/

. . . Autopilot and FSD unite many of these features in a single system. Yet, as Tesla itself told regulators and writes in its manuals, none of them is more than a Level 2 driving aid. Trying to prove they are or believing that is the case is what led to many crashes so far involving the system. Apart from the two new cases that have emerged, NHTSA was already investigating 18. The agency currently has 25 SCIs (Special Crash Investigations) ongoing.

If it is confirmed that Autopilot or FSD was active in the Texas and the Fontana crash, 20 of all 25 cases the agency is investigating will be related to the driving assistant aids Tesla offers. It could have been a good idea to mention that in any of these videos and stress what they really are.
Guy [I have lots of experience designing/selling off-grid AE systems, some using EVs but don't own one. Local trips are by foot, bike and/or rapid transit].

The 'best' is the enemy of 'good enough'. Copper shot, not Silver bullets.

Oils4AsphaultOnly
Posts: 844
Joined: Sat Oct 10, 2015 4:09 pm
Delivery Date: 20 Nov 2016
Leaf Number: 313890
Location: Arcadia, CA

Re: Tesla's autopilot, on the road

Mon May 17, 2021 11:06 pm

GRA wrote:
Mon May 17, 2021 10:19 pm
Both L.A. Times:
DMV probing whether Tesla violates state regulations with self-driving claims
https://www-latimes-com.cdn.ampproject. ... lot-safety

. . . Asked for detail, DMV spokesperson Anita Gore said via email, “The DMV cannot comment on the pending review.” She did list the penalties that might be applied if a company is found to have violated DMV regulations that prohibit misleading advertising concerning automated vehicles.

In small print, Tesla says on its website that full self-driving “does not make the car autonomous” and that “active supervision” is required by the driver. But social media are rife with videos showing drivers, mostly young men, overcoming Tesla’s easily defeated driver-monitoring system to crawl into the back seat and let the Tesla “drive itself” down public highways. . . .

Although a driver is legally responsible for such misbehavior, the fine print in Tesla advertising provides a weak defense against deceptive marketing allegations, said Bryant Walker Smith, a leading expert on automated vehicle law at the University of South Carolina. He cites the Lanham Act, the federal law that governs trademarks.

If the DMV finds Tesla is misleading customers, potential penalties include suspension or revocation of DMV autonomous vehicle deployment permits and manufacture and dealer licenses, the DMV spokesperson said. She added that “a vehicle operating on public roads using autonomous technology without first obtaining a permit can be removed from the public roadway by a police officer.”

Although the National Highway Traffic Safety Administration has no authority to regulate vehicle advertising, the DMV’s own rules allow it to sanction manufacturers that advertise a vehicle as autonomous when it is not. The Federal Trade Commission also regulates such advertising; an FTC spokesperson declined to comment. A request to interview the DMV’s director, Steve Gordon, was declined. Tesla lacks a media relations department.

In July 2020, a Munich court ruled that Tesla had been misleading consumers about the capabilities of its autonomous systems and ordered the company’s German subsidiary to stop using phrases such as “full potential for autonomous driving” on its website and in advertising materials. . . .

“Tesla seems to be asking for legal trouble on many fronts,” law professor Smith said. “From the FTC and its state counterparts for deceptive marketing. From the California DMV for, potentially, crossing into the realm of autonomous vehicle testing without state approval, from competitors with driver assistance systems, competitors with actual automated driving systems, ordinary consumers, and future crash victims who could sue under state or federal law.”

Tesla is facing hundreds of lawsuits. At least several deaths have been connected with use or misuse of Autopilot. NHTSA has more than 20 investigations open on Tesla, though how long they’ll take to be resolved, NHTSA won’t say. . . .

Tesla driver in fatal California crash had posted videos of himself in vehicle
https://www-latimes-com.cdn.ampproject. ... in-vehicle

The driver of a Tesla involved in a fatal crash that California highway authorities said may have been operating on Autopilot posted social media videos of himself riding in the vehicle without his hands on the wheel or foot on the pedal.

The May 5 crash in Fontana, 50 miles east of Los Angeles, is also under investigation by the National Highway Traffic Safety Administration. The
probe is the 29th case involving a Tesla that the federal agency has probed.

In the Fontana crash, a 35-year-old man identified as Steven Michael Hendrickson was killed when his Tesla Model 3 struck an overturned semi on a freeway about 2:30 a.m. . . .

Another man was seriously injured when the electric vehicle hit him as he was helping the semi’s driver out of the wreck.

The California Highway Patrol announced Thursday that its preliminary investigation had determined that the Tesla’s partially automated driving system called Autopilot “was engaged” before the crash. The agency said it was commenting on the Fontana crash because of the “high level of interest” about Tesla crashes and because it was “an opportunity to remind the public that driving is a complex task that requires a driver’s full attention.”

However on Friday, the agency walked back its previous declaration.

“To clarify,” a new CHP statement said, “There has not been a final determination made as to what driving mode the Tesla was in or if it was a contributing factor to the crash. . . .”

Tesla, which has disbanded its public relations department, did not respond Friday to an email seeking comment. The company says in its owner’s manuals and on its website that both Autopilot and “Full Self-Driving” are not fully autonomous and that drivers must pay attention and be ready to intervene at any time.

Autopilot at times has had trouble dealing with stationary objects and traffic crossing in front of Teslas.

[Desc.of FL, Mountain View fatal crashes]

Tesla’s system, which uses cameras, radar and short-range sonar, also has trouble handling stopped emergency vehicles. Teslas have struck several firetrucks and police vehicles that were stopped on freeways with their flashing emergency lights on.

After the Florida and California fatal crashes, the National Transportation Safety Board recommended that Tesla develop a stronger system to ensure drivers are paying attention, and that it limit use of Autopilot to highways where it can work effectively. Neither Tesla nor the safety agency took action. . . .

NHTSA, which has authority to regulate automated driving systems and seek recalls if necessary, seems to have developed a renewed interest in the systems since President Biden took office.

[Snore . . . Riinng . . . Snore . . . Riiinng . . . snore . . Riinngg . . . Snort! Grabs phone] National Highway Transportstion Administration! Uh, what? Yes sir, wide awake and on the job. We'll get right on that! Yes, sir, goodbye, sir. [Phone is hung up. . . . . . . snore].


While we're still waiting on confirmation, it's likely a non-occupant was injured by a Tesla driving itself, which collided with a semi-dumptruck lying on its side on the freeway. Do you suppose that the regulatory agencies will finally do their jobs?
drivers falling asleep at the wheel at 3am also get into car accidents. Why is it automatically assumed that the car was in autopilot?

Unlike the Texas accident, a wrecked vehicle in the middle of the highway is indeed one of the failure points of autopilot, so I'm not saying that it wasn't on, just questioning why that's the default assumption? Have we not learned our lesson yet from last month?
:: Leaf S30 :: build date: Sep '16 :: purchased: Nov '16
:: Model 3 LR (Turo) :: acquired 9 May '18
:: Model Y LR AWD (wife's) :: acquired 30 Dec '20
100% Zero transportation emissions (except when I walk) and loving it!

GRA
Posts: 13009
Joined: Mon Sep 19, 2011 1:49 pm
Location: East side of San Francisco Bay

Re: Tesla's autopilot, on the road

Mon May 17, 2021 11:23 pm

Oils4AsphaultOnly wrote:
Mon May 17, 2021 11:06 pm
GRA wrote:
Mon May 17, 2021 10:19 pm


While we're still waiting on confirmation, it's likely a non-occupant was injured by a Tesla driving itself, which collided with a semi-dumptruck lying on its side on the freeway. Do you suppose that the regulatory agencies will finally do their jobs?
drivers falling asleep at the wheel at 3am also get into car accidents. Why is it automatically assumed that the car was in autopilot?

It's not automatically assumed, it's a matter of probability given the driver's own statements and behavior in videos, along with his membership in a Tesla club - it all points to him being a serious fanboy who thought A/P & FSD were terrific:
"What would I do without my full self-driving Tesla after a long day at work," said a message on one. "Coming home from LA after work, thank god, self-drive," said a comment on another video, adding, "Best car ever!"

I left room for there being another possible explanation, but the above quotes, IMO, strongly tilt the probability towards A/P & FSD controlling the car as they indicate his mindset, added to the fact he hit a truck lying on its side across several lanes.

Oils4AsphaultOnly wrote:
Mon May 17, 2021 11:06 pm
Unlike the Texas accident, a wrecked vehicle in the middle of the highway is indeed one of the failure points of autopilot, so I'm not saying that it wasn't on, just questioning why that's the default assumption? Have we not learned our lesson yet from last month?
See above.
Guy [I have lots of experience designing/selling off-grid AE systems, some using EVs but don't own one. Local trips are by foot, bike and/or rapid transit].

The 'best' is the enemy of 'good enough'. Copper shot, not Silver bullets.

cwerdna
Posts: 12121
Joined: Fri Jun 03, 2011 4:31 pm
Delivery Date: 28 Jul 2013
Location: SF Bay Area, CA

Re: Tesla's autopilot, on the road

Tue May 18, 2021 12:44 pm

Tesla on 'autopilot' crashes into deputy's vehicle in Washington state
There were no injuries, but there was significant damage to the patrol vehicle, the sheriff's office said.
https://www.nbcnews.com/news/us-news/te ... e-n1267716
https://www.facebook.com/SnoCoSheriff/p ... 2652934095

'19 Bolt Premier
'13 Leaf SV w/premium (former)
'13 Leaf SV w/QC + LED & premium (lease over)

Please don't PM me with Leaf questions. Just post in the topic that seems most appropriate.

Oils4AsphaultOnly
Posts: 844
Joined: Sat Oct 10, 2015 4:09 pm
Delivery Date: 20 Nov 2016
Leaf Number: 313890
Location: Arcadia, CA

Re: Tesla's autopilot, on the road

Tue May 18, 2021 4:19 pm

GRA wrote:
Mon May 17, 2021 11:23 pm
Oils4AsphaultOnly wrote:
Mon May 17, 2021 11:06 pm
GRA wrote:
Mon May 17, 2021 10:19 pm


While we're still waiting on confirmation, it's likely a non-occupant was injured by a Tesla driving itself, which collided with a semi-dumptruck lying on its side on the freeway. Do you suppose that the regulatory agencies will finally do their jobs?
drivers falling asleep at the wheel at 3am also get into car accidents. Why is it automatically assumed that the car was in autopilot?

It's not automatically assumed, it's a matter of probability given the driver's own statements and behavior in videos, along with his membership in a Tesla club - it all points to him being a serious fanboy who thought A/P & FSD were terrific:
"What would I do without my full self-driving Tesla after a long day at work," said a message on one. "Coming home from LA after work, thank god, self-drive," said a comment on another video, adding, "Best car ever!"

I left room for there being another possible explanation, but the above quotes, IMO, strongly tilt the probability towards A/P & FSD controlling the car as they indicate his mindset, added to the fact he hit a truck lying on its side across several lanes.

Oils4AsphaultOnly wrote:
Mon May 17, 2021 11:06 pm
Unlike the Texas accident, a wrecked vehicle in the middle of the highway is indeed one of the failure points of autopilot, so I'm not saying that it wasn't on, just questioning why that's the default assumption? Have we not learned our lesson yet from last month?
See above.
None of the articles referenced FSD, and there's no indication that he was part of the Beta-test pool, so the fact that he would call it "self-drive" indicates that he didn't treat autopilot as a driving aid as it should've been. But I agree that he probably became over-reliant on autopilot to drive him when he shouldn't have. 3am sounds like either drunk or sleepy driver, neither conditions for getting behind the wheel. The sooner we get FSD, the safer we would all be, otherwise, we're sharing the road with drivers who make poor decisions.
:: Leaf S30 :: build date: Sep '16 :: purchased: Nov '16
:: Model 3 LR (Turo) :: acquired 9 May '18
:: Model Y LR AWD (wife's) :: acquired 30 Dec '20
100% Zero transportation emissions (except when I walk) and loving it!

GRA
Posts: 13009
Joined: Mon Sep 19, 2011 1:49 pm
Location: East side of San Francisco Bay

Re: Tesla's autopilot, on the road

Tue May 18, 2021 4:52 pm

Oils4AsphaultOnly wrote:
Tue May 18, 2021 4:19 pm
GRA wrote:
Mon May 17, 2021 11:23 pm
Oils4AsphaultOnly wrote:
Mon May 17, 2021 11:06 pm


drivers falling asleep at the wheel at 3am also get into car accidents. Why is it automatically assumed that the car was in autopilot?

It's not automatically assumed, it's a matter of probability given the driver's own statements and behavior in videos, along with his membership in a Tesla club - it all points to him being a serious fanboy who thought A/P & FSD were terrific:
"What would I do without my full self-driving Tesla after a long day at work," said a message on one. "Coming home from LA after work, thank god, self-drive," said a comment on another video, adding, "Best car ever!"

I left room for there being another possible explanation, but the above quotes, IMO, strongly tilt the probability towards A/P & FSD controlling the car as they indicate his mindset, added to the fact he hit a truck lying on its side across several lanes.
None of the articles referenced FSD, and there's no indication that he was part of the Beta-test pool, so the fact that he would call it "self-drive" indicates that he didn't treat autopilot as a driving aid as it should've been. But I agree that he probably became over-reliant on autopilot to drive him when he shouldn't have. 3am sounds like either drunk or sleepy driver, neither conditions for getting behind the wheel. The sooner we get FSD, the safer we would all be, otherwise, we're sharing the road with drivers who make poor decisions.

Sure they did:
What would I do without my full self-driving Tesla after a long day at work?
Here's another article probably referencing the same quote on FSD, from Reuters:
Tesla crash victim lauded 'full self-driving' in videos on Tiktok
https://www.reuters.com/business/autos- ... 21-05-16/


It remains to be seen if his car had it or not, but he certainly thought he could treat it that way. Not that "FSD" is anything of the sort, it's still Level 2. Once we get true Level 4 or Level 5 ADS, then yes, the roads will be safer. As it is, we have L2 that makes extra dumb mistakes that alert human drivers wouldn't. As the whole point of ADS/DAS is to make the roads safer by avoiding human errors, they shouldn't be allowed to replace them with machine errors, particularly if the systems are allowed to operate in conditions which it's known they are unable to cope with.
Guy [I have lots of experience designing/selling off-grid AE systems, some using EVs but don't own one. Local trips are by foot, bike and/or rapid transit].

The 'best' is the enemy of 'good enough'. Copper shot, not Silver bullets.

LeftieBiker
Moderator
Posts: 17105
Joined: Wed May 22, 2013 3:17 am
Delivery Date: 30 Apr 2018
Location: Upstate New York, US

Re: Tesla's autopilot, on the road

Tue May 18, 2021 6:53 pm

I'm waiting for the (class action?) lawsuit against Musk for naming it and describing it as "full self driving"...
Brilliant Silver 2021 Leaf SV40 W/ Pro Pilot & Protection
2009 Vectrix VX-1 W/18 Leaf modules, & 2 lithium E-bicycles.
BAFX OBDII Dongle
PLEASE don't PM me with Leaf questions. Just post in the topic that seems most appropriate.

Oils4AsphaultOnly
Posts: 844
Joined: Sat Oct 10, 2015 4:09 pm
Delivery Date: 20 Nov 2016
Leaf Number: 313890
Location: Arcadia, CA

Re: Tesla's autopilot, on the road

Tue May 18, 2021 7:00 pm

GRA wrote:
Tue May 18, 2021 4:52 pm
Oils4AsphaultOnly wrote:
Tue May 18, 2021 4:19 pm
GRA wrote:
Mon May 17, 2021 11:23 pm



It's not automatically assumed, it's a matter of probability given the driver's own statements and behavior in videos, along with his membership in a Tesla club - it all points to him being a serious fanboy who thought A/P & FSD were terrific:


I left room for there being another possible explanation, but the above quotes, IMO, strongly tilt the probability towards A/P & FSD controlling the car as they indicate his mindset, added to the fact he hit a truck lying on its side across several lanes.
None of the articles referenced FSD, and there's no indication that he was part of the Beta-test pool, so the fact that he would call it "self-drive" indicates that he didn't treat autopilot as a driving aid as it should've been. But I agree that he probably became over-reliant on autopilot to drive him when he shouldn't have. 3am sounds like either drunk or sleepy driver, neither conditions for getting behind the wheel. The sooner we get FSD, the safer we would all be, otherwise, we're sharing the road with drivers who make poor decisions.

Sure they did:
What would I do without my full self-driving Tesla after a long day at work?
Here's another article probably referencing the same quote on FSD, from Reuters:
Tesla crash victim lauded 'full self-driving' in videos on Tiktok
https://www.reuters.com/business/autos- ... 21-05-16/


It remains to be seen if his car had it or not, but he certainly thought he could treat it that way. Not that "FSD" is anything of the sort, it's still Level 2. Once we get true Level 4 or Level 5 ADS, then yes, the roads will be safer. As it is, we have L2 that makes extra dumb mistakes that alert human drivers wouldn't. As the whole point of ADS/DAS is to make the roads safer by avoiding human errors, they shouldn't be allowed to replace them with machine errors, particularly if the systems are allowed to operate in conditions which it's known they are unable to cope with.
Autopilot != FSD.

Once you get that, then you'll understand why this is all a bunch of hot air.

Edit: FSD isn't level 2 driver's assistance, autopilot is. But that doesn't change the fact that he thought he had something he didn't and he died for it. How many more cases of "sudden unintended acceleration" occured before people realized that the human behind the wheel was at fault? I'm afraid that's what it's going to take here.

The reason why I'm pushing back so hard, is because when used correctly, autopilot is a HUGE driving benefit that people who don't use it will NEVER understand. And the side effect of all this autopilot misunderstanding is that Tesla will be forced to disable autopilot for ALL. It'll be a case of the magnetic Zen Magnets ban (https://en.wikipedia.org/wiki/Neodymium_magnet_toys). A few people being stupid means no one else can enjoy them. Consumer overprotection at its finest, except it's not about toys that'll will be rescinded.
:: Leaf S30 :: build date: Sep '16 :: purchased: Nov '16
:: Model 3 LR (Turo) :: acquired 9 May '18
:: Model Y LR AWD (wife's) :: acquired 30 Dec '20
100% Zero transportation emissions (except when I walk) and loving it!

User avatar
jlv
Moderator
Posts: 1723
Joined: Thu Apr 24, 2014 6:08 pm
Delivery Date: 30 Apr 2014
Leaf Number: 424487
Location: Massachusetts

Re: Tesla's autopilot, on the road

Tue May 18, 2021 8:12 pm

Oils4AsphaultOnly wrote:
Tue May 18, 2021 7:00 pm
Autopilot != FSD.
You know it, I know it, and anyone here who isn't trolling knows it.

But, there are far too many newer Tesla owners who don't know it. They bought FSD and errantly conclude that their car has FSD when it only has AP. The number of posts in Tesla FB groups from people saying things like "I love my FSD" is sad.
ICE free since '18 / 100K+ 100% BEV miles since '14
LEAF '13 SL (mfg 12/13, leased 4/14, bought 5/17, sold 11/18, 34K mi, AHr 58, SOH 87%)
Tesla S 75D (3/17, 49K mi)
Tesla X 100D (12/18, 30K mi)
8.9kW Solar PV and 2x Powerwall

Return to “Off-Topic”