Tesla's autopilot, on the road

My Nissan Leaf Forum

Help Support My Nissan Leaf Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
Oils4AsphaultOnly said:
GRA said:
Oils4AsphaultOnly said:
There's nothing curious about it. The driver lost control and crashed within 550 ft of his driveway. The doors were jammed shut from the crash (which happens - remember Paul Walker?), and the driver tried to flee through the backseat, but didn't make it (that's why he was in the backseat). We've long known that autopilot wasn't a contributing factor and this report was released to settle the FUD.

The constable at the scene should never have claimed that no one was driving the car. He should've stopped with "no one was in the driver's seat" and claimed anything as fact beyond that.


You know for a fact that the driver tried to flee through the backseat? Shouldn't you stop with "no one was in the driver's seat" and not claim anything as fact beyond that?

My original hypothesis (posted last week) was that the driver fled the scene. Now that they have conclusive evidence that the driver died at the scene, many suppositions can be laid down to rest:
- The driver was found dead in the back seat.
- Driver was seen entering the driver's seat and driving away by local security cameras.
- car had 550 ft to accelerate to impact speed.

Indeed, an alien or ghost could've intervened and moved the driver, or maybe the recoil from impact could've ejected the driver into the back seat, but Occam's razor rules out the first possibility, and newton's first law rules out the second. If you don't start with the assumption that Tesla's autopilot is at fault, then the truth is much simpler to grasp.


An alternative explanation is that the car stopped and the driver moved from the front to the back before resuming the drive. We don't yet know what happened during the entire drive. The NTSB issued this today:

UPDATE 5/12/21: An NTSB spokesperson clarified the agency's findings about whether the driver was seated at the wheel at the time of the crash with this statement to Car and Driver: "The NTSB has made no conclusions about the operation of the crash vehicle—we are only stating the facts as we know them at this early stage of our investigation and all we know at this point, with regard to vehicle operation, is that Autosteer was not available on the section of road where the crash happened when we tested the exemplar vehicle. Autopilot would not engage because Autosteer was not available. We have not made any conclusion about the operation of the crash vehicle during the crash sequence. That remains under investigation."

Hopefully some other info will show up to answer this question.

Meanwhile, a local idiot was arrested after being video'd from another car riding in the back of his Tesla on I-80, with no one at the wheel. He's an arrogant young prick, and says he plans to continue doing so. In fact, in an interview after he got home he says that after he spent the night in jail, he got home the same way. See the video (unfortunately, you have to wait for the ad to play first): https://www.ktvu.com/news/safety-re...ial-investigation-into-tesla-autopilot-system

Seeing as how Tesla can do over-the-air updates, I wonder if they will disable A/P in this moron's car? The alternative, if/when he hits and injures or kills someone, is that any attempt by Tesla to claim that that this was unforeseeable goes right out the window. And this and the CR test show how trivially easy it continues to be to fool A/P into controlling the car with no one at the wheel.
 
'I'm very rich': Back seat Tesla rider pulls same stunt, but in new car after jail release
https://www.ktvu.com/news/back-seat-tesla-rider-pulls-same-stunt-but-in-new-car-after-jail-release
 
cwerdna said:
'I'm very rich': Back seat Tesla rider pulls same stunt, but in new car after jail release
https://www.ktvu.com/news/back-seat-tesla-rider-pulls-same-stunt-but-in-new-car-after-jail-release

Suspend his license for 10 years. Maybe by then FSD will be a reality (though I have my doubts).
 
GRA said:
Oils4AsphaultOnly said:
GRA said:
You know for a fact that the driver tried to flee through the backseat? Shouldn't you stop with "no one was in the driver's seat" and not claim anything as fact beyond that?

My original hypothesis (posted last week) was that the driver fled the scene. Now that they have conclusive evidence that the driver died at the scene, many suppositions can be laid down to rest:
- The driver was found dead in the back seat.
- Driver was seen entering the driver's seat and driving away by local security cameras.
- car had 550 ft to accelerate to impact speed.

Indeed, an alien or ghost could've intervened and moved the driver, or maybe the recoil from impact could've ejected the driver into the back seat, but Occam's razor rules out the first possibility, and newton's first law rules out the second. If you don't start with the assumption that Tesla's autopilot is at fault, then the truth is much simpler to grasp.


An alternative explanation is that the car stopped and the driver moved from the front to the back before resuming the drive. We don't yet know what happened during the entire drive. The NTSB issued this today:

UPDATE 5/12/21: An NTSB spokesperson clarified the agency's findings about whether the driver was seated at the wheel at the time of the crash with this statement to Car and Driver: "The NTSB has made no conclusions about the operation of the crash vehicle—we are only stating the facts as we know them at this early stage of our investigation and all we know at this point, with regard to vehicle operation, is that Autosteer was not available on the section of road where the crash happened when we tested the exemplar vehicle. Autopilot would not engage because Autosteer was not available. We have not made any conclusion about the operation of the crash vehicle during the crash sequence. That remains under investigation."

Hopefully some other info will show up to answer this question.

Except that TACC won't get you up past the speed limit (25mph in unmarked residential streets). So no, the driver had to have his foot on the accelerator to get up to crashing speed.
 
cwerdna said:
'I'm very rich': Back seat Tesla rider pulls same stunt, but in new car after jail release
https://www.ktvu.com/news/back-seat-tesla-rider-pulls-same-stunt-but-in-new-car-after-jail-release

You just can't prevent human stupidity. The stupidest thing we can do is to try to prevent human stupidity. That's why strollers come with warning labels about "taking the baby out, before folding the stroller". As a parent I'm appalled at the waste of time and material spent to affix these warnings labels (there are half a dozen others on that stroller) that add a significant amount of cost to the stroller for no additional benefit (those stupid enough to make that mistake generally wouldn't read the warning labels anyway).

Humans make mistakes, fine. But deliberately circumventing safety systems isn't "a mistake". There needs to be a line drawn where the "driver" is ultimately responsible for their own actions.
 
cwerdna said:
'I'm very rich': Back seat Tesla rider pulls same stunt, but in new car after jail release
https://www.ktvu.com/news/back-seat-tesla-rider-pulls-same-stunt-but-in-new-car-after-jail-release

He said "Teslas don't crash." LOL, he's inside his own bubble.
 
LeftieBiker said:
Those labels add little cost. And that Tesla A-hole should have his license yanked.

Multiplied by the hundreds of millions annually (those labels are international), those little pieces of plasticized paper and glue add up to a pretty significant cost. They're much less significant than disposable plastic-ware, but producing useless things is still a waste of resources.
 
Both L.A. Times:
DMV probing whether Tesla violates state regulations with self-driving claims

https://www-latimes-com.cdn.ampproj...-17/dmv-tesla-california-fsd-autopilot-safety


. . . Asked for detail, DMV spokesperson Anita Gore said via email, “The DMV cannot comment on the pending review.” She did list the penalties that might be applied if a company is found to have violated DMV regulations that prohibit misleading advertising concerning automated vehicles.

In small print, Tesla says on its website that full self-driving “does not make the car autonomous” and that “active supervision” is required by the driver. But social media are rife with videos showing drivers, mostly young men, overcoming Tesla’s easily defeated driver-monitoring system to crawl into the back seat and let the Tesla “drive itself” down public highways. . . .

Although a driver is legally responsible for such misbehavior, the fine print in Tesla advertising provides a weak defense against deceptive marketing allegations, said Bryant Walker Smith, a leading expert on automated vehicle law at the University of South Carolina. He cites the Lanham Act, the federal law that governs trademarks.

If the DMV finds Tesla is misleading customers, potential penalties include suspension or revocation of DMV autonomous vehicle deployment permits and manufacture and dealer licenses, the DMV spokesperson said. She added that “a vehicle operating on public roads using autonomous technology without first obtaining a permit can be removed from the public roadway by a police officer.”

Although the National Highway Traffic Safety Administration has no authority to regulate vehicle advertising, the DMV’s own rules allow it to sanction manufacturers that advertise a vehicle as autonomous when it is not. The Federal Trade Commission also regulates such advertising; an FTC spokesperson declined to comment. A request to interview the DMV’s director, Steve Gordon, was declined. Tesla lacks a media relations department.

In July 2020, a Munich court ruled that Tesla had been misleading consumers about the capabilities of its autonomous systems and ordered the company’s German subsidiary to stop using phrases such as “full potential for autonomous driving” on its website and in advertising materials. . . .

“Tesla seems to be asking for legal trouble on many fronts,” law professor Smith said. “From the FTC and its state counterparts for deceptive marketing. From the California DMV for, potentially, crossing into the realm of autonomous vehicle testing without state approval, from competitors with driver assistance systems, competitors with actual automated driving systems, ordinary consumers, and future crash victims who could sue under state or federal law.”

Tesla is facing hundreds of lawsuits. At least several deaths have been connected with use or misuse of Autopilot. NHTSA has more than 20 investigations open on Tesla, though how long they’ll take to be resolved, NHTSA won’t say. . . .



Tesla driver in fatal California crash had posted videos of himself in vehicle

https://www-latimes-com.cdn.ampproj...a-crash-had-post-videos-of-himself-in-vehicle


The driver of a Tesla involved in a fatal crash that California highway authorities said may have been operating on Autopilot posted social media videos of himself riding in the vehicle without his hands on the wheel or foot on the pedal.

The May 5 crash in Fontana, 50 miles east of Los Angeles, is also under investigation by the National Highway Traffic Safety Administration. The probe is the 29th case involving a Tesla that the federal agency has probed.

In the Fontana crash, a 35-year-old man identified as Steven Michael Hendrickson was killed when his Tesla Model 3 struck an overturned semi on a freeway about 2:30 a.m. . . .

Another man was seriously injured when the electric vehicle hit him as he was helping the semi’s driver out of the wreck.

The California Highway Patrol announced Thursday that its preliminary investigation had determined that the Tesla’s partially automated driving system called Autopilot “was engaged” before the crash. The agency said it was commenting on the Fontana crash because of the “high level of interest” about Tesla crashes and because it was “an opportunity to remind the public that driving is a complex task that requires a driver’s full attention.”

However on Friday, the agency walked back its previous declaration.

“To clarify,” a new CHP statement said, “There has not been a final determination made as to what driving mode the Tesla was in or if it was a contributing factor to the crash. . . .”

Tesla, which has disbanded its public relations department, did not respond Friday to an email seeking comment. The company says in its owner’s manuals and on its website that both Autopilot and “Full Self-Driving” are not fully autonomous and that drivers must pay attention and be ready to intervene at any time.

Autopilot at times has had trouble dealing with stationary objects and traffic crossing in front of Teslas.

[Desc.of FL, Mountain View fatal crashes]

Tesla’s system, which uses cameras, radar and short-range sonar, also has trouble handling stopped emergency vehicles. Teslas have struck several firetrucks and police vehicles that were stopped on freeways with their flashing emergency lights on.

After the Florida and California fatal crashes, the National Transportation Safety Board recommended that Tesla develop a stronger system to ensure drivers are paying attention, and that it limit use of Autopilot to highways where it can work effectively. Neither Tesla nor the safety agency took action. . . .

NHTSA, which has authority to regulate automated driving systems and seek recalls if necessary, seems to have developed a renewed interest in the systems since President Biden took office.


[Snore . . . Riinng . . . Snore . . . Riiinng . . . snore . . Riinngg . . . Snort! Grabs phone] National Highway Transportstion Administration! Uh, what? Yes sir, wide awake and on the job. We'll get right on that! Yes, sir, goodbye, sir. [Phone is hung up. . . . . . . snore].


While we're still waiting on confirmation, it's likely a non-occupant was injured by a Tesla driving itself, which collided with a semi-dumptruck lying on its side on the freeway. Do you suppose that the regulatory agencies will finally do their jobs?
 
Both L.A. Times:
DMV probing whether Tesla violates state regulations with self-driving claims

https://www-latimes-com.cdn.ampproj...-17/dmv-tesla-california-fsd-autopilot-safety


. . . Asked for detail, DMV spokesperson Anita Gore said via email, “The DMV cannot comment on the pending review.” She did list the penalties that might be applied if a company is found to have violated DMV regulations that prohibit misleading advertising concerning automated vehicles.

In small print, Tesla says on its website that full self-driving “does not make the car autonomous” and that “active supervision” is required by the driver. But social media are rife with videos showing drivers, mostly young men, overcoming Tesla’s easily defeated driver-monitoring system to crawl into the back seat and let the Tesla “drive itself” down public highways. . . .

Although a driver is legally responsible for such misbehavior, the fine print in Tesla advertising provides a weak defense against deceptive marketing allegations, said Bryant Walker Smith, a leading expert on automated vehicle law at the University of South Carolina. He cites the Lanham Act, the federal law that governs trademarks.

If the DMV finds Tesla is misleading customers, potential penalties include suspension or revocation of DMV autonomous vehicle deployment permits and manufacture and dealer licenses, the DMV spokesperson said. She added that “a vehicle operating on public roads using autonomous technology without first obtaining a permit can be removed from the public roadway by a police officer.”

Although the National Highway Traffic Safety Administration has no authority to regulate vehicle advertising, the DMV’s own rules allow it to sanction manufacturers that advertise a vehicle as autonomous when it is not. The Federal Trade Commission also regulates such advertising; an FTC spokesperson declined to comment. A request to interview the DMV’s director, Steve Gordon, was declined. Tesla lacks a media relations department.

In July 2020, a Munich court ruled that Tesla had been misleading consumers about the capabilities of its autonomous systems and ordered the company’s German subsidiary to stop using phrases such as “full potential for autonomous driving” on its website and in advertising materials. . . .

“Tesla seems to be asking for legal trouble on many fronts,” law professor Smith said. “From the FTC and its state counterparts for deceptive marketing. From the California DMV for, potentially, crossing into the realm of autonomous vehicle testing without state approval, from competitors with driver assistance systems, competitors with actual automated driving systems, ordinary consumers, and future crash victims who could sue under state or federal law.”

Tesla is facing hundreds of lawsuits. At least several deaths have been connected with use or misuse of Autopilot. NHTSA has more than 20 investigations open on Tesla, though how long they’ll take to be resolved, NHTSA won’t say. . . .



Tesla driver in fatal California crash had posted videos of himself in vehicle

https://www-latimes-com.cdn.ampproj...a-crash-had-post-videos-of-himself-in-vehicle


The driver of a Tesla involved in a fatal crash that California highway authorities said may have been operating on Autopilot posted social media videos of himself riding in the vehicle without his hands on the wheel or foot on the pedal.

The May 5 crash in Fontana, 50 miles east of Los Angeles, is also under investigation by the National Highway Traffic Safety Administration. The probe is the 29th case involving a Tesla that the federal agency has probed.

In the Fontana crash, a 35-year-old man identified as Steven Michael Hendrickson was killed when his Tesla Model 3 struck an overturned semi on a freeway about 2:30 a.m. . . .

Another man was seriously injured when the electric vehicle hit him as he was helping the semi’s driver out of the wreck.

The California Highway Patrol announced Thursday that its preliminary investigation had determined that the Tesla’s partially automated driving system called Autopilot “was engaged” before the crash. The agency said it was commenting on the Fontana crash because of the “high level of interest” about Tesla crashes and because it was “an opportunity to remind the public that driving is a complex task that requires a driver’s full attention.”

However on Friday, the agency walked back its previous declaration.

“To clarify,” a new CHP statement said, “There has not been a final determination made as to what driving mode the Tesla was in or if it was a contributing factor to the crash. . . .”

Tesla, which has disbanded its public relations department, did not respond Friday to an email seeking comment. The company says in its owner’s manuals and on its website that both Autopilot and “Full Self-Driving” are not fully autonomous and that drivers must pay attention and be ready to intervene at any time.

Autopilot at times has had trouble dealing with stationary objects and traffic crossing in front of Teslas.

(Desc.of FL, Mountain View fatal crashes)

Tesla’s system, which uses cameras, radar and short-range sonar, also has trouble handling stopped emergency vehicles. Teslas have struck several firetrucks and police vehicles that were stopped on freeways with their flashing emergency lights on.

After the Florida and California fatal crashes, the National Transportation Safety Board recommended that Tesla develop a stronger system to ensure drivers are paying attention, and that it limit use of Autopilot to highways where it can work effectively. Neither Tesla nor the safety agency took action. . . .

NHTSA, which has authority to regulate automated driving systems and seek recalls if necessary, seems to have developed a renewed interest in the systems since President Biden took office.


(Snore . . . Riinng . . . Snore . . . Riiinng . . . snore . . Riinngg . . . Snort! Grabs phone) National Highway Transportation Safety Administration! Uh, what? Yes sir, wide awake and on the job! We'll get right on that! Yes, sir, will do! Goodbye, sir. (Phone is hung up. . . . . . . snore . . . snore)


While we're still waiting on confirmation, it's likely that a non-occupant was injured by a Tesla driving itself, which collided with a semi-dumptruck lying on its side on the freeway. If it proves to be another A/P-enabled accident, do you suppose that the regulatory agencies will finally do their jobs? I have my doubts.

IEVS:
NHTSA Misses Opportunity To Say There's No Autonomous Car Yet

It created a series of videos with Jason Fenske but did not address that matter.

https://insideevs.com/news/507889/nhtsa-no-autonomous-vehicles-yet/


. . . Autopilot and FSD unite many of these features in a single system. Yet, as Tesla itself told regulators and writes in its manuals, none of them is more than a Level 2 driving aid. Trying to prove they are or believing that is the case is what led to many crashes so far involving the system. Apart from the two new cases that have emerged, NHTSA was already investigating 18. The agency currently has 25 SCIs (Special Crash Investigations) ongoing.

If it is confirmed that Autopilot or FSD was active in the Texas and the Fontana crash, 20 of all 25 cases the agency is investigating will be related to the driving assistant aids Tesla offers. It could have been a good idea to mention that in any of these videos and stress what they really are.
 
GRA said:
Both L.A. Times:
DMV probing whether Tesla violates state regulations with self-driving claims

https://www-latimes-com.cdn.ampproj...-17/dmv-tesla-california-fsd-autopilot-safety


. . . Asked for detail, DMV spokesperson Anita Gore said via email, “The DMV cannot comment on the pending review.” She did list the penalties that might be applied if a company is found to have violated DMV regulations that prohibit misleading advertising concerning automated vehicles.

In small print, Tesla says on its website that full self-driving “does not make the car autonomous” and that “active supervision” is required by the driver. But social media are rife with videos showing drivers, mostly young men, overcoming Tesla’s easily defeated driver-monitoring system to crawl into the back seat and let the Tesla “drive itself” down public highways. . . .

Although a driver is legally responsible for such misbehavior, the fine print in Tesla advertising provides a weak defense against deceptive marketing allegations, said Bryant Walker Smith, a leading expert on automated vehicle law at the University of South Carolina. He cites the Lanham Act, the federal law that governs trademarks.

If the DMV finds Tesla is misleading customers, potential penalties include suspension or revocation of DMV autonomous vehicle deployment permits and manufacture and dealer licenses, the DMV spokesperson said. She added that “a vehicle operating on public roads using autonomous technology without first obtaining a permit can be removed from the public roadway by a police officer.”

Although the National Highway Traffic Safety Administration has no authority to regulate vehicle advertising, the DMV’s own rules allow it to sanction manufacturers that advertise a vehicle as autonomous when it is not. The Federal Trade Commission also regulates such advertising; an FTC spokesperson declined to comment. A request to interview the DMV’s director, Steve Gordon, was declined. Tesla lacks a media relations department.

In July 2020, a Munich court ruled that Tesla had been misleading consumers about the capabilities of its autonomous systems and ordered the company’s German subsidiary to stop using phrases such as “full potential for autonomous driving” on its website and in advertising materials. . . .

“Tesla seems to be asking for legal trouble on many fronts,” law professor Smith said. “From the FTC and its state counterparts for deceptive marketing. From the California DMV for, potentially, crossing into the realm of autonomous vehicle testing without state approval, from competitors with driver assistance systems, competitors with actual automated driving systems, ordinary consumers, and future crash victims who could sue under state or federal law.”

Tesla is facing hundreds of lawsuits. At least several deaths have been connected with use or misuse of Autopilot. NHTSA has more than 20 investigations open on Tesla, though how long they’ll take to be resolved, NHTSA won’t say. . . .



Tesla driver in fatal California crash had posted videos of himself in vehicle

https://www-latimes-com.cdn.ampproj...a-crash-had-post-videos-of-himself-in-vehicle


The driver of a Tesla involved in a fatal crash that California highway authorities said may have been operating on Autopilot posted social media videos of himself riding in the vehicle without his hands on the wheel or foot on the pedal.

The May 5 crash in Fontana, 50 miles east of Los Angeles, is also under investigation by the National Highway Traffic Safety Administration. The probe is the 29th case involving a Tesla that the federal agency has probed.

In the Fontana crash, a 35-year-old man identified as Steven Michael Hendrickson was killed when his Tesla Model 3 struck an overturned semi on a freeway about 2:30 a.m. . . .

Another man was seriously injured when the electric vehicle hit him as he was helping the semi’s driver out of the wreck.

The California Highway Patrol announced Thursday that its preliminary investigation had determined that the Tesla’s partially automated driving system called Autopilot “was engaged” before the crash. The agency said it was commenting on the Fontana crash because of the “high level of interest” about Tesla crashes and because it was “an opportunity to remind the public that driving is a complex task that requires a driver’s full attention.”

However on Friday, the agency walked back its previous declaration.

“To clarify,” a new CHP statement said, “There has not been a final determination made as to what driving mode the Tesla was in or if it was a contributing factor to the crash. . . .”

Tesla, which has disbanded its public relations department, did not respond Friday to an email seeking comment. The company says in its owner’s manuals and on its website that both Autopilot and “Full Self-Driving” are not fully autonomous and that drivers must pay attention and be ready to intervene at any time.

Autopilot at times has had trouble dealing with stationary objects and traffic crossing in front of Teslas.

[Desc.of FL, Mountain View fatal crashes]

Tesla’s system, which uses cameras, radar and short-range sonar, also has trouble handling stopped emergency vehicles. Teslas have struck several firetrucks and police vehicles that were stopped on freeways with their flashing emergency lights on.

After the Florida and California fatal crashes, the National Transportation Safety Board recommended that Tesla develop a stronger system to ensure drivers are paying attention, and that it limit use of Autopilot to highways where it can work effectively. Neither Tesla nor the safety agency took action. . . .

NHTSA, which has authority to regulate automated driving systems and seek recalls if necessary, seems to have developed a renewed interest in the systems since President Biden took office.


[Snore . . . Riinng . . . Snore . . . Riiinng . . . snore . . Riinngg . . . Snort! Grabs phone] National Highway Transportstion Administration! Uh, what? Yes sir, wide awake and on the job. We'll get right on that! Yes, sir, goodbye, sir. [Phone is hung up. . . . . . . snore].


While we're still waiting on confirmation, it's likely a non-occupant was injured by a Tesla driving itself, which collided with a semi-dumptruck lying on its side on the freeway. Do you suppose that the regulatory agencies will finally do their jobs?

drivers falling asleep at the wheel at 3am also get into car accidents. Why is it automatically assumed that the car was in autopilot?

Unlike the Texas accident, a wrecked vehicle in the middle of the highway is indeed one of the failure points of autopilot, so I'm not saying that it wasn't on, just questioning why that's the default assumption? Have we not learned our lesson yet from last month?
 
Oils4AsphaultOnly said:
GRA said:
While we're still waiting on confirmation, it's likely a non-occupant was injured by a Tesla driving itself, which collided with a semi-dumptruck lying on its side on the freeway. Do you suppose that the regulatory agencies will finally do their jobs?

drivers falling asleep at the wheel at 3am also get into car accidents. Why is it automatically assumed that the car was in autopilot?


It's not automatically assumed, it's a matter of probability given the driver's own statements and behavior in videos, along with his membership in a Tesla club - it all points to him being a serious fanboy who thought A/P & FSD were terrific:
"What would I do without my full self-driving Tesla after a long day at work," said a message on one. "Coming home from LA after work, thank god, self-drive," said a comment on another video, adding, "Best car ever!"


I left room for there being another possible explanation, but the above quotes, IMO, strongly tilt the probability towards A/P & FSD controlling the car as they indicate his mindset, added to the fact he hit a truck lying on its side across several lanes.


Oils4AsphaultOnly said:
Unlike the Texas accident, a wrecked vehicle in the middle of the highway is indeed one of the failure points of autopilot, so I'm not saying that it wasn't on, just questioning why that's the default assumption? Have we not learned our lesson yet from last month?

See above.
 
Tesla on 'autopilot' crashes into deputy's vehicle in Washington state
There were no injuries, but there was significant damage to the patrol vehicle, the sheriff's office said.
https://www.nbcnews.com/news/us-news/tesla-autopilot-crashes-deputy-s-vehicle-washington-state-n1267716
https://www.facebook.com/SnoCoSheriff/posts/5604732652934095
 
GRA said:
Oils4AsphaultOnly said:
GRA said:
While we're still waiting on confirmation, it's likely a non-occupant was injured by a Tesla driving itself, which collided with a semi-dumptruck lying on its side on the freeway. Do you suppose that the regulatory agencies will finally do their jobs?

drivers falling asleep at the wheel at 3am also get into car accidents. Why is it automatically assumed that the car was in autopilot?


It's not automatically assumed, it's a matter of probability given the driver's own statements and behavior in videos, along with his membership in a Tesla club - it all points to him being a serious fanboy who thought A/P & FSD were terrific:
"What would I do without my full self-driving Tesla after a long day at work," said a message on one. "Coming home from LA after work, thank god, self-drive," said a comment on another video, adding, "Best car ever!"


I left room for there being another possible explanation, but the above quotes, IMO, strongly tilt the probability towards A/P & FSD controlling the car as they indicate his mindset, added to the fact he hit a truck lying on its side across several lanes.


Oils4AsphaultOnly said:
Unlike the Texas accident, a wrecked vehicle in the middle of the highway is indeed one of the failure points of autopilot, so I'm not saying that it wasn't on, just questioning why that's the default assumption? Have we not learned our lesson yet from last month?

See above.

None of the articles referenced FSD, and there's no indication that he was part of the Beta-test pool, so the fact that he would call it "self-drive" indicates that he didn't treat autopilot as a driving aid as it should've been. But I agree that he probably became over-reliant on autopilot to drive him when he shouldn't have. 3am sounds like either drunk or sleepy driver, neither conditions for getting behind the wheel. The sooner we get FSD, the safer we would all be, otherwise, we're sharing the road with drivers who make poor decisions.
 
Oils4AsphaultOnly said:
GRA said:
Oils4AsphaultOnly said:
drivers falling asleep at the wheel at 3am also get into car accidents. Why is it automatically assumed that the car was in autopilot?


It's not automatically assumed, it's a matter of probability given the driver's own statements and behavior in videos, along with his membership in a Tesla club - it all points to him being a serious fanboy who thought A/P & FSD were terrific:
"What would I do without my full self-driving Tesla after a long day at work," said a message on one. "Coming home from LA after work, thank god, self-drive," said a comment on another video, adding, "Best car ever!"


I left room for there being another possible explanation, but the above quotes, IMO, strongly tilt the probability towards A/P & FSD controlling the car as they indicate his mindset, added to the fact he hit a truck lying on its side across several lanes.

None of the articles referenced FSD, and there's no indication that he was part of the Beta-test pool, so the fact that he would call it "self-drive" indicates that he didn't treat autopilot as a driving aid as it should've been. But I agree that he probably became over-reliant on autopilot to drive him when he shouldn't have. 3am sounds like either drunk or sleepy driver, neither conditions for getting behind the wheel. The sooner we get FSD, the safer we would all be, otherwise, we're sharing the road with drivers who make poor decisions.


Sure they did:
What would I do without my full self-driving Tesla after a long day at work?
Here's another article probably referencing the same quote on FSD, from Reuters:
Tesla crash victim lauded 'full self-driving' in videos on Tiktok
https://www.reuters.com/business/au...-full-self-driving-videos-tiktok-2021-05-16/


It remains to be seen if his car had it or not, but he certainly thought he could treat it that way. Not that "FSD" is anything of the sort, it's still Level 2. Once we get true Level 4 or Level 5 ADS, then yes, the roads will be safer. As it is, we have L2 that makes extra dumb mistakes that alert human drivers wouldn't. As the whole point of ADS/DAS is to make the roads safer by avoiding human errors, they shouldn't be allowed to replace them with machine errors, particularly if the systems are allowed to operate in conditions which it's known they are unable to cope with.
 
GRA said:
Oils4AsphaultOnly said:
GRA said:
It's not automatically assumed, it's a matter of probability given the driver's own statements and behavior in videos, along with his membership in a Tesla club - it all points to him being a serious fanboy who thought A/P & FSD were terrific:


I left room for there being another possible explanation, but the above quotes, IMO, strongly tilt the probability towards A/P & FSD controlling the car as they indicate his mindset, added to the fact he hit a truck lying on its side across several lanes.

None of the articles referenced FSD, and there's no indication that he was part of the Beta-test pool, so the fact that he would call it "self-drive" indicates that he didn't treat autopilot as a driving aid as it should've been. But I agree that he probably became over-reliant on autopilot to drive him when he shouldn't have. 3am sounds like either drunk or sleepy driver, neither conditions for getting behind the wheel. The sooner we get FSD, the safer we would all be, otherwise, we're sharing the road with drivers who make poor decisions.


Sure they did:
What would I do without my full self-driving Tesla after a long day at work?
Here's another article probably referencing the same quote on FSD, from Reuters:
Tesla crash victim lauded 'full self-driving' in videos on Tiktok
https://www.reuters.com/business/au...-full-self-driving-videos-tiktok-2021-05-16/


It remains to be seen if his car had it or not, but he certainly thought he could treat it that way. Not that "FSD" is anything of the sort, it's still Level 2. Once we get true Level 4 or Level 5 ADS, then yes, the roads will be safer. As it is, we have L2 that makes extra dumb mistakes that alert human drivers wouldn't. As the whole point of ADS/DAS is to make the roads safer by avoiding human errors, they shouldn't be allowed to replace them with machine errors, particularly if the systems are allowed to operate in conditions which it's known they are unable to cope with.

Autopilot != FSD.

Once you get that, then you'll understand why this is all a bunch of hot air.

Edit: FSD isn't level 2 driver's assistance, autopilot is. But that doesn't change the fact that he thought he had something he didn't and he died for it. How many more cases of "sudden unintended acceleration" occured before people realized that the human behind the wheel was at fault? I'm afraid that's what it's going to take here.

The reason why I'm pushing back so hard, is because when used correctly, autopilot is a HUGE driving benefit that people who don't use it will NEVER understand. And the side effect of all this autopilot misunderstanding is that Tesla will be forced to disable autopilot for ALL. It'll be a case of the magnetic Zen Magnets ban (https://en.wikipedia.org/wiki/Neodymium_magnet_toys). A few people being stupid means no one else can enjoy them. Consumer overprotection at its finest, except it's not about toys that'll will be rescinded.
 
Oils4AsphaultOnly said:
Autopilot != FSD.
You know it, I know it, and anyone here who isn't trolling knows it.

But, there are far too many newer Tesla owners who don't know it. They bought FSD and errantly conclude that their car has FSD when it only has AP. The number of posts in Tesla FB groups from people saying things like "I love my FSD" is sad.
 
This posting is for GRA and anyone else who feels that Tesla is going about autonomous vehicles development the wrong way.

Waymo is seen by most in the media (not the tech community, who vehemently disagree) as the leader in autonomous tech. Yet, last week, their robotaxi blocked traffic and wouldn't let the service technician in, because the car didn't know how to navigate some cones that weren't pre-mapped before. In a city that they had been developing and testing in for years, to stumble on something so simple means they're a VERY LONG way from any form of true Level 5 deployment. The fact that their code depends on HD mapping of any area that they would service means that their solution will also never scale, which means it'll be too costly to compete with personal vehicle ownership.

Human drivers currently account for the most miles driven on the roads today, and they also account for the most accidents and cause the most deaths. On a per-mile basis, it's about ~90 million miles per death, and hasn't really changed much over the years. That's why flying is safer. The idiot human driver is the last barrier to overcome.
 
Back
Top