GRA wrote:Oils4AsphaultOnly wrote:cwerdna wrote:For those who don't follow along on TMC (I take hiatuses sometimes for months at a time), CR's findings (https://www.consumerreports.org/autonom ... ervention/) aren't really new. There are tons of threads and posts on TMC about how poorly NoA works, in general. It's been going on ever since the feature was rolled out.
And, IIRC, there are definitely some software releases that are worse than others from an AP and NoA POV. Latest isn't always the greatest.
Since GRA won't post this, I thought I'd do you both a favor and post it: https://insideevs.com/news/351110/consu ... pilot/amp/
Turns out, CR actually approves of NoA. They just don't like the version that does the lane-changes for you. Go figure. I'll still make my own judgements based upon the actual use of the product though, instead of hearsay.
If I'd seen the article (and IEVS' headline had accurately reflected CR's views), I'd have been happy to post it, but as I posted the link to and directly quoted from CR's own release, why bother to filter it through a particular forum? Your claim that CR approves of NoA is without foundation.
You're right. I'll eat crow here. "CR was fine with it otherwise", is not the same as "approving".
GRA wrote:From an article on GCR dated May 24th:https://www2.greencarreports.com/news/1123236_tesla-navigate-on-autopilot-drives-itself-poorly-consumer-reports-finds. . . It's worth noting that when the organization first tested Navigate on Autopilot last November on its Model S, it found the same problems, among others, and Musk later said the company would update the system to return to its original lane after a pass. The update Consumer Reports received didn't seem to include that ability. . . .
The article goes on to quote David Friedman about Tesla's pre-public release testing regimen:Consumer Reports' vice president of advocacy, David Friedman, is the former acting administrator of NHTSA under the Obama administration. He said, "Tesla is showing what not to do on the path toward self-driving cars: release increasingly automated driving systems that aren’t vetted properly. Before selling these systems, automakers should be required to give the public validated evidence of that system’s safety—backed by rigorous simulations, track testing, and the use of safety drivers in real-world conditions."
IOW, pretty much what the FAA failed to ensure Boeing did adequately in the case of the 737 Max, the difference being that in an airliner accident people die by the hundreds, while for cars the total per individual accident is much smaller, but the number of accidents is far greater.
Ummm, no. Boeing screwed up on their UI design and pilot training. The software behaved exactly as it was programmed to do. This is a usability design issue. The only thing they have in common with Tesla's A/P is the word "autopilot".
GRA wrote:For an example of exactly the opposite approach to development testing compared to Tesla, and one which I obviously believe is necessary, see the following article. BTW, in a previous post you stated that there hadn't been any backlash owing to self-driving car accidents. I meant to reply at the time, but got distracted. In fact, as noted below there was a major backlash after the Herzberg death, and those where self-driving vehicles kill non-occupants are the ones that I'm worried will set back the development and deployment of AVs. The general public is far more worried about being put at risk by self-driving cars that they aren't in. Anyone who's riding in one has volunteered to act as a crash-test dummy for the company, so people aren't as concerned about those deaths as they are when an AV kills a non-occupant, potentially themselves:https://www.forbes.com/sites/alanohnsman/2019/05/19/waymo-six-month-checkup-headway-on-hand-gestures-and-cops-on-slow-ride-to-self-driving-future/#3e6c74e11124May 19, 2019, 10:00am
Hand Gestures And Horses: Waymo’s Self-Driving Service Learns To Woo The Public. . . “We've always had a very conservative approach to making both our users and the public feel safe with this technology and what it is we're doing here," a Waymo spokeswoman said. . . .
The cautious expansion, partly driven by a public that is both intrigued by the prospect of a digital chauffeur and easily spooked by self-driving blunders, contrasts with some of the more ambitious roadmaps projected by Waymo rivals. Tesla CEO Elon Musk claims his electric car company can deliver a self-driving future as early as next year–assuming his new computer and self-driving software, perfected by utilizing camera and sensor data collected in “shadow mode” from hundreds of thousands of Teslas on the road work as promised. General Motors’ Cruise unit has its own aggressive market plans, and Uber and Lyft are also pursuing own self-driving car technology as a way to keep the cost of rides low. Getting the cars to navigate on their own is just the start; getting the public comfortable with both driving alongside and in a car without a driver is another big challenge.
Numerous surveys of public sentiment, such as one released in March by AAA, show a high level of worry over driverless vehicles, especially in the wake of a deadly 2018 accident in which a self-driving Uber test vehicle struck and killed a pedestrian in Tempe and fatal crashes involving Tesla drivers who may have overestimated the capabilities of the carmaker’s Autopilot software. A new Capgemini study finds a majority of people surveyed globally are awaiting the technology with “anticipation,” plus “a degree of uncertainty and concern,” says Markus Winkler, Capgemini’s head of automotive research. . . .
MIT’s Reimer thinks it could take decades of refinement until robots truly change how most people get around. For now, winning public acceptance for the technology means finding ways to validate how well it performs in the real world. In other words, a steady ground game is better than a Hail Mary pass for autonomous technology.
“The view for companies like Waymo is `we have to be able to show functional safety. Otherwise, we can't protect our decisions in a court of law, where this will all end up long term,’” he said. “Elon is working mostly on the deep neural net side where a good chunk of it is a black box. Documenting, defending that in court is going to be tough.”
Waymo had been developing self-driving for almost a decade, and their car still gets into accidents and causes road rage with other drivers. At the rate they're going, they'll never have a self-driving solution that can work outside of the test area.
One thing that people still seem to misunderstand and I suspect you do too, is the claim that Tesla's FSD will be "feature-complete" by the end of the year. "Feature-complete" is a software development term indicating that the functional capabilities have been programmed in, but it's not release ready yet. Usually at this point in software, when under an Agile development cycle, the product is released in alpha, and bugs are noted and released in the next iteration (usually iterations are released weekly, or even daily). After certain milestones have been reached, it will be considered beta, and after that RC1 (release candidate).
Under this development cycle, you'll see news about FSD being tested on the roads or in people's cars (who have signed up to be part of the early access program). That isn't considered the public availability of FSD! You might hate it, but there's no substitute for real-world testing.