https://www.greencarcongress.com/2021/0 ... ation.htmlPlus completes driverless Level 4 semi truck highway demonstration
. . . The driverless semi truck was operated using Plus’s Level 4 autonomous driving technology, without a safety driver, teleoperator, or any other forms of human intervention.
This represents a significant milestone for the autonomous trucking industry and for Plus, which demonstrated the company’s first driverless Level 4 heavy truck operation at the Qingdao port in April 2018.
The driverless Level 4 truck demonstration was completed on the Wufengshan highway in China’s largest economic center of Yangtze Delta. The demonstration was conducted with a special permit on the newly built highway, with Plus being the first company to be granted such a permit in China.
Plus expects to launch pilot operations of a fully driverless truck for use in a dedicated environment in 2022.
Plus is also applying the Level 4 technology used in the driverless demo to deploy a commercial driver-in product for semi trucks called PlusDrive. PlusDrive can either be a standard configuration of newly built trucks or added to existing trucks in order to help make long-haul trucking safer, more efficient, more comfortable, and better for the environment.
Customers of PlusDrive include some of the world’s largest fleets. The first customer delivery of PlusDrive started in February 2021 and mass production of the FAW J7 L3 truck powered by PlusDrive is expected to start in the third quarter of 2021.
https://www.caranddriver.com/news/a3726 ... ms-tested/It's Not Just Tesla: All Other Driver-Assist Systems Work without Drivers, Too
Driver-assistance systems have become commonplace, and our testing found none of them can sniff out drivers aggressively misusing them
. . . While one brand's tech achieves the same end as another's, that's not to say all systems behave the same way. They don't. To demonstrate this, we performed four tests on 17 vehicles, one from most major manufacturers, to determine how much each expects the driver to remain engaged and attentive.
First, with adaptive cruise control set to 60 mph and lane centering active, we unbuckled the driver's seatbelt. Some, such as the Subaru, immediately canceled all driver aids. Others, such as the Teslas and Cadillac, went even further, braking to a halt. But the majority of the vehicles tested did nothing in this scenario.
For the second test, again with the cruise set to 60 and lane centering active, we took our hands off the wheel to see how much time passed before (a) the vehicle threw a warning and (b) the system shut down. The most conservative bunch—the Cadillac, Ford, Volvo, Toyota, and Lexus—called it quits within 21 seconds, while the Hyundai tracked for 91 seconds, covering 1.5 miles sans hands.
Then we repeated the second test, only this time we tried to trick each vehicle into thinking we had our hands on the wheel by draping a 2.5-pound ankle weight over one of the spokes. That fooled the vast majority of today's systems, which watch for torque at the steering wheel as a proxy for driver engagement. But the ones that rely on touch, such as the BMW's and Mercedes's, couldn't be gamed by the weight. Before you ask, we also tried putting tape around those wheels and, when that didn't work, a zip tie. No dice.
We had to adjust the parameters of our third test for GM's Super Cruise. Currently the only system to allow hands-free driving for extended periods, Super Cruise relies on an infrared camera pointed at the driver to determine whether sufficient attention is being paid to the road. (Ford will launch a similar system called BlueCruise later this year.) Since our hands were already off the wheel, we tested the Caddy's capability by taking our eyes off the road. It shut us down, but even today's most sophisticated system isn't foolproof. We tricked it with a pair of gag glasses emblazoned with eyeballs.
Finally, we went for the full monty, getting out of the driver's seat and letting the car fend for itself [see Safety First, below]. Every last vehicle let us. Most of them, when saddled with a weight on the helm, would mind the steering and speed for as long as you dare. Riding lawn mowers can detect a missing driver. Why can't cars?
While some vehicles cancel driver aids when the seatbelt is unlatched, a determined fool can simply buckle the belt over an empty driver's seat; no vehicle can tell the difference. The BMW, Mercedes, and Cadillac fared best against driver misuse, with the Germans having a touch-sensitive steering-wheel sensor and Cadillac's Super Cruise remaining active for only 18 seconds with no one behind the wheel—unless, of course, you put a mannequin wearing eyeball glasses in the seat.
As these systems continue to gain capabilities, we suspect drivers will become increasingly emboldened to take risks. Automakers should close these loopholes to head off future idiocy.
https://www.autoblog.com/2021/08/17/nht ... stigation/Does Tesla Autopilot investigation signal a new get-tough NHTSA?
'It’s terrific that the regulators are finally realizing how serious this is'
The National Highway Traffic Safety Administration’s investigation into the involvement of Tesla Autopilot in car accidents signals a more activist approach to regulating auto safety — particularly new technologies — by the Biden administration.
NHTSA, which announced the Tesla probe Monday, opened 26 probes into various auto and highway safety issues so far this year — more than in all of 2020 or 2019, according to its website. NHTSA is on pace to launch about 66 percent more investigations than the 25 it began last year.
Since President Joe Biden was inaugurated in January, NHTSA also has stepped up its pace of investigations into Tesla crashes. Nine probes into Tesla accidents have been opened since March; the most recent occurred in San Diego last month.
“Biden has picked leadership that is independent-minded and safety conscious,” said Jamie Court, president of Consumer Watchdog, a Los Angeles-based public advocacy group.
NHTSA said it launched the Tesla probe, which covers an estimated 765,000 vehicles from the 2014 model year onward, after 11 cars using Autopilot collided with fire trucks, police cruisers or other vehicles at crash scenes. The incidents resulted in 17 injuries and one fatality.
Monday’s announcement, combined with an order in June requiring car manufacturers including Tesla to report crashes involving automated-driving technology, shows that NHTSA is becoming more aggressive on the issue generally.
“Taken together, that order and this particular enforcement action could be the beginnings of a more active safety enforcement agenda for NHTSA,” said Paul Hemmersbaugh, who served as the agency’s general counsel under former President Barack Obama and now heads transportation work at the law firm DLA Piper LLP. . . .
Virtually all U.S. automakers are offering a variety of advanced driver-assistance systems, technologies that can help motorists park, stay in their lane or avoid obstacles. These systems help the driver with a combination of sensors, cameras and radar, but completely autonomous vehicles still aren’t commercially available.
Given the developments, it’s important for regulators to get a handle on the emerging driver-assistance technologies, especially as carmakers push toward fully driverless vehicles, said Jake Fisher, director of auto testing at Consumer Reports.
“In a way that we have not seen in 50 or even 100 years, the control of the vehicle is fundamentally changing, and it’s very serious,” Fisher said. “The technology has advanced so quickly, it’s really left regulators scrambling to try to keep up. It’s terrific that the regulators are finally realizing how serious this is."
NHTSA didn’t immediately respond to a request for comment on whether the Tesla probe signals a more aggressive approach to auto safety regulation. Tesla and Eric Williams, the company’s associate general counsel for regulatory issues, also didn’t respond.
Lawmakers who have pushed for more stringent auto regulation cheered the agency’s decision to open the Tesla probe.“NHTSA is rightly investigating Tesla’s Autopilot after a series of concerning crashes,” Sens. Richard Blumenthal and Ed Markey, both Democrats, said in a statement. “This probe must be swift, thorough, and transparent to ensure driver and public safety."
Biden has nominated many regulators who have charted a more activist course, such as Federal Trade Commission Chairwoman Lina Khan, who has re-emphasized antitrust enforcement at the agency.
NHTSA hasn’t had a U.S. Senate-confirmed chief since 2017. Since Biden took office in January, the agency has been led by Steven Cliff, a former deputy executive officer at the California Air Resources Board, which regulates auto emissions in the Golden State.
The new chairwoman of the National Transportation Safety Board, which investigates highway safety issues but has no regulatory authority, cheered the NHTSA announcement. Jennifer Homendy, who was sworn in on Friday, called it “a positive step forward” and said she was “encouraged by NHTSA’s leadership taking action in ensuring advanced driving assistance systems function safely."
Homendy challenged Tesla in a series of tweets Dec. 29 while she was an NTSB board member, over the company’s marketing of what it calls “Full Self-Driving” capability in its cars. . . .
Ben Shneiderman, a computer science professor at the University of Maryland who studies human-computer interactions, said it’s too soon to tell if the Tesla probe is a sign of a more aggressive regulatory approach. But he said it’s long overdue.
“We can’t just allow these companies to go do these things,” he said. The social structure and governance around artificial intelligence “has to be broader than Google claiming ‘trust us, we’re really careful.’ Same thing for Tesla.”
Tesla’s marketing of its driver-assistance systems as Autopilot has led to drivers believing they don’t need to pay attention, said David Harkey, president of the Insurance Institute for Highway Safety, which conducts safety testing and represents the insurance industry.
“We’ve been calling for this more aggressive approach to step in and provide guidance to manufacturers so they’re not leading us down a path where consumers misunderstand what these systems can do,” he said. “There has to be good driver monitoring.”
https://www.autoblog.com/2021/08/18/aur ... fety-tool/Aurora releases tool to gauge safety of self-driving systems
Provides methodology, metrics for gauging progress from development to deployment
At Google, Urmson and the now-imprisoned Anthony Levandowski used to be at odds over how fast to push the bounds of safety. Levandowski's approach led to the death of a woman pedestrian hit by an Uber in Arizona, after he left Google/Waymo. Urmson believed a much more careful expansion involving far more testing and validation was needed.Aurora, the Silicon Valley self-driving startup founded by former Tesla, Uber and Google executives, has released what it says is the industry’s first tool for evaluating whether and when autonomous trucks and cars are safe to deploy on public roads without a human behind the wheel.
“We think this is the only way you can get to a safe, commercializable product,” said co-founder and CEO Chris Urmson of Aurora’s new Safety Case Framework.
Aurora, working with partners PACCAR and Volvo Group, aims to put its self-driving system in commercial service in heavy-duty trucks in late 2023.
The release of the safety tool, which provides a methodology and metrics for gauging progress from development to deployment, comes days after the U.S. National Highway Traffic Safety Administration (NHTSA) opened an investigation of Tesla’s Autopilot driving assistance feature following a series of crashes involving Tesla models and emergency vehicles.
Urmson said the latest NHTSA investigation of Tesla “had no bearing” on Aurora’s decision to release its framework, which he described as a "structured approach" to testing and validating the safety of self-driving systems. It includes four levels of claims connected with the safe development, testing and evaluation of Aurora's self-driving systems, which require supporting evidence.
Urmson has had an ongoing dialogue with the U.S. safety agency dating back to when he ran Google’s self-driving car program, which since has been renamed Waymo.
Aurora also has had related discussions with such professional organizations as the Society of Automotive Engineers and the Institute of Electrical and Electronics Engineers to “look at different standards and approaches to safety. . . .”