How safe are automated driving systems? Are some safer than others?
Seven years after Tesla started selling cars equipped with so-called Autopilot, safety authorities are still unable to answer these fundamental and vital questions.
But they took a step toward that on Wednesday with the National Highway Traffic Safety Administration’s first report of accidents involving advanced driver assistance systems.
The numbers are telling, as Tesla is responsible for 70% of all accidents with “Level 2” driving systems, which include adaptive cruise control and automatic lane keeping, and may include more advanced features like automatic lane change. That number certainly provides ammunition for critics who say Elon Musk’s company has taken a reckless approach to adopting unproven technology.
However, far more detail and context is needed before regulators can say definitively whether such systems can outperform human drivers or each other.
“The data may raise more questions than it answers,” NHTSA chief Steven Cliff told reporters.
In June 2021, the agency asked automakers to report serious accidents involving Level 2 systems. Figures reported Wednesday reflect crashes that occurred from that time through May 15 this year.
Of all the accidents that affected all cars during this period, automakers reported that 392 involved automated driver assistance systems.
Of these, 273 were reported by Tesla, 90 by Honda and 10 by Subaru; others reported serious accidents in the single digits.
“This data provides limited insight into hundreds of accidents,” said Bryant Walker Smith, a professor specializing in automated vehicle law at the University of South Carolina School of Law. “But over the same period, there have been literally millions of other crashes.”
But no one should conclude that Level 2 systems are safer than cars operated only by human drivers, he said. They might be, they might not. The NHTSA data is far too comprehensive to reach such conclusions, he said.
The data does not include the number of automated systems each company has on the road or the total kilometers traveled with Level 2 systems activated. NHTSA did not comment on how thorough each company’s reporting procedures might be. The agency plans monthly reports.
Crashes that were prevented by automated systems “obviously go unreported to the extent that they didn’t occur,” Smith said. A deep dive into the cause of the reported accidents – the role played by the system, the driver, the system’s driver monitoring system and other conditions on the roadway – would help safety authorities draw firm conclusions, he said.
“What NHTSA provided was a ‘fruit bowl’ of data with many caveats that made it difficult for the public and experts alike to understand what was being reported,” Jennifer Homendy, chair of the National Transportation Safety Board, said in a statement . “Independent analysis of the data is key to identifying security vulnerabilities and potential remedial actions.”
Last year’s order to report accident data marked NHTSA’s first attempt to fill a deep knowledge gap about the real-world safety implications of automated vehicle technology on public roads.
Every vehicle manufacturer’s automated system could be safer than human drivers. Or less sure. Data rich enough to draw firm conclusions is scarce. Accident data collection systems in the US are decades old, inconsistent, still paper-based in many police departments, and totally inadequate to determine the role automated systems play in preventing or causing accidents.
“One would have hoped that NHTSA would ‘do the work’ to make the numbers they publish in summaries truly comparable,” said Alain Kornhauser, director of the driverless car program at Princeton University, in an e-mail. Mail.
In addition to collecting accident data, NHTSA is investigating why Tesla’s cars collided with emergency vehicles parked on the side of the road, often with emergency lights flashing.
The investigation was prompted by 11 accidents resulting in 17 injuries and one fatality, including three accidents in Southern California. The number of such accidents has risen to 16. The technology in about 830,000 cars – all Tesla vehicles sold in the US from 2014 to 2022 – will be examined.
As part of that investigation, regulators will examine the performance of Tesla’s automatic emergency braking systems. As The Times reported last year, Tesla drivers are far more likely to report emergency braking problems than drivers of other brands.
Investigating emergency vehicles became more serious earlier this month when NHTSA upgraded its engineering analysis status to EA. This category means investigators will take a closer look at the technical design and performance of the autopilot. Once an investigation reaches EA, a recall is more likely.
Meanwhile, the California Department of Motor Vehicles continues to investigate whether Tesla is mistakenly marketing its Full Self-Driving feature, a $12,000 option. Experts in the field overwhelmingly state that the system is nowhere near capable of safely driving itself.
However, the DMV review is more than a year old and the DMV will not say when it might be complete.
State lawmakers are increasingly concerned about the DMV’s seemingly lax approach to Tesla. In December, California Senate Transportation Committee Chairwoman Lena Gonzalez asked the DMV to provide accident and safety information to the committee. The DMV said they would look into it and are still looking.
The DMV appears to allow Tesla to test self-driving cars on public roads without requiring the company to report accidents or system failures, as required by competitors like Waymo, Cruise, Argo and Zoox. DMV boss Steve Gordon has declined all media requests to discuss the issue since May 2021.
Watch LA Times Today at 7:00 p.m. on Spectrum News 1 on Channel 1 or stream live on the Spectrum News app. Viewers from Palos Verdes Peninsula and Orange County can watch on Cox Systems on channel 99.