If the Feds Don't Act, Expect More Autonomous Car Accidents The death of Elaine Herzberg after a collision with an Uber self-driving car could hinder autonomous vehicle development, but it might also force much-needed federal regulation.
By Doug Newcomb
Our biggest sale — Get unlimited access to Entrepreneur.com at an unbeatable price. Use code SAVE50 at checkout.*
Claim Offer*Offer only available to new subscribers
This story originally appeared on PCMag
The tragic accident in Tempe, Ariz., last Sunday evening that killed Elaine Herzberg as she walked her bike across the road should have never happened. But the same could be said of the 197 pedestrians struck by cars and killed in Arizona in 2016, and the more than 40,000 people who died in car accidents in the U.S. during the same time period.
Herzberg's death caused a stir because it was caused by one of the self-driving Volvos Uber was testing in the Phoenix area. Because self-driving tech is new and not completely trusted, this one roadway death out of thousands every year has been intensively covered by media, scrutinized by traffic safety experts and caused considerable hand-wringing among companies that are heavily investing in autonomous technology. It should spur questions about the technology -- and government regulators into action.
A video of the accident from inside and outside the vehicle released on Wednesday clearly shows that neither the car's sensors nor the human safety driver behind the wheel saw Herzberg. The car did not slow down or swerve to avoid hitting her. Even though the Volvo XC90 comes equipped with a range of sensors and driver assist systems -- including pedestrian and cyclist detection with auto brake (above) -- and the Uber vehicle was outfitted with more sophisticated sensors such as lidar, the vehicle inexplicably never detected Herzberg as she crossed a darkened road.
Her death demonstrates that self-driving tech in its current form isn't ready for public roads. But it also proves why we desperately need federal regulation to protect people and autonomous vehicle innovation.
Autonomous tech testers flock to Phoenix
Companies like Uber, Lyft and Waymo have flocked to Phoenix because of Arizona's lax laws on autonomous vehicle testing. That prompted California to relax its regulations last month, although the state requires self-driving vehicles tested on public roads to have safeguards like remote operation.
States like Florida, which has favorable weather, and Michigan, home to the U.S. car industry, have also adopted a laissez faire approach the autonomous vehicle testing in hopes of attracting or retaining automaker and tech company dollars.
Meanwhile, self-driving car advocates have been waiting on Washington to give guidance on autonomous regulation. The Obama administration and former Transportation Secretary Anthony Foxx laid the groundwork in January 2016 with a series of voluntary safety guidelines and by offering $4 billion in federal funding to foster the testing and development of self-driving technology.
After power changed hands in D.C. a year later, the federal government and Congress delayed picking up the baton for several months due to partisan bickering and lobbying by special interests. Now, self-driving car policy has effectively stalled on Capitol Hill.
Transportation Secretary Elaine Chao has said that the USDOT won't stifle innovation but she hasn't shown leadership on self-driving cars. More than a year into the Trump administration, the National Highway Traffic Safety Administration (NHTSA), the division of the US DOT that supervises motor vehicle regulation and was tasked under the Obama guidelines to oversee autonomous vehicle policy, still doesn't have an administrator. Overseas, China and Germany have made development of the technology a political priority.
This will continue to leave states to figure it out for themselves -- and in the current Wild West environment perhaps more accidents like the one that killed Elaine Herzberg. Collateral damage will also occurr to self-driving technology and the semi-autonomous driver assistance systems since they'll likely be viewed with even more suspicion by the public due to this mishap.
There's little doubt that self-driving cars will ultimately save lives and will need to be safely tested on public roads, but with sufficient federal regulation and oversight of the technology. Until then, more people will continue to die on U.S. roads every day. And that too is a tragedy.