NHTSA chief questions safety level of self-driving cars – USA TODAY
NOVI, Mich. — The government’s top safety regulator says more flexible rules and new research sources are needed to determine whether autonomous vehicles will be safe enough for all American roadways.
“Today everybody expects a regulation comes out and that’s what it is forever. That will not work,” Mark Rosekind, head of the National Highway Traffic Safety Administration, said at the annual Telematics Update conference in Novi, Mich.
Rosekind added that NHTSA will release guidelines next month that could set the near-term rules of the road in autonomous vehicle research.
But he warned those will change.
“We need new safety metrics,” Rosekind said. “We also are going to have to broaden our view on the data sources for what those metrics might be. We have laboratory work. We have simulations and real world data.”
In short, the industry and regulators don’t know everything they don’t know about the safety of the most advanced autonomous technologies.
While Rosekind may leave his position when the next administration takes over next January, he acknowledges that the technology will change faster than regulators’ ability to make new rules.
Most vehicles that are pushing full autonomy, or Level 4, are limited to proving grounds, specially designed test environments such as Mcity in Ann Arbor or other well-guarded settings.
Further, being able to convey people over a short distance does not mean the autonomy systems will operate safely over public roads for 12,000 to 20,000 miles per year.
Automakers continue to introduce semi-autonomous, or Level 2, features such as adaptive cruise control, lane departure alert and forward collision avoidance that clearly enhance safety.
But the path to full autonomy gets tricky when sensors, software, 3D maps and algorithms create the possibility of a driver relinquishing control.
Tesla Motors has introduced a feature called Autopilot that can do some of that. But earlier this week, one of its Model X’s with Autopilot crashed into a commercial building in Irvine, Calif. The driver alleged that the car accelerated on its own from a parking space. Tesla said vehicle’s logs showed that the Autopilot feature had not been activated.
The incident illustrates how difficult it is to determine how safe is safe enough.
Also speaking at the Novi conference were James Fackler, assistant administrator in the Michigan Secretary of State’s office, and Jude Hurin, head of the Nevada Department of Motor Vehicles. The two state officials are already exploring whether autonomous vehicles should change the standards for who can get a driver’s license.
They must perform their roles just as Michigan, Nevada and many other states are trying to attract funding for testing facilities such as Mcity and the American Center for Mobility under construction at Willow Run.
Last week at the Mackinac Policy Conference, Gov. Rick Snyder announced a branding campaign called Planet M to promote Michigan as the world’s center for innovative transportation and mobility research.
In one of Google’s most widely seen videos promoting its Google car, a blind man, Steve Mahan, is guided from home to a fast-food restaurant, a level of independence that to him seems miraculous.
“Michigan is taking the approach of ‘Let’s start slow,’ ” Fackler said. “If there is someone behind the wheel, let’s make sure they are not technically unable to operate it. Some people outside the regulatory community say ‘Well, you’re standing in the way of future technology.’ But I want to make sure that if something does happen to this car that the person who is there is ready to take over.”