The driverless debate: Even in cars that are only semi-autonomous, drivers say they’ll text, eat and read – Los Angeles Times
Until recently, there was no question about who’s responsible for an automobile’s operation: the driver. One-hundred percent.
Going forward, as driverless cars hit the roads en masse, that distinction will fade. When the robot cars get here, you’ll be able to eat, text and sleep, but you won’t drive, because there won’t be a steering wheel or brake pedal. Your only role: order the car where to go.
Yet there’s a lot of ground between the old-school cars that people are used to and the driverless experience promised in the next few years. Semi-autonomous cars with features such as self-parking, adaptive cruise control and automatic lane change are fast becoming a sizable portion of the auto industry. Most new cars today are offered with some kind of driver-assist technology package; the more expensive the car, the more advanced the feature set.
Though these cars will do much of their own driving, the human behind the wheel will continue to be responsible for the vehicle’s operation, as stipulated on the contract signed on purchase.
That’s a legal responsibility. In practice, dividing driving duties between human and car is raising uncomfortable questions about who, truly, is in charge, and what it means for auto safety.
“There’s something we used to call split responsibility,” said Hod Lipson, director of Columbia University’s Creative Machines Lab. “If you give the same responsibility to two people, they each will feel safe to drop the ball. Nobody has to be 100%; and that’s a dangerous thing.”
It’s an issue that Tesla has wrestled with ever since the May death of a Model S driver using Autopilot, the company’s popular driver-assist feature. Autopilot users are instructed to keep their hands on the wheel and to stay alert, but many — lulled by a false sense of security — have ignored those warnings. Tesla on Wednesday started rolling out improvements to the software that it says will make the feature safer.
Automakers say most customers don’t know yet what to make of driverless cars, but many want new cars equipped with technologies that can take over some aspects of driving. The companies are happy to oblige: More excitement brings more people into the showroom, and more options mean higher revenues and profits.
Supporters, including federal transportation officials, believe these cars will prove safer, too, though there’s plenty of statistical analysis yet to be done.
No matter what, though, there will still be spectacular crashes, and the more often humans let their attention drift, the more crashes and bad publicity there will be.
But that’s the new reality for the world’s roadways. From now and until driverless cars are widespread, new vehicles will be something in between: part regular car, part robot, with the robot increasingly picking up the driving duties.
It could be this way for a while. Raj Nair, Ford Motor Co.’s chief technology officer, estimates that only 20% of new vehicle sales in 2030 will be completely driverless cars.
How people will drive during this transition is unclear. More than a third of respondents to a recent State Farm survey said that if a semi-autonomous car took over part of the driving duties, they’d eat, read, text, take pictures and access the Internet while driving.
The pace of evolution in driver-assist technology varies among automakers. Tesla, General Motors and Mercedes-Benz are taking an aggressive approach.
Tesla already is on the road with its Autopilot feature, the most advanced semi-autonomous system currently around. Mercedes and Audi offer semi-autonomous features that go well beyond adaptive cruise control. The 2017 Cadillac CT6 will have an Autopilot-like set of features called SuperCruise, in which the car will steer, change lanes and pass other vehicles, all with little driver effort. The same goes for Mercedes’ E-Class.
Then there are more exotic Tesla fighters in the works. Faraday Future, based in Southern California, is developing high-powered cars loaded with automation, though it has yet to detail what those plans are.
In carmaker lingo, there are six levels that describe a vehicle’s driverless capability, from zero to five.
Level 0 is no driver-assist technology at all. Level 1 covers old-fashioned stuff like traditional cruise control. At Level 2, where most driver-assist technologies stand now, the driver is expected to pay full attention. With Level 3, the robot drives most of the time, but not all the time. Level 4 is driverless on most roads, and Level 5 is driverless anywhere.
Ford Motor Co. plans driverless cars by 2021 but will skip Level 3. Google, an early leader in autonomous vehicle technology, and Volvo, where safety is leveraged as a marketing tool, also say they plan to skip Level 3 and go straight to fully autonomous.
The levels are not rules or regulations, only loose descriptions.
“From a technical perspective, there are really only two levels,” said Jonas Nilsson, an autonomous-driving executive at Volvo Car Group. “Whether the driver is responsible or not.”
Circling over the issue is the May Tesla crash. Cruising on Autopilot, the car ran under a truck, and the car’s driver was killed. Tesla said it analyzed Autopilot data and concluded the driver was not in sufficient control of the vehicle. The truck driver told police he heard a Harry Potter movie soundtrack playing in the crumpled car after the crash.
The incident prompted a debate on auto safety. Elon Musk, Tesla’s chief executive, said it was the first known fatality in more than 130 million miles during which Autopilot was activated. Among all vehicles in the U.S., there is a fatality every 94 million miles; worldwide, it’s every 60 million miles. (In China, a Tesla driver was killed in a crash in January, but it’s unclear whether Autopilot was on.)
Statisticians can argue about whether the numbers are comparable, given Tesla’s short track record. The company has 100,000 cars on the road equipped with Autopilot.
Musk has pledged to push ahead with autonomous features. This month he said it “would be morally wrong to withhold functionality that improves safety simply in order to avoid criticism or fear of being embroiled in lawsuits.”
Any subsequent tragedies involving driver-assist technologies are likely to draw rabid media attention.