It’s Monday morning, you’re late for work, and as you merge onto the freeway you see it: the sea of red brake lights. It’s going to be a slow, frustrating trip—for all the suckers who have to drive their own cars. You click yours into autonomous mode and spend the slog getting ahead on work emails, or even catching up on sleep.
Yes, the day you become a co-driver is fast approaching. But as cars master how to see, understand, and navigate the world, researchers are shifting their attention to another subject: you. Paradoxical it may seem, but the more control the car has, the more it needs to know about the person sitting behind the wheel—whether they’re paying attention, their mood, even their health.
“We are making tremendous progress in instrumenting vehicles to know everything that’s happening around them, but there are just not enough sensors looking at the driver inside the car,” says Anuj Pradhan, who studies human factors at the University of Michigan’s Transportation Research Institute.
Used to be, if you stopped paying attention while driving, you’d just crash. And in 20 or 50 years, when cars are 100 percent autonomous, whatever you’re up to won’t matter, because you’ll have zero responsibility. Today’s technology sits between those points: The robots are doing some of the work. Tesla already sells cars that drive themselves on the highway, as long as the human monitors the system, ready to take over at any moment. Next year, Audi plans to introduce a more capable system, where the driver is demoted from supervisor to understudy, necessary only when things go to pot.
A lot of the players in this business hate that idea (a bunch are avoiding that kind of system) because people are godawful backups. They’re prone to dozing off, zoning out, goofing around. But if you want an autonomous car that can roam beyond a constrained geographical zone, or that can stay on the road in less than ideal weather conditions—and you want it this decade—you’re gonna need some human help.
Distractions Can Be a Good Thing
So researchers and engineers in the autonomous space are focusing more and more attention on the human. One surprise: Smartphones can help. “Being in an autonomous car is incredibly boring, and we have a lot of people who fall asleep,” says Wendy Ju, who studies self-driving cars at Stanford’s Center for Design Research.
Distractions like texting and tweeting, super dangerous in a regular car, can be useful in a self-driving one, engaging the human’s brain. Demanding a human take the wheel is way harder when that person’s sleeping. “These are things that keep you awake,” Ju says. “They’re actually good.”
Great—as long as car knows what the human’s up to, and whether they’re able to take control of the car if needed. Basic driver monitoring systems have been around for more than a decade, mostly aimed at combatting drowsy driving. In 2003, Volvo introduced its Intelligent Driver Information System, which monitors steering wheel and pedal inputs, and whether the turn signal is on. That’s enough to guess if the driver’s in the middle of a high stress overtaking maneuver—and it’s better to automatically decline that incoming call. Some BMW models will pop up an icon of a steaming cup of coffee if steering inputs start wandering, and it seems like the driver could be nodding off. Toyota has used a camera to watch the driver’s eyelids.
This new challenge invites a more Orwellian approach. Australian company Seeing Machines says its gaze tracking technology will allow cars to act as co-drivers, because they’ll know what the driver has and hasn’t seen. Industry supplier Pioneer wants to monitor his heart rate. Harman is working on tech that measures pupil dilation, aiming to understand cognitive load.
Your Car Will Pick up on Your Moods
This richness of information is likely crucial for the semi-autonomous car, but it could also inform how truly driverless systems work. Think about when you’re a passenger in the front seat of a regular car, and the driver is speeding, or changing lanes erratically. You may tense up, frown, tug at your seatbelt—communicating you’d prefer to slow down, please, without having to say it.
Being in an autonomous car is incredibly boring, and we have a lot of people who fall asleep. Wendy Ju, Stanford researcher
Even if you’re happy with the driving style, you may point out things the driver missed, or suggest a different route. These could all be possible with an autonomous car too, with the right sensors pointed at the person in what was the driver’s seat.
This goes both ways. The car’s computer can better calibrate how it talks to the driver, like with a louder warning if he is obviously distracted. It could also offer to engage autonomous systems if the road conditions look good and the driver looks tired.
Having a camera pointed at your face raises obvious privacy concerns, but Ju says it’s unlikely all that data will be collected and kept. “It would be very expensive from a bandwidth perspective to transmit video of what you’re doing in the car.” And if that changes, it may just be the price you pay for improved safety. Until your autonomous car can cope without a human all together, and you become, at long last, irrelevant.