Driverless Cars Need Ears as Well as Eyes – WIRED
You need just two eyes and two ears to drive. Those remarkable sensors provide all the info you need to, say, know that a fire engine is coming up fast behind you, so get out of the way. Autonomous vehicles need a whole lot more than that. They use half a dozen cameras to see everything around them, radars to know how far away it all is, and at least one lidar laser scanner to map the world. Yet even that may not be enough.
To understand why, think about that fire engine. Your ears hear it approaching from behind, and your stereoscopic sound can determine where it is, where it’s headed, and how fast. Hearing plays an essential role in how you navigate the world, and, so far, most autonomous cars can’t hear. Engineers at the outfits developing robocars are trying to figure out how to give them that skill, and any other human traits they’ll need to hit the roads.
“Since the technology is relatively new, we still don’t have all the answers as to what is best,” says Jeff Miller who studies driverless vehicle systems at USC.
Waymo, which is testing a fleet of autonomous minivans in the Phoenix area, has developed microphones that lets its robocars hear sounds twice as far away as previous sensors while also letting them discern where the sound is coming from.
It recently spent a day testing the system with emergency vehicles from the Chandler, Arizona, police and fire departments. Police cars, ambulances, fire trucks, and even unmarked cop cars chased, passed, and led the Waymo vans through the day and into the night. Sensors aboard the vans recorded vast quantities of data that will help create a database of all the sounds emergency vehicles make, so in the future, Waymo’s driverless cars will know how to respond. If it’s a fire truck coming up behind, the car will pull over. If you’re at a green light and an ambulance is approaching from the left, the car will yield.
This information expands the enormous dataset Waymo’s amassed in the eight years it has been working on autonomous tech. The more info and experience these artificially intelligent systems have, the better they’ll do making decisions out on the road.
Beyond hearing, robocars also must develop some semblance of the other types of communication you take for granted. Think about what happens when you approach a four-way stop. With a nod, a smile, or an extended middle finger, you communicate things like “No, you go ahead” or “Hey! I’m coming through.” A wave to pedestrians invites them to cross.
Hit The Road
Robocars obviously can’t do this, so they’ll rely upon vehicle to vehicle (V2V) and vehicle to infrastructure (V2I) radio communications to effectively communicate with other vehicles. Apple just filed a patent for wireless vehicle to vehicle communication, and Cadillac’s latest CTS sedan can talk to other Cadillacs to warn them it’s slamming on the brakes, for example. The National Highway Traffic Safety Administration is proposing making a similar system mandatory in all cars by 2020. “V2V is going to be bigger within the next few years,” says Miller.
The fact you can safely control a speeding vehicle with few inputs, speaks to the awesome power of the human brain. But you’re still a lousy driver. Human error causes 90 percent of the 1.25 million traffic fatalities recorded worldwide each year. Computers have the potential to be so much better than you are, that one day they may cut that rate to something approaching zero. But only once they can fully sense and understand the world they’re driving through.