Blue roads and glowing signs—how this startup’s tech lets cars see the world – Ars Technica

Posted: Monday, October 09, 2017

In the past couple of years, a number of intersecting trends in the automotive and technology worlds have come to be grouped together as mobility. This is not a reference to an IoT-enabled version of those scooters you see people riding at the grocery store but is instead a catch-all covering electric vehicles, self-driving vehicles, and ride-hailing services—either on their own or packaged together. It’s shorthand for a vision of the future where traffic jams and traffic deaths are a thing of the past, as are carbon emissions and maybe even car ownership. Some of that stuff is still decades away from widespread deployment, and a lot of infrastructure—both physical and digital—needs to be built to get us there. One company with a particularly fresh approach to doing that is a startup called Civil Maps.

Remember when a road trip was unthinkable without one of these?

A car needs to be able to do several things in order to be fully autonomous. First, it has to know exactly where it is, where it’s supposed to go, and the route it needs to take. It ought to know its location to within a few centimeters, because no one likes it when you drive on the wrong side of the road or park on a sidewalk. So we need very accurate maps, ones much more precise than the trusty road atlas or the turn-by-turn directions we now get from the likes of Google and Apple. What’s more, the entire road network—which amounts to more than 2.5 million miles of paved roads in the US—can’t just be mapped once or even once a month. The initial base map has to be updated constantly to reflect potholes and road closures and all the other obstacles that a vehicle might encounter.

Next, the car has to be able to perceive its environment. That ability will require each car to carry an array of sensors, the data from which will be fused together. We’re more risk-averse when it comes to trusting our lives to machines than we are other humans, so plenty of redundancy is warranted. And fusing the input from a mix of sensor types—lidar, radar, optical cameras, and so on—should give the car a better picture of the world around it than we can get from our eyes and ears.


Write a Reply or Comment:

Your email address will not be published.*