Google’s got a hot new ride. The company has a new Street View car with updated cameras, and—surprisingly—a set of Lidar (Light, Detection and Ranging) cans! Google doesn’t have anything up officially about this, but Wired has the scoop on the new vehicles.
The new cars are all Hyundai Elantra GTs with a sweet paint job (ok, probably a vinyl wrap) featuring—what else?—Street View imagery of beautiful vistas. The cars mostly end up being blue and brown, which is a lot more subtle than the old Google Maps-themed Street View cars and their big green doors. Still, it’s hard to miss the giant camera on top.
The camera system upgrade—the first in eight years—greatly improves the image quality while simplifying the rig. In the main ball, Google is down from 15 cameras to seven, making the whole package a lot smaller. These 20MP cameras are aimed all around the car, and the pictures they take are stitched together into a spherical image for Google Maps. There’s more to the cars than just the ball though: there are also a pair of “HD” cameras that face directly left and right. These are dedicated to reading street signs, business names, and even posted store hours; those images are funneled to Google’s cloud computers for visual processing.
The end result of the new cameras will be prettier Street View shots, with higher resolution, better colors, and fewer stitching errors. The better images should also result in more data for Google’s various visual feature-detection algorithms.
Wired’s report focuses almost entirely on the new cameras, but I think the the most interesting additions are the two LIDAR pucks that hang just below the camera ball. These are the ubiquitous Velodyne VLP-16 “Puck” sensors, allowing the to car “see” in 3D in 360 degrees. These $8,000 Lidar sensors are most commonly used in autonomous car prototypes, so to see them on a Street View car is unexpected. Don’t expect the Street View cars to start driving themselves anytime soon—as Google Street View’s Technical Program Manager Steve Silverman says in Wired’s video, the Lidar sensors “are used to position us in the world.”
A Lidar sensor is just a 3D depth sensor. It only “positions you in the world” if you save a map of the world in 3D and compare your current 3D measurements to your saved 3D map. So are Google’s Street View cars are now creating a massive 3D map of everywhere they drive? It sure seems like it.
The positioning of the Lidar sensors is worth mentioning too. The pucks come in a few configurations, but at most they have a ±15° vertical field of view. Given the skinny field of view, it’s typical in self-driving cars for sensors to be mounted horizontally, so they can perceive traffic all around the car. \In the Street View cars, the Pucks are both mounted at about a 45° angle, which means they can’t see very far in front of or behind the car—the FoV barely clears the car body. We can conclude then, that the Lidar sensors are mostly looking at everything to the left and right of the car.
The normal horizontal mounting would only get a sliver of buildings when you drive by, but with this 45° angle, Google should be getting more vertical height on the buildings—probably enough to have a full top to bottom building scan—and just more detail in general. It’s still a small sliver of data, but that works fine since the car is moving forward. The new cars then, are like giant, 3D flatbed scanners, blasting a wave of Lidar across every building they pass. And remember: there are two of them, so Google captures twice the resolution.
The Google Maps team is one of the most data-hungry groups in the world, and now they’re driving a big, mobile 3D scanner up and down the roads. What they do with it is anyone’s guess. They could use the data to build better 3D building models for Google Maps. They could provide the data to self-driving car companies, which often need Lidar scans of areas in order to autonomously navigate them. In fact, Google’s setup looks identical to Here’s “True Collection Vehicle” which is purpose-built for creating self-driving car maps. Here’s setup has four cameras on top and a Lidar sensor mounted at a 45° angle!
Google is certainly making a much more precise map than GPS could make alone. GPS is accurate to a few meters, while a Velodyne sensor can position you within a few centimeters. If all you want to do is locate your Street View pictures on a map though, it sure seems like overkill.