How To Make Autonomous Cars See Better – Forbes
Autonomous cars need to see everything all the time. They need to understand driving conditions during any kind of weather and in every scenario possible from country roads to city streets.
To do this, autonomous cars’ eyes use sensors to see the road. The current method is LIDAR which is light-based radar. In this scenario, a sensor sends short pulses of invisible laser light and then times how long it takes to see the reflection from that light. This lets the car see how far away it is from the object or person. Google and Ford are using LIDAR. The other method is a camera model centered around vision-based driving like a human driver would see. It uses multiple cameras around the car to capture a scenario and uses software to learn human behavior to create a color 3D world from 2D images.
AIMotive, based in Budapest, raised $10.5 million in funding in 2016 for their artificial intelligence engine for advanced driver assistance systems (ADAS) in the camera model solution. It takes all the data from the car’s sensors to segment objects and estimates the distance of the car from objects or people through neural networks. The goal is to make autonomous driving more like a human driver.
And now, a start up in Israel, AdaSky, has jumped into the autonomous car vision arena with a Far Infrared (FIR) perception solution called Viper. AdaSky isn’t the first company to make FIR and thermal sensors but Avi Katz, CEO, AdaSky, believes they are the first to apply it to autonomous vehicles.
According to Katz, FIR technology has been used for decades in other vertical industries which he says makes it a mature and proven concept. The company says it’s leveraging that competence for autonomous vehicles to offer a solution that complements other sensing technology like LIDAR, radar, and cameras, that’s already being adopted by OEMs and adding an additional layer of vision and brains for the vehicle. By increasing performance for classification, identification, and detection of objects and of vehicle surroundings, both near and far range, autonomous cars will be safer.
The company says autonomous cars that have FIR thermal cameras can understand the road and their surroundings in any conditions. Viper is comprised of a thermal camera and machine vision algorithms that can be added to any autonomous vehicle which will theoretically allow the car to see better and analyze its surroundings. The solution passively collects FIR signals through detecting thermal energy, or body heat, radiating from objects or people. Algorithms process the signals received by the camera and provide accurate object detection and scene analysis. This gives the vehicle the ability to more precisely detect pedestrians within a few meters and that allows for more distance to react when driving.
Katz says they founded the company with the vision of advancing the autonomous vehicle market by using a perception solution to increase the safety and performance of the self-driving car.
“To date, autonomous vehicles are using LIDAR, radar and visual cameras for sensing and FIR is a relatively new concept for this market,” said Katz. “LIDAR is absolutely a necessary sensor for autonomous vehicles to have. We complement LIDAR sensors as well as standard cameras and radars to improve perception ability of the vehicle. Each of the sensors above has its own unique contribution to the perception capabilities of the ADAS system and the Autonomous Vehicle, and eventually, it will all be used together and the outputs will be fused together to enable ultimate 24/7 driving.”