Ford remains wary of Tesla-like autonomous driving features – Computerworld
At a time when Tesla has already rolled out advanced autonomous driving features in its cars, Ford is proceeding more cautiously because it believes the industry is not ready to hand over such features to consumers.
On Tuesday, Ford announced that by 2021 it plans to offer a fully self-driving (autonomous) vehicle for multi-passenger shuttles and ride-hailing services such as Uber. The vehicle will be manufactured with no steering wheel, no gas or brake pedal. In other words, no driver necessary.
Well before 2021, Ford will be testing prototypes on U.S. roadways of those fully-autonomous vehicles, according to Randy Visintainer, director of Autonomous Vehicles at Ford. The automaker hopes a fleet of self-driving vehicle shuttles and ride-hailing services can tap into a market that includes the elderly, infirm or young who are not able to drive.
Ford isn’t alone. Volvo Cars just penned a deal with Uber, the world’s leading ride-sharing company, to develop a generation of autonomous driving cars. Uber also announced its first fleet of semi-autonomous Volvo X90 SUVs will hit the streets of Pittsburgh this year.
Volvo XC90 SUVs will be outfitted with dozens of sensors that use cameras, lasers, radar and GPS receivers, according to a report by Bloomberg. A “handful of vehicles” have so far been delivered to Uber, with 100 expected by the end of the year. Earlier this year, Uber and Volvo signed a $300 million deal to develop a road-ready, fully autonomous car by 2021.
In announcing the autonomous fleet plans, Ford executives declined to talk about possible ride-sharing partnerships. They also made it clear that fully autonomous technology would not be coming to consumer vehicles for at least “several” years after 2021 because it doesn’t make “economic sense.”
Advanced self-driving feature too advanced for consumers
The problem with offering near-fully autonomous driving features to consumers is there’s currently no way to ensure drivers will be engaged enough to retake control of a vehicle when its advanced driver assistance system (ADAS) gets in a jam, Visintainer said.
“The concern is if you needed to get them back into the loop quickly for some reason, how can you be sure they’re ready to be brought back into the loop,” Visintainer said.
ADAS systems that drive the car, then hand control back over when in a bind, are virtually as complicated to roll out as fully-autonomous systems, he added.
“So when you look at the expense of putting all that in, is that something someone would be willing to buy for personal use? It’s questionable… other than in a high-end vehicle,” Visintainer said. “So for us, the economics make the most sense in a ride hailing, ride sharing mobility suite.”
The Society of Automotive Engineers (SAE) International, a U.S.-based industry standards organization, has established six autonomous driving categories: level 0 represents no automation, level 5 is a fully-autonomous vehicle.
It is SAE Level 3 that has become a sticky wicket for the auto industry, including Ford. Labeled by SAE as “conditional automation,” Level 3 allows all aspects of dynamic driving, such as automated turning, lane keeping and adaptive cruise control, but it has the expectation the driver will re-take control of the vehicle if prompted to do so by the ADAS system.
The problem for Tesla has been that while its Autopilot ADAS offered some level 3 automation, there was no way to force a driver to retake control of the vehicle; that has resulted in several documented accidents — one of them fatal.
The problem, in several high-profile examples, hasn’t necessarily been that Tesla’s Autopilot ADAS isn’t performing as promised, but that drivers place too much confidence in it and take their hands off the steering wheel and their attention from the road.
Tesla has restated in blogs that its Autopilot requires that drivers “must maintain control and responsibility” while using it. Still, videos posted by Tesla vehicle owners ignoring that policy are readily available on the Internet.
Consumer Reports, whose past reviews showered Tesla’s all-electric vehicles with praise, has called on the all-electric carmaker to disable its semi-autonomous driving system in light of the accidents and rethink the technology. That request has so far been denied by Tesla.
In June, Gil Pratt, CEO Toyota Research Institute, told reporters and analysts that the company will be focused on ADAS that is enabled by machine learning, allowing it to improve its own abilities over time.
Ford is taking a similar tact, Visintainer said. Autonomous vehicles will use both a “mediated perception” of their surroundings through high-definition maps and “direct perception” using cameras and machine learning algorithms that can identify stationary or moving objects around it.
The company has announced four technology partnerships to enable its self-driving technology, including investing $75 million in Velodyne, a Silicon Valley-based leader in light detection and ranging (LiDAR) sensors. The sensors are able to paint a 3D image of the surroundings that can create a map for the car’s ADAS systems as well as continually update those maps over time.
Ford also signed an exclusive licensing agreement with German-based Nirenberg Neuroscience, a machine vision company founded by neuroscientist Dr. Sheila Nirenberg. The partnership will offer Ford machine vision for its autonomous vehicle virtual driver system.
Unlike LiDAR, which bounces a laser off objects like a radar uses sound to determine distances, Nirenberg Neuroscience’s technology perceives objects in the same way as a human eye — by detecting the natural light that reflects off them. Nirenberg’s technology then uses the same neural code the eye uses to transmit visual information to the brain, but transmits it to a vehicle’s virtual driving system.
Nirenberg’s machine vision platform can be used for navigation, object recognition, facial recognition and other functions, the company said.
Nirenberg’s technology is also being used by Dr. Nirenberg to develop a device for restoring sight to patients with degenerative diseases of the retina.
The technology Ford is gaining through its new partnerships will not only be used for future fully-autonomous fleet, but also toward improving ADAS in personally-owned vehicles.
Consumer vehicles will continue to advance
Ford will focus on improving SAE levels 1 and 2 in its current fleet, Visintainer said.
“There’s some opportunity to share sensors and share learning across the two chains. We have an equal focus on both fronts,” he said.
For example, parallel parking assist (which Ford introduced in 2009) currently requires a driver to control both shifting and braking as the car parks itself. In the future, Ford’s ADAS will take over all parallel parking functions.
Another feature that will continue to evolve is the Ford Pro Trailer Backup Assist, which was introduced this year in F-150 models. The Backup Assist works by letting a driver steer the trailer with a control knob while the truck steers its wheels and limits vehicle speed.
Ford also plans to roll out Traffic Jam Assist, which will enable cars to automatically keep pace with traffic flow through sensors from its active park assist, automatic lane-keeping and adaptive cruise control technology.
“It still requires the driver to be engaged, but it helps take the stress away,” Visintainer said.