The machines are taking over. Like it or not, self-driving cars are on their way to our roads. However they are privy to their fair share of speed bumps along the way. Any new technology has any number of problems to solve before we can all let go of the wheel. One of those is getting the robots to handle bad weather.
As of today, automaker Ford and tech giant Google are working to overcome some of the current limitations of self-driving cars when they encounter snow, rain, or other severe weather conditions. Like human drivers, autonomous vehicles sometimes have trouble “seeing” in some low-visibility situations and adapting quickly to loss of traction. To date, many of the self-driving cars being tested have experienced difficulty in this area. Often, their sensors can be blocked by snow, ice or torrential downpours, and their ability to “read” road signs and markings can be impaired if they’re covered by snow.
At the 2016 Detroit Auto Show, Ford announced that it is working with the University of Michigan to develop a solution based on high-resolution 3-D digital maps that include data about road markings, signs, geography, topography and landmarks. With this more detailed information, Ford says its autonomous Fusion Hybrid sedans are able to navigate effectively in poor weather conditions. Ford’s high-fidelity, 3D maps of the roads include the exact position of the curbs and lane lines, trees and signs, along with local speed limits and other relevant rules. The more a car knows about an area, the more it can focus its sensors and computing power on detecting temporary obstacles—like people and other vehicles—in real time.
“It’s one thing for a car to drive itself in perfect weather. It’s quite another to do the same thing when its sensors cannot sense the road through snow, or when visibility is limited by falling precipitation,” explained Jim McBride, Ford’s technical leader for autonomous vehicles, in a statement. “In Ford’s home state of Michigan, we know weather isn’t always perfect. That’s why we’re conducting testing — for the roughly 70 percent of U.S. residents who live in snowy regions.”
Those maps have another advantage: The car can use them to figure out, within a centimeter, where it is at any given moment. Say the car can’t see the lane lines, but it can see a nearby stop sign, which is on the map. Its LIDAR scanner tells it exactly how far it is from the sign. Then, it’s a quick jump to knowing how far it is from the lane lines. This doesn’t mean all the problems with autonomous driving in bad weather are solved. Falling rain and snow can interfere with LIDAR and cameras, and safely driving requires more than knowing where you are on a map—you also need to be able to see those temporary obstacles. You know, like other people. But still, it’s nice to see one more challenge resolved as we move closer to the day when “driving” is something you save for the golf course.
Ford says it tested this ability in real snow last month at Mcity, the fake town built for self-driving vehicles. This idea of self-locating by deduction itself may not be unique to Ford, but the automaker’s the first one to publicly show it can use its maps to navigate on snow-covered roads.
Meanwhile, amid a multi-year drought in California, Google’s self-driving vehicles have now been confronted with a significant amount of rain in the area around their Mountain View test area.
As Google says in its latest Self-Driving Car Monthly Report: “Driving in rain makes many human drivers nervous due to reduced visibility, and some of our sensors — particularly the cameras and lasers — have to deal with similar issues.”
The various types of sensors and cameras may not actually get nervous, but some do experience reduced effectiveness when they become blocked by heavy rain, fog and clouds of exhaust from other vehicles.
To help address the problem, Google has equipped the sensor domes on top of its self-driving cars — Lexus RX 450h SUVs and its own prototypes — with little windshield wipers. These, says the tech company, “ensure our sensors have the best view possible.”
The autonomous cars now in development use a variety of sensors to read the world around them. Radar and LIDAR do most of the work looking for other cars, pedestrians, and other obstacles, while cameras typically read street signs and lane markers. However, an even bigger problem that comes with winter: If snow is covering a sign or lane marker, there’s no way for the car to see it.