Skip
Current Issue
This Month's Print Issue

Follow Fast Company

We’ll come to you.

2 minute read

Winter Road Conditions Are The Next Hurdle For Driverless Cars

All it takes is an incredibly detailed map of every street.

Top Video: courtesy Ford

Self-driving cars are great, until the weather kicks up. Robot drivers might be better than humans on a sunny day, but currently, their robotic brains are far behind ours when it comes to precipitation. The problem isn’t the slippery surface—long-established traction-control does a decent job of that. The problem is visual. With snow blanketing the road and covering lane markings, or sticking to street signs, the car has trouble orienting itself. Its camera eyes just won’t work.

Ford’s solution to this problem is to do what humans do on an oft-traveled route. They use a mixture of memory and guesswork, but because a computer has a much better memory than a human, the car can depend on it more.

Its test cars rely on super-accurate maps, 3-D models of the roads, from the positions of curbs, to lane markers, to the positions and contents of traffic signs. Between these maps, and the car’s own sensors (LiDAR, cameras), the computer brain can piece together the layout of the road. Say the car knows where it is, and can spot a road sign with its cameras, but the road is covered with snow. No problem. Because the car can use the rest of this information to deduce its position down to the half inch, it knows exactly where the lanes should be. The result is a car that’s better in the snow than you are.

Snow isn’t the only kind of weather that messes with autonomous cars. Rain can also screw things up. Here’s a video from Hyundai’s self-driving car competition last year. The car has to be stopped by humans several times. It drives the wrong way down a one-way street, it hits a barrier while parking, and fails to detect pedestrians.

At least the rain-activated windshield wipers keep working.

The problem with rain is the same as with snow—visibility, although for different reasons. The LiDAR used by driverless cars needs good visibility. It bounces lasers off the surroundings and measures the light that comes back, like a laser radar. Rain and fog mess with this visual system, effectively blinding the car. When you know what’s going on, it’s surprising how good a job the Hyundai test car did, despite the weather.

Ford’s mapping system would help in any low-visibility conditions, but it doesn’t help with the most important part of driverless car control: detecting pedestrians, and other vehicles. Radar helps with that, and that’s why an autonomous car has so many sensors.

Weather isn’t he only thing that can catch a driverless car out. Potholes go undetected unless marked with traffic cones, and a cop waving down a car looks just like any other pedestrian by the side of the road, and traffic lights with the sun directly behind can stop the car from reading the color. That’s not to say human drivers don’t suffer from similar limitations.

Ironing out these problems, and mapping the roads down to the last centimeter (like Google and Ford are doing) are the keys to making cars that can tackle the same conditions as human drivers can today. Happily, it’s more efficient to intensively map urban areas, which is where we will see the most benefit from ditching obsolete human drivers.

loading