Any fully self-driving, or autonomous, car of the future must be more human than any car before it, not less. With the need to have its own eyes, ears, Spidey-sense, and a brain to take your place as the driver, the fully autonomous car of your future will be the automotive you, just without the hair, skin, craving for chocolate, or pollen allergies.
Let’s first understand that fully self-driving cars must do everything we humans do without even thinking about it. That means relying on the same senses you have, particularly vision and the ability to react to input from vision. They must perceive, process, and execute based on incoming data.
Additionally, we must also recognize that it is one thing for a fully self-driving car to get you from New York to Boston on a clear, sunny day in June, but quite another to do the same thing on a snowy day in January.
Weather is just one of many significant hurdles on the path to full autonomy, and engineers will need to solve for it before putting truly autonomous vehicles on the road in significant numbers.
Seeing Through Weather, Both Airborne and Built Up On Your Car
Rain, snow, and ice change how a self-driving vehicle perceives the street it’s driving on. Cameras cannot see through fog or heavy snow. Lidar units, which measure distance based on small laser beams that bounce off objects and return to the unit much like a bat’s echolocation, reflect off whatever it hits first. When the weather is nasty, that’s snow, sleet, freezing rain, and sometimes very heavy rain.
But those are all off-board problems, or interruptions of those systems being able to see into the surroundings once their beams or cameras are operating.
There’s another, more basic consideration, and one closer to you as the inactive driver being autonomously driven: the build-up of snow, ice, mud, and crud on cameras and sensors that prevents them from working. In order to see, you must first open your eyelids. Caked-on ice becomes an autonomous car camera’s inoperative eyelid.
Meanwhile, GPS (global positioning systems, as used for navigation in human-driven cars) cannot serve as a backup for a self-driving car’s obscured vision because it does not provide the accuracy or resolution needed to act as stand-alone navigation for self-driven cars – as borne out by collisions involving various “Bird Box” imitators. Also, GPS cannot discern between stationary objects and those in motion.
Some companies are working on more sophisticated ground-scanning radar systems that penetrate through the weather and better detect other cars and their position. However, even if this radar helps in bad weather, it adds more complexity into an already busy in-car network. Plus, it will require a continually updated, pre-existing map of the surroundings. Road construction and real-world detours could require software updates or simply give it fits.
Yet another wrinkle in this overall picture is the weather-obscured road sign. Some cars today have traffic sign recognition capabilities to help inform the onboard driver-assistance systems, and this capability will transfer to fully self-driving cars.
But if snow, ice, or slush covers important road signs to the degree that they’re illegible (or they’re simply missing in action), that will be another information stream removed from the autonomous car’s decision-making process. This is not an issue for today’s cars that have limited Level 2 or even Level 3 self-steering and self-braking capability, but it could be for fully self-driving cars.
Lastly, how a self-driving car copes with poor weather might be influenced by region. There will be a level of interaction between self-driving cars and the infrastructure surrounding them, like traffic alerts and emergency warning systems. Locations benefitting from this kind of information infrastructure will naturally be a friendlier environment for autonomous vehicles, but without major investments in urban, suburban, and rural areas alike, performance will prove uneven.
One bit of good news here: Most, if not all, carmakers have already moved the optical equipment used on cars with driver-assistance systems (like lane-keeping assist and adaptive cruise control) into the upper windshield. This enables those cameras and sensors to use the wipers, washers, and defrosters in order to solve that visibility issue.
A Self-Driving Scenario That Must Be Solved
Let’s say it’s the year 2027 and you’ve programmed your fully self-driving car to take you from Sacramento all the way to Portland, Oregon in December – a 580-mile drive, perfect for reading a book or watching a movie on your iPad while your car does the driving. You could start off in sunny, dry, 65-degree Sacramento weather, yet slowly ascend into a blizzard blasting the Siskiyou Summit just over the border in Oregon, at 4,310 feet.
In this scenario, the car might observe all speed limits. But when the weather changes, either sophisticated roadside technology must inform it, or, if that doesn’t exist or your car cannot receive it, it must recognize the changing conditions for itself and compensate in terms of how it’s driving.
Some automakers have already stated that in the worst weather cases, fully self-driving cars will be programmed to pull over off the road until the poor weather subsides. But as any person who has driven in a worst-weather situation knows, pulling over might not be possible. And the last thing you’d want is your autonomous car to simply quit working, stranding you in your lane in a blizzard. At Siskiyou Summit.
Imagine the uproar and outrage if thousands of passengers in autonomous vehicles were stranded in the middle of nowhere during a raging snowstorm, shivering from the cold, the cars getting covered with drifts, the only option apart from waiting for rescue to exit the vehicle and risk exposure to Mother Nature.
When you think about self-driving autonomous cars, you might envision a utopian future free of the task of driving. But before that can ever happen, automakers need to solve for bad weather. That could take a while, and it won’t be inexpensive.