Do Pedestrian-Assist Systems Discriminate?

  • Liz Kim has written about automobiles, both as a journalist and as a marketer, for 20 years. She enjoys giving advice about them to friends and family who want to make the most of their hard-earned dollars, and incorporates her experience as a mother and savvy consumer in everything she writes.

Share on facebook
Share on twitter
Share on reddit

If you’re 60 or older, you might remember a time when young, aspiring artists would eagerly open a new box of Crayolas, sniff their waxy goodness, and reach for the “Flesh” color to draw a self-portrait only to be disappointed when the pale peach color looked nothing like your own epidermis. Crayola changed its allegedly flesh-colored crayon name to Peach in 1962, but that was just one of many awakenings to a pervasive problem.

At the risk of dating myself, there are pictures of me from the 1980s with big hair and a big grin radiating across a chalky face, thanks to the limited selection of foundation colors that never quite matched my skin tone. Today, I can find foundation in every shade under the sun. Baby steps, society, baby steps.

Now, as the world gravitates toward a self-driven future like Woody, Buzz and the gang on a conveyor belt of doom, companies that create artificial intelligence (AI) software for autonomous vehicles aren’t, apparently, as intelligent as the makeup industry.

According to a new study out of the Georgia Institute of Technology titled “Predictive Inequity in Object Detection,” driver assistance and collision avoidance systems may offer greater accuracy at detecting pedestrians with lighter skin tones than darker ones. According to the study, detection was 5% less precise for test groups composed of subjects with darker skin, and testing took into account all variables such as time of day and weather conditions.

That means that if you’re relying on your car’s automatic emergency braking system to stop the car if a pedestrian stepped out onto the road, there’s a measurably greater chance that it will fail to see a dark-skinned walker than a light-skinned one. Given that more and more cars have active safety systems like automatic emergency braking that you increasingly count on to prevent a tragedy like colliding with a pedestrian, this is an alarming finding.

Now imagine if you’re fully reliant on these systems during your commute.

And you don’t have any control over the vehicle.

But First, Read the Fine Print

Pedestrian Detection System
No, self-driving cars aren’t racist, but the technology that will power them still needs a lot of work to perfect. (Flir)

 

Obviously, our utopian future involving even more screen time (like we need that) by leaving the task of commuting to AI is a long way off.

And it’s important to keep in mind that Georgia Tech’s findings are not based on actual future autonomous vehicles currently undergoing testing by automakers. That’s because companies working on these systems are notoriously guarded about test results pertaining to their autonomous vehicle programs, and are stingy about sharing any data with competitors or the public. Not a great way to engender trust, but I suppose they have to maintain their trade secrets.

Rather, this study is based on existing models with semi-autonomous driving technologies and datasets available to the public. The researchers then used a Fitzpatrick scale, which categorizes people according to skin tone, and measured the precision with which the software is able to detect pedestrians.

Now, you may think that machines would not be subject to bias. Remember, though, that people who may harbor some prejudice could be programming their software. People who simply don’t take into account the range of human diversity could be responsible for the underlying code. As with any artificial intelligence, programs “learn” from sample sets, and software engineers may not have thought of using a wide range of skin tones.

What’s important at this point is for car companies to discover flaws in their systems, and try to minimize them as if lives depended on it. Because they do.

When it comes to automotive safety, there’s more at stake than an embarrassing picture from the past. And while no technology is perfect, findings like what Georgia Tech claims to have discovered only serve to sow the seeds of mistrust in the eyes of the consumer, jeopardizing wide-scale acceptance of new, useful safety features.

For now, everyone needs to know that we are decades away from true Level 5 autonomous vehicles. There are no vehicles that will let you sit back, relax, and check Instagram for the 12th time that day. Driving remains an activity that requires hyper-vigilance.


About the Author

  • Liz Kim has written about automobiles, both as a journalist and as a marketer, for 20 years. She enjoys giving advice about them to friends and family who want to make the most of their hard-earned dollars, and incorporates her experience as a mother and savvy consumer in everything she writes.

Close Menu