Teslas Can Be Fooled Into Speeding

can be reached at nkurczewski@yahoo.com
can be reached at nkurczewski@yahoo.com
Share on facebook
Share on twitter
Share on pocket

Researchers at McAfee discovered it takes only two inches of plastic tape on a speed limit sign to cause a Tesla to go zooming past the legal limit.

  • McAfee Advanced Threat Research wanted to see if it could trick vehicles into misreading road signs.
  • Tests centered around stop signs and speed limit signs.
  • One of the tests convinced the sensor system in a Tesla Model S and Model X to accelerate to 85 mph after misreading a speed limit sign of 35 mph.

Tesla hackingThe altered sign McAfee used in its tests (bottom right-hand corner) looks remarkably normal. (Photo: McAfee/YouTube)

Self-driving virus

Research by the McAfee Advanced Research Team, an engineering consultancy firm targeted at spotting and preventing potential hacking threats, has uncovered a new vulnerability in Teslas. The California-based tech company is probably best known for antivirus software developed for laptops and home computers. But when it comes to automotive technology and sensor arrays, particularly hacks that could affect the driving behavior of your car or SUV, McAfee found it took increasingly less subterfuge and skill to cause some dramatic results.

In this case, the test subjects included a 2016 Tesla Model S sedan and Model X crossover, both fitted with a MobilEye EyeQ3 camera array mounted in their windshield. MobilEye is one of the leading suppliers of camera-based sensors, with tens of millions in use in brands like BMW, Audi, Cadillac, Nissan, Volvo, and Tesla.

A few inches of black tape on a speed limit sign could be all that’s needed for your vehicle to zoom 50 miles per hour above the legal limit. (Video: YouTube)

What a fool believes

McAfee initially looked into what it would take to cause a car’s road-sign detection system to misread a stop sign. But since this form of error has no direct influence on a car’s actual driving behavior – meaning, a misunderstood stop sign would not cause the vehicle to automatically accelerate or brake on its own – the research team looked for a hack that would directly influence a vehicle’s mechanical systems.

A narrower window was discovered when implementing Tesla Automatic Cruise Control (TACC) and Speed Assist (SA) in the 2016 Model S and Model X. Ideally, these systems will read a road sign and automatically adjust the vehicle’s speed to match. Except, in its 18 months of testing, McAfee’s research team discovered it could fool the system by using as little as 2 inches of strategically-placed black tape on a speed limit sign. During repeated tests when engaging TACC and SA with the modified sign ahead, the Tesla vehicle read the speed limit as being 85 mph, or 50 mph more than the sign truly indicated.

Diagram showing Audi Pre Sense 360 sensors

 

As cars become more aware of their driving environment, the chances for errors (and potential for direct hacks) similarly increases. (Photo: Audi)

The good news

While this is a scary miscalculation, there is some good news found in McAfee’s conclusions. The conditions for the miscalculation require someone to actively alter a road sign, more modern MobilEye camera sensors (EyeQ4 and newer) are better at avoiding this error, and Tesla vehicles themselves no longer use MobilEye sensors.

“We did get access to a 2020 vehicle implementing the latest version of the MobilEye camera and were pleased to see it did not appear to be susceptible to this attack vector or misclassification, though our testing was very limited,” wrote Steve Povolny, head of McAfee Advanced Threat Research, in a blog post detailing the experiment. “We’re thrilled to see that MobilEye appears to have embraced the community of researchers working to solve this issue and are working to improve the resilience of their product. Still, it will be quite some time before the latest MobilEye camera platform is widely deployed.”

WHY THIS MATTERS

There are enough bad drivers on the road already. The last thing anyone needs are computers and car sensors that misinterpret signs and go careening down a road. While the conditions for this error are quite narrow, this level of research helps keep auto companies one step ahead of hackers and other potential threats to driving safety.


About the Author

can be reached at nkurczewski@yahoo.com
Close Menu

We use cookies and browser activity to improve your experience, personalize content and ads, and analyze how our sites are used. For more information on how we collect and use this information, please review our Privacy Policy. California consumers may exercise their CCPA rights here.