Researchers show how to trick a Tesla into speeding using a piece of tape

midian182

Posts: 9,722   +121
Staff member
Why it matters: While many modern vehicles contain incredibly advanced technology, it’s not infallible. The MobilEye EyeQ3 camera system installed on certain Tesla models, for example, can be tricked into accelerating the car up to 50 mph over the speed limit using a simple piece of tape.

McAfee researchers Steve Povolny and Shivangee Trivedi fooled the system by placing a slither of black tape over part of a 35mph sign to extend the middle line of the ‘3.’ It meant the cameras in Tesla’s 2016 Model S and Model X cars read the sign as 85 mph, causing the cruise control to accelerate toward 50 mph over the 35mph limit.

The MobilEye cameras can read speed limit signs and adjust the speed of the autonomous driving system accordingly. McAfee disclosed the research to Tesla and MobilEye last year, but the automaker said it would not be fixing hardware problems on that generation of vehicles, while MobilEye told MIT Tech Review that the sign could have been misread by a human and said the camera hadn't been designed specifically for fully autonomous driving.

Only early Tesla models equipped with the Tesla hardware pack 1 and MobilEye version EyeQ3 were fooled by the trick. Newer Tesla vehicles use proprietary cameras, and MobilEye’s more recent version of the camera system was not susceptible.

“The newest models of Tesla vehicles do not implement MobilEye technology any longer and do not currently appear to support traffic sign recognition at all,” said Povolny.

The researchers said the number of 2016 Teslas on the roads meant their discovery was still a concern. “The vulnerable version of the camera continues to account for a sizeable installation base among Tesla vehicles,” added Povolny. “We are not trying to spread fear and say that if you drive this car, it will accelerate into through a barrier, or to sensationalize it. The reason we are doing this research is we're really trying to raise awareness for both consumers and vendors of the types of flaws that are possible.”

Tesla’s Autopilot feature was found to have been activated during several fatal crashes, leading to Tesla reiterating that despite the somewhat misleading name, Autopilot is not meant to make the car fully autonomous, and drivers must keep their hands on the wheel at all times.

Main image credit: Angelus_Svetlana via Shutterstock

Permalink to story.

 
Changing a road sign like that is a serious crime that can easily lead to casualties. I do not see how this is Tesla's fault in any way.

Those McAfee researchers should know better.

As for the autonomous driving, I wouldn't use any of those today, the technology is still in its infancy across the industry, and not worth the risk.
 
Last edited:
"... and drivers must keep their hands on the wheel at all times" and brakes?

Changing a road sign like that is a serious crime that can easily lead to casualties. I do not see how this is Tesla's fault in any way.
I think it's more the sudden acceleration that's the problem. Most humans wouldn't fall for the trick and would be shocked that the car accelerates.

Frankly I'm disappointed that Tesla thinks a 4-year-old car is too old to be recalled.
 
This autopilot feature is just gonna make people worse drivers. I'd like a Tesla car with just the basic, essential features like radio and AC. It'd have none of the stuff I don't like that messes with how I want to drive.
 
Somebody please call the cop...

This has to be why the Starlink constellation is needed, because there are always foul intended humans who wants to get everyone killed for their own benefits.
 
Software is a moving target so you can't simply say 'Teslas' and actually have that statement mean anything, even if the statement was qualified to indicate older vehicles. To make a meaningful statement you really need to talk hardware and software versions. I'm laughing a bit that 4 year old software imperfections are making it into news.
 
While this is interesting it is a clickbait Beat Up, Why? Because the Mobile Eye System with the limited number of cameras were from 2012 to mid 2016 were all upgraded in 2016 to the latest hardware Designed for Full Self Driving FSD. Your title plays on the fears of FSD never even hinting it is older non-FSD capable cars. ?

I had a 2014 Model for 3 months and there were some signs that it struggled to read eg LED speed signs but it never did a crazy acceleration etc.

I also doubt that any of the cars actually tried to accelerate to 85mph and think it’s more likely that the 85 showed up on the screen and was then corrected with the next speed sign. Am I Right??? Because even back in 2014 when AP was first downloaded to my car, it’s ability to hold the lane improved with the learning from EVERY CAR!!! even a 70kmh sweeping turn near my place that the car took uncomfortably fast in the first days, reduced its speed to “BELOW” the speed limit after about a week. So I simply do not believe that 1 anomalous reading from a few cars had them accelerating to dangerous speeds... I’m calling BS till you show me proof.

Your misleading Click Bait Title is sending massive warning signals.
 
. I'm laughing a bit that 4 year old software imperfections are making it into news.
Well, this is not software run on a PC that sits in someone's house. It is in vehicles that are operated on public roads.

Tesla's reaction tells you something about their vehicles' expected life span really.
 
While this is interesting it is a clickbait Beat Up, Why? Because the Mobile Eye System with the limited number of cameras were from 2012 to mid 2016 were all upgraded in 2016 to the latest hardware Designed for Full Self Driving FSD. Your title plays on the fears of FSD never even hinting it is older non-FSD capable cars. ?

I had a 2014 Model for 3 months and there were some signs that it struggled to read eg LED speed signs but it never did a crazy acceleration etc.

I also doubt that any of the cars actually tried to accelerate to 85mph and think it’s more likely that the 85 showed up on the screen and was then corrected with the next speed sign. Am I Right??? Because even back in 2014 when AP was first downloaded to my car, it’s ability to hold the lane improved with the learning from EVERY CAR!!! even a 70kmh sweeping turn near my place that the car took uncomfortably fast in the first days, reduced its speed to “BELOW” the speed limit after about a week. So I simply do not believe that 1 anomalous reading from a few cars had them accelerating to dangerous speeds... I’m calling BS till you show me proof.

Your misleading Click Bait Title is sending massive warning signals.
A whole week? Until the next speed sign? That's an infinity to recognize something a human would instantly recognize. Most humans would even recognize a 35MPH sign painted to look like an 85MPH sign - if not from the sign itself, but from the common sense of knowing that the speed limit in the area is not 85MPH.

IMO, Musk/Tesla should be paying you to own and beta test his car.
 
Similar to this, using a tape to convert 30 to 80 is an old trick of converting a 30 mph zone into 80 mph zone. It works on humans too.
 
Back