Researchers show how hacked billboards could force Tesla's autopilot into a collision

midian182

Posts: 9,756   +121
Staff member
Why it matters: As technology advances, the safety of driver assistance systems continues to improve—but they're not infallible. Security researchers have shown how hijacked internet-connected billboards could trick systems such as Tesla's Autopilot into braking suddenly, stopping, or swerving.

As reported by Wired, researchers at Israel's Ben Gurion University of the Negev have been experimenting with "phantom" images that can confuse self-driving/driver-assistance tech. It involves creating split-second light projections so the systems "see" something that isn't really there, such as a stop sign.

Previous examples of this technique projected the image of a person onto a road, as well as road signs onto a tree. The projections only appeared for a few milliseconds but were enough to convince advanced driver-assistance systems (ADAS) that they were real.

The new research builds on the same method; but instead of projecting an image, it uses an internet-connected billboard. By hacking into one of these digital signs—which, judging by those that have been hacked to play porn, isn't impossible—perpetrators can inject a few frames that could cause a car to crash, leaving little evidence and without the driver understanding what happened.

The second-most-recent version of Tesla's Autopilot and a Mobileye 630 system were tested. In the case of the former, a phantom stop sign appearing for just 0.42 seconds tricked the vehicle; with the latter, it took 1/8th of a second.

Using billboards instead of projections would reach a larger number of vehicles and potentially cause more carnage.

The research is due to be presented at the ACM Computer and Communications Security conference this November.

Several early versions of self-driving/driver-assistance technologies were susceptible to hacks. In 2015, we saw examples of how Lidar could be tricked into seeing phantom objects using a laser pointer and a Raspberry Pi. Researchers also demonstrated how to make an older Tesla break speed limits using a piece of tape.

Permalink to story.

 
Autopilot is supposed to be an assistant to driving. Not a substitute.

People always need to be paying attention to the road conditions.

That said: I'm surprised that speed limits aren't simultaneously GPS verified to prevent this from happening.
 
Even if there is a powerful algorithm and machine learning inside the cars software, there is still a long way to effectively leave the car to drive on its own. Now, one has to pay attention to the driving and be ready in the critical moments.
 
Autopilot is supposed to be an assistant to driving. Not a substitute.

People always need to be paying attention to the road conditions.

That said: I'm surprised that speed limits aren't simultaneously GPS verified to prevent this from happening.
Even if you are paying attention, I am not so sure how well a human could recover from a sudden braking at highway speed. There will first be the confusion of 'what' as you start to suddenly brake hard, then that actual reaction time by the time you realize something is wrong. Finaly, what exactly do you do to recover? Its the brake. Do you you step on the gas and hope autopilot releases the brake in time so that the vehicle behind you doesn't hit you?

A swerve should be relatively easy to correct. Braking may not be.
 
Is it really news worthy to report finding bugs in an old versions of software? Would you report about Windows bugs that only existed years ago? Tesla hasn't used Mobileye in years.

Because software is a moving target, which means what was true yesterday is not necessarily true today, to report on software responsibly, you must be reporting version numbers, date of release, whether the public is subject to any of these findings.. use your head.
 
This will never work in the real world. What someone will run along the highway with a high power projector in hand to TRY to trick Tesla cars? LOL
Not to mention that the Autopilot does not allow autonomous operation for more than a few seconds. The person running with the projector had to have very good reflexes to find cars that have people not paying attention.
The whole thing is as stupid as the people "developed" it...
 
Is it really news worthy to report finding bugs in an old versions of software? Would you report about Windows bugs that only existed years ago? Tesla hasn't used Mobileye in years.
The clickbait headline is even worse than that, as this research does not show how to "force Teslas into a collision", but rather simply to cause them to avoid an obstacle that doesn't really exist. But since such obstacles could exist, the software's reaction to a phantom stop sign or pedestrian should be no different than that of a real one, and should not result in a collision.
 
Even if you are paying attention, I am not so sure how well a human could recover from a sudden braking at highway speed. There will first be the confusion of 'what' as you start to suddenly brake hard, then that actual reaction time by the time you realize something is wrong. Finaly, what exactly do you do to recover? Its the brake. Do you you step on the gas and hope autopilot releases the brake in time so that the vehicle behind you doesn't hit you?

A swerve should be relatively easy to correct. Braking may not be.

You forget that a Tesla has rear sensors. The rear ultrasonic sensors are also part of a teslas collision avoidance strategy. If a Tesla sense is that somebody is speeding up to it it will increase its acceleration to get away from the person pursuing. There’s even a video on YouTube showing a Tesla avoiding a rear end collision.
 
You forget that a Tesla has rear sensors. The rear ultrasonic sensors are also part of a teslas collision avoidance strategy. If a Tesla sense is that somebody is speeding up to it it will increase its acceleration to get away from the person pursuing. There’s even a video on YouTube showing a Tesla avoiding a rear end collision.
Ok, and what does it do when the front sensors say 'stop' and the rear sensors say 'go'?
 
If the computer verifies the speed limit it would prevent a hacked billboard from sending the computer into an unintended acceleration.
I think you've misunderstood what exactly is happening here. Read the underlying linked article. These phantom images are in no manner causing a vehicle to exceed speed limits, but rather to brake or swerve to avoid an obstacle. Furthermore, when you say "GPS-verified" speed-limits, you seem to be wrongly presuming the vehicle itself is being hacked, rather than devices external to it.
 
Even if there is a powerful algorithm and machine learning inside the cars software, there is still a long way to effectively leave the car to drive on its own. Now, one has to pay attention to the driving and be ready in the critical moments.
I fully agree. I also think that this use of the technology in a sort of "half way" stage is a bad idea. Several cases of accidents already due to the "hard of thinking" drivers assuming they are in a fully automated car so once they're on the road they stop paying attention and disappear into smartphone mode.
 
Back