Why it matters: As technology advances, the safety of driver assistance systems continues to improve---but they're not infallible. Security researchers have shown how hijacked internet-connected billboards could trick systems such as Tesla's Autopilot into braking suddenly, stopping, or swerving.

As reported by Wired, researchers at Israel's Ben Gurion University of the Negev have been experimenting with "phantom" images that can confuse self-driving/driver-assistance tech. It involves creating split-second light projections so the systems "see" something that isn't really there, such as a stop sign.

Previous examples of this technique projected the image of a person onto a road, as well as road signs onto a tree. The projections only appeared for a few milliseconds but were enough to convince advanced driver-assistance systems (ADAS) that they were real.

The new research builds on the same method; but instead of projecting an image, it uses an internet-connected billboard. By hacking into one of these digital signs---which, judging by those that have been hacked to play porn, isn't impossible---perpetrators can inject a few frames that could cause a car to crash, leaving little evidence and without the driver understanding what happened.

The second-most-recent version of Tesla's Autopilot and a Mobileye 630 system were tested. In the case of the former, a phantom stop sign appearing for just 0.42 seconds tricked the vehicle; with the latter, it took 1/8th of a second.

Using billboards instead of projections would reach a larger number of vehicles and potentially cause more carnage.

The research is due to be presented at the ACM Computer and Communications Security conference this November.

Several early versions of self-driving/driver-assistance technologies were susceptible to hacks. In 2015, we saw examples of how Lidar could be tricked into seeing phantom objects using a laser pointer and a Raspberry Pi. Researchers also demonstrated how to make an older Tesla break speed limits using a piece of tape.