Fake images can fool autonomous cars, posing risks, Israeli researchers warn

Attackers could use cheap consumer drones to project fleeting images onto roadways, fooling Tesla and Mobileye navigation systems and causing havoc, says Ben-Gurion University team

Luke Tress is The Times of Israel's New York correspondent.

In a Ben-Gurion University study, a Telsa perceives a projected image as a real person, left, and Mobileye's 630 PRO autonomous vehicle system considers an image projected on a tree as a real road sign, right.
In a Ben-Gurion University study, a Telsa perceives a projected image as a real person, left, and Mobileye's 630 PRO autonomous vehicle system considers an image projected on a tree as a real road sign, right.

Autonomous vehicles can be fooled by “phantom” images displayed on a road, wall or sign, causing them to unexpectedly brake or veer off course and making them vulnerable to attackers, Israeli researchers said.

Semi- and fully-autonomous cars perceive and respond to two-dimensional projections as real objects, according to researchers from the Ben-Gurion University of the Negev.

Attackers could exploit the vulnerability to put vehicle passengers in danger, for example by projecting an image of a person in front of a car, causing it to brake suddenly, or by projecting fake lane markers onto the surface of a road, directing the car into oncoming traffic or onto a sidewalk, endangering pedestrians.

The team from the university’s Cyber Security Research Center used the Tesla Model X and the Mobileye 630 PRO system in the research, which was published earlier this month through the nonprofit International Association for Cryptologic Research science organization.

“This is not a bug. This is not the result of poor code implementation. This is a fundamental flaw in object detectors that essentially use feature matching for detecting visual objects and were not trained to distinguish between real and fake objects,” researcher Ben Nassi said in a statement. “This type of attack is currently not taken into consideration by the automobile industry.”

The team demonstrated how attackers could carry out an attack remotely by using a drone to project an image onto a road, or by hacking a digital billboard to insert a fake road sign into an advertisement. Mobileye’s system, a leader in the field, could be fooled with a fake projection lasting just 125 milliseconds, the researchers said.

Cars in the Mobileye fleet of autonomous vehicles leave the Mobileye garage for test drives November 5, 2019, as part of the 2019 Mobileye Investor Summit. (Walden Kirsch/Intel Corporation)

For only a few hundred dollars, a bad actor could purchase a drone equipped with a portable projector and use it to carry out a terror attack by tricking cars into hitting pedestrians, for example. Criminals could create traffic jams by falsely projecting slow speed signs onto roadways. Fraudsters could have someone cause their self-driving Tesla to crash, then sue the company.

Previous studies have shown that autonomous driving systems are vulnerable to attack, but only by skilled attackers carrying out lengthy preparations, and in ways that would require the attackers to approach the scene, putting themselves at risk. Manipulating vehicle sensors would leave little to no evidence behind.

The sensors in autonomous vehicles do not verify what they perceive with other systems, and therefore react to stimuli independently, in what the researchers call a “validation gap” that could be exploited by hackers.

Sensors that perceive depth exist, but the researchers said that the systems likely employ a “better safe than sorry” policy that regards the projections as real objects.

Computer vision algorithms identify their surroundings based on geometry and patterns (the researchers call them “feature matchers”), such as the shape of a stop sign, and do not consider the context of the image, its texture, or how realistic it looks. So, a partially transparent stop sign floating in a tree, or a pixelated image of a man lying in a street, could cause a car to suddenly brake.

The systems do not have any concept of what a fake object would look like, and do not take into account the possibility of phantom image attacks.

Mobileye’s system relies solely on computer vision algorithms, which assumes all objects are real. The system registers any image of a stop sign within a certain size range — if the sign is too small, the system “thinks” it is far away, and if it is too big, it thinks the driver is too close, and unable to stop. The researchers found the system does not even register the colors of the sign.

In its experiment on Mobileye’s driver assistance system, the team used a DJI Matrice 600 drone with a projector disguised as a delivery box, and a Renault Captur car.

The drone projected a 90 km/h (56 mph) speed sign of onto a wall for 125 milliseconds in an urban environment, tricking the vehicle into telling its operator to drive dangerously fast. They were able to carry out the same attack using an 8.8 ounce projector on a smaller DJI Mavic drone, which retails for $400.

The study notes that billboards could be hacked and signs hidden in an advertisement video, making them difficult for humans to notice.

The researchers inserted a road sign into a Coca-Cola advertisement video on a billboard for only three video frames, again, tricking the car into trying to drive dangerously fast.

To test attacks on semi-autonomous cars the researchers used a Tesla Model X with its “Hardware 2.5” autopilot capabilities. Although it is called autopilot, the system is only meant to help a human driver, and the company intends to release fully autonomous cars in the future.

Tesla’s obstacle detection system employs an array of cameras, sensors and radar.

The research team projected the image of a person onto the road ahead of the vehicle, which it perceived as an actual human. (The “phantom person” the team projected onto the road was an image of Tesla founder Elon Musk.) The car’s radar and sensors, which are meant to monitor other vehicles, are unable to effectively detect people.

A 2019 Tesla Model X at a Tesla in Littleton, Colorado, Oct. 20, 2019. (AP Photo/David Zalubowski, File)

Next, the team projected an image of a car in front of the Tesla. To its surprise, the Tesla registered the phantom image as an actual vehicle, suggesting that the car’s obstacle detection system does not validate objects it sees with the vehicle’s sensors. The researchers said they contacted Tesla for an explanation, but the company declined to comment.

Attackers could exploit the flaw by projecting images onto a highway, for example, causing cars to brake suddenly and putting the occupants of the car and other vehicles in danger.

The researchers also tricked the Tesla into driving into an oncoming traffic lane by projecting false road markings onto the street ahead of it.

To fix the problem, the researchers recommended employing add-on software that could validate objects by having the cars rely on more than the cameras to navigate. The cameras should take into account information beyond shapes and patterns, such as an image’s size, angle, context, surface, and lighting, they said.

The paper said that, in response to the findings, Mobileye said: “There was no exploit, no vulnerability, no flaw, and nothing of interest: the road sign recognition system saw an image of a street sign, and this is good enough, so Mobileye 630 PRO should accept it and move on.”

Tesla dismissed the findings, saying that its autopilot system “is intended for use only with a fully attentive driver who has their hands on the wheel and is prepared to take over at any time. While Autopilot is designed to become more capable over time, in its current form, it is not a self-driving system, it does not turn a Tesla into an autonomous vehicle, and it does not allow the driver to abdicate responsibility.”

While current automated driving systems are meant to help, but not replace, human users, many of today’s drivers overestimate vehicle capabilities. A study last year found that 48 percent of drivers thought it would be safe to take their hands off the wheel while using Tesla’s autopilot system.

Mobileye is an Israel-based subsidiary of Intel, which took over the startup for $15.3 billion in 2017 in the largest-ever acquisition of an Israeli company.

Most Popular
read more:
If you’d like to comment, join
The Times of Israel Community.
Join The Times of Israel Community
Commenting is available for paying members of The Times of Israel Community only. Please join our Community to comment and enjoy other Community benefits.
Please use the following structure: example@domain.com
Confirm Mail
Thank you! Now check your email
You are now a member of The Times of Israel Community! We sent you an email with a login link to . Once you're set up, you can start enjoying Community benefits and commenting.