. home.aspx

NEWS

home.aspx
   


Tesla Autopilot Duped By ‘Phantom’ Images

February 03, 2020 / Ajinkya Bagade
SHARESHARESHARE

  • Researchers said that they were able to create “phantom” images purporting to be an obstacle, lane or road sign and trick systems into believing that they are legitimate.

  • On the scale of level 0 (no automation) to level 5 (full automation), these autopilot systems are considered “level 2” automation.

  • Configuring ADAS systems so that they take into account objects’ context, reflected light, and surface would help mitigate the issue.


Research has shown that a two-dimensional image of a person projected on a Tesla Model X's autopilot system caused it to slow down. The autopilot system, which is also used by other popular cars, confused it for a real physical being.
 


Phantom Images explained..


The research has found that these autopilot systems can be fooled into detecting fake images, projected by drones on the road or surrounding billboards. It has warned that this flaw can prove fatal as attackers could misuse this system flaw to trigger the systems to brake or steer cars into oncoming traffic lanes. Similarly, projecting fake lane lines also leads the Model X to temporarily ignore the road's physical lane lines. The crossover of fake and real lane lines would follow the artificial lane markers into the wrong traffic lane, as the projections were brighter than the real-life markers.

 

“When projecting images on vertical surfaces (as we did in the case with the drone) the projection is very simple and does not require any specific effort. When projecting images on horizontal surfaces (e.g., the man projected on the road), we had to morph the image so it will look straight by the car’s camera since we projected the image from the side of the road. We also brightened the image in order to make it more detectable since a real road does not reflect light so well.”

- Ben Nassi, Researcher, Ben-Gurion University of the Negev



The advanced driving assistance systems (ADAS), which are used by semi-autonomous vehicles to help the vehicle driver while driving or parking are the origins of this issue. ADAS systems are designed to increase driver safety by detecting and reacting to obstacles on the road. However, researchers said that they were able to create “phantom” images purporting to be an obstacle, lane or road sign; use a projector to transmit the phantom within the autopilots’ range of detection; and trick systems into believing that they are legitimate.

 

 


“The absence of deployed vehicular communication systems, which prevents the advanced driving assistance systems (ADASs) and autopilots of semi/fully autonomous cars to validate their virtual perception regarding the physical environment surrounding the car with a third party, has been exploited in various attacks suggested by researchers.”
 


Research Team, Ben-Gurion University of the Negev.
 

 

“The absence of deployed vehicular communication systems, which prevents the advanced driving assistance systems (ADASs) and autopilots of semi/fully autonomous cars to validate their virtual perception regarding the physical environment surrounding the car with a third party, has been exploited in various attacks suggested by researchers.”

- Research Team, Ben-Gurion University of the Negev


The researchers also showed how Tesla Model X can be caused to brake suddenly by projecting a phantom image, perceived as a person, in front of the car.
Mobileye's 630 Pro technology also proved easily confused by projected imagery. Notably, Mobileye's traffic sign recognition system was fooled into relaying incorrect speed limit information to the driver due to its inability to distinguish between projected and physical speed limit signage.
 

READ MORE: Tesla: an underdog to the most powerful business mogul in the us


What does Tesla have to say?


Of course, they were not available to comment.
 

The researchers, though said that they are in touch with both Tesla and Mobileye regarding the issue. Mobileye has said, that the phantom attacks did not stem from an actual vulnerability in the system, researchers revealed. On the other hand, Tesla told the researchers, “We cannot provide any comment on the sort of behavior you would experience after doing manual modifications to the internal configuration – or any other characteristic, or physical part for that matter – of your vehicle.”


The research has responded saying, that they did not influence the behavior that led the car to steer into the lane of oncoming traffic or suddenly put on the brakes after detecting a phantom. They have excluded this demonstration from the research paper, as Tesla’s stop sign recognition system is experimental and is not considered a deployed functionality.


The research team has suggested that configuring ADAS systems so that they take into account objects’ context, reflected light, and surface would help mitigate the issue as it would provide better detection around phantom images.


READ MORE: Tesla's disruptive success in competitive car-making and how it can airlift Boeing



Truth be told..


The research findings are disturbing. On the scale of level 0 (no automation) to level 5 (full automation), these two systems are considered “level 2” automation. It is a clear indication that semi-autonomous autopilot systems still require human intervention.
 

Though the phantom attacks are not security vulnerabilities, researchers have said that they reflect a fundamental flaw of models that detect objects that were not trained to distinguish between real and fake objects.
 

Connected cars have been the target of hackers for a long time. Vehicle-related attacks have stemmed from  keyless entry systems to in-vehicle infotainment systems. This study confirms that autopilot systems and other advanced driver-assist features are here to make the driving experience safer, but do not replace the act of driving or human interference altogether.