Tesla’s controversial driver-assist system, Autopilot, can be gamed with consumer hardware and two-dimensional image projections, according to researchers in Israel to put the system to the test. Researchers at Ben-Gurion University of the Negev were able to trick self-driving cars including a Tesla Model X into braking and taking evasive action to avoid “depthless phantom objects.”
Cause for Serious Concern
Two-dimensional image projections tricked Tesla’s Autopilot system into thinking there was actually a person standing on the road. To the human eye, it is clear that it’s a hologram and won’t pose a physical threat. But, the car perceives it differently. This technique of phantom image projections can be tricked into thinking that several objects are on the road ahead.
Researchers were even able to trick the system’s speed limit warning features with a phantom road sign. They were able to get the Autopilot system to brake suddenly and deviate from its lane by projecting new road markings on to the tarmac.
What is seriously worrisome is that phantom image attacks can be carried out remotely or from a distance using drones or even by hacking video billboards. In some cases, the images can appear and disappear faster than a human eye can detect. But, the high-speed sensors used in autopilot systems will be able to perceive them.
Tech Has a Long Way to Go
This research sends another resounding warning about the inadequacy of autopilot systems and other driverless technology. There are several kinks that need to be ironed out. Despite the extent of sophisticated technology that has gone into these vehicles already, they are still not ready for primetime. The fact that Tesla Model X vehicles are still vulnerable and considering the number of these vehicles that are out there on the roadways, one can only imagine the extent of damage that can be done.
As auto defect attorneys, we are extremely concerned about the safety of autonomous and semi-autonomous vehicle technology. It appears that tech companies and automakers are eager to get these vehicles out on the road. But, in no way, have they been diligently tested. There are still several glitches that need to be fixed. The public should not be used as guinea pigs in the testing of this technology. Public safety should always come first.