New research shows that “phantom” images such as similar road signs flashed in roadside billboards could confuse semi-automated driver-assist systems such as Tesla’s Autopilot. According to a report in Tech Times, in some cases, hackers could even hijack billboards on main roads leading to accidents or total loss of vehicle control. Wired recently reported that researchers were able to trick the Tesla autopilot feature by flashing a few frames of a stop sign for less than half a second on an internet-connected billboard.
Serious Safety Concerns
They called these images “phantom” images which are flickering lights on the road that could confuse self-driving cars or semi-automated vehicles on when to stop, go, or turn. Tesla and its CEO, Elon Musk, have come under increasing scrutiny after a number of crashes have been reported. So, how do these phantom lights affect Tesla’s Autopilot?
Researchers say, imagine Tesla with the Autopilot engaged is driving in the middle of the night on the main road with flashing billboards. Some of these large ads feature warnings to drivers including road signs. Once the vehicle spots the sign, there is the possibility that it would be confused about what to do next. Once the system gets confused, the driver may completely lose control of the car, which could lead to traffic jams or even car accidents.
Tesla has posted on its website that the Autopilot feature should be used only with a fully attentive driver at the wheel. But, a recent study by researchers at the Massachusetts Institute of Technology (MIT) found that Tesla drivers are more distracted when they use Autopilot. The findings support calls for the automaker to take more steps to keep drivers attentive, for their own safety.
Auto Product Liability Issues
This is just one of many indications that much needs to be improved when it comes to autonomous and semi-autonomous vehicles. A lot of this technology simply isn’t ready for prime time. But, we already have millions of these vehicles on our roadways and their drivers have been lulled into a false sense of security and into believing that Autopilot (as the name suggests) will do the job.
If you or a loved one has been injured in a Tesla crash involving Autopilot, you may be able to seek compensation for your injuries, damages and losses. Don’t hesitate to get in touch with an experienced auto defect lawyer to obtain more information about pursuing your legal rights.
Source: https://www.techtimes.com/articles/253366/20201015/teslas-autopilot-crash-due-phantom-flickering-lights.htm