Is NHTSA Getting Ready to Formally Investigate Tesla's Autopilot?
Mashable reports that a team of Chinese white-hat hackers have just proved that Autopilot is not Tesla’s only problem. The hackers, who expose vulnerabilities with good rather than evil intentions, released a blog post as well as a video showing their ability to hack and control unmodified Tesla Model vehicles in both parking and driving mode. Sure, they had fun adjusting seats and activating windshield wipers.
But, it got a lot scarier. Not only was the group able to remotely unlock the cars without a key fob, but they were also able to remotely activate the brakes, bringing the car to a sudden stop. While a number of these demos were performed close to the car, the hackers were able to apply the brakes on the car from 12 miles away. The hackers, good guys that they are, have alerted Tesla to the security vulnerabilities and are even working with the automaker to remedy them.
Cause for Serious Concern
Even though no incidents have been reported just yet relating to Teslas and hacking incidents, it is incredibly worrisome for Tesla owners to know that someone several miles away can control their cars. If a so-called tech-savvy company such as Tesla can produce vehicles that are so vulnerable to hacking, what other security concerns do other vehicles on our roadways face now?
The U.S. National Highway Traffic Safety Administration (NHTSA) just this week introduced unprecedented guidelines for autonomous cars. While the regulations looks at things like the public’s trust in autonomous vehicles and technological advancements, it does not adequately address the concerns over hacking and the potential criminal acts and serious personal injuries that could occur as a result.
We Just May Not Be Ready
As auto product liability attorneys, for us, this adds an extra layer of concern to the autonomous and semi-autonomous vehicles. If government regulators are perceiving a lack of public trust in this technology, that’s no surprise. How could consumers trust a vehicle that cannot tell the difference between the bright sky and the white side of a turning big-rig? That’s exactly what happened in a fatal Florida crash where a Tesla was on Autopilot and crashed into a turning truck.
We are just not ready for these vehicles, just yet. And automakers should stop using consumers as lab rats for tomorrow’s ground-breaking technology. None of these vehicles should be on our roads unless they have been tested and proved reasonably safe. There’s nothing “reasonable” about what we’ve heard, read or seen so far.