Tesla Autopilot System Found Probably at Fault in 2018 Fatal California Crash
Tesla‘s Autopilot driver-assistance system and a driver who relied on it too heavily as he was playing a video game on his Smartphone are likely to blame for a fatal 2018 crash in California, the National Transportation Safety Board (NTSB) determined this week. According to a report in The New York Times, the NTSB criticized several institutions for failing to do more to prevent this traffic crash including the National Highway Traffic Safety Administration (NHTSA) for a hands-off approach to regulated automated-vehicle technology.
What Came Out of the Investigation
NTSB Chairman Robert Sumwalt said his agency urges Tesla to continue to improve Autopilot technology and for NHTSA to fulfill its oversight responsibility and ensure corrective action is taken when necessary. “It is time to stop enabling drivers in any partially automated vehicle to pretend that they have driverless cars,” Sumwalt said.
The board adopted a number of staff findings and recommendations from an investigation into the crash that has been ongoing for over six months. It found that Autopilot failed to keep the vehicle in its lane, that its collision-avoidance software failed to detect a highway barrier and that the driver, Wei Huang, was likely distracted by a game on his phone. The NTSB also determined that Huang would have survived the crash had the California Department of Transportation fixed the barrier he hit, which was designed to absorb some of the impacts of a crash but was damaged during a previous crash.
Autopilot Misleads Consumers
NHTSA has said all crashes caused by distracted driving including those in which driver-assist systems were in use, are a cause for major concern. While Tesla has said drivers should remain vigilant and keep their hands on the wheel even when Autopilot is engaged, the name of the feature misleads drivers into thinking that they can do whatever they please when the vehicle is on Autopilot. It gives the appearance of autonomous function when it doesn’t operate that way. This is and has been extremely dangerous and misleading.
German regulators reportedly asked Tesla as early as in 2016 to stop using the term Autopilot arguing that it suggests the technology is more advanced than it actually is. The Center for Auto Safety, an advocacy group, also protested saying the name Autopilot suggests “full self-driving” and that Tesla is intentionally misleading consumers about what its technology is capable of doing.
It is no secret that drivers misuse Autopilot. The evidence is all over the Internet where drivers can be seen eating, sleeping or reading a book when Autopilot is engaged. A survey by the Insurance Institute for Highway Safety found that 48% of drivers believe it is safe to take their hands off the wheel while using Autopilot. Tesla needs to fix this before more lives are lost. It is certainly misleading to consumers and poses grave danger to everyone on our roadways.