The National Transportation Safety Board (NTSB) determined this week that Tesla’s Autopilot driver-assistance system and a driver who relied too heavily on it while playing a video game on his smartphone are likely to blame for a fatal 2018 crash in California. According to a report in The New York Times, the NTSB criticized several institutions for failing to do more to prevent this traffic crash, including the National Highway Traffic Safety Administration (NHTSA) for a hands-off approach to regulated automated-vehicle technology.
What Came Out of the Investigation
NTSB Chairman Robert Sumwalt said his agency urges Tesla to continue to improve Autopilot technology and for NHTSA to fulfill its oversight responsibility and ensure corrective action is taken when necessary. “It is time to stop enabling drivers in any partially automated vehicle to pretend they have driverless cars,” Sumwalt said.
The board adopted a number of staff findings and recommendations from an investigation into the crash that has been ongoing for over six months. It found that Autopilot failed to keep the vehicle in its lane, that its collision-avoidance software failed to detect a highway barrier, and that the driver, Wei Huang, was likely distracted by a game on his phone. The NTSB also determined that Huang would have survived the crash had the California Department of Transportation fixed the barrier he hit, which was designed to absorb some of the impacts of a crash but was damaged during a previous crash.
Autopilot Misleads Consumers
NHTSA has said all crashes caused by distracted driving, including those in which driver-assist systems were in use, are a cause for major concern. While Tesla has said drivers should remain vigilant and keep their hands on the wheel even when Autopilot is engaged, the name of the feature misleads drivers into thinking that they can do whatever they please when the vehicle is on Autopilot. It gives the appearance of autonomous function when it doesn’t operate that way. This is and has been extremely dangerous and misleading.
German regulators reportedly asked Tesla to stop using the term Autopilot as early as 2016, arguing that it suggests the technology is more advanced than it is. The Center for Auto Safety, an advocacy group, also protested, saying the name Autopilot suggests “full self-driving” and that Tesla is intentionally misleading consumers about what its technology can do.
It is no secret that drivers misuse Autopilot. The Internet is replete with evidence of drivers engaging Autopilot and eating, sleeping, or reading a book. A survey by the Insurance Institute for Highway Safety found that 48% of drivers believe taking their hands off the wheel while using Autopilot is safe. Tesla needs to fix this before more lives are lost. The current situation misleads consumers and endangers everyone on our roadways.
Source: https://www.nytimes.com/2020/02/25/business/tesla-autopilot-ntsb.html