Is NHTSA Getting Ready to Formally Investigate Tesla's Autopilot?
Tesla Motors, particularly its Autopilot feature, has come under renewed scrutiny after reports from China that a fatal crash in that country may have occurred while the vehicle’s automated driver-assist system was in operation. According to a report in The New York times, the crash took place on January 20, 2016 and killed Gao Yaning, 23, when the Tesla Model S he was driving slammed into a road sweeper on a highway near Handan, about 300 miles south of Beijing.
The Chinese government’s news channel CCTV broadcast a report, which included an in-car video looking through the windshield as the car travelled in the left lane at highway speed just before crashing into a parked or slow-moving orange truck. The video, recorded by a camera mounted on the rearview mirror, however, recorded no images, sounds or jolts that might suggest that the driver or the car hit the brakes prior to the collision. A police officer is heard saying on the CCTV report that the car did not attempt to stop or apply the brakes, but instead, rammed right into the sweeper.
Shrouded in Secrecy?
Tesla has said in a statement that it has not been able to determine whether Autopilot was active in this particular incident and it has also declined to say when it learned of this fatality in China or whether it had reported the crash to U.S. safety officials who are investigating a fatal accident in Florida May 7 in which the Autopilot was engaged. So far, that is the only confirmed fatality involving a Tesla with Autopilot turned on.
Even with the Florida case, Tesla did not publicly disclose the incident or details until after a few weeks. The accident in China occurred way back in January. It is hard to imagine this is the first Tesla is hearing about it and it’s truly disturbing as to why they haven’t said a word about this incident.
Tesla Should Be Accountable
A Tesla spokeswoman explains that because of the damage to the vehicle caused by the collision with the sweeper, the car was physically incapable of transmitting data to Tesla’s servers. So, they have no way of knowing whether Autopilot was engaged or not at the time. Last week, Tesla CEO Elon Musk outlined changes for the Autopilot, which could have prevented the fatal crash in Florida. In August, the automaker removed a Chinese term for “self-driving” from its China website. This came after a driver in Beijing suffered injuries in a Tesla crash when Autopilot was engaged.
It is apparent that Tesla played up the Autopilot feature in its marketing, but took a big step back after the crashes not just downplaying the feature, but adding that drivers should not take their hands off their wheel even when the Autopilot was activated. Then, why call it Autopilot? And why put technology on the road because it’s ready for primetime? Auto defect attorney Brian Chase has said automakers have no right to use consumers as guinea pigs to test new technology. Musk says the new Autopilot would have prevented the Florida crash. Why was a vehicle that could not tell the difference between the bright sky and the side of a turning big rig put on the road in the first place?