Federal Investigators Say California Officials Failed to Repair Highway Barrier Prior to Fatal Tesla Crash
A design defect in Tesla’s Autopilot was a critical factor in a crash that involved a Model S slamming into a fire truck along the 405 Freeway in Culver City in January 2018. According to a news report, the National Transportation Safety Board (NTSB) also determined that the driver was overly reliant on the Autopilot feature and that the Autopilot’s design allowed him to disengage from driving.
How the Crash Happened
The NTSB’s report on this incident raises important questions about the effectiveness and safety of Tesla’s semi-autonomous Autopilot feature, which was engaged, but failed to brake in the Culver City crash and three others since 2016 in which the drivers were killed. No one was injured in the Culver City crash involving a 2014 Tesla Model S that was traveling 31 mph at the time of impact. The crash occurred after a larger vehicle in front of the Tesla moved out of its lane. The Tesla hit the fire truck that had been parked with its emergency lights flashing while firefighters handled a separate crash.
Defective and Dangerous System
The report says Tesla’s automatic emergency braking did not activate and there was no braking from the driver, a 47-year-old man who was going to Los Angeles from his Woodland Hills home. Also, the driver’s hands were not detected on the wheel in the moments leading up to the crash. Cell phone data showed the driver was not using his phone to talk or text in the minutes leading up to the crash.
This finding by the NTSB is another black mark against the Autopilot system, which has been involved in three fatal crashes, two in Florida and one in California in March 2018 when the Autopilot accelerated just before a Model X SUV crashed into a freeway barrier killing its driver.
Our auto defect lawyers have been saying for a long time now that Tesla’s Autopilot system misleads drivers into believing that they have a fully automated system in their hands when it’s barely semi-autonomous. Drivers believe based on Tesla’s marketing that the car can “take care of itself” on Autopilot. But, that’s not true. In fact Tesla has known for years that its system allows drivers not to pay attention. And yet, the automaker hasn’t taken the problem seriously. Autopilot is nothing more than a driver-assist feature. Tesla’s failure to acknowledge that fact in an emphatic way is continuing put drivers of its vehicles in danger.