The National Highway Traffic Safety Administration (NHTSA) has opened an investigation this week into Tesla’s “Full Self-Driving” (FSD) system after the electric car maker reported four crashes, including one that killed a pedestrian.
According to an Associated Press news report, investigators are investigating Tesla’s FSD technology’s ability to “detect and respond appropriately to reduced roadway visibility conditions, and if so, the contributing circumstances for these crashes.”
The probe covers about 2.4 million Tesla vehicles between the 2016 and 2024 model years. The crashes reportedly occurred after Tesla vehicles entered areas of low visibility, including sun glare, fog, and airborne dust.
Tesla’s Plans for Unsupervised Autonomous Driving
Tesla has been actively pursuing its plans to increase the number of autonomous vehicles on the road. In October 2024, the company held an event in Hollywood to unveil a fully autonomous robotaxi without a steering wheel or pedals.
Tesla CEO Elon Musk has said that the company plans to have fully autonomous vehicles running without human drivers as early as next year, with robotaxis available in 2026.
Musk said during that Oct. 11 event that this would move vehicles “from supervised Full Self-Driving” to unsupervised. Musk said you could even fall asleep in the vehicle and wake up at your destination.
The Tesla boss called his robotaxi vision the “glorious future.” Tesla also expects to make the FSD technology available on its popular Model 3 and Model Y vehicles in Texas and California next year.
However, the impact of NHTSA’s probe on Tesla’s ambitious plans is still unclear. NHTSA is already investigating the reported fatal crashes linked to FSD limitations.
The watchdog has also said it will investigate whether any other similar crashes involving FSD have happened in low-visibility conditions.
Officials have stated that they will seek information from Tesla regarding whether any updates affected the system’s performance under those conditions. Specifically, documents posted by NHTSA said the review will evaluate the timing, purpose, and capabilities of the updates and the company’s assessment of their safety impact.
NHTSA must approve any robotaxi that operates without pedals or a steering wheel. Those plans are unlikely to move forward with an ongoing investigation.
Tesla’s FSD Criticized for Possible Safety Gaps
Tesla’s FSD has been criticized for relying solely on cameras to detect hazards and for lacking the sensors needed for fully autonomous driving. Other companies working on self-driving vehicles use radar, laser sensors, and cameras to improve visibility in the dark and in inclement weather.
The FSD recalls were issued after the company investigated its Autopilot system, which examined crashes involving emergency vehicles. After that probe, NHTSA found 467 crashes involving the less sophisticated Autopilot.
Those crashes resulted in 14 deaths and 54 injuries. While Autopilot was simply a better version of cruise control, Musk has touted FSD as capable of operating without human intervention.
This week’s new investigation is uncharted territory for NHTSA, which previously viewed Tesla’s systems as only assisting drivers rather than driving themselves.
This new probe is significant because it looks at FSD’s capabilities, not whether drivers were focused during the crash.
Safety advocates say the prior investigation of Autopilot failed to examine why the Tesla vehicles that crashed into emergency vehicles did not see or stop for them. In those investigations, the focus was on the driver, not the car.
However, our auto defect attorneys will be watching this investigation closely, focusing on whether FSD can properly detect dangers in real-life scenarios.
Liability for Defective Autonomous Vehicles
As autonomous vehicles (or AVs) become more widespread, it is natural that questions about liability for accidents involving them grow louder. The legal framework governing this area intersects with both product liability and the evolving laws and regulations governing these types of vehicles.
Product liability traditionally imposes liability on manufacturers for defects arising from the creation, production, or marketing of goods. Manufacturers may be subject to strict liability, negligence, or breach of warranty claims in the context of autonomous vehicles.
When a car malfunctions and injures someone, strict liability is applicable, even if the manufacturer was not at fault. Strict liability applies to an autonomous vehicle (AV) manufactured defectively if a flaw arises during production.
Defective materials, subpar assembly, or a breakdown in quality control may hold the manufacturer liable for causing an accident.
The complicated interaction between hardware and software in autonomous vehicles also adds new levels of defect liability. Manufacturers produce complex software systems that control the driving functions of physical vehicles.
It may also be considered a manufacturing defect if an accident is caused by a software or sensor malfunction in the car, particularly if the software was installed or integrated incorrectly. It can add a whole new layer to car accident lawsuits.
Furthermore, questions about who is responsible for such defects arise in the context of supply chains because autonomous vehicles are made of parts from multiple companies.
If component suppliers manufacture their products defectively, they may also share liability. Determining fault can become complex and could involve multiple parties, including software developers, component manufacturers, and vehicle assemblers.
Contact an Auto Defect Lawyer
With autonomous vehicle technology still in its infancy, courts and legislatures will continue to adapt and refine the laws governing liability, particularly regarding the unique aspects of how these vehicles are manufactured.
Imagine a scenario where you or a loved one sustains injuries in a crash involving an autonomous or semi-autonomous vehicle. In that case, it is important that you contact an experienced product defect lawyer who understands the nuances of laws and liability issues involving such vehicles and the technology behind them.
Source: https://www.cbsnews.com/news/tesla-fsd-self-driving-autopilot-elon-musk/