A group of law enforcement officers in Texas are suing Tesla after a Model X with Autopilot engaged crashed into five police officers.
According to a report in The Verge, the crash took place on Feb. 27 in Splendora, a small town in Montgomery County in eastern Texas, where several officers were injured in an autonomous car crash.
According to the lawsuit, the Model X SUV crashed into several police officers while they were engaged in a traffic stop on the Eastex Freeway in Texas.
What the Lawsuit States
The plaintiffs claim that “design and manufacturing defects known to Tesla” are responsible for the crash as well as “Tesla’s unwillingness to admit or correct such defects.” The lawsuit argues that the Autopilot system failed to detect the officers’ cars or to function in any way to avoid or warn of the hazard or subsequent crash.
The plaintiffs also say in the lawsuit that this was not an “isolated instance” citing at least a dozen other crashes that have occurred involving a Tesla vehicle with Autopilot engaged. The National Highway Traffic Safety Administration (NHTSA) is investigating 12 crashes in which Tesla vehicles with Autopilot features have crashed into stationary emergency vehicles resulting in 17 injuries and one fatality.
The officers’ lawsuit points to several tweets by Tesla CEO Elon Musk commenting on crashes involving Autopilot or incidents of Tesla owners misusing the system as evidence that the company is aware of these defects and problems, but has failed to fix the problem or to recall these vehicles.
The officers, who said in the lawsuit that they’ve suffered permanent disabilities, are also suing a local restaurant owner claiming that the driver of the Model X was served too much alcohol prior to the accident.
Misleading to Drivers
Our auto defect lawyers as well as many consumer safety advocates have criticized the electric vehicle maker for calling its semi-autonomous feature “Autopilot” because it could lull drivers into a sense of false security leading them to believe that it is fully automated when it is really not.
Tesla has since warned that drivers should be wary even when they set the vehicle on Autopilot mode and should be prepared to take over at a moment’s notice. However, that message doesn’t often reach everyone.
It is important for Tesla to put safeguards in place to avoid misuse of the Autopilot. Until all these glitches and poor design issues are ironed out, the unfortunate reality is that we will probably continue to see these types of incidents that cause more serious injuries and harm.