A company has pulled a range of self-driving cars after a pedestrian accident that caused critical injuries. The accident happened in October 2023, when a Cruise automated vehicle rolled over a woman in the roadway and dragged her along the road for 20 feet. Cruise had received permits for its driverless technology just two months earlier.
California regulators said they were confident in self-driving technology when they granted Cruise’s self-driving cars permission to operate a robotaxi service in San Francisco. But The Washington Post reports the permit came two months before the serious accident. The approval Cruise received was crucial to the self-driving car industry because it expanded one of the largest test cases for such technology.
However, the pedestrian accident on Oct. 2 and the manner in which Cruise initially misrepresented facts about that crash has officials rethinking the approval they handed out. Two days after the DMV suspended Cruise’s driverless permits following that pedestrian accident, the company said it would suspend all driverless operations nationwide to review its processes and rebuild public trust.
A Misrepresentation of Vehicle Safety?
According to the Post’s report, the accident happened on the night of Oct. 2 at a busy intersection in San Francisco. A human-driven car initially rammed into the female pedestrian as she stepped into the roadway. She then rolled onto the windshield of that car before she was flung into the path of the Cruise driverless car. The human-driven car fled the scene. In the initial video that Cruise showed to media outlets, the Cruise vehicle appeared to brake immediately upon impact with the woman, after which the video ended.
Cruise spokespersons said they had no additional footage to share and that the driverless vehicle “braked aggressively to minimize the impact.” However, it later came to light that Cruise did not present the whole story. First responders who were at the scene noted a trail of blood from the point of impact with the woman to where the driverless vehicle stopped—about 20 feet away. The Cruise vehicle had dragged the woman, pinned underneath, for about 20 feet, an action that may have worsened her injuries.
In their move to revoke Cruise’s driverless permits, the DMV said the company’s vehicles are “not safe for the public’s operation” and determined that Cruise misrepresented information “related to the safety of the autonomous technology.” The National Highway Traffic Safety Administration (NHTSA) has also opened an investigation into Cruise after reports that its vehicles may not have exercised appropriate caution around pedestrians in the roadway.
Who Bears Responsibility When There Is No Driver?
In the San Francisco case, it was a human driver who initially struck the pedestrian, and the Cruise autonomous vehicle then dragged the victim for 20 feet. However, some officials are raising the issue that those on the California Public Utilities Commission who granted the company expanded permits, in spite of being aware of issues with the technology, also bear responsibility for the crash. They point to “a check and balance” that completely failed in this scenario.
Liability in a crash caused by a driverless car is a complex and evolving legal issue that raises numerous questions about responsibility and accountability. Here are some parties that may bear financial responsibility for collisions involving driverless vehicles:
Vehicle manufacturers: In many instances, they may be liable for driverless cars. If the crash results from a technical malfunction or a flaw in the autonomous vehicle’s hardware or software, the manufacturer could be deemed responsible. Manufacturers have a duty to ensure their products are safe and free of defects, including autonomous vehicle technology.
Software developers: Liability may also extend to developers and programmers responsible for creating the autonomous-driving software. If a software bug or coding error caused the crash, the developers could face accountability for the damages.
Vehicle owner or operator: The person who owns or operates the driverless car may still bear some responsibility, especially if they failed to properly maintain the vehicle or ignored warnings or updates related to the autonomous driving system. For example, if a vehicle owner fails to install crucial software updates that would have prevented a crash, they may be considered partially liable.
Third parties: Other drivers, pedestrians, or entities that interact with the driverless car may also be involved in a crash. For example, if a human driver’s reckless behavior causes a collision with an autonomous vehicle, the human driver might be at fault.
Governmental entities: Liability rules can vary significantly from one jurisdiction to another. Governments not only establish the legal framework for autonomous vehicles but also grant permits to test new technology. These bodies have a significant role to play in determining who gets these permits to test new technology on public roadways.
The Evolving Landscape of Self-Driving Cars
As the technology and legal landscape continue to evolve, it is essential to keep in mind that liability determinations can be highly dependent on the specific circumstances of each case. Legal systems are likely to adapt to the emerging challenges posed by autonomous vehicles, potentially leading to the establishment of new laws and precedents.
New legislation, court rulings, and industry standards emerge on a regular basis. If you have been injured as the result of a driverless car or semi-automated vehicle, it would be in your best interest to contact an experienced California auto defect lawyer who has knowledge and experience handling these types of cases.