A company has pulled a range of self-driving cars after a pedestrian accident that caused critical injuries.
The accident happened in October 2023, when a Cruise automated vehicle rolled over a woman in the roadway and dragged her along the road for 20 feet. Cruise had received permits for its driverless technology just two months earlier.
California regulators had said they were confident about self-driving technology when they gave Cruise’s self-driving cars permission to operate a robotaxi service in San Francisco. But The Washington Post reports the permit came two months before the serious accident. The approval Cruise received was crucial for the self-driving car industry because it expanded one of the biggest test cases in the world for such technology.
However, the pedestrian accident on Oct. 2 and the manner in which Cruise initially misrepresented facts about that crash has officials rethinking the approval they handed out. Two days after the DMV suspended Cruise’s driverless permits following that pedestrian accident, the company has said it would suspend all driverless operations in the country to examine its process and earn back public trust.
A Misrepresentation of Vehicle Safety?
According to the Post’s report, the accident happened on the night of Oct. 2 at a busy intersection in San Francisco. A human-driven car initially rammed into the female pedestrian as she stepped into the roadway. She then rolled onto the windshield of that car before she was flung into the path of the Cruise driverless car. The human-driven car fled the scene. In the initial video that Cruise showed media outlets, the Cruise vehicle appeared to brake as soon it made impact with the woman, after which the video ended.
Cruise spokespersons said they had no additional footage to share and that the driverless vehicle “braked aggressively to minimize the impact.” However, it later came to light that Cruise did not present the whole story. First responders who were at the scene noted a trail of blood from the point of impact with the woman to where the driverless vehicle stopped – about 20 feet away. The Cruise vehicle had dragged with the woman pinned underneath for about 20 feet, an action that may have worsened her injuries.
In their move to revoke Cruise’s driverless permits, the DMV said the company’s vehicles are “not safe for the public’s operation” and determined that Cruise misrepresented information “related to the safety of the autonomous technology.” The National Highway Traffic Safety Administration (NHTSA) has also opened an investigation into Cruise after reports that its vehicles may not have exercised appropriate caution around pedestrians in the roadway.
Who Bears Responsibility When There Is No Driver?
In the San Francisco case, it was a human driver who initially struck the pedestrian, and the Cruise autonomous vehicle then dragged the victim for 20 feet. However, some officials are raising the issue that those on the California Public Utilities Commission who granted the company expanded permits, in spite of being aware of issues with the technology, also bear responsibility for the crash. They point to “a check and balance” that completely failed in this scenario.
Liability in a crash caused by a driverless car is a complex and evolving legal issue that raises numerous questions about responsibility and accountability. Here are some of the parties that may be held financially responsible for collisions caused by driverless vehicles:
Vehicle manufacturers: In many cases, the manufacturer of the driverless car may be held liable. If the crash is a result of a technical malfunction or a flaw in the autonomous vehicle’s hardware or software, the manufacturer could be deemed responsible. Manufacturers have a duty to ensure their products are safe and free from defects, and this includes autonomous vehicle technology.
Software developers: Liability may also extend to the developers and programmers responsible for creating the autonomous driving software. If it is determined that the crash occurred due to a software bug or coding error, the developers could be held accountable for the damages.
Vehicle owner or operator: The person who owns or operates the driverless car may still bear some responsibility, especially if they failed to properly maintain the vehicle or ignored warnings or updates related to the autonomous driving system. For example, if a vehicle owner fails to install crucial software updates that would have prevented a crash, they may be considered partially liable.
Third parties: Other drivers, pedestrians, or entities that interact with the driverless car may also play a role in a crash. For example, if a human driver’s reckless behavior causes a collision with an autonomous vehicle, the human driver might be at fault.
Governmental entities: Liability rules can vary significantly from one jurisdiction to another. Governments not only establish the legal framework for autonomous vehicles but also grant permits to test new technology. These bodies have a significant role to play in determining who gets these permits to test new technology on public roadways.
The Evolving Landscape of Self-Driving Cars
As the technology and legal landscape continue to evolve, it is essential to keep in mind that liability determinations can be highly dependent on the specific circumstances of each case. Moreover, legal systems are likely to adapt to the emerging challenges posed by autonomous vehicles, and new laws and precedents may be established.
New legislation, court rulings, and industry standards emerge on a regular basis. If you have been injured as the result of a driverless car or semi-automated vehicle, it would be in your best interest to contact an experienced California auto defect lawyer who has knowledge and experience handling these types of cases.