Officials have confirmed the first car accident fatality involving a self-driving vehicle, specifically a Tesla S sports car where the driver was using Autopilot. According to an ABC news report, the victim was identified as 40-year-old Joshua D. Brown of Canton, Ohio, owner of a technology company and former Navy SEAL. Officials say the fatal crash occurred May 7 in Williston, Florida, on a divided highway when a big rig made a left turn in front of the Tesla at a highway intersection.
Neither the Autopilot nor the driver noticed the white side of the tractor-trailer against a brightly lit sky, and therefore, the brake was not applied, according to a statement made by the U.S. National Highway Traffic Safety Administration (NHTSA). The Model S went under the trailer and the impact sheared off the roof of the vehicle. The company says the Autopilot is disabled by default and that drivers should be prepared to take over at any time. NHTSA is investigating the design and performance of Tesla’s system.
Our heartfelt condolences go out to Brown’s family for their tragic loss. His obituary describes him as a member of the Navy SEALs for 11 years who left service in 2008.
Are Autonomous Vehicles Safe?
The question we need to ask after this tragedy is if autonomous vehicles are safe? The short answer is an emphatic “NO,” says auto defect attorney Brian Chase. Self-driving cars are an uncertain frontier for not just regulators and insurers, who have set rules based on driver behavior, but also for motorists who are used to having control of their vehicles at all times.
These technologies involve sophisticated sensors, computer software and cameras. They guide vehicles and can dictate speed and braking without driver intervention. They may hold the promise of safer roadways, but as this particular incident clearly shows, this technology is not ready to be put to use on our nation’s roadways.
“Should we, you and I, be the human guinea pigs that do the real-world testing of this technology?” asks Chase. “Of course not, and this tragic accident is a case in point. This technology is not ready for our highways, period!”
Still a Lot of Uncertainty
Autonomous vehicles are an uncertain frontier for regulators and insurers who have for more than a century, set rules based on the behavior of human motorists. Additionally, this technology raises many more questions about liability in the event of a crash and broader use on public roads. Chase says consumers should see red flags here. The autonomous vehicle technology is touted as the next best thing, but now after this horrendous fatal accident, we hear things like: “The Autopilot is only an assist feature and drivers should be prepared to take over at any given time.”
“Before this accident we are told how great this technology is,” Chase says. “After this accident, all we hear is C.Y.A. I’ve been doing automotive defect cases for over 20 years, and I have seen first-hand how the auto industry and its suppliers routinely place profit over safety. I have no reason to believe we won’t uncover some of that over the next decade with regard to autonomous driving cars.”
That’s not to say we’ll never have autonomous vehicles some day in the future, Chase added.
“This may be really great technology someday, but that day has not even remotely arrived,” he said. “This needs much more testing, but not on our nation’s highways with us as the guinea pigs.”