4. Viral Video Shows Tesla Driver Asleep at the Wheel on LA Freeway with Vehicle on Autopilot
Even as the National Highway Traffic Safety Administration (NHTSA) has stepped up its investigation of a fatal crash involving a Tesla Model S, the company continues to defend the self-driving or Autopilot technology as safe when properly used. According to a report in The New York Times, NHTSA this week released a detailed set of questions for the automaker about its automated driving system, in particular, the emergency braking function.
Tesla’s Autopilot system has been the subject of scrutiny since the May 7 Florida crash that killed driver Joshua Brown, who had his Tesla S on Autopilot when it was hit by a turning big-rig. Officials said the Autopilot feature was engaged at the time, but neither the automatic braking system nor Brown applied the brakes before the car hit the big rig at 65 mph. Tesla has said that in this particular accident, the Autopilot system failed to see the white truck against a bright sky.
Turning the Blame on Drivers
Tesla seems to be employing a strategy, which automakers have used for decades: Pointing the finger at drivers. When Tesla put this self-driving technology in 70,000 of its cars, which are now on the road, it was touted as the gateway to the future. But after the May 7 tragedy, Tesla’s narrative has turned around. The company is now saying that their technology works just fine and that they have no plans to disable the feature. Instead, they are saying that drivers may be to blame for misusing Autopilot.
In an interview with the Times, a Tesla executive said the system was safe as designed, but that consumers need to realize that not using the Autopilot properly could make the difference between life and death. The representative told the Times that drivers, in spite of using the feature, must be aware of road conditions at all times and be able to take control of the car at a moment’s notice, even if the Autopilot’s self-steering and speed controls could operate for up to three minutes without any driver involvement.
Using Consumers as Guinea Pigs
Tesla is also under fire for introducing Autopilot in “beta” mode, which means a technology that is still under development that has not been fully tested. Some engineers researching self-driving cars have concluded that automated systems that depend on the driver suddenly resuming command cannot be made fully safe.
It is not OK for Tesla to continue using consumers as human guinea pigs to test a product that is still being developed and has not been completely tested. We need answers to a number of questions. What are the brands and models of vehicles sold in the U.S. with the Autopilot system? How safe is it? How many times has the automatic system warned drivers to put their hands back on the wheel? In how many incidents were the automatic brakes activated?
In the end there are too many questions, too few answers and an automaker pointing the finger at drivers. As auto defect attorneys who represent seriously injured victims and families that have lost loved ones to defective automobiles and parts, we’ve heard that argument before, way too many times.