A driver monitoring safety feature that is meant to help Tesla drivers keep their eye on the road while the driver-assist Autopilot feature is engaged performed poorly in tests, according to Consumer Reports.
Earlier this year, the automaker announced that it had activated in-car driver monitoring cameras known as “cabin cameras” to detect driver inattentiveness when Autopilot is in use. This feature is meant to alert drivers when they need to pay more attention. But initial reports suggest it may not be as effective as hoped.
Consumer Reports said they tested Tesla’s cabin cameras in a Model S and Model Y and found that it was just not adequate. Consumer Reports researchers found that drivers could still use Autopilot if they were looking away from the road or if they were using their phones.
They also saw that Autopilot remained active and did not prohibit the driver from using the system even if the vehicle’s camera was obscured. In addition, researchers found they could even use Tesla’s Full Self-Driving (FSD) beta software with the vehicle’s camera blocked.
Tesla’s Inadequate Driver Monitoring System
Drivers often pay less attention to the road when a vehicle is automating some driving tasks. When drivers pay less attention, they may have trouble reacting in time during an emergency if they need to take back control of the vehicle, said Kelly Funkhouser, manager for vehicle technology at Consumer Reports. She says an “adequate driver monitoring system” should detect driver inattentiveness and alert the driver.
When this is insufficient, the driver monitoring system should further escalate warnings in order to get the driver’s attention, Funhouser added. When all of that fails, the system should ideally bring the vehicle to a stop as safely as possible. Consumer Reports tested other driver-assist systems and found only General Motors’ Super Cruise intervenes if the camera determines the driver is not paying attention to the road.
Consumer Reports researchers say that while Tesla’s decision to add camera-based driver monitoring seemed like a good idea, they say it doesn’t do enough to keep drivers engaged. Simply having hands on the wheel doesn’t mean the driver is paying attention to the road, researchers say, adding that in order to do that, the system should prevent the driver from using active driving assistance if he or she stops paying attention.
The Illusion of Safety
This Consumer Reports study raises further concerns about Tesla’s Autopilot and FSD driver-assist systems and how they may mislead motorists into believing that they can fully count on these systems. Both Autopilot and FSD, with their names and branding, suggest that they are fully autonomous. However, that is not true.
Drivers need to be alert all the time and need to pay attention to the road at all times. These systems lull motorists into a sense of false security, which is alarming. The addition of driver monitoring cameras only enhances this false sense of security.
If you or a loved one has been injured in a Tesla Autopilot or FSD crash, please contact an experienced auto defect lawyer to obtain more information about pursuing your legal rights.