Tesla vehicles driving in Autopilot mode appear to be having a problem with one particular road in Yosemite Valley, leading to a series of accidents.
According to a news report on SFGATE, a Reddit user is claiming that a Yosemite park ranger has reported numerous Tesla accidents at the same spot in the valley.
The report shows the potential dangers of the Autopilot self-driving system when it malfunctions or does not read the road correctly.
Drivers Experience Glitches
In the Reddit post last month, the Model X Tesla driver recounted the story of driving on Autopilot in a spot in Yosemite where the road comes to a fork. The driver said they had hands on the wheel and eyes on the road, but that “the vehicle just wanted to keep going straight and crashed into a boulder.” The airbag did not deploy, but the driver was unable to drive, the post said. The spot in question seems to be where Northside Drive and El Capitan Drive meet in the middle of the valley.
The driver also said they spoke with a ranger who told them three other Tesla accidents had occurred in the exact same spot. A tow truck driver sent the Reddit user pictures of a fifth accident with a Model S a few days later. The user further explained that it was thankfully a 25-mph zone, limiting the damage. But even lower-speed accidents can be extremely dangerous.
Another Tesla owner, Willie Morris, told SFGATE that he has noticed some areas driving in nearby Mariposa where his Tesla’s Autopilot gets “glitchy.” He said Tesla’s “total self-driving” feature is technically in beta testing, which means drivers using the feature are supposed to still be completely engaged in driving. While the Reddit user claims they were paying attention to the road, Morris says he can see how using Autopilot in the park could end up being difficult to navigate because Autopilot uses marking on the road.
The Trouble with Autopilot
While park officials have not corroborated this account from Yosemite officially, it is in line with many other reports of Tesla vehicles not recognizing real-world traffic scenarios. The automaker has been covering up its tracks by pointing to its own instructions that drivers must remain attentive at all times and not count on the Autopilot to navigate perfectly. A number of Tesla crashes, particularly those on Autopilot that resulted in deaths, are being currently investigated by federal officials.
However, calling a feature “Autopilot” can be extremely misleading. As we’ve often seen, it gives drivers a false sense of security that the vehicle can drive itself, when that’s hardly the case with Tesla. As auto defect lawyers, we believe that Tesla should stop confusing drivers and be truthful about what Autopilot can and can’t do. Without such clarity, we’re sadly going to see more accidents involving Tesla vehicles on Autopilot.