(800) 561-4887

No Fee If We Don't Win

A Fork in the Road in Yosemite is Proving Dangerous for Tesla Autopilot

tesla autopilot police car crash

Tesla vehicles driving in Autopilot mode appear to have a problem with one particular road in Yosemite Valley, leading to a series of accidents.  According to a news report on SFGATE, a Reddit user is claiming that a Yosemite park ranger has reported numerous Tesla accidents at the same spot in the valley. The report shows the potential dangers of the Autopilot self-driving system when it malfunctions or does not read the road correctly.

Drivers Experience Glitches

In a Reddit post last month, the Model X Tesla driver recounted driving on Autopilot in a spot in Yosemite where the road comes to a fork. The driver said they had hands on the wheel and eyes on the road but that “the vehicle just wanted to keep going straight and crashed into a boulder.” The post said the airbag did not deploy, but the driver could not drive. The spot in question seems to be where Northside Drive and El Capitan Drive meet in the middle of the valley.

The driver also said they spoke with a ranger who told them three other Tesla accidents had occurred in the exact same spot. A tow truck driver sent the Reddit user pictures of a fifth accident with a Model S a few days later. The user further explained that it was a 25-mph zone, limiting the damage. But even lower-speed accidents can be hazardous.

Another Tesla owner, Willie Morris, told SFGATE that he has noticed some areas driving in nearby Mariposa where his Tesla’s Autopilot gets “glitchy.” He said Tesla’s “total self-driving” feature is technically in beta testing, which means drivers using the feature are supposed to be completely still engaged in driving.

While the Reddit user claims they were paying attention to the road, Morris says he can see how using Autopilot in the park could be challenging to navigate because Autopilot uses markings on the road.

The Trouble with Autopilot

While park officials have not officially corroborated this account from Yosemite, it aligns with many other reports of Tesla vehicles not recognizing real-world traffic scenarios.

The automaker has been covering up its tracks by pointing to its own instructions that drivers must always remain attentive and not count on the Autopilot to navigate perfectly. Federal officials are currently looking into a number of Tesla crashes, particularly those that involved Autopilot and resulted in fatalities.

However, calling a feature “Autopilot” can be highly misleading. As we’ve often seen, it gives drivers a false sense of security that the vehicle can drive itself, when that’s hardly the case with Tesla. As seasoned auto defect lawyers, we believe that Tesla should stop confusing drivers and be truthful about what Autopilot can and can’t do. Without such clarity, we will sadly see more accidents involving Tesla vehicles on Autopilot.


FREE Case Evalution

Our staff will evaluate your case submission and respond in a timely manner.

California Personal Injury Blog