Frightening Video Shows Problems with Tesla’s ‘Full Self-Driving System’

Tesla

A recently posted YouTube video of an incident showed a Tesla Model Y reportedly running Tesla’s “Full Self-Driving” level veering toward an oncoming car before the driver grabbed the wheel to disengage the system.

According to Jalopnik.com, the video, which was removed from YouTube, but later reposted on Twitter, also shows the vehicle leaving the road.

Potentially Serious Accident Involving Self-Driving System

The video begins with the Model Y rounding a long, gentle curve on a two-lane road. As another vehicle approaches the Tesla in the opposing lane, the Model Y straightens its steering to send the electric crossover on a path for a head-on collision. The driver quickly reacts by quickly retaking control from the Full Self-Driving (FSD) Beta and pulling the Model Y back to the right.

But, according to Jalopnik, the force needed to disengage the FSD and avoid the head-on collision caused the driver to overcorrect causing them to lose control. The crossover then fell into a small ditch and launched out bouncing over the terrain before coming to a stop near a home.

The description of the YouTube post stated that the Model Y’s frame and suspension sustained significant damage. Jalopnik says while the driver’s overcorrection did take the vehicle off the road, the accident would not have happened, had the FSD kept the vehicle within the lane.

There have already been a number of problems with Tesla’s Full Self-Driving or FSD Beta, which was released on Oct. 23. Last month, the automaker recalled 12,000 vehicles operating its FSD Beta software version 10.3 because false collision warnings and unnecessary auto emergency braking events might occur. The company released an over-the-air update two days later and said nearly all of the cars had been updated as of Oct. 29.

Public Safety Should Always Come First

Our auto defect lawyers have been concerned about Tesla’s Autopilot, which is under investigation by the National Highway Traffic Safety Administration (NHTSA) particularly with regard to crashes involving emergency vehicles. Now, its so-called Full Self-Driving system, which is not really completely autonomous as the name suggests, is causing problems.

Like the Autopilot, this is only a driver-assist system. But its name misleads consumers into thinking that drivers can put the vehicle on FSD and not pay attention. That is however not true. As we saw in this scary video, had the driver not taken over and corrected the vehicle’s course, the driver of the other car could have been seriously injured or even killed.

We are not against new technology or innovation. But, that should never come at the price of public safety. Any tech that is on our roadways should have been fully vetted. It is unacceptable for a company to use consumers as subjects to test its latest technology.

If you or a loved one has been injured as the result of a Tesla Autopilot or FSD crash, please contact an experienced auto defect lawyer to obtain more information about pursuing your legal rights.

 

Source: https://jalopnik.com/tesla-full-self-driving-beta-causes-accident-with-model-1848201350

FREE Case Evalution

Our staff will evaluate your case submission and respond in a timely manner.

California Personal Injury Blog