TikTok Star Posts Video of Him Dozing Off in Tesla with Autopilot Engaged
A TikTok star used Tesla’s Autopilot in a dangerous way and posted a video of it, showing the dangerous normalizing of abusing this feature. According to a report on Electrek.com, the worst part of this was that the man’s mother was helping him film the video where her son was sleeping with the car in motion and Autopilot engaged.
Reckless and Dangerous Behavior
Johnathon Cook, who has over 1 million hits on the video streaming app TikTok, posted a video of himself faking going to sleep in a Tesla Model 3 on Autopilot and even going into the backseat while driving at a significant speed on the highway. In most jurisdictions, this would amount to dangerous or reckless driving – a criminal offense. It’s not clear if the police have filed charges yet. But, if they wanted to, they would have ready evidence because
Cook filmed a “behind-the-scenes” of his TikTok video for YouTube, showing in detail his dangerous use of the Tesla Autopilot. He shows how he got around Tesla’s safeguards to try to prevent people from doing just what Cook did. For example, Cook showed how he put a weight on the steering wheel to trick the torque sensor and how he kept the seatbelt locked behind his back so he could exit the seat while the vehicle was in motion.
This is yet another incident that exposes Tesla’s Autopilot flaws. While Tesla has told drivers not to take their hands off the wheel while driving, the name “Autopilot” has done enough damage over the years to confuse or mislead drivers into thinking it’s an autonomous driving feature when it’s only a driver-assist feature.
The misuse of the Autopilot feature has been rampant. We’ve seen a number of drivers filmed driving while asleep with Autopilot engaged on Los Angeles freeways. There have been a number of publicity stunts showing people engaged in reckless acts while Autopilot is engaged. For example, a porn website posted a video of Autopilot misuse, which went viral.
The incident involving Cook’s video also shows that the safeguards put up by Tesla to prevent Autopilot abuse don’t really work. They need to take additional steps to improve driver monitoring, which the automaker hasn’t improved, focusing instead on producing full self-driving cars.
Our auto defect lawyers have consistently maintained that vehicles equipped with autonomous and semi-autonomous technologies should not be put on our roadways until they are absolutely ready. Tesla Autopilot is an example of how lack of due diligence on this matter can be extremely dangerous.