A hot potato: While critics have repeatedly called the safety of Tesla's full self-driving software (FSD) into question, a hacker has discovered a hidden version of the feature that could potentially be more dangerous. However, it is unclear how widely available the secret FSD mode is.
An undisclosed version of Tesla's full self-driving (FSD) software has recently emerged. Its intended use remains unknown, but it appears to have fewer restrictions than what is available to ordinary Tesla owners.
The normal FSD mode monitors drivers to ensure that they maintain focus on the road and alerts them if it thinks they aren't sufficiently attentive. According to user @greentheonly, who frequently reports on Tesla FSD development, the secret "Elon Mode" has no such guardrails, letting users do whatever they want as the self-driving car completely takes over.
It isn't clear which Tesla models can access Elon mode, and the hacker hasn't divulged how to engage it. They only explained that the process is a challenging hack unavailable through the official Tesla Toolbox maintenance software and that they performed it on a vehicle manufactured in 2020. The jailbreak may only be possible on special company-owned vehicles, implying that it is still in an early internal testing phase.
because of FSD foolishness).– green (@greentheonly) June 17, 2023
So I was more tolerant towards the constant flow of cars passing me on the right and merging in front of me.
It also helped that I did not need to watch for the dreaded nag.
Overall I spent a bunch of time thinking about it and came up with this: pic.twitter.com/TaPQgClRa9
After testing Elon mode for nearly 600 miles in public traffic, @greentheonly came away generally optimistic. The setting drives somewhat slowly, its braking is less intense than in normal FSD, and it can sometimes struggle with lane changes. The biggest remaining issue concerns obstacles like construction work, road debris, and potholes. The hacker posted a lengthy video (masthead) showing the feature operating on a busy highway mostly without issue, although they twice had to take control to pass other vehicles.
While Tesla drivers may yearn for an FSD mode that doesn't constantly nag them, the feature would probably face at least as much opposition and criticism as the existing option. In February, the US National Highway Traffic Safety Administration advised Tesla to recall over 300,000 vehicles due to FSD software flaws. Last month, an employee leaked a massive trove of company documents exposing thousands of customer safety complaints.
Self-driving cars from other manufacturers have encountered similar complaints. According to California traffic authorities, vehicles from a General Motors subsidiary have obstructed traffic and triggered false emergency calls due to sleeping occupants.