Posts: 13,742 +140
What just happened? Consumer Reports (CR) in the wake of a recent fatal crash involving a Tesla Model S said they have successfully demonstrated the ease in which the vehicle’s Autopilot safeguards can be defeated. It begs the question: are more safeguards needed or should some onus fall on the consumer?
Engineers with the organization tested their 2020 Tesla Model Y on a half-mile closed test track. One engineer sat in the rear, and another in the driver’s seat on top of the buckled seat belt (Autopilot will apparently disengage if the seat belt isn’t buckled while the vehicle is in motion).
The passenger in the front seat engaged Autopilot, then used the speed dial on the steering wheel to set the speed to zero, bringing the vehicle to a stop. Next, the driver placed a small weighted chain on the steering wheel to simulate the weight of a driver’s hand before moving over to the front passenger seat.
Using the speed dial, CR engineers were able get the vehicle to drive up and down their test track, “repeatedly, never noting that no one was in the driver’s seat, never noting that there was no one touching the steering wheel, never noting there was no weight on the seat.”
“It was a bit frightening when we realized how easy it was to defeat the safeguards, which we proved were clearly insufficient,” said Jake Fisher, CR’s senior director of auto testing, who conducted the test.
The fatal crash in question, which local police and federal agencies are still investigating, took place on April 17, 2021, in Spring, Texas, and involved a 2019 Tesla Model S. According to preliminary reports, one person was found in a rear passenger seat and another in the front passenger seat. Police at the time said they didn’t believe anyone was behind the wheel when the crash occurred.
Your research as a private individual is better than professionals @WSJ!— Elon Musk (@elonmusk) April 19, 2021
Data logs recovered so far show Autopilot was not enabled & this car did not purchase FSD.
Moreover, standard Autopilot would require lane lines to turn on, which this street did not have.
Tesla CEO Elon Musk in a message on Twitter on April 19 said data logs recovered up to that point show Autopilot was not enabled and the vehicle did not have FSD, or full self-driving, a feature intended for use with a fully attentive driver who has their hands on the wheel and is prepared to take over at any moment. The driving aid, Tesla notes on its website, does not make the vehicle autonomous.
Image credit Photosite