Posts: 8,797 +110
Tesla’s autopilot feature has come under scrutiny in recent times following three fatal accidents that occurred while it was being used. According to a report in the Wall Street Journal, the firm’s engineers wanted to include additional safeguards on the system, but these were rejected by executives, including CEO Elon Musk, over concerns that they were too expensive, ineffective, and would annoy drivers.
The report claims that eye-tracking technology was one of the options looked at to ensure drivers were watching the road; another was to add more sensors to the steering wheel. But fears that the former system might not work with drivers of different heights, along with concerns over camera and sensor costs, led to the plans being abandoned, according to the Journal’s sources. "It came down to cost, and Elon was confident we wouldn't need it," said one person.
Musk tweeted that while the eye tracking had been on the table, it was rejected for its ineffectiveness rather than cost. The CEO said Tesla vehicles were the safest on the road, four times better than average.
This is false. Eyetracking rejected for being ineffective, not for cost. WSJ fails to mention that Tesla is safest car on road, which would make article ridiculous. Approx 4X better than avg.— Elon Musk (@elonmusk) 14 May 2018
A driver in China is thought to be the first person to be killed while using Autopilot, back in January 2016. In June that same year, Joshua Brown was the first person in the US to die while using the feature. More recently, Tesla Model X driver Walter Huang died when his vehicle hit a crash barrier while autopilot was engaged. The incident, which took place in March this year, is being investigated by the NTSB, though Tesla says its own internal investigation showed Huang kept his hands off the wheel despite repeated warnings.
Autopilot alerts drivers when it detects they’re not holding the wheel. If the warnings are ignored for a minute, the feature is switched off for the rest of the drive. But some drivers still treat the system as if it's fully autonomous. Last month saw a UK Tesla owner banned for 18 months and fined around $2472 for engaging the feature and leaving the driver’s seat while on a highway.
A Tesla spokesperson gave the following statement:
Everyone at Tesla is not only encouraged, but expected, to provide criticism and feedback to ensure that we're creating the best, safest cars on the road. This is especially true on the Autopilot team, where we make decisions based on what will improve safety and provide the best customer experience, not for any other reason.
Ensuring that drivers stay engaged and alert when using Autopilot is extremely important. That is why we designed the system to deliver an escalating series of visual and audio warnings reminding the driver to place their hands on the wheel. That's also why we've taken so many steps to improve this process over time, including an update that prevents a driver from re-engaging Autopilot if they ignore repeated warnings. We've explored many technologies and opted for the combination of a hands-on-wheel torsion sensor with visual and audio alerts, and we will of course continue to evaluate new technologies as we evolve the Tesla fleet over time.
Tesla Autopilot does not prevent all accidents -- such a standard would be impossible -- but it makes them much less likely to occur. If you are driving a Tesla equipped with Autopilot hardware, you are 3.7 times less likely to be involved in a fatal accident.