What just happened? Investigators examining a fatal crash in which the driver of a Tesla had Autopilot engaged say he was playing a smartphone game at the time. The National Transport and Safety Board (NTSB) found that 38-year-old Apple software engineer Walter Huang made no attempt to stop the vehicle before it hit the crash barrier in the March 2018 incident.

A month after the crash, Tesla said its own internal investigation showed Haung kept his hands off the Model X’s wheel despite the vehicle’s repeated warnings to retake control. The company reiterated it is "extremely clear" that Autopilot requires a driver’s full attention at all times.

NTSB investigators said another factor was that the crash attenuator in front of the barrier was damaged and had not been repaired by California’s transportation department, Caltrans. If it had been replaced, Huang likely would have survived, said the agency.

“If you own a car with partial automation, do you not own a self-driving car. So don’t pretend that you do,” said the NTSB chairman, Robert Sumwalt. “This means that when driving in the supposed self-driving mode you can’t sleep. You can’t read a book. You can’t watch a movie or TV show. You can’t text. And you can’t play video games. Yet that’s precisely what we found that this driver was doing.”

The NTSB has been analyzing four different Tesla accidents, finding similarities between each one. It criticized the DOT’s National Highway Traffic Safety Administration for ignoring its requests to come up with rules for partial automation tech such as Autopilot.

Sumwalt added that Tesla had yet to respond to safety recommendations that had been sent to the company 881 days ago.

“In this crash we saw an over-reliance on technology, we saw distraction, we saw a lack of policy prohibiting cell phone use while driving, and we saw infrastructure failures, which, when combined, led to this tragic loss,” Sumwalt said. “We urge Tesla to continue to work on improving their Autopilot technology and for NHTSA to fulfill its oversight responsibility to ensure that corrective action is taken where necessary. It’s time to stop enabling drivers in any partially automated vehicle to pretend that they have driverless cars.”

Tesla insists that drivers using Autopilot crash less often than while driving manually. It says that the technology is not a full self-driving system and that users need to keep their hands on the wheel when it is activated, ready to take over at any moment. But there are numerous reports of drivers ignoring this advice.