Tesla confirmed on Wednesday that one of its Model S electric vehicles was involved in a minor accident in China's capital of Beijing last week. The vehicle, belonging to 33-year-old programmer Luo Zhen, was in Autopilot mode when the crash took place.

Zhen, who happened to be filming his commute with a dashboard camera, said his car sideswiped another vehicle that was pulled over into the median (but still partially in the left lane). Rather than take the wheel and steer to avoid the road hazard, Zhen let Autopilot handle the situation.

The minor accident ripped the side mirror off the stalled Volkswagen and scratched the sides of both vehicles. Fortunately, nobody was injured.

In his first international interview (with Reuters), Zhen said the salespeople at the local Tesla dealership gave him the impression that the vehicle had self-driving capabilities, not assisted driving. He contends that Tesla uses "immature technology" as a sales and promotional tactic then doesn't take responsibility for the safety of the function.

After analyzing the data, a Tesla spokesperson told the publication that the driver, whose hands were not detected on the steering wheel, did not steer to avoid the parked car. As clearly communicated to the driver in the vehicle, the rep continued, autosteer is an assist feature that requires the driver to keep their hands on the wheel at all times, to maintain control and responsibility of the vehicle and be prepared to take over at any time.

Even if the dealership had labeled the feature as self-driving, the in-vehicle system states otherwise as the spokesperson points out.

If accidents like this continue to happen, it's clear that some sort of concessions will have to be made. On one hand, it's easy to argue that Tesla perhaps shouldn't be beta testing semi-autonomous features at scale. Then again, this is perhaps the best way to do it as it allows the software operating the features to mature at a faster rate.

Ultimately, I believe Tesla simply has too much faith in the public's ability to follow directions. Sure, it's a neat feature to show off to friends and family but giving up total control of the vehicle at this stage is just stupid. Had the drivers involved in the multiple accidents that have been reported as of late actually followed the directions, they would have had their hands on the wheel and been prepared to take control when approaching a questionable situation.

Sometimes, you have to protect the public from itself.

Tesla would be well served to make it absolutely crystal clear how its Autopilot feature works. Its vehicles have massive 17-inch touchscreen displays - put them to good use! Make drivers watch a video explaining exactly what the feature is capable of and their requirements when using it, then have them digitally sign off on it.