TechSpot

Can Tesla protect itself from drivers that don't follow directions?

By Shawn Knight
Aug 10, 2016
Post New Reply
  1. Tesla confirmed on Wednesday that one of its Model S electric vehicles was involved in a minor accident in China’s capital of Beijing last week. The vehicle, belonging to 33-year-old programmer Luo Zhen, was in Autopilot mode when the crash took place.

    Zhen, who happened to be filming his commute with a dashboard camera, said his car sideswiped another vehicle that was pulled over into the median (but still partially in the left lane). Rather than take the wheel and steer to avoid the road hazard, Zhen let Autopilot handle the situation.

    The minor accident ripped the side mirror off the stalled Volkswagen and scratched the sides of both vehicles. Fortunately, nobody was injured.

    In his first international interview (with Reuters), Zhen said the salespeople at the local Tesla dealership gave him the impression that the vehicle had self-driving capabilities, not assisted driving. He contends that Tesla uses “immature technology” as a sales and promotional tactic then doesn’t take responsibility for the safety of the function.

    After analyzing the data, a Tesla spokesperson told the publication that the driver, whose hands were not detected on the steering wheel, did not steer to avoid the parked car. As clearly communicated to the driver in the vehicle, the rep continued, autosteer is an assist feature that requires the driver to keep their hands on the wheel at all times, to maintain control and responsibility of the vehicle and be prepared to take over at any time.

    Even if the dealership had labeled the feature as self-driving, the in-vehicle system states otherwise as the spokesperson points out.

    If accidents like this continue to happen, it’s clear that some sort of concessions will have to be made. On one hand, it’s easy to argue that Tesla perhaps shouldn’t be beta testing semi-autonomous features at scale. Then again, this is perhaps the best way to do it as it allows the software operating the features to mature at a faster rate.

    Ultimately, I believe Tesla simply has too much faith in the public’s ability to follow directions. Sure, it’s a neat feature to show off to friends and family but giving up total control of the vehicle at this stage is just stupid. Had the drivers involved in the multiple accidents that have been reported as of late actually followed the directions, they would have had their hands on the wheel and been prepared to take control when approaching a questionable situation.

    Sometimes, you have to protect the public from itself.

    Tesla would be well served to make it absolutely crystal clear how its Autopilot feature works. Its vehicles have massive 17-inch touchscreen displays – put them to good use! Make drivers watch a video explaining exactly what the feature is capable of and their requirements when using it, then have them digitally sign off on it.

    Permalink to story.

     
  2. Uncle Al

    Uncle Al TS Evangelist Posts: 1,684   +790

    Yeah .... blame the salesman! But, honestly, where do those salesmen get all their big ideas? .....
     
  3. Cycloid Torus

    Cycloid Torus TS Evangelist Posts: 1,667   +312

    Read the manual? Now, why would I want to do that?
     
  4. Evernessince

    Evernessince TS Evangelist Posts: 1,201   +596

    So this guy knowingly let his car crash into another based on a purported sales pitch? No, this guy is just another person looking for blood in the water and he put another person's life at risk to do so.
     
  5. Kenrick

    Kenrick TS Booster Posts: 190   +89

    Give the buyer a contract to sign before giving them the key. The font in the paper should be in size 40 and in one sentence.

    "I promise I will use my God given common sense when I am driving and will not blame anyone for my own negligence."

    signed: Not-a-Ahole "
     
    RebelFlag likes this.
  6. MrCommunistGen

    MrCommunistGen TS Rookie

    I work at a software company (in QA) and I feel like 50%+ of my workload is making sure that the UI is blatantly clear even if someone hasn't read the instructions, and that things fail gracefully even if the ID10T does the *wrong* thing.

    This is a difficult task even when all we're doing is installing and configuring a relatively simple piece of software for a given environment. I can only imagine how hard it is to dummy-proof the software in a car.

    Note to Tesla: you cannot underestimate how people poorly follow, ignore, or refuse to read instructions.
     
  7. treetops

    treetops TS Evangelist Posts: 1,953   +162

    Can you blame the mnfer if you take off your hands off the wheel while in cruise control? No. Same principal.
     

Similar Topics

Add New Comment

You need to be a member to leave a comment. Join thousands of tech enthusiasts and participate.
TechSpot Account You may also...