Tesla Autopilot is steering towards lane dividers again

mongeese

Posts: 643   +123
Staff
Cutting corners: Tesla Autopilot is possibly the most advanced self-driving system available to the public, but Tesla’s ambition has come with a considerable amount of risk. After a Model X steered into a lane divider causing a fatal accident last year, Tesla ‘fixed’ the problem with a subsequent software update, but the issue is back.

Reddit user Beastpilot drives down a freeway in Seattle as part of his afternoon commute. Last year, only days after a fatal accident in similar circumstances, BP noticed that his own Model X was steering towards the lane divider separating the freeway from a carpool lane that veers off to the left. Speaking to Ars Technica, he described the car as acting as if the divider was an empty lane.

In light of the then-recent tragedy, he notified Tesla immediately. They didn’t respond, but after several weeks, a new update rolled out and the issue stopped. Come October last year, however, BP’s Model X began steering towards the lane divider again. Once again he notified Tesla to no avail, but once again an update rolled out a few weeks later and the issue disappeared.

Continuing to enjoy the Tesla experience despite the Autopilot issues, BP picked up a Model 3. It didn’t have any troubles until the 2019.5.15 update rolled out earlier this month. You can see the issue returning quite clearly in BP’s Reddit post below: the car follows the lane rightwards until it suddenly veers to the left and into the lane divider.

It's BACK! After 6 months of working fine, 2019.5.15 drives at barriers again from r/teslamotors

This is exactly what happened to Walter Huang last year, only when the car shifted left into the lane divider, his hands weren’t on the wheel and the car hit a concrete wall front on at 70 mph. By that time Teslas had passed that exact same stretch of road 85,000 times on Autopilot, which had probably lulled Huang into a false sense of security. But just because the system works for the first 100,000 times, that apparently doesn’t mean that a software update can’t add a fatal flaw.

One way of making sure old errors aren’t built back into the system – remember, the code is written largely by a neural network – is by checking for every single reported error every single update using simulations. A 3D model of an intersection or road is presented to the software, which must decide what to do and where to go. If it collides with a virtual car or leaves its lane then it’s back to the drawing board. If it passes through all these simulations without a hitch, then it gets downloaded to the cars.

For most companies, this is easy because most self-driving cars rely on Lidar-based 3D maps as their real-time information source. Teslas aren’t equipped with Lidar and only use cameras and radar, and thus don’t use 3D maps to guide themselves. This means that 3D maps can’t be presented to the cars directly, they must be translated into complex footage and radar readings. Tesla refuses to say how regularly they put the effort in to do this, if at all.

Tesla’s plan is to use their billions of miles worth of footage, radar readings and driver’s decisions to create a neural network that is so well informed it must be reliable. Or at least that’s the marketing pitch. In reality, Teslas only consume a few hundred megabytes of data per day at peak, so only a fraction of those miles actually get sent to the supercomputers.

Tesla’s solution to their technical problem is to shift the issue onto the consumer, who must always keep their hands on the wheel and their mind on the traffic, according to their terms of service. Unfortunately, that’s directly against human nature, and drivers do just zone out. With Autopilot on, you can’t trust the car, and you can’t trust the driver. Perhaps it’s better to leave Autopilot off for the time being.

Permalink to story.

 
""Tesla’s solution to their technical problem is to shift the issue onto the consumer"".
should not the Tesla maker and engineers be the fist tester??? like old days when an engineer make a bridge he was standing under it or being the first person to walk over it ??
 
I'll have a self driving car but it sure won't be anything Tesla or "the *****" have had a hand in .....
 
should not the Tesla maker and engineers be the fist tester??? like old days when an engineer make a bridge he was standing under it or being the first person to walk over it ??

The last time I checked they had a back log of testers and weren't able to produce enough test units.
 
Have Tesla spend billions testing on their own tracks & courses… why they are allowed to drive on our public roads and use our Citizens as test subjects/fodder is beyond "We the People..."
 
Can you disable all the bs assists in this car and drive it your self?

No. 5 minutes after you sit inside the car you are dead. Read the rest of the comments here and you will get the picture about the evil car clearer. You cannot do anything to save yourself from it.
 
The other clip got REALLY close to barrier which is really sad... I think tesla's autopilot is still.... meh... not yet, not yet. I think they use their customers as test bunnys with the autopilot, not in a total bad way tho. Still like the point tesla has about autonomous driving it's just not ready yet.
 
The best part of the article is that his old car had suicidal tendencies but no biggie, he bought another one, wow times have changed. At one point people would shame an entire car company for something simple like styling and now they'll give one the benefit of the doubt even though it can outright kill you.

A car shouldnt need "software updates" ffs, we finally get to a point where vehicles are extremely reliable and here comes tesla and its dreams with software updates to dash that and now have your car chasing the dragon and sending you to the great beyond.
 
Back