Tesla Cybertruck crashes into pole while using latest Full Self-Driving software

"Jonathan Challinger, a Florida-based software developer who works for Kraus Hamdani Aerospace, posted a photo of his Cybertruck looking a lot worse than the pole it collided with."

It appears the c-truck is made of cheap metal...!
 
Musk over sells everything with hype and empty promises. Like calling it's driver assist system "Auto Pilot" or "Full Self Driving". I'd suggest as has become more and more evident over the years, Elon has no firm grounding in science or technology. He seems to think if he says "make it so" it will happen like he's some God like being with unlimited powers.

And when he's successful? It's usually because of the hard work and innovations of other people. He's an investor not a entrepreneur. Who buys enough control of promising start-ups to demand he be made CEO. He then does everything he can to convince people he's the reason behind the companies success. Any ideas he comes up with are often impractical or impossible. Like going to Mars. We will eventually, but with current tech it's a one way suicide mission IMHO.

Musk like his pumpkin friend is a spoilt rich kid playing in his sand box with no regard for those he hurts...
 
Assuming you can drive in that lane, it does seem like a road planning error, sticking a telegraph pole in the middle of the lane.

If the vehicle was just doing the 15mph shown on the sign then why was the vehicle totalled?
 
The man said he ignored numourious warnings the truck gave him while in FSD mode that it needed user input. The vehicle knew it needed the driver to take over and the driver didn't, how is that the vehicles fault?
Irrespective it was in FSD mode and drove into a pole. That's a catastrophic failure of FSD. If it disengaged FSD it would come to a halt according to the info above.

Just to add, night driving seeing lines can be hard for humans let alone automation. The fact it missed a pole and drove into it shows the tech is not fit for production. That's inexcusable. There are no small quantity of these vehicles around. Most humans in such conditions do not drive straight into poles. They just don't.

Computers shouldn't make this mistake AT ALL. They don't fatigue. They don't get distracted. They don't drink. This is an algorithmic failure meaning it is likely repeatable failure.

This is why FSD shouldn't be tested on production. It's clearly not ready yet.
 
test-in-production-meme-2.jpg
 
The man said he ignored numourious warnings the truck gave him while in FSD mode that it needed user input. The vehicle knew it needed the driver to take over and the driver didn't, how is that the vehicles fault?
Or lawmakers could put an end to the ever "evolving" issue. Negligence on the governments part for allowing flawed; early access technology to roam the streets( aren't bad drivers themselves enough?), and negligence on the vehicle owner for not doing his due diligence as a driver. I saved everyone some time, here is the definition of negligence: "failure to take proper care in doing something.". Sorry to poke at you, but don't defend negligence, it won't bode well long term. Fact: This technology is far from perfect. The dirtier and more aged the sensors get, the less accurate and more prone to error they will become, no matter how many software updates they get. How many bus drivers, truck drivers, and drivers of large vehicles go past this same pole everyday for the last say 25 years, and have avoided such event? Again negligence, don't defend it. This dude essentially said "I'm an *****, and awful driver, but thanks Elon! your car saved my life! It nailed that pole, and airbags deployed" There is much to be said about the Tesla cult followers.
 
The man said he ignored numourious warnings the truck gave him while in FSD mode that it needed user input. The vehicle knew it needed the driver to take over and the driver didn't, how is that the vehicles fault?
How do we know that "the vehicle knew it needed the driver to take over"? The article says the driver blames themselves for not paying attention, and it also discusses the owner's manual's description of how FSD will alert the driver, but it doesn't say in this article that the car actually alerted the driver. Did I miss something, or is that stated in an article somewhere else?
 
>ignores safety warnings
>gets hurt

Who cares what the marketing department calls it. The system had failsafes in place, they worked as intended, user ignored them and his car got wrecked.

Also, the DoT never approved autopilot for FSD. This is the same as driving while texting and then being surprised when you get into an accident
Failsafe, eh. The last failsafe, as informed and written in the article is for the vehicle to stop completely. It didn't and before you defend the marketing team any further imagine it was not a pole but your mother crossing the road.
It should never been called autopilot, self driving, let alone full. Just be honest and call it driver assist mode where all risks bear by the driver and be done with it.
 
Failsafe, eh. The last failsafe, as informed and written in the article is for the vehicle to stop completely. It didn't and before you defend the marketing team any further imagine it was not a pole but your mother crossing the road.
It should never been called autopilot, self driving, let alone full. Just be honest and call it driver assist mode where all risks bear by the driver and be done with it.
It seems like the drive acknowledged it was user error. I'm not "defending" the marketing department. I'm a big proponent of "just ignore what the marketing department has to as altogether, they're all a bunch of liars"

And the thing about it pulling over? it only does that when it CAN pull over. But let me summerize what happened in terms maybe people can understand.

>car screams at driver to take over because it knows it no longer has the ability to self drive
>drive ignores it
>car crashes as a result

If I get drunk and kill someone in a car accident do I blame the alcohol? For some reason everyone seems to want to turn the tables on Tesla and Musk because hating them has been the popular thing to do lately. But lets keep in mind, turning on auto pilot is no small feat.

People need to stop shifting the blame of what happened because they're made at Musk about something else. If you take the shield off an angle grinder and you get hit in the face what the blade explodes, that's on you for not knowing how to use something properly. Workmans comp won't pay for that. You don't get sue the manufacture because YOU took the shield off and got hurt.

The dude was using a piece of equipment(a vehicle) improperly and it got damaged and he could have gotten hurt. What's so confusing here?
 
How do we know that "the vehicle knew it needed the driver to take over"? The article says the driver blames themselves for not paying attention, and it also discusses the owner's manual's description of how FSD will alert the driver, but it doesn't say in this article that the car actually alerted the driver. Did I miss something, or is that stated in an article somewhere else?
This article didn't say anything about it in this article, but I read and article on Jalopnik about how the driver said he ignored the warnings. To be fair, this article is being spun so many ways for so many reasons right now that I don't know what to believe and what not to. I do, however, know that Tesla autopilots often ask the driver to take over more often than necessary and making devices to 'trick' the car into thinking you have the wheel is common place.
 
Or lawmakers could put an end to the ever "evolving" issue. Negligence on the governments part for allowing flawed; early access technology to roam the streets( aren't bad drivers themselves enough?), and negligence on the vehicle owner for not doing his due diligence as a driver. I saved everyone some time, here is the definition of negligence: "failure to take proper care in doing something.". Sorry to poke at you, but don't defend negligence, it won't bode well long term. Fact: This technology is far from perfect. The dirtier and more aged the sensors get, the less accurate and more prone to error they will become, no matter how many software updates they get. How many bus drivers, truck drivers, and drivers of large vehicles go past this same pole everyday for the last say 25 years, and have avoided such event? Again negligence, don't defend it. This dude essentially said "I'm an *****, and awful driver, but thanks Elon! your car saved my life! It nailed that pole, and airbags deployed" There is much to be said about the Tesla cult followers.
People who ignore the saftey warnings in crashes involving self driving features often get charged with wreckless driving. This person should 100% be charged with wreckless driving. However, I believe this falls under a "drive assistance feature" since the DOT hasn't approve autopilot for FSD. But nearly all vehicles these days have some sort of "driver assistance" whether it be keeping you in your lane, auto breaking, speed warnings.

I drive 40,000 miles a year, I'm REALLY BIG about strighter driving laws and making it easier to lose your license for doing stupid stuff. If this guy wants to use this feature, I have no problem with it. If it uses it in ways not specified my the manufacture and it causes an accident, it should 100% be charged for that. The difference here is that instead of other cases of people doing the samething, blaming the car for the accident and everyone saying the drive is at fault. Everyone here seems to be saying, nah, the driver did nothing wrong and it's the car's fault.

It never ceases to amaze me how people are willing to contradict themselves whenever it is convenient for them.
 
The amount of Tesla drivers not paying attention is simply appalling. More often than not, the majority of the ones I encounter on my daily commute are simply too distracted and should not operate a vehicle.
I find this to be true with every single make and model of car.
 
"Jonathan Challinger, a Florida-based software developer who works for Kraus Hamdani Aerospace, posted a photo of his Cybertruck looking a lot worse than the pole it collided with."

It appears the c-truck is made of cheap metal...!
A safe vehicle is *supposed to* do that.
 
The man said he ignored numourious warnings the truck gave him while in FSD mode that it needed user input. The vehicle knew it needed the driver to take over and the driver didn't, how is that the vehicles fault?

The warning was likely just the standard FSD reminder that the driver needs to keep his hands on the wheel and eyes on the road. The real issue is the Tesla continuing to call it/market it as Full Self Driving and Autopilot when it is clearly not. Tesla cars do not meet the definition of FSD or autopilot in any laymen's sense or common use of the words.

People pay $8k+ for FSD because they think the car can drive by itself and they can relax...but in reality they should still be actively driving.
 
I find this to be true with every single make and model of car.
In my daily observations, the proportion of distracted Tesla users is higher than in other brands. I see most Tesla drivers paying attention more to their phones than to the road. This used to be true for people driving larger SUVs/ Pickup Trucks/ Vans, especially commercial ones and it still is, but to a significantly lesser degree than Tesla drivers. In my experience only 1 in 3 or 4 Tesla drivers pay close attention to the road.

I work for a large Tier1 automotive company, we had a fully loaded Model 3 on test for a while now. Its FSD is eerily good, except when it isn't and therein lies the problem, exception handling. It will give people a false sense of safety. We had a few close calls with technicians glancing more at the additional screens monitoring the data collection, so we decided to forego those screens on road tests and only keep them on when we run the car on the dyno.

Take any or all of this as you wish.

Cheers!
 
Last edited:
It's called Full Self Driving mode.
Does it do that? No, it sells the words.
It fully self drove into a pole.
I would expect a PoleStar to do that.

Change the name or stop letting people test it on real roads.
fully self driving = only the vehicle is in control. the vehicle driving into the pole without human interaction = fully self driving. no clue how that's hard to understand.

the phrase "self driving" is stupid anyway because a vehicle isn't a "self" so...yea. Humans can also fully "self" drive into poles too. just sounds like you have a hate boner for the musk man.
 
The man said he ignored numourious warnings the truck gave him while in FSD mode that it needed user input. The vehicle knew it needed the driver to take over and the driver didn't, how is that the vehicles fault?
I do understand what you're saying and you're right, the driver should pay attention to the road at all times. The vehicle's automation however also failed to detect an obstacle and stop as it is supposed to do.
 
It seems like the drive acknowledged it was user error. I'm not "defending" the marketing department. I'm a big proponent of "just ignore what the marketing department has to as altogether, they're all a bunch of liars"

And the thing about it pulling over? it only does that when it CAN pull over. But let me summerize what happened in terms maybe people can understand.

>car screams at driver to take over because it knows it no longer has the ability to self drive
>drive ignores it
>car crashes as a result

If I get drunk and kill someone in a car accident do I blame the alcohol? For some reason everyone seems to want to turn the tables on Tesla and Musk because hating them has been the popular thing to do lately. But lets keep in mind, turning on auto pilot is no small feat.

People need to stop shifting the blame of what happened because they're made at Musk about something else. If you take the shield off an angle grinder and you get hit in the face what the blade explodes, that's on you for not knowing how to use something properly. Workmans comp won't pay for that. You don't get sue the manufacture because YOU took the shield off and got hurt.

The dude was using a piece of equipment(a vehicle) improperly and it got damaged and he could have gotten hurt. What's so confusing here?
Mate you are ignoring the documentation on the feature. FSD was NOT disabled as confirmed by the car not warning and coming to a stop. That's a fact. FSD catastrophically did not see a massive solid object in its path. That they ACKNOWLEDGED was a flaw with vision only sensor methodology. There's nothing remarkable about any of this meaning any attempt to make such a system "FSD" fraudulent, unsafe and unfit for purpose.

I have NOT been new on this. As a control engineer, I've said from day dot this is cowboy test in production catastrophically unsafe junk. Musk wants to refine it with realworld production data not do it in a safe manner.

Also FSD is selling to the US market which has an enormous amount of people with sub 6 grade literacy. You want to blame dumb people? That's a huge amount of your pop and you knew it before selling such a shonky product. It's not rocket science. This is reality.
 
A software developer can afford a cyber truck? As for safety, I'd like to see if the structure by the right front seat's feet area was good or if it crumbled.
 
I do understand what you're saying and you're right, the driver should pay attention to the road at all times. The vehicle's automation however also failed to detect an obstacle and stop as it is supposed to do.
If it never failed, there would be no requirement for the driver to pay attention. The system works well if the driver is attentive. It is not currently designed to operate any other way. An inattentive driver is equivalent to having a disabled camera. It is a necessary part of the system at present.
 
FSD catastrophically did not see a massive solid object in its path. That they ACKNOWLEDGED was a flaw with vision only sensor methodology.
The cameras, of course, *did* see the object (if the driver can see it, of course the camera can see it). The software, not the cameras, did not recognize the object. This has exactly zero to do with "vision only" limitations.
 
Least it's catching up to the ford pinto in safety for all the doubters out there

https://www.msn.com/en-gb/cars/news...s-than-infamous-ford-pinto-report/ar-AA1yLTDx

Really the Cyber truck is a complete sop to Musk. All Tesla needed to do was a make a great EV that looked and like a F150-350.

Maybe he knew no F350 owner would want one, just city folk

Make America Great again remember the Ford Pinto - full metal alchemy that one
Remember all the great 1980 cars all 10000 versions you could have ? they were all gems sparkling as they lumbered around corners at speed with soft suspension- well sparkling if you like your rear sparking as it was scraped along the road

Remember why The lemon law came in? - now Musk Xcreted RIP to the consumer protection federal dept!

Least you getting cheap eggs , shame your 1 hour of minimum pay can't afford them anyway and 38 Million americans are living under the poverty line .

Probably why this happened Elon the Senators and your King are working out a way to help that 38 Million and reduce inequality to stop people rejoicing over what Luigi did , ie Musk no time to keep his yearly promise since 2014 for full self driving car , well maybe next year and with extra eggs
 
Back