Super Bowl ad shows alleged FSD-enabled Tesla running over child-size mannequins, hitting...

midian182

Posts: 9,763   +121
Staff member
A hot potato: There were plenty of Super Bowl commercials yesterday, including one involving a Tesla. But this wasn't a promotion from the electric vehicle company; it was a 30-second criticism filled with simulated child deaths from an organization dedicated to banning the automaker's Full Self-Driving (FSD) software.

The Super Bowl ad, aired in Washington, DC, Austin, Tallahassee, Albany, Atlanta, and Sacramento, was the work of The Dawn Project, an organization calling for software in critical computer-controlled systems to be replaced with unhackable alternatives that never fail.

The ad shows a Tesla Model 3 with FSD engaged (according to The Dawn Project) running down a mannequin on a crosswalk the same size and dressed as a child, swerving into oncoming traffic, hitting a stroller, going past stopped school buses, ignoring 'do not enter' signs, and driving on the wrong side of the road. The voiceover then claims Tesla is endangering the public with its "deceptive marketing" and "woefully inept engineering."

CEO Elon Musk didn't seem concerned by the commercial, tweeting that it would likely increase public awareness that Tesla's can drive themselves (with supervision). He also responded to a Tesla fan account claiming the ad was fake with a laughing emoji.

This isn't the first time the Dawn Project has taken out a public notice slamming Tesla. Back in January 2022, it paid for a full-page ad in the New York Times claiming FSD software malfunctions and commits critical driving errors every 8 minutes. The ad demanded FSD be removed from public roads until it has "1,000 times fewer critical malfunctions," and offered $10,000 to the first person who could name "another commercial product from a Fortune 500 company that has a critical malfunction every 8 minutes."

The following August, The Dawn Project released a video that, again, showed four child-size mannequins wearing children's clothes being hit by Tesla vehicles. The company sent a cease-and-desist letter on August 11 to Dan O'Dowd, founder of the Dawn Project, demanding the video be taken down. It also asked O'Dowd to issue a public retraction and send all material from the video to Tesla within 24 hours of receiving the letter. O'Dowd responded in a blog post that called Musk "just another crybaby hiding behind his lawyer's skirt."

O'Dowd is also the CEO of Green Hills Software, which develops operating systems and programming tools for embedded systems. Its real-time OS and other safety software are used in BMW's iX vehicles and other cars from various automakers.

Detractors have called The Dawn Foundation's testing methodology in these ads into question, while others note that one of Green Hills Software's customers is the Intel-owned Mobileye, which makes chips for self-driving software.

It's been a mixed year so far for Tesla. While its director of software admitted that the famous 2016 Autopilot demo video was staged, and Apple co-founder Steve Wozniak slammed the company and Musk over claims they robbed him and his family, Tesla's Autopilot system was exonerated after NTSB investigators found it was not engaged at the time of a fatal crash in 2021.

Permalink to story.

 
"The Dawn Project, an organization calling for software in critical computer-controlled systems to be replaced with unhackable alternatives that never fail." Hilarious! I demand that all bad things on the internet be replaced with rainbows and unicorns, because that's just as easy to do as creating a computer system that is unhackable and never fails!
 
Dan O'Dowd also ran a single-issue Senate campaign in California with the sole purpose of banning Tesla's FSD software from roads permanently. Considering his company indirectly competes with them, that's what you call a conflict of interest: https://electrek.co/2022/04/19/tesl...l-ad-campaign-billionaire-strange-senate-run/

That said, I'm not saying that Tesla's FSD software is worth the money or great or anything, but I do believe they're trying to take important precautions when putting it into the hands of customers. They have a driver-facing camera to require attention when using it, the expectations of driver responsibility and need to take over at any time due to Beta software is well communicated, and Tesla does delay updates for months to prevent serious flaws from making it into customer hands. It's not great right now, but there are going to need to be some risks in order to get something like this developed. I think of it as a behind-the-wheel driver's training but for AI.

Most competitors in autonomous driving pre-map city roadways by scanning them with LiDAR and their vehicles depend on that data. That's not a viable long term solution to self-driving because of the significant operating costs associated with it. Anytime roadways change or there's construction or reality doesn't match the scanned point in time, the map needs to be updated to keep a functioning autonomous solution.

Regardless of whoever wins the autonomous driving race, I'll be excited for the free time gained on any commute I make and be more willing to live at a distance from work.
 
When a person walks, it's changing shape because the legs and hands are moving as he walk and the neural network has been trained to recognize these small changes in shape. If they're moving a small solid box in the shape of a human, it's not the same. The car can't go into a full stop if there's a paper or a box or a bird in the way. Nobody wants to buy such a system because it's dangerous for the passengers. Imagine you are on the highway and someone throws a cardboard box, the car full break and stops for no reason and you have all the cars behind you crashing into you. If they want to do a credible and reliable test, the manikin not only has to look like a human, it has to move like a human.

All the other micro-violations they show are instantaneous and minor. All drivers momentarily cut across oncoming traffic, pass a bus when there are no pedestrians, or ignore arbitrarily placed signs when they know the road is passable. The central issue is the throwing of a disguised box and the blaming of the system for not stopping for the box.
 
It's beta and requires the driver to be 100% aware and in control at all times. None of the things shown here would happen with an attentive driver who is watching the road and holding the wheel (as required). It's a ridiculous ad which could just as easily have targeted old-school cruise control for plowing into anything in its path.
 
I've used the Tesla software in a friend's car and can say with 100% certainty. My "BlueCruise" in my F-150 Lightning is far superior in monitoring the driver. Two camera's and two infrared camera's monitor you and if your hands are off the steering wheel during anything other than a straight road you are immediately warned that you have 3 seconds to put your hands back on the wheel or it disengages. Also if you stop looking ahead it disengages as well.
 
It's beta and requires the driver to be 100% aware and in control at all times. None of the things shown here would happen with an attentive driver who is watching the road and holding the wheel (as required). It's a ridiculous ad which could just as easily have targeted old-school cruise control for plowing into anything in its path.
Frankly a feature in beta should NOT be on public roads, in the hands of customers, period.
I kind of hope we never have truly self-driving cars, because the minute we do, they'll suddenly be mandatory.
Given there is this thing called physics, so long as snow, rain, and dirt exist pure self driving cars will never happen. Railroads are not fully automated, and if you cant fully automate a thing that travels on static paths, fully controlled from a central location, with set schedules; how the hell are you going to automate something as unpredictable as cars?
 
Many evolving technologies are dangerous when not used properly. It's not about making them foolproof, it's about making everyone clear on the fact that they're dangerous, and who should be allowed to use them and under what circumstances.

I'm glad we're working on developing this tech, but I do agree Tesla's marketing of it seems dangerous. Sure they put the required disclaimers in small print but all while screaming FULL SELF DRIVING in the main copy. I have smart friends with Teslas who are intellectually aware of the facts but behave in practice based on their emotional want to believe it's real today.

That said we will one day progress to a stage where autonomous driving is still not perfect, but may be superior to humans who are also not perfect. This is particularly true when talking about specific humans. My first wish is that habitual fall down drunk drivers would just stop driving, but too many don't and won't, so these are a case where Tesla's solution may already be an improvement to the safety of anyone who is sharing a road with them.
 
Self driving is going to have quite some accidents, killed individuals along the line. That's just the harsh truth. Cars in the 60's or 70's had their "metal" fuel tank right after the bumper, but nobody in their mind would even think that in the event of a rear chrash the fuel tank might snap and ignite and burning the passengers alive.

Or metal interior parts that would penetrate drivers or passengers in the event of a collision. It's all things in the past that people learned later on.

There will be more things like this; software needs to adept, improve to at some point have a reliant enough of a car that is fully capable of self driving. The self driving we have is more suited for high-way or inway roads, but not conditions that would still need human intervention.
 
To evaluate the FSD (tesla, mercedes, etc.) it must be an independent institution, not a group that may have hidden conflict of interests. I really doubt that the FSD is *that* bad as I saw some beta versions driving much better than that. But IMHO the FSD should *never* exist unless a human is at the wheel.

If the FSD comes, that humans slowly won´t have jobs: no taxi drivers, robots in bars / restaurants / stores, robots cleaning, etc.. So, basically there will only be humans "engineers" and even so with the AI, they can be replaced eventually. Where is the limit?
 
I personally don't like the idea that FSD is allowed to be sold and operated by untrained drivers. However, I am not sure I really blame Tesla for doing it. IF it is legal, then it is the best way to get massive amounts of testing and data to refine FSD to be better, but at the risk of hurting or killing someone. That is a pretty difficult moral dilemma IMO as someone who build autonomous robots for a living.

What I find unacceptable is Musk's response. He is a snarky a-hole of a "used car salesman" that is more interested in "likes" than being moral or professional. If the video is bogus, then he should have backed it up with data to prove them wrong. Instead, he made a shameless shelf promotion and then proceed to reply as an immature teenager. No other company in the world would response like that. I don't understand why Tesla keeps this guy as the face of the company. Maybe they don't have a choice, but they will come to regret it.
 
Unless I'll be able to jump in my backseat and pass out on my way to work then the tech is useless.

whats the point of sitting in the drivers seat babying the car? if anything happens the lag time of a person probably noticing it and taking control in a meaningful way will still lead to somebody getting ran over by a car. it's just a parlor trick at this point and not worth it if the owners of a "self driving" car cant let the damn thing drive itself.

if the lynchpin in a system like this is the human element then its a clearly broken system, especially when youre setting that human up to be bored and inattentive. people already do a list of insane things other than driving while driving, placing them in smart cars and having the faith that they'll monitor it is a fools errand.
 
Frankly a feature in beta should NOT be on public roads, in the hands of customers, period.

Even it weren't called "beta", it is still a Level 2 system, and would work fine if operated as a Level 2 system (driver always involved). It only fails when the driver fails to do his/her part. Same as cruise control.
 
I agree if its not perfect it shouldn't be allowed

I dont care if a person is capable of the same mistakes or not.

I want to know who is at fault and that I dont also have to worry about machine error on top of human liabilities
Perfection can never be achieved so can not be the target. I understand humans want to be able to assign blame or fear things they do not control but these things are getting very good. In a box, fewer people will die if we embrace this technology. Arguing that there is no one to blame when it fails is an argument for the multiple more that will die if we don't use it.

Is it ready? Not in my opinion, but it will be.

 
I get that this is an Anti-Tesla commercial but it only makes me want a self driving car more. It is very impressive Tesla got this far with self driving tech and it's only going to get better.

Honestly it is my wildest dream to be able to sleep in the car during my morning commute.
Maybe in a few years that will be in a reality.
 
It's beta and requires the driver to be 100% aware and in control at all times. None of the things shown here would happen with an attentive driver who is watching the road and holding the wheel (as required). It's a ridiculous ad which could just as easily have targeted old-school cruise control for plowing into anything in its path.
As a matter of fact, it's WORSE than driving to have a semi-functional thing like Tesla 'helping out'.
There was a video w/ Louis Rossman which had a psychologist stating that due to errors and the need to accelerate when the autopilot makes a ton of mistakes, your behavioral pattern changes to accelerating (if I correctly remember) so you'll run over a pedestrian if something bad appears in front of you.
I don't trust Tesla, as much as I don't trust ANY multinational corporation/company.
They are ALL sheep in wolves' clothing. As for Musk, after his rants and firing a ton of people, my respect for this caricature of a man is dipping. See his position on Starlink when asked to help out a country invaded. He's a garbage human being.
Look at the layoffs, look at the way they are firing people, not long after the HR and marketing teams bragging about "families" and the "work hard play hard" bullshit.
YOU are paid to do a job. Not make friends. Not mingle in, not create a family at work. You are disposable.No matter what they say. You are an asset. Maybe a valuable asset. But an asset nonetheless.
 
Back