Tesla sends cease-and-desist to CEO behind video of its cars hitting child-size mannequins

midian182

Posts: 9,756   +121
Staff member
In context: Elon Musk definitely seems to be a love-him-or-hate-him character. One of those firmly in the latter camp is Dan O'Dowd, founder of the Dawn Project and CEO of Green Hills Software, who has been running a public campaign slamming Tesla's Full Self-Driving (FSD) software as dangerous. The latest jab is a viral ad showing the vehicles running over child-sized mannequins while in FSD mode. It's resulted in a cease-and-desist letter from the automaker, which O'Dowd has responded to by calling Musk a "crybaby."

The Dawn Project is an organization launched last year calling for software in critical computer-controlled systems to be replaced with unhackable alternatives that never fail. It took out a full-page ad criticizing Tesla's FSD beta program earlier in 2022, offering $10,000 to the first person who could name "another commercial product from a Fortune 500 company that has a critical malfunction every 8 minutes."

The Dawn Project continued its campaign earlier this month by releasing a video called "Test Track - The Dangers of Tesla's Full Self-Driving Software." It claims to show a Tesla in FSD mode running over child-size mannequins wearing children's clothes at 25mph. O'Dowd says FSD is the worst self-driving software he's ever seen and calls for congress to shut it down.

Tesla took action against the ad. It sent a cease-and-desist letter on August 11 to O'Dowd demanding the video, which it calls defamatory and misrepresentative, be removed. It also asked O'Dowd to issue a public retraction and send all material from the video to Tesla within 24 hours of receiving the letter.

O'Dowd responded to the letter in a lengthy blog post calling Musk "just another crybaby hiding behind his lawyer's skirt."

There have been plenty of people and publications pointing out what appear to be flaws in O'Dowd's video; Electrek highlights several apparent inconsistencies.

Green Hills Software develops operating systems and programming tools for embedded systems; its real-time OS and other safety software are used in BMW's iX vehicles and others from various automakers. The company is also developing its own self-driving software.

Musk tweeted a reply to O'Dowd's blog with an emoji of a bat, a poop, and the word "crazy."

Last month , the California Department of Motor Vehicles filed a complaint against Tesla's autopilot and self-driving claims, alleging the company made "untrue or misleading" statements in advertisements on its website relating to its driver assistance programs.

Permalink to story.

 
Tesla is one of the worst companies for suing people when it is clearly free speech. They sued Top Gear which is both an entertainment show and a car review show which for me means either way you slide either you cant take it seriously or its someone's opinion.

Like Top Gear or not the point is most companies would just dispute it publicly on social media or something.
 
Before jumping to conclusions, if this is a topic of interest to you, it would be to your benefit to follow the links in the article and read Electrek's arguments.
While some of those findings makes sense, the first sentence:
"A Tesla Full Self-Driving smear campaign started by a California billionaire running for Senate"
straight away raising red flags, making sure this criticism is to be very carefully checked, as clearly guys there are mixing technology with politics pretty badly.
 
I mean it is in beta and they know it isn't perfect, that's kind of the whole point of beta testing. Tesla even acknowledge this by saying keep your hands on steering wheel.

I think musk is an a** but the criticism around beta software is absurd. They know it doesn't work right and they need to collect data from beta testers to improve it. Lab testing and real world testing are two completely different monsters. This needs to be tested on the road
 
Tesla is one of the worst companies for suing people when it is clearly free speech. They sued Top Gear which is both an entertainment show and a car review show which for me means either way you slide either you cant take it seriously or its someone's opinion.

Like Top Gear or not the point is most companies would just dispute it publicly on social media or something.
Libel isn’t free speech though, even in the US. Tesla doesn’t sue every publication which has a negative opinion about them. Also when they sued Top Gear, it was over 10 years ago when they made just a few hundred cars a year. This is an uncommon occurrence.

What’s interesting about these two cases however is that they both knowingly lied what actually happened when physically testing their cars, and then displayed it on TV over and over. And in both cases Tesla tried contacting the video makers several times before ever filing a lawsuit. Both video makers probably even have footage showing that they lied.
 
I mean it is in beta and they know it isn't perfect, that's kind of the whole point of beta testing. Tesla even acknowledge this by saying keep your hands on steering wheel.

I think musk is an a** but the criticism around beta software is absurd. They know it doesn't work right and they need to collect data from beta testers to improve it. Lab testing and real world testing are two completely different monsters. This needs to be tested on the road
I don't think that Tesla can hide behind the idea of calling it "Beta". That is a ridiculous claim IMO. So you are telling me that all companies have to do to avoid any moral and legal culpability for safety is to say "oh, it is beta software, so it is not our fault." You can't be serious about that. We are not talking about video games or websites. We are talking about a piece of software that controls a 2 ton car driving at 70mph. I know this is a really hard problem (I build autonomous software for a living) and Tesla has been a great innovator in this area, but the beta argument does not hold water. Would you accept the same for software that runs on commercial aircraft, what about traffic lights?

All that said, it is clear that Mr. O'Dowd is trolling Tesla and Elon for the benefit of his own company. He made what appears to me a video of unsubstantiated claims about Tesla software. That is pretty low bar. However, I do find it amusing that he used Elon's love of twitter to troll him. Elon always seems to be tweeting some BS about other companies and people, but when someone does it to him he runs to his lawyers. Classic narcissism from Musk.
 
Last edited:
Haven't look but someone must be doing the same tests - seems very easy test - rent the car - set up mannenkins

Just looking at video - the car is channel by cones - see static object - veer left hit object , veer right hit object or go straight - now that is BS - as should still brake.
What happens inside car - is their warnings ?
 
The most important line in the article is here:

Green Hills Software develops operating systems and programming tools for embedded systems; its real-time OS and other safety software are used in BMW's iX vehicles and others from various automakers. The company is also developing its own self-driving software.

Musk is a ******* and people have every right to be deeply suspicious of self-driving cars, but I had a feeling this O'Dowd guy had ulterior motives and sure enough...
 
While some of those findings makes sense, the first sentence:
"A Tesla Full Self-Driving smear campaign started by a California billionaire running for Senate"
straight away raising red flags, making sure this criticism is to be very carefully checked, as clearly guys there are mixing technology with politics pretty badly.

I really don't think anyone is looking at that link. The article is pure speculation and then the article even retracts its comments and simply spins more speculation after conceding their original post was wrong.

The article ends with the author admitting the system isn't good but that they are still impressed by it (completely different than the title suggests).

However, every response in this forum that supports Telsa is taking the headline at face value (or at least the existence of unverified "apparent inconsistencies" [that were later retracted].

Regardless of the person's background, this seems to be standard non-profit consumer protection. There is no hint of libel. Rather, the continued (arguably unethical), use of NDA's to suppress bad PR.
 
I mean it is in beta and they know it isn't perfect, that's kind of the whole point of beta testing. Tesla even acknowledge this by saying keep your hands on steering wheel.

I think musk is an a** but the criticism around beta software is absurd. They know it doesn't work right and they need to collect data from beta testers to improve it. Lab testing and real world testing are two completely different monsters. This needs to be tested on the road

If a beta doesn't comply with CA regulations then they simply can't do that. Even if it did comply (which is uncertain), then CA can simply change the regulations to make sure Tesla can't beta test in a real-world scenario.

Both of these points seem to be the purpose of Dowd's non-profit. Ensure regulators are aware of current problems. If unwilling to apply the rules, then change them for tighter compliance.

Just imagine if Boeing was able to beta test their software without regulatory compliance --- a ton of people would die.
 
If a beta doesn't comply with CA regulations then they simply can't do that. Even if it did comply (which is uncertain), then CA can simply change the regulations to make sure Tesla can't beta test in a real-world scenario.

Both of these points seem to be the purpose of Dowd's non-profit. Ensure regulators are aware of current problems. If unwilling to apply the rules, then change them for tighter compliance.

Just imagine if Boeing was able to beta test their software without regulatory compliance --- a ton of people would die.
You mean like how Boeing didn't test the angle of attack sensors with the 747 max?

There is no substitution for real world testing. Tesla tells the testers what can go wrong and gives beta testers specific directions on how to react WHEN they do. They have done all the testing they can in the lab and now they need to see how it functions in the real world. They need more data before they can release a fully functional product. This isn't a Tesla thing, this is an issue for all self driving cars. Google, Apple and everyone else will need to test their software before they can release it to the public.

Tesla is just getting all the flak because they're the first ones to have a public beta test. I'm sure Google has similar issues but since it's closed testing with NDAs we aren't hearing about it.
 
Last edited:
Its a pity Techspot are giving some random guy named Dan the publicity he obviously seeks; to make his life relevant. I would just put him over my knee and spank him in an effort to try and teach him some self discipline since it he was obviously deprived of any teaching of discipline as a child...........but can you really teach and old dog new tricks? Probably not. Techspot - just give Dan a time out in the corner and refuse to look at him; ever.
 
Haven't look but someone must be doing the same tests - seems very easy test - rent the car - set up mannenkins

Just looking at video - the car is channel by cones - see static object - veer left hit object , veer right hit object or go straight - now that is BS - as should still brake.
What happens inside car - is their warnings ?
Its clearly a deepfake video aimed at a vehicle standin for a person. Ignore it.
 
I really don't think anyone is looking at that link. The article is pure speculation and then the article even retracts its comments and simply spins more speculation after conceding their original post was wrong.

The article ends with the author admitting the system isn't good but that they are still impressed by it (completely different than the title suggests).

However, every response in this forum that supports Telsa is taking the headline at face value (or at least the existence of unverified "apparent inconsistencies" [that were later retracted].

Regardless of the person's background, this seems to be standard non-profit consumer protection. There is no hint of libel. Rather, the continued (arguably unethical), use of NDA's to suppress bad PR.
Your last conclusions is incorrect. This is a deepfake video. Best not to react to it at all.
 
Tesla is one of the worst companies for suing people when it is clearly free speech. They sued Top Gear which is both an entertainment show and a car review show which for me means either way you slide either you cant take it seriously or its someone's opinion.

Like Top Gear or not the point is most companies would just dispute it publicly on social media or something.
Granted that this is free speech. But is also fake news: the propagation of which is unethical. If sites that Techspot refused to retweet it Tesla would not have done anything about it. Free speech is also about putting your counter opinion out there, as I am doing here. The only reason that Tesla reacted in the way they did was because it is fake news that is being propagated by sites like Techspot. So my free speech comment to Techspot. You are quite wrong to publish this (spreading fake news) and should take it down and should re-examine you ethics policy that allowed this to happen.
 
Regardless of the person's background, this seems to be standard non-profit consumer protection. There is no hint of libel. Rather, the continued (arguably unethical), use of NDA's to suppress bad PR.
"Green Hills Software develops operating systems and programming tools for embedded systems; its real-time OS and other safety software are used in BMW's iX vehicles and others from various automakers. The company is also developing its own self-driving software"

Surely there is no commercial motivation for this "non-profit" to attack it's founder's competition!
 
In my opinion, self driving software should be responsible for all damages of crashes it causes. The software is the driver. This includes assisted driving software. It's easy to throw software behind the wheel when you're not taking any responsibility for it. If the software is driving the car then it is the driver, not the person sitting in the seat, and it (the company that made it) should be responsible for all damage it causes.
 
Granted that this is free speech. But is also fake news: the propagation of which is unethical. If sites that Techspot refused to retweet it Tesla would not have done anything about it. Free speech is also about putting your counter opinion out there, as I am doing here. The only reason that Tesla reacted in the way they did was because it is fake news that is being propagated by sites like Techspot. So my free speech comment to Techspot. You are quite wrong to publish this (spreading fake news) and should take it down and should re-examine you ethics policy that allowed this to happen.
This may be the single stupidest take I've ever read here, and that's saying something.
You mean like how Boeing didn't test the angle of attack sensors with the 747 max?

There is no substitution for real world testing. Tesla tells the testers what can go wrong and gives beta testers specific directions on how to react WHEN they do. They have done all the testing they can in the lab and now they need to see how it functions in the real world. They need more data before they can release a fully functional product. This isn't a Tesla thing, this is an issue for all self driving cars. Google, Apple and everyone else will need to test their software before they can release it to the public.

Tesla is just getting all the flak because they're the first ones to have a public beta test. I'm sure Google has similar issues but since it's closed testing with NDAs we aren't hearing about it.
Tesla's getting flak for selling a "self driving" feature that does not self drive properly and can and will fail.
 
I mean it is in beta and they know it isn't perfect...

They know it doesn't work right...
I think that says it all right there.
This needs to be tested on the road
If they know that it does not work, then why is it safe enough to be on the road? That it does not work means there is a high probability for accidents and people being seriously injured or killed.
 

In this video you can see (at 1.53) that the Telsa auto pilot avoids even a bunny at night! So that guy it’s obvious a lier.
 
I think that says it all right there.

If they know that it does not work, then why is it safe enough to be on the road? That it does not work means there is a high probability for accidents and people being seriously injured or killed.
Exactly. It's also why regulatory bodies should step in if Tesla is not as safe as they claim to be.
 
I think that says it all right there.

If they know that it does not work, then why is it safe enough to be on the road? That it does not work means there is a high probability for accidents and people being seriously injured or killed.
If it is never tested in the real world then it can never be programed to work in the real world. Tesla has guidelines for how people are suppose to operate the vehicle in beta. It is not 100% self driving, they plannly state that. So when someone uses it in a 100% self driving mode and does not follow Tesla's guidelines for operating in self driving mode ten the operator should not be using it in self driving mode.

This man intentionally miss used it, although I'll admit for a good reason and research purposes, but it can't point to it after misusing it and say "see, it doesn't work."

We KNOW it doesn't work which is why they have guidelines for operation so when things go wrong, which they know that they will and have told drivers that it will, they have directions on how to handle it.

Certainly we need regulatory organizations to keep this stuff in check but I'm sure even they understand that it is new tech and more research is needed. This is a problem across the board with selfdriving cars, not just Tesla. The only difference between Tesla and the others is an NDA.
 
If they know that it does not work, then why is it safe enough to be on the road? That it does not work means there is a high probability for accidents and people being seriously injured or killed.
Because the alternative is even worse -- vehicles driven by humans cause more than 50 million accidents per year, killing nearly 1.5 million people. Autonomous vehicles aren't perfect, and never will be perfect -- but they're already better than humans. And -- much more importantly -- they improve every year. People don't.

In a few decades, driving a car manually at highway speeds will be highly illegal -- and with good reason. It's far too dangerous to leave up to fallible, unreliable, easy-distracted humans.
 
Back