Meta sues maker of Crush AI nudify app over Facebook and Instagram ads

midian182

Posts: 10,756   +142
Staff member
What just happened? Meta's mission to crack down on AI "nudify" programs has led to it suing the maker of one of these apps. The social media giant has launched a lawsuit against Joy Timeline HK Limited, which developed an app called Crush AI.

In the suit, which has been filed in the company's home of Hong Kong, Meta states that Crush AI made multiple attempts to circumvent Meta's ad review process and continued placing the ads.

The ads appeared across Facebook and Instagram, and while Meta repeatedly removed them for breaking the rules, Joy Timeline kept posting more.

One of the many negative consequences to come from the advancement of generative AI has been the rise of nudify apps. They use the technology to generate nonconsensual nude and explicit images of people after being fed photos of the individual.

Crush AI had been one of the most prolific advertisers among these apps. An investigation in January from the author of the Faked Up newsletter, Alexios Mantzarlis, found that Meta's platforms ran more than 8,000 Crush AI-related ads during the first two weeks of the year alone. He notes that roughly 90% of Crush AI's website traffic came from either Facebook or Instagram.

Crush AI avoided Meta's review process by setting up dozens of advertiser accounts and frequently changed domain names. Crush AI also had a Facebook page promoting its service.

Senator Dick Durbin sent a letter to Meta CEO Mark Zuckerberg in February, urging him to address the ads. Durbin wrote that the ads violated Meta's Advertising Standards, including its prohibitions on ads featuring adult nudity, sexual activity, and certain forms of bullying and harassment.

Meta says that it has now developed new technology that is designed to find and remove these types of nudify ads more quickly. It has also expanded the list of terms, phrases and emoji that are flagged by its systems.

The company is also working with specialist teams to stay up to date with how these app makers evolve their tactics to avoid detection. It will be sharing signals about the apps with other tech companies so they can address them on their respective platforms.

In May last year, Google announced a new policy that prohibits ads for deepfake porn or those that promise to digitally undress people without consent. Soon after, the San Francisco City Attorney's office sued 16 of the most-visited "undressing" sites with the aim of shutting them down.

Permalink to story:

 
While I don’t defend the posting of non-consensual nude photos , I think our society needs to get over nudity being such a “sin”…

If the same company took a photo of you and edited it to make you using a gun to kill people, we wouldn’t bat an eye…

Got to question whether seeing a nipple is worse than seeing someone get shot…
 
While I don’t defend the posting of non-consensual nude photos , I think our society needs to get over nudity being such a “sin”…

If the same company took a photo of you and edited it to make you using a gun to kill people, we wouldn’t bat an eye…

Got to question whether seeing a nipple is worse than seeing someone get shot…

Actually, ANY doctoring of ANY photo or video to show innocent people in an illicit or illegal activity should be prosecuted without question or mercy. There is a HUGE difference between people showing a nipple and being publicly harassed with deepfakes of said pictures/videos.
 
Actually, ANY doctoring of ANY photo or video to show innocent people in an illicit or illegal activity should be prosecuted without question or mercy. There is a HUGE difference between people showing a nipple and being publicly harassed with deepfakes of said pictures/videos.
True.... but I doubt anyone will be able to do anything about this... the tech is already widespread and will only get more available with time - and more effective. Maybe we just need to retrain our minds that a picture ISN'T worth a 1000 words any more...
 
While I don’t defend the posting of non-consensual nude photos , I think our society needs to get over nudity being such a “sin”…

If the same company took a photo of you and edited it to make you using a gun to kill people, we wouldn’t bat an eye…

Got to question whether seeing a nipple is worse than seeing someone get shot…
Personally, I think AI nudes are handy.
At least now all the people who were getting revenge porned can just say it's an AI fake and people won't really question it anymore.
 
Back