Report shows over 100k women virtually disrobed through AI deepfake

Pete Flint

Posts: 40   +7
WTF?! A report by intelligence group Sensity, who research faked images and other malicious visual media, revealed that a deepfake bot was made freely available on the messaging platform Telegram. With this bot, users publically released nude deepfakes of thousands of women based on widely available social media photos.

A recent report has uncovered over 100,000 faked nude photos of women based on social media images, which were then shared online.

This type of deepfake technology is no new feat. Similar software has unfortunately been used to produce fake pornographic content of celebrities for years.

This dramatic new increase in images stems from a "deepfake bot" residing in the private channel of the messaging app Telegram. Users can send innocuous photographs of women, and the bot will disrobe them and spread them on the messaging platform.

The service's administrator, known as "P" online, responded to this reporting: "I don't care that much. This is entertainment that does not carry violence." P went on to say that the quality of images was unrealistic enough that it would not be used for blackmail and downplayed the application's harm, stating, "There are wars, diseases, many bad things that are harmful in the world."

Sensity, the intelligence company researching this recent deepfake uplift, stated that between July 2019 and 2020, a reported 104,852 women were targeted by publically shared images. Follow-up research into pages on which the deepfake bot was advertised revealed that this number might actually be closer to an unbelievable 680,000 people targeted.

"Having a social media account with public photos is enough for anyone to become a target," warned Sensity's chief executive, Giorgio Patrini. While the technology is not novel, Patrini says targeting private individuals is a "relatively new" practice.

Reporters at the BBC have tried this bot with consent from participants and found the results to be "poor," including one image of a "woman with a belly button on her diaphram."

Some of the images revealed in this investigation reportedly appeared underage, "suggesting that some users were primarily using the bot to generate and share paedophilic content."

As administrator, P has said that pedophilic content is deleted if it is found and has even claimed he will soon delete all images shared on the platform.

Surveys of this bot-service's users showed 70% of the roughly 101,000 members were from Russian and ex-USSR countries where Telegram is a popular messaging app. Sensity's research also showed the bot advertised heavily on Russian media website, VK.

"Many of these websites or apps do not hide or operate underground, because they are not strictly outlawed," said Sensity's Giorgio Patrini. "Until that happens, I am afraid it will only get worse."

Victims of faked porn can be seriously impacted both personally and professionally, and as a result, certain states in the U.S. have banned the act, though it is not outlawed nationally.

A report from Durham University and the University of Kent described the state-of-play of legal protections around deepfakes and revenge porn in the UK to be "inconsistent, out-of-date and confusing."

The Sensity report's authors apparently shared this information with law enforcement and the platform and media groups in question but have not received any response.

Permalink to story.

 
" The service's administrator, known as "P" online, responded to this reporting: "I don't care that much. This is entertainment that does not carry violence." P went on to say that the quality of images was unrealistic enough that it would not be used for blackmail and downplayed the application's harm, stating, "There are wars, diseases, many bad things that are harmful in the world." "
- I can't argue with a point this valid.

You know, I may just be a really horrible person but I couldn't help cracking up constantly as I read this. What people are failing to realise (which is immediately obvious to someone who thinks) is that these are FAKE. Without actual nude pictures to use, the picture is nothing more than a computer-based GUESS which means that people still don't know what these women look like with no clothes on. It's no different than taking Scarlett Johansson's head and sticking it on Kenzie Reeves' body. People acting like this is the end of the world really need to get their priorities straight.

Hell, at my age, if someone did a deep fake of me, I'd probably find it complimentary! :joy:
 
Last edited:
I may just be a really horrible person but I couldn't help cracking up constantly as I read this. What people are failing to realise (which is immediately obvious to someone who thinks) is that these are FAKE. Without actual nude pictures to use, the picture is nothing more than a computer-based GUESS which means that people still don't know what these women look like with no clothes on. It's no different than taking Scarlett Johansson's head and sticking it on Kenzie Reeves' body.

Hell, at my age, if someone did a deep fake of me, I'd probably find it complimentary! :joy:
They may look fake to some - today, but technology gets better over time. And without a real photo to compare, the average Joe isn't gonna know better. You'd be giving humanity too much credit if you thought otherwise.
 
They may look fake to some - today, but technology gets better over time. And without a real photo to compare, the average Joe isn't gonna know better. You'd be giving humanity too much credit if you thought otherwise.
Oh I agree and, to be honest, I think that this is a good thing. It will mean that real actresses will no longer be pressured to take off their clothes for movies because the production house can just "deep-fake" it and, as you say, the "average Joe" won't know better but the actress sure will. And even when it "appears" real, it will still be fake because it's no more accurate than an artist's drawing. Any identifying marks (tattoos, moles, freckles, pubic hair style, piercings, etc.) won't be there so it won't look like her and she won't have to feel shame because she can laugh at how much it DOESN'T look like her.

There's no stopping this so just give it time and people's sensitivity to it will disappear. I can pretty much guarantee you that only North Americans really care. Go to Europe and discover that nudity is, in fact, not a big deal. When I was in Italy, I took a ferry out to the island of Capri and there were at least thirty flat-deck sailboats just outside of the harbour with women sunning themselves completely nude (and they were all stunners). They had to know that we were gawking tourists because they were all smiling and waving at us! I kinda think that any psychological damage caused by being seen in the buff is at least partially voluntary because these women weren't bothered one bit and there were hundreds of people on the ferry.

This reminds me of Woody Harrelson's best line from "The People vs. Larry Flynt":
" “I have a message for all you good moral Christian people who are complaining that breasts and vaginas are obscene, but don't complain to me, complain to the manufacturer and Jesus told us not to judge.” "
 
Last edited:
Oh I agree and, to be honest, I think that this is a good thing. It will mean that real actresses will no longer be pressured to take off their clothes for movies because the production house can just "deep-fake" it and, as you say, the "average Joe" won't know better but the actress sure will.

There's no stopping this so just give it time and people's sensitivity to it will disappear. I can pretty much guarantee you that only North Americans really care. Go to Europe and discover that nudity is, in fact, not a big deal.

This reminds me of Woody Harrelson's best line from "The People vs. Larry Flynt":
" “I have a message for all you good moral Christian people who are complaining that breasts and vaginas are obscene, but don't complain to me, complain to the manufacturer and Jesus told us not to judge.” "
Leaked nudes don't typically come from movie screenshots...
 
Fake nudes exist from the 90s when the first versions of photoshop had released.

The shape of the body is defined form the DNA which is common to all human so a body shape can’t be strictly connected with someones mind.
 
This is complex problem - with probably no easy solution - yet I read the usual dribble comments above.
You are amazing guys that no one can scam or bluff as you see the real reality . You lot who can speak for everyone on earth .
The point that being nude for a lot of people is fine - misses the point bigtime - The meta is misrepresentation, identity fraud , abuse , extortion and the ability to harass and destroy lives.

I love spiders - but some folks just freak out - I will make the assumption that a good percentage of folks clamoring for this would happily install RAT or whatever it's call on others computers .

Sure you will laugh if someone sends a "naked" **** pick of you to someone in your office spoofing your account - but just maybe , just maybe a few of us would be put out for a second or 2 - before we laugh it off on the way to the inevitable Human Resource meeting.

I find it perverse on tech blogs everyone let "information be free" , no censorship
Then in another article they are extolling and defending their abusive relationship with companies like Apple who have imposed huge amount of control on the phone and system they bought .
Also having the stupidity to say if google is free you are the product - which is true - but come on -to Apple ( and other Corps ) you are the product - they want a cut of all the money you spend

Like could you imagine a young Steve Wozniak buying an iphone 12 ? "oh this phone will allow me to explore my insatiable curiosity about tech - and all the limits imposed by Apple are for my safety I suppose
That rant over - Don't presume other people - I leant from the age of 11 that words can have powerful affects when I use to think it was a game winding people up and within a 1 week period one kid tried to donk me over the head with a chair and another tried to stab me with a compass .

We don't have to walk around on eggshells - but we don't have to support Aholes who post pictures or deepfake videos of folks they know passing them off as possible real .

Rant 2 - again on tech sites - law enforcement has no right to my info , phone blah blah blah - sream scream scream
How bloody idealistic.
put a group of you in a room all with day bags - one with a gun inside intent on killing everyone.
No we should not search anyone's bag - because even though we will probably die - it's just not OK
 
This is complex problem - with probably no easy solution - yet I read the usual dribble comments above.
You are amazing guys that no one can scam or bluff as you see the real reality . You lot who can speak for everyone on earth .
The point that being nude for a lot of people is fine - misses the point bigtime - The meta is misrepresentation, identity fraud , abuse , extortion and the ability to harass and destroy lives.

I love spiders - but some folks just freak out - I will make the assumption that a good percentage of folks clamoring for this would happily install RAT or whatever it's call on others computers .

Sure you will laugh if someone sends a "naked" **** pick of you to someone in your office spoofing your account - but just maybe , just maybe a few of us would be put out for a second or 2 - before we laugh it off on the way to the inevitable Human Resource meeting.

I find it perverse on tech blogs everyone let "information be free" , no censorship
Then in another article they are extolling and defending their abusive relationship with companies like Apple who have imposed huge amount of control on the phone and system they bought .
Also having the stupidity to say if google is free you are the product - which is true - but come on -to Apple ( and other Corps ) you are the product - they want a cut of all the money you spend

Like could you imagine a young Steve Wozniak buying an iphone 12 ? "oh this phone will allow me to explore my insatiable curiosity about tech - and all the limits imposed by Apple are for my safety I suppose
That rant over - Don't presume other people - I leant from the age of 11 that words can have powerful affects when I use to think it was a game winding people up and within a 1 week period one kid tried to donk me over the head with a chair and another tried to stab me with a compass .

We don't have to walk around on eggshells - but we don't have to support Aholes who post pictures or deepfake videos of folks they know passing them off as possible real .

Rant 2 - again on tech sites - law enforcement has no right to my info , phone blah blah blah - sream scream scream
How bloody idealistic.
put a group of you in a room all with day bags - one with a gun inside intent on killing everyone.
No we should not search anyone's bag - because even though we will probably die - it's just not OK


This is a complex problem, but the solution isn't searching the bags for the handgun and pointing fingers at the guy who has it.

It's teaching kids to be confident about themselves, and to not be so easily manipulated over things that shame their parents. If you teach shame, then there is no solution to an internet where everybody already knows your name, birthdate and Social Security Number.

You're never going to get rid of people being manipulated, but maybe a little more knowledge will teach people to re-prioritize what they get bent out-of-shape over. Maybe once you realize you can recover a lot more easily from faked nudes than someone stealing your identity, you can treat these things as harmless.

Remember, these trolls only gain power if you acknowledge them...but you're going to have to teach entire classrooms this level of confidence, (or it doesn't work)
 
Well, with literally, "billions and billions of porn star's pictures on the internet", (I'm paraphrasing Steven Hawkings here), I'n pretty sure you have to be socially and sexually dysfunctional to do deep fake nudes on prominent people.

I mean really, who would you rather see naked, Glen Close, or some 19 year old blonde Russian nubile? (Who's putting it out there willingly).

Rape laws may have to be expanded in scope to include this type of intrusion.

As it stands, we'll have to wait for someone to post a deep fake of someone under 18, so they can be rung up on "distribution of child pornography", charges
 
Last edited:
People are looking at this all wrong. What a boon this is for women around the world. If and when one of your naughty pics leaks out onto the Internet, you simply claim its a DeepFake. Problem solved: no muss, no fuss.
 
" The service's administrator, known as "P" online, responded to this reporting: "I don't care that much. This is entertainment that does not carry violence." P went on to say that the quality of images was unrealistic enough that it would not be used for blackmail and downplayed the application's harm, stating, "There are wars, diseases, many bad things that are harmful in the world." "
- I can't argue with a point this valid.

You know, I may just be a really horrible person but I couldn't help cracking up constantly as I read this. What people are failing to realise (which is immediately obvious to someone who thinks) is that these are FAKE. Without actual nude pictures to use, the picture is nothing more than a computer-based GUESS which means that people still don't know what these women look like with no clothes on. It's no different than taking Scarlett Johansson's head and sticking it on Kenzie Reeves' body. People acting like this is the end of the world really need to get their priorities straight.

Hell, at my age, if someone did a deep fake of me, I'd probably find it complimentary! :joy:
I think it was about the year 2000 or plus or minus a few years, the US Supreme Court ruled VIRTUAL PORN cannot be banned, including virtual images of children. Removing or altering clothes is still within that virtual realm. BECAUSE IT DOES NOT DEPICT REAL PEOPLE, similarly is when those images LOOK like real people.
 
I think it was about the year 2000 or plus or minus a few years, the US Supreme Court ruled VIRTUAL PORN cannot be banned, including virtual images of children. Removing or altering clothes is still within that virtual realm. BECAUSE IT DOES NOT DEPICT REAL PEOPLE, similarly is when those images LOOK like real people.
So, basically what you're saying is that "catching wood", on a virtual image of a naked 8 year old, is only "virtual deviance"? :eek:
 
So, basically what you're saying is that "catching wood", on a virtual image of a naked 8 year old, is only "virtual deviance"? :eek:
Better than than a real image of an 8 year-old. Sickos aren't going to stop being sickos just because you take their porn away.
 
Oh I agree and, to be honest, I think that this is a good thing. It will mean that real actresses will no longer be pressured to take off their clothes for movies because the production house can just "deep-fake" it and, as you say, the "average Joe" won't know better but the actress sure will. And even when it "appears" real, it will still be fake because it's no more accurate than an artist's drawing. Any identifying marks (tattoos, moles, freckles, pubic hair style, piercings, etc.) won't be there so it won't look like her and she won't have to feel shame because she can laugh at how much it DOESN'T look like her.

There's no stopping this so just give it time and people's sensitivity to it will disappear. I can pretty much guarantee you that only North Americans really care. Go to Europe and discover that nudity is, in fact, not a big deal. When I was in Italy, I took a ferry out to the island of Capri and there were at least thirty flat-deck sailboats just outside of the harbour with women sunning themselves completely nude (and they were all stunners). They had to know that we were gawking tourists because they were all smiling and waving at us! I kinda think that any psychological damage caused by being seen in the buff is at least partially voluntary because these women weren't bothered one bit and there were hundreds of people on the ferry.

This reminds me of Woody Harrelson's best line from "The People vs. Larry Flynt":
" “I have a message for all you good moral Christian people who are complaining that breasts and vaginas are obscene, but don't complain to me, complain to the manufacturer and Jesus told us not to judge.” "

I think you are carrying a wrong picture of Europe over there.
The women you saw are are not necessarily representative for the rest of us.
Imagine going to a techno party where taking drugs is not uncommon and assuming this is how every party in Europe works.
Especially about nakedness as you described there is a lot of disagreement here because some apparently think the freedom to dress as you like equates
to flashing people with your private parts or wearing the most seductive (or skanky, if you will) clothing as a regular. THere are many who are not happy with this though most people will not openly discuss this, being afraid to be called prudish.
 
I think you are carrying a wrong picture of Europe over there.
The women you saw are are not necessarily representative for the rest of us.
Imagine going to a techno party where taking drugs is not uncommon and assuming this is how every party in Europe works.
Especially about nakedness as you described there is a lot of disagreement here because some apparently think the freedom to dress as you like equates
to flashing people with your private parts or wearing the most seductive (or skanky, if you will) clothing as a regular. THere are many who are not happy with this though most people will not openly discuss this, being afraid to be called prudish.
I used that as an example, nothing more. Of course all of Europe isn't like that but all of Europe is a lot more laid back about the human body than the USA is. The fact is that you'd NEVER see something like that in the USA.
 
I have a question, do these fake nudes do away with excess pubic hair, cellulite, unshaved legs, pimples, blackheads and varicose veins? If so, we need more of them.
 
Back