Google refuses to reinstate account after it flagged medical images as child abuse

midian182

Posts: 9,741   +121
Staff member
A hot potato: While there are an increased number of safeguards used by online services that identify and flag child abuse images, these systems are not infallible, and they can have a devastating impact on the wrongly accused. Such is the case of one father whose Google account is still closed after the company mistakenly flagged medical images of his toddler son's groin as child porn.

According to a New York Times report, the father, Mark, took the photos in February last year on the advice of a nurse ahead of a video appointment with a doctor. Mark's wife used her husband's Android phone to take photos of the boy's swollen genital area and texted them to her iPhone so they could be uploaded to the health care provider's messaging system. The doctor prescribed antibiotics, but that wasn't the end of it.

It seems that the images were automatically backed up to Google Photos, at which point the company's artificial intelligence tool and Microsoft's PhotoDNA flagged them as child sexual abuse material (CSAM). Mark received a notification two days later informing him his Google accounts, including Gmail and Google Fi phone service, had been locked due to "harmful content" that was "a severe violation of Google's policies and might be illegal."

As a former software engineer who had worked on similar AI tools for identifying problematic content, Mark assumed everything would be cleared up once a human content moderator reviewed the photos.

But Mark was investigated by San Francisco Police Department over "child exploitation videos" in December. He was cleared of any crime, yet Google still hasn't reinstated his accounts and says it is standing by its decision.

"We follow US law in defining what constitutes CSAM and use a combination of hash matching technology and artificial intelligence to identify it and remove it from our platforms," said Christa Muldoon, a Google spokesperson.

Claire Lilley, Google's head of child safety operations, said that reviewers had not detected a rash or redness in Mark's photos. Google staff who review CSAM are trained by pediatricians to look for issues such as rashes, but medical experts are not consulted in these cases.

Lilley added that further review of Mark's account revealed a video from six months earlier showing a child laying in bed with an unclothed woman. Mark says he cannot remember the video, nor does he still have access to it.

"I can imagine it. We woke up one morning. It was a beautiful day with my wife and son, and I wanted to record the moment," Mark said. "If only we slept with pajamas on, this all could have been avoided."

The incident highlights the problems associated with automated child sexual abuse image detection systems. Apple's plans to scan for CSAM on its devices before photos are uploaded to the cloud were met with outcry from privacy advocates last year. It eventually put the feature on indefinite hold. However, a similar, optional feature is available for child accounts on the family sharing plan.

Masthead: Kai Wenzel 

Permalink to story.

 
"Child sexual abuse material" Hey Google, how many syllables do you need to add to say "child porn"? Don't try to make it less disgusting by softening the terminology.

"Porn" kind of suggests something produced with intent to distribute widely. However, I have little sympathy for weirdos who sleep in the nude and expose their children to that kind of thing.
 
They most likely have already wiped parts or all data previously stored in the account, or are simply not prepared to restore lost data after a year, and don't want to admit to that
 
They most likely have already wiped parts or all data previously stored in the account, or are simply not prepared to restore lost data after a year, and don't want to admit to that
Google never deletes or wipes any user s data. Only tell users that they did it. Otherwise they cannot build their AI database anymore.
 
Last edited:
IMO, this is a prime example of Google's outright arrogance. Google is insisting that their "pediatrician trained experts" have more expertise than the child's own pediatrician?? Classic arrogance.

The really have no f'ing clue.

IMO, no one should upload anything at all to Google.
 
If the image had been ZIPed would Google's automatic scanners have picked it up?

Do they go into the content of archives?
 
"Porn" kind of suggests something produced with intent to distribute widely. However, I have little sympathy for weirdos who sleep in the nude and expose their children to that kind of thing.

This kind of thinking is exactly the Americacentric cultural insensitivity that makes Google, meta, and other think that making content guidelines for the entire world based on their own semipuritanical American beliefs is alright. There are plenty of places in the world where child nudity, even public child nudity, is perfectly normal in many contexts. As is taking pictures of them. To us it’s neither ‘child pornography’ or ‘being a weirdo’. It’s a day at the beach with our toddler…
 
Google now put themselves above the courts of the land. Not too surprising given their fall from rational thinking for the last 20 years and the totally mind blowing stupid decision making going on. Its almost like employees at every level are hired based solely on their narcissistic qualities and the company has become a cult to narcissism. Time for a #GoogleMustFall campaign to start scaring off the advertisers and drive the share price down?
 
If the image had been ZIPed would Google's automatic scanners have picked it up?

Do they go into the content of archives?
Yes, for them that's trivial to do. Presumably if its an encrypted zip then No, but that might be based on the zip product used. MS zip product is questionable but winzip and winrar both offer encryption - though Google may have cut a deal with Corel who owns winzip. PCMag for one does not believe in the integrity of zipfile encryption and does not recommend it for file encryption to the point that they do not bother to test it at all now. Best to encrypt all of the files that Google has access to (like this person found out with GooglePhotos. There is a considerable amount of personal information in a single photograph. All of it is data mined as you see in this example. Google are the only arbitrators for what Google uses this information - as you can see from this example too - they are above the courts of the land and it is their decisions not the courts that govern their actions. PhotoDNA is a pattern matching software that can be set up to flag any pattern; example number sequences in the background or foreground in any photos such as bank account numbers, SIN numbers etc. etc. etc. As can be seen in this example the process is automated completely even to the point of use of the information (in this case termination of accounts and complaint to police) - all without human intervention and all without apology.
 
Guilty until proven...uh, hmmm...nope, just plain guilty IGHO (in Google's humble opinion)
There's nothing humble about Google's opinion.
This kind of thinking is exactly the Americacentric cultural insensitivity that makes Google, meta, and other think that making content guidelines for the entire world based on their own semipuritanical American beliefs is alright. There are plenty of places in the world where child nudity, even public child nudity, is perfectly normal in many contexts. As is taking pictures of them. To us it’s neither ‘child pornography’ or ‘being a weirdo’. It’s a day at the beach with our toddler…
(y) (Y)Very well said.
 
"The incident highlights the problems associated with automated child sexual abuse image detection systems."

I'd say perhaps even more than that, it also highlights the problems associated with relying on cloud services for anything you do. Especially if FAGMAN are involved.

This sort of news often triggers some schadenfreude on me. Served them right for trusting their data to the cloud. But then I remember that the way things are going, soon we might have no choice on the matter, and then I start to worry.
 
"The incident highlights the problems associated with automated child sexual abuse image detection systems."

I'd say perhaps even more than that, it also highlights the problems associated with relying on cloud services for anything you do. Especially if FAGMAN are involved.

This sort of news often triggers some schadenfreude on me. Served them right for trusting their data to the cloud. But then I remember that the way things are going, soon we might have no choice on the matter, and then I start to worry.
That is the irony of the whole situation - ultimately the people that suffer the most are the ones that were already Google's most faithful customers; the ones that heard reports like this and just discounted them as over-reactions or reasoned that they just won't have issues like this, until, of course, they do.
 
Man, I'm conflicted on this one. While protecting innocent children is only a good thing, do we really want "Big Brother" being the total arbitrator in any suspected case? Secondly, Google's services are in fact free to use, and they do have the right to suspend anyone for any reason if they really want to, but the fact that they're so intractable once there was some clarification is troubling.
My guess is that just like with YouTube they don't want to increase the amount of possible claims of innocence and would rather simply be done with the issue as quickly and efficiently as possible, even if the customer suffers for it. Better to piss of a single user then potentially have to spend resources investigating and determining the true facts in the case.
 
Man, I'm conflicted on this one. While protecting innocent children is only a good thing, do we really want "Big Brother" being the total arbitrator in any suspected case? Secondly, Google's services are in fact free to use, and they do have the right to suspend anyone for any reason if they really want to, but the fact that they're so intractable once there was some clarification is troubling.
My guess is that just like with YouTube they don't want to increase the amount of possible claims of innocence and would rather simply be done with the issue as quickly and efficiently as possible, even if the customer suffers for it. Better to piss of a single user then potentially have to spend resources investigating and determining the true facts in the case.
The type of "free" service which Google claims to provide is in fact not free at all.
Google trades it's services for personal users data. And in this trade users are loosing hard because their personal info is worthing more than Google is paying to provide their services and second because Google lies and deceives users that they provide a 'free' service so most of users are not aware that they do an unfair transaction with Google.
 
Last edited:
Back