NCMEC unveils tool to help minors fight online spread of explicit images

Daniel Sims

Posts: 1,335   +43
Staff
In brief: The spread of intimate images, particularly those of minors, is one of the most serious issues surrounding social media. Meta and a few other groups have started offering people a method to stem the unwanted proliferation of such images, promising total privacy.

The National Center for Missing and Exploited Children (NCMEC) announced this week that it's collaborating with a few online platforms on a tool to help teens and others stop their intimate pictures from being shared. The measure intends to take a more preventative approach versus typical reporting systems.

Normally, when someone wants to stop an explicit image of themselves from spreading online, they have to search for instances of it and report each one. For many, the process is likely laborious, humiliating, and ultimately ineffective due to how quickly such content spreads.

Take It Down is a platform that tags such images with digital fingerprints so participating services – currently including Facebook, Instagram, OnlyFans, Yubo, and PornHub – can automatically detect and remove them. The method has limits but could be more effective than typical measures due to its proactive and automated nature.

If a user is afraid an explicit image or video might be shared online, they can select it and head to the "Get Started" page on the Take It Down website. From there, the service will generate a hash which it will use to identify exact copies, which participating platforms will remove upon detection.

The tool is primarily for minors, but adults can also use it for content created when they were minors. It's also accessible to anyone in the world.

Those worried about sharing their images with Take It Down should note that assigning a digital fingerprint doesn't upload the content. It remains solely on their device, the staff operating Take It Down can't view it, and the hash can't be used to reproduce it. Additionally, creating a hash doesn't require submitting any personal information.

However, to create a hash for a video or picture, it must still be on the user's device. Furthermore, it can't stop the spread of images that have already been uploaded, but it can slow it down. Take It Down also can't penetrate encryption. Additionally, users shouldn't try to upload pictures themselves after submitting them to the service since it could tag them and lead to social media bans.

The NCMEC has additional services for those who want to fight the spread of their intimate images. The organization also operates a CyberTipline for people being threatened regarding explicit images. Anyone who needs mental health services can head to the NCMEC's emotional support portal.

Permalink to story.

 
How about you minors stop being f'ing retarded and stop taking nudes of each other or yourself and sharing them. Most of the time it is their own damn fault their images get out into the public. Anything digital has the potential to be viewed by anyone.

If you're too stupid to understand this, then you're clearly too stupid to be trusted with social media platforms and you should do yourself a favor and stop using them.
 
This is a great tool. However I do foresee using it to censor content. e.g. Trump or Biden fans could hash a viral photo to get it removed from Facebook specially if no one at NCMEC can view the image.
 
I assume it uses the NeuralHash perceptual hashing algorithm designed for detecting similar images with more of a fingerprint. It sounds like this same system is used to scan for child pornography images on iOS devices. There's actually a very long way to go in this field. It's why you can easily find movies to watch that infringe copyright on YouTube. Here's a nice little snippet from Wikipedia about some of the recent challenges that the NCMEC is facing here: https://en.wikipedia.org/wiki/Perceptual_hashing
Apple Inc reported as early as August 2021 a Child Sexual Abuse Material (CSAM) system that they know as NeuralHash. A technical summary document, which nicely explains the system with copious diagrams and example photographs, offers that "Instead of scanning images iCloud, the system performs on-device matching using a database of known CSAM image hashes provided by [the National Center for Missing and Exploited Children] (NCMEC) and other child-safety organizations. Apple further transforms this database into an unreadable set of hashes, which is securely stored on users’ devices."

In an essay entitled "The Problem With Perceptual Hashes", Oliver Kuederle produces a startling collision generated by a piece of commercial neural net software, of the NeuralHash type. A photographic portrait of a real woman reduces through the test algorithm to the same hash as the photograph of a piece of abstract art (from the "deposit photos" database). Both sample images are in commercial databases. Kuederle is concerned with collisions like this. "These cases will be manually reviewed. That is, according to Apple, an Apple employee will then look at your (flagged) pictures... Perceptual hashes are messy. When such algorithms are used to detect criminal activities, especially at Apple scale, many innocent people can potentially face serious problems... Needless to say, I’m quite worried about this."

Researchers have continued to publish a comprehensive analysis entitled "Learning to Break Deep Perceptual Hashing: The Use Case NeuralHash", in which they investigate the vulnerability of NeuralHash as a representative of deep perceptual hashing algorithms to various attacks. Their results show that hash collisions between different images can be achieved with minor changes applied to the images. According to the authors, these results demonstrate the real chance of such attacks and enable the flagging and possible prosecution of innocent users. They also state that the detection of illegal material can easily be avoided, and the system be outsmarted by simple image transformations, such as provided by free-to-use image editors.
 
Back