People are shocked that Apple's image recognition system categorizes "brassiere" photos

midian182

Posts: 9,763   +121
Staff member

Image recognition AI certainly isn’t something new. It’s a feature found in the products of many tech companies, including Apple, who introduced it with iOS 10 last year. But it became a controversial topic yesterday after a twitter user noted how searching for “brassiere” in the iPhone’s Photo app 'showed' that Apple places explicit user images into a folder.

While the tweet by ellieeewbu did correctly state that Apple's software can identify certain images, it does not “save” photos of bras and move them into any special folders.

The Photos app can tag and categorize a number of different scenes and objects (along with facial expressions, places, and people) that appear in images, allowing users to search for specific photos by using keywords. In a Medium post, developer Kenny Yin detailed all 4432 of these categories, including “brassiere,” “bra,” and “bras.” But there’s no mention of any type of men’s clothing, strangely.

The most important thing to remember is that the object detection is all done locally on the device and nothing is sent to Apple’s servers. As pointed out by TechCrunch, there are special exceptions for things such as child pornography, which have special classifiers that can reach beyond the confines of a handset.

The Tweet’s reference to a folder isn’t strictly accurate, either. It implies that the images in question have been moved to specific “brassier” folder the system created on the phone, but it doesn’t occur this way. It’s likely a misconception stemming from a UI that automatically creates categories from past searches.

You can’t turn Apple’s image recognition feature off, which isn’t great, but it’s still a safer system than Google Photos’ version, which stores images on Google’s servers.

For those who do want explicit photos moved into a separate (locked) folder on their device, there’s always Nude: the sexiest app ever, which uses image recognition to automatically identify naughty pictures.

Permalink to story.

 
This is a 'feature' I wouldn't find any use for and would probably remain oblivious to it's existence if it wasn't pointed out to me.
 
Last edited:
Did anyone bother to check if there are other clothing tags like skirt, shoes, etc?
It does, it is just meta tagging the photos. Google does the same and I'm not sure about MS but hey probably do as well. It is used to speed up searches for similar pics since few people bother to tag pictures on their own. BTW, the tagging process is done by computer learning, or I would hope so, since the tags can be incredibly wrong for some pictures. This has been done for a while now but it seems people figured it out just now.
 
Apple is again much better for privacy than Google, since iPhone categorizes photos on the phone, while Google sends your photos over the internet to the cloud. Which means your photos may be copied by someone during the transmission (wifi connection and https can be easily cracked) or the photos can be copied from the Google servers by hackers, or employees. Over and over again Google is doing everything to invade your privacy. There should be a law against it.
 
Apple is again much better for privacy than Google, since iPhone categorizes photos on the phone, while Google sends your photos over the internet to the cloud. Which means your photos may be copied by someone during the transmission (wifi connection and https can be easily cracked) or the photos can be copied from the Google servers by hackers, or employees. Over and over again Google is doing everything to invade your privacy. There should be a law against it.
The Fappening, that is all.
 
Back