Google has implemented a new image search algorithm that now makes it much more difficult to find pornographic photos using the popular search engine, even with SafeSearch disabled. Google claims they are not censoring any adult content but there are at least some users that would disagree with that statement at this hour.

A Google spokesperson said they are simply trying to show explicit images only to individuals that are intentionally seeking them out. The algorithms they are using try to determine the most relevant results for any given query so if someone is looking for adult content, they might have to be a bit more specific in their search. The spokesperson said image search now works the same way as web search.

There's already a Reddit thread where site members are speaking out about the change. One user recalled being able to find porn accidentally all the time but now typing in sexually explicit search terms finds nothing but innocent photos.

I ran a few queries with SafeSearch disabled to see if it was really as "bad" as Redditors were saying. I have no doubt that some of the terms I searched for would have otherwise pulled up some questionable photos but sure enough, I didn't get any pornographic results with the first couple of terms. It wasn't until I added supporting terms like "porn" or "nude" that I got some hits in the right direction.