Google introduces Real Tone to fight bias in camera tech

Daniel Sims

Posts: 163   +6
Staff
Why it matters: Part of Google's presentation for its new Pixel 6 and Pixel 6 Pro phones focused on the company's efforts to address bias in camera technology. Real Tone is an effort to make the Pixel 6 camera more accurately light people with darker skin tones.

One of Google's blog posts that came with the Pixel 6 presentation, along with the announcement page for Real Tone, both admit that Google in the past has had problems accurately representing people of color with its photo technology. The blog post explains how a lack of testing on subjects with a variety of skin tones has caused the technology to make mistakes when photographing them, "like over-brightening or unnaturally desaturating skin."

To start addressing this issue for Real Tone, Google worked with photographers known for depicting people of color. They include Kira Kelly, Deun Ivory, Adrienne Raquel, Kristian Mercado, Zuly Garcia, Shayan Asgharnia, Natacha Ikoli, and others. The datasets Google uses to train its camera models will start to include more portraits of people of color, and Real Tone will include several features to improve how the Pixel 6 camera detects skin.

Training the camera models on more diverse datasets should improve how the Pixel 6 camera detects faces in a variety of lighting situations. Auto-white balance on the Pixel 6 should produce more nuanced depictions of color. Auto-exposure should make sure pictures aren't automatically made unnecessarily light or dark. An algorithm should reduce the washed-out effect stray light can have on subjects with darker skin tones. The new Tensor processor's ability to manipulate blur should address blurriness that can sometimes occur with darker skin tones. Google says the experts it consulted also helped it improve the auto enhance feature in Google Photos. The updates to auto enhance should work across more skin tones and will come to both Android and iOS users.

Image technologies' bias with skin tones has been well documented. This has had implications for not only photography but also image sharing and facial recognition. In August, Twitter awarded a researcher whose study revealed that the service's image crop algorithm favored lighter, thinner faces. In June of last year, IBM decided to stop developing "general purpose" facial recognition technology due to worries regarding policing and racial injustice.

Permalink to story.

 

Theinsanegamer

Posts: 2,831   +4,484
And then they'll call google racist because it makes black people not as dark as the night. You cant please these people.

"Google worked with photographers known for depicting people of color"

So....people then. Because last time I checked there were no clear humans. This whole racist "poc" thing really needs to stop.
 

Squid Surprise

Posts: 4,427   +3,750
Well, I was noticing how my camera was using the N word repeatedly when taking action shots... and I had to disable the "tar and feather" feature, which had been enabled by default. Good for you Google!
 

Toju Mikie

Posts: 238   +215
The problem is that multiple people have different opinions on how a photo should be taken. This can be an issue with "AI" adjustment when taking photos.

For me, the automatic settings taken from iPhone and Pixel phones seem to overprocess the image and can seem too bright at times. A lot of people are okay with it, but I prefer taking pictures using a more natural color. I will use the auto setting on a phone a lot, but RAW to me is better. No processing, better quality, and you can edit the picture later to get it how you want.
 

Xex360

Posts: 162   +238
And then they'll call google racist because it makes black people not as dark as the night. You cant please these people.

"Google worked with photographers known for depicting people of color"

So....people then. Because last time I checked there were no clear humans. This whole racist "poc" thing really needs to stop.
Because in underdeveloped countries such as the US, people don't understand that there is only one human race.
 

Fearghast

Posts: 450   +352
It does not matter because your skin is white ... or not, It matters because it will improve white balance even more. And Pixels already were white balance champs to begin with.
I like how they are trendy and inclusive ... yady yady yada ... but iPhone photos are still with orange/yellow tinted faces for all the skin tones.
 

Aceseven

Posts: 117   +174
This is ridiculous at this point, as usual in amurica we gotta boil something down to racist.

I dont know, maybe just use better lighting? I mean all of you expert photographers with high-end cameras in their phones obviously know this basic skill? right?