A hot potato: Facial recognition systems remain a contentious issue for many people, and it is now at the center of another controversial report. It’s been discovered that IBM developed the tech for the New York Police Department using data from the NYPD’s CCTV cameras.

While the NYPD’s use of facial recognition was public knowledge, IBM accessing its department’s surveillance systems was previously unknown. The tech giant used footage of New Yorkers to create a program able to search for people based on characteristics including skin tone, hair color, facial hair, gender, and age.

The revelations come via a report from The Intercept, which cites internal IBM documents and interviews with engineers who worked on the system. The publication states that the NYPD acquired IBM’s analytics platform from Microsoft subsidiary Vexcel in 2007 and it was tested in New York City’s counterterrorism command center— part of the Lower Manhattan Security Initiative–in 2010. At that time, the system was running on “fewer than fifty” of center’s 512 cameras.

A couple of years later, IBM was "testing out the video analytics software on the bodies and faces of New Yorkers, capturing and archiving their physical data as they walked in public."

The NYPD claims it never used the skin profiling features of the system. “While tools that featured either racial or skin tone search capabilities were offered to the NYPD, they were explicitly declined by the NYPD,” a spokesperson told The Intercept. But Rick Kjeldsen, who worked on the project, said: “We would have not explored [the search functionality] had the NYPD told us, ‘We don’t want to do that.’”

“No company is going to spend money where there’s not customer interest,” he added.

The NYPD decided to stop using the controversial analytics tool in 2016. It is noted, however, that IBM’s Intelligent Video Analytics 2.0 that launched in 2017 can automatically label people with tags such as “Asian,” “Black,” and “White.”

Facial recognition got some rare good publicity last month when a system used in US airports spotted its first imposter. Back in May, however, it was revealed that one UK police force's tech had a 92 percent false positive rate.