(Lack of) privacy: Facial recognition technology has always been the source of much debate surrounding privacy in the digital era. While many companies and governments have embraced the tech for purposes like law enforcement or machine learning, some have begun to push back against it lately. Multiple cities have banned police departments from utilizing facial recognition tech, and now, the city of Los Angeles is following suit – sort of.
The law enforcement organization has allegedly been taking advantage of software provided by highly-controversial facial recognition tech company Clearview to track down criminals. According to a report from Buzzfeed News, over 25 LAPD employees performed nearly 475 searches as of "earlier this year," so officers have certainly gotten a decent amount of use out of the tech.
The trouble with Clearview's software, however, is that it uses images and content scraped from social media websites to build a database of faces that can be used by its clients. That's where the controversy comes in – if given the choice to consent to this sort of scraping, we imagine most people would opt-out.
Being added to a database that can and will often be used by law enforcement with no warning is a frightening prospect. Artificial intelligence is capable of making mistakes, after all, as we saw recently when an AI-powered sports camera operator mistook a lineman's bald head for a soccer ball.
Asserting its first-amendment rights, Clearview has stated in the past that it's completely free to perform this sort of scraping, and social media platforms have no legal grounds to stop the process.
Legal or not, though, the LAPD seems to have had a change of heart: moving forward, it will be putting an indefinite "moratorium" on the use of all commercial facial recognition software.
However, here's the catch: the police department can still use facial recognition technology, it just has to be in-house. As Buzzfeed News says, a new policy proposal will allow the LAPD to use a "Los Angeles County system that relies on suspect booking images." That's still not an ideal situation for privacy proponents, we're certain, but it's definitely a step in the right direction.