(Lack of) privacy: It's no secret that privacy and anonymity are becoming a thing of the past in some parts of the world. This is especially evident in countries like China, where rampant, almost dystopian levels of surveillance are the norm. However, for better or worse, it seems some of that tech will make be making its way to London soon.
As reported by The Guardian on Friday, London's Metropolitan Police force plans to begin using live facial recognition cameras on the city's streets in the not-so-distant future. The hope is that these cameras will help officers catch suspects or criminals on "bespoke 'watch list[s].'"
Police are primarily aiming to use the cameras to find violent or otherwise dangerous criminals, and the data of any innocents who have their face scanned will apparently be deleted "in seconds." According to The Guardian, the cameras are 70 percent effective at spotting wanted suspects, and "80%" of people surveyed by police support their existence.
Metropolitan police say their systems have only falsely identified one individual out of a thousand as a suspect during testing, but we expect real-world results to vary.
As we here at TechSpot have reported in the past, this type of technology is far from perfect. We can't speak to the effectiveness of the Metropolitan police force's particular implementation of facial recognition, but other technologies in this sector, such as Amazon's "Rekognition," have a history of reporting false positives.
Even if the cameras in question boasted a 100 percent success rate, their deployment is still likely to be met with some stiff resistance from privacy advocates. Indeed, this news has already sparked several heated debates on social media. Many are asking themselves whether or not potentially compromising their anonymity is worth the added security that these facial recognition cameras could bring.
Whether you're for or against the use of facial recognition cameras throughout London's streets, we'd love to hear your thoughts on this news in the comments below.