Facial recognition continues to be utilized in surveillance systems, especially in China, where it’s found in millions of CCTV cameras. But what happens if the technology keeps getting it wrong? That’s the situation faced by police in Wales, where more than 2000 people were incorrectly identified as possible criminals during last year’s Champions League soccer final in Cardiff.

The Guardian reports that South Wales police started trialing the facial recognition technology last year as a way of identifying and capturing more criminals. It was used during the 2017 Real Madrid vs Juventus game, which saw 170,000 people arrive in the country’s capital.

The system matched 2,470 people with the 500,000 custody pictures on the police force’s database, but 2,297 of those identified, or 92 percent, were “false positives,” according to the South Wales Police website. Chief Constable Matt Jukes told the BBC that officers "did not take action" and no one was wrongly arrested.

This isn’t the first time the facial recognition tech has been overeager in labeling people potential criminals. It gave a 90 percent false positive at a boxing match last year, and 87 percent of the identifications it made at a rugby match were incorrect.

The police force said that “no facial recognition system is 100% accurate,” but added that the system had led to 450 arrests since it was introduced, and that it had not resulted in any incorrect arrests.

“Successful convictions so far include six years in prison for robbery and four-and-a-half years imprisonment for burglary. The technology has also helped identify vulnerable people in times of crisis,” said a spokesperson for the force.

Police have blamed the results from the Champions League game on “poor quality images” from Interpol and UEFA (Union of European Football Associations), and because it was the first time its technology had been deployed at a major event. The accuracy of the facial recognition system is said to have increased to 28 percent overall since the final, but it’s still a concern among privacy advocates.

“These figures show that not only is real-time facial recognition a threat to civil liberties, it is a dangerously inaccurate policing tool,” said Silkie Carlo, director of privacy rights group Big Brother Watch. "The tech misidentifies innocent members of the public at a terrifying rate, leading to intrusive police stops and citizens being treated as suspects."