France likely to use AI-powered surveillance at Paris Olympics despite outcry

midian182

Posts: 9,741   +121
Staff member
A hot potato: In a rare (these days) story about artificial intelligence of the non-generative type, France's National Assembly has approved the use of AI to aid in video surveillance of the 2024 Paris Olympics. The move comes despite opposition from rights groups who say its use is a potential violation of civil liberties while paving the way for the future use of invasive algorithm-driven video surveillance across Europe.

As per The Reg, the French government adopted Article 7 of the pending law for the 2024 Olympic and Paralympic Games, authorizing the use of automated analysis of surveillance video from fixed and drone cameras.

The system is said to detect specific suspicious events in public spaces, such as abnormal behavior, pre-determined events, and crowd surges.

While the AI-powered surveillance plan could be challenged at the highest constitutional court, France looks on track to becoming the first country in the European Union to use such a system.

It appears that France ignored the warning of 38 civil society organizations who expressed their concerns over the technology in an open letter. They say the proposed surveillance measures violate international human rights law as they contravene the principles of necessity and proportionality, and pose unacceptable risks to fundamental rights, such as the right to privacy, the freedom of assembly and association, and the right to non-discrimination.

The letter warns that should the AI system be adopted, it will set a precedent of unjustified and disproportionate surveillance in publicly accessible spaces.

"If the purpose of algorithm-driven cameras is to detect specific suspicious events in public spaces, they will necessarily capture and analyze physiological features and behaviors of individuals present in these spaces, such as their body positions, gait, movements, gestures, or appearance," the open letter reads. "Isolating individuals from the background, without which it would be impossible to achieve the aim of the system, will amount to 'unique identification.'"

As is often the case with AI surveillance, there are also discrimination fears. "Using algorithmic systems to fight crime has resulted in over-policing, structural discrimination in the criminal justice system, and over-criminalization of racial, ethnic, and religious minorities," the groups add.

Mher Hakobyan, Amnesty International's advocacy advisor on AI regulation, said the decision puts France at risk of permanently transforming into a dystopian surveillance state.

France's Commission Nationale de l'Informatique et des Libertés (CNIL) regulatory commission backed the bill under the condition that no biometric data is processed, but privacy advocates do not believe such a thing is possible.

Daniel Leufer, the policy advisor at digital rights organization Access Now, said, "You can do two things: object detection or analysis of human behavior - the latter is the processing of biometric data."

Masthead: Henning Schlottmann

Permalink to story.

 
The government does not care what you think.

The government is not your friend.

The government WILL try to stomp you, and your rights, at every opportunity.

Apparently we need to keep reminding people about this. If you dont want this kind of thing to happen, you need bills that completely outlaw the use of AI in surveillance systems, with extremely harsh penalties that are properly enforced on those who break said law, if you want to fix any of this.
 
Maybe the AI will argue with its operators about whether it actually sees someone or not. 🤣
 
We're already burning the country, about the retirement age, for this, you'll have to wait a little ... :p
France is being destroyed from within and the current protests achieve the same purpose. When will people understand that protests are useless. Macron and his bosses won't leave just because people kindly ask him.
 
If those cameras start actually tagging criminals, it may show they are biased. Because most of the criminals will come from the rows of so called "victims". People who may not be called criminals anymore.

So, next step will be to train the neural networks to ignore anyone who looks like a "victim" and only ID people who look like standard default "oppressors". Let's wait and see if I'm right.
 
Back