Emotion analysis technologies could be "immature and discriminating," says UK privacy...

Alfonso Maruccia

Posts: 193   +94
Staff
A hot potato: The United Kingdom's independent authority for privacy doesn't want companies or organizations to use emotion analysis systems based on biometric traits. It's an untested and nascent technology that could even fail to materialize at all.

The UK's Information Commissioner's Office (ICO) recently released a stark warning for companies and organizations looking to deploy AI-based emotion analysis technologies. These systems don't seem to work yet and could potentially never function accurately. Deputy Commissioner Stephen Bonner said machine learning algorithms that identify and distinguish people's moods are "immature." He said the risks brought by this kind of tech are greater than the possible benefits.

"Emotional" AI is a concern for the ICO because there is currently no system developed in a way that satisfies data protection requirements, fairness, and transparency. Bonner suggested that the only sustainable biometric recognition technologies are those that are "fully functional, accountable, and backed by science," and emotion analysis algorithms are none of that.

Emotional analysis technologies process data such as gaze tracking, sentiment analysis, facial movements, gait analysis, heartbeats, facial expressions, and skin moisture. This data offers monitoring capabilities for the physical health of workers, students during exams, and more. The ICO warned that an AI system designed to identify moods could show systemic bias, inaccuracy, and even discrimination against particular traits and facial features.

Emotion analysis AI is usually paired with complex biometric systems, as it needs to manage a vast amount of personal information in addition to the facial images themselves. Beyond the algorithm's usefulness, there is another cause for concern regarding how these systems record and store data. "Subconscious behavioural or emotional responses" are more risky than traditional biometric technologies.

Since officials cannot yet trust emotion analysis technologies, the ICO warned that organizations using it "[pose] risks to vulnerable people" and will face investigation. The office advises companies to wait at least one more year to deploy commercial emotional AIs.

In the meantime, the privacy watchdog is working on a comprehensive "Biometric Guidance" regarding how biometric data, including facial traits, fingerprints, and voice samples, should be handled in a proper and non-discriminating way. The ICO expects to have guidlines published by Spring 2023.

Permalink to story.

 

brucek

Posts: 1,349   +2,025
Sounds like a reasonable default policy to me.

On the other hand, in the very long run, it's not like humans haven't been using all sorts of "analysis" techniques that are frequently wrong, discriminatory, unfair, counter-productive, etc. and/or been corrupted via bribes or nepotism all along. So I think there's room for both, especially with transparency, oversight, and accountability.
 

Theinsanegamer

Posts: 3,957   +7,004
Evidently, the world is now taking after the Chinese appalling discrimination.

Welcome to the age of digital slavery!
Governments, all of them, desire power and crave control over as many people as possible. No matter if they claim to *love freedom* or not. Believing otherwise is allowing the rose colored glasses of government class to cloud your view.
 

BadThad

Posts: 1,280   +1,560
Governments, all of them, desire power and crave control over as many people as possible. No matter if they claim to *love freedom* or not. Believing otherwise is allowing the rose colored glasses of government class to cloud your view.

Indeed! And today we have a record number number of fools willing to give-up freedom in exchange for more power and more control by governments. I will NEVER understand the "bigger government" cronies. Government is the enemy of ALL people!