Facial recognition test mistakenly identified 26 California legislators as criminals

Cal Jeffrey

Posts: 4,452   +1,587
Staff member
Again? No, you are not experiencing a case of déjà vu. The ACLU has actually conducted another facial recognition test like the one they did last year on Congress. This time they put the technology to use on California lawmakers, and it identified 26 of them as criminals.

The results of the experiment were similar to last year's, which identified 28 members of the US Congress as criminals. In both studies, the ALCU contends that the majority of misidentifications involved people of color.

"[The experiment] reinforces the fact that facial recognition software is not ready for prime time – let alone for use in body cameras worn by law enforcement," said assembly member Phil Ting said in a press conference. Ting was one of the lawmakers marked as a criminal.

"While we can laugh about it as legislators, it's no laughing matter if you're an individual trying to get a job, trying to get a home," he continued. "If you're falsely accused, what happens? It impacts your ability to get a job, to get housing. There are real people that this can impact."

The repeated test is being used to drum up publicity and support for California assembly bill 1215. Ting and the Northern California ALCU co-sponsored AB 1215, also called The Body Camera Accountability Act. The proposed legislation would make facial recognition, and biometric surveillance in police body cams unlawful in California. The bill's PR campaign slogan is "One false match is one too many."

The ACLU is vehemently opposed to the technology, calling it a "disaster for communities and their civil rights."

"The spread of facial recognition body cameras in California neighborhoods would be a massive public safety hazard," said ACLU attorney Matt Cagle. "Even if this technology was accurate, which it is not, face recognition-enabled body cameras would facilitate massive violations of Californians' civil rights."

Ting first introduced AB 1215 to the assembly back in February. The state legislature approved it in May. The California Senate is slated to put it to a vote sometime in the next few weeks.

Permalink to story.

 
Could this be just a Minority Report patch?

With significant drops in population and GOP, somebody is responsible.
 
Last edited:
Could this be just a Minority Report patch?

With significant drops in population and GOP, somebody is responsible.

I honestly don't understand this post at all.
Could this be just a Minority Report patch?

With significant drops in population and GOP, somebody is responsible.
Are you using Google to translate from Russian? Because we're not understanding you, dude.
 
Are you using Google to translate from Russian? Because we're not understanding you, dude.
Which part don't you understand, dude?

Minority Report reference? What drop in population means? What GOP means? Otherwise I don't know what is so confusing.
 
I've said it a million times... Humans make mistakes every single day yet when it comes to technology like self-driving cars or facial recognition, we aren't willing to accept even small mistakes. They can be 99% more accurate, but that 1% will make people not trust them at all.

This tech should absolutely be used but along side other policies that require confirmation of identity once situations are under control. This could offer valuable instantaneous information regardless of errors. Just don't go taking irreversible actions like shooting somebody based on the results.
 
For most of our human history, we evolved and built social structures in times and places when most people knew or at least recognized each other.

I do not see anything inherently wrong with the idea that if you are going to do things in front of other people, you should expect those other people know who you are.

Of course in the interim start-up phases of the technology, being aware that it is not yet fully accurate, will be important. But that doesn't mean we shouldn't be doing it at all.
 
I've said it a million times... Humans make mistakes every single day yet when it comes to technology like self-driving cars or facial recognition, we aren't willing to accept even small mistakes. They can be 99% more accurate, but that 1% will make people not trust them at all.

This tech should absolutely be used but along side other policies that require confirmation of identity once situations are under control. This could offer valuable instantaneous information regardless of errors. Just don't go taking irreversible actions like shooting somebody based on the results.

What the difference here is? Let's make an easy scenario. Would you like to be the crash test dummy? We put you in an autonomous car going around a cliff, and we simulate a sensor error. Whether it be hardware or software or it trying to avoid something stupid like a bird flying low. Are you ready to be another statistic of flying off a bridge? Another needless death? It was nice knowing ya...
 
What the difference here is? Let's make an easy scenario. Would you like to be the crash test dummy? We put you in an autonomous car going around a cliff, and we simulate a sensor error. Whether it be hardware or software or it trying to avoid something stupid like a bird flying low. Are you ready to be another statistic of flying off a bridge? Another needless death? It was nice knowing ya...
Because nobody has ever driven off a cliff without automation before? So you are saying automation that kills somebody because of a bad sensor means the technology shouldn't be used even though that same technology may have PREVENTED 30 other people from driving off that same cliff when their sensors worked correctly and the person fell asleep or had a health condition?
 
"the ALCU contends that the majority of misidentifications involved people of color."

So is it about black/brown people? I am a bit confused here since last time I checked white was still considered a skin color despite both black and white not being colors but whatever.
 
"the ALCU contends that the majority of misidentifications involved people of color."

So is it about black/brown people? I am a bit confused here since last time I checked white was still considered a skin color despite both black and white not being colors but whatever.

I guess you don't remember the time (not that far ago) when there were signs telling "colored" people where to sit on a bus, what water fountain to drink from (or not) of if they were allowed in a place or not, etc

And trust me, those signs were not intended for "white colored" people!!
 
"the ALCU contends that the majority of misidentifications involved people of color."

So is it about black/brown people? I am a bit confused here since last time I checked white was still considered a skin color despite both black and white not being colors but whatever.

It’s a politically correct way of saying black people. But apparently phrasing it as such can be offensive, somehow; so people come up with other adjectives to say the exact same thing.

The ACLU is a very, very confused organization.
 
It’s a politically correct way of saying black people. But apparently phrasing it as such can be offensive, somehow; so people come up with other adjectives to say the exact same thing.

The ACLU is a very, very confused organization.

So Black Lives Matter is racist... I knew it.

Thanks for the explanation/clarification. I suspected it to be the case with a 99% probability.
 
Because nobody has ever driven off a cliff without automation before? So you are saying automation that kills somebody because of a bad sensor means the technology shouldn't be used even though that same technology may have PREVENTED 30 other people from driving off that same cliff when their sensors worked correctly and the person fell asleep or had a health condition?

If they did, it's their fault. In this case, it is some dude in a room programming these things that sent you off a cliff. They all know these things will fail, and they don't care. Please do share the factual article of your made up statistics. Fuzzy words you use there "may" "if". A lot of assumptions. As I said, are you ready submit your life to test a sensor failure? How would you like to be the person who dies because of a computer error? Several people would still be here had it not been for this "technology". If I am going to die, I would rather it be because of an accident, not as you would from a computer error. If you choose to let one of these things on the road and I get in an accident and injured, prepare for a lawsuit, and a class action lawsuit for your car company. Meet you at the cliff?
 
I've said it a million times... Humans make mistakes every single day yet when it comes to technology like self-driving cars or facial recognition, we aren't willing to accept even small mistakes. They can be 99% more accurate, but that 1% will make people not trust them at all.

This tech should absolutely be used but along side other policies that require confirmation of identity once situations are under control. This could offer valuable instantaneous information regardless of errors. Just don't go taking irreversible actions like shooting somebody based on the results.
I bet you would change your opinion if you were falsely arrested because the facial recognition tech identified you as a criminal. Of course, secondary identification will take place, however, that is going to take time, and perhaps you would have to be bailed out of jail until they correctly identify you. In the meantime, your arrest will put an iota of doubt into everyone you know.

IMO, with so many false positives, this tech is nowhere near ready for general use. (All joking aside, of course as politicians are well, I'll leave it at that.)

Personally, I agree with the slogan: "One false match is one too many."

I can think of one reason that dark-skinned people are falsely identified besides a possible racist bent - lack of contrast in the image which would make image processing more difficult.
 
I live in CA and can say without reasonable doubt that the fac-rec tech nailed it.

"California legislature. You will never find a more wretched hive of scum and villainy."
I see you don't get to the east coast very often.

Here in Philly, we have quite a rogues gallery of our own

Former District Attorney Seth Williams is doing a nickel for cashing his granny's social security checks. (Think was a plea bargain reached against much more severe charges).
https://en.wikipedia.org/wiki/R._Seth_Williams

Congressman Chaka Fattah is doing a dime on several misappropriation of funds charges. https://en.wikipedia.org/wiki/Chaka_Fattah

And finally, PA Attorney General Kathleen Kane did a deuce for grand jury tampering, and was released a few days ago. https://en.wikipedia.org/wiki/Kathleen_Kane

Facial recognition software may not be anywhere as near as accurate as it needs to be, but it sure is intuitive...:rolleyes: :laughing:
 
Back