Viral tweet prompts discrimination probe into Goldman Sachs

Bubbajim

TechSpot Staff
Staff member

At what point do we decide that decisions taken by machine learning algorithms are inherently discriminatory, if in fact they can be? It’s a question that is only going to become more pressing in the modern era as human brains are taken out of the decision-making process in more industries.

The latest allegations of algorithmic discrimination come from David Heinemeier Hansson, co-founder of Basecamp, who in a series of tweets has said that he received twenty-times the credit limit his wife received when applying for an Apple Card, despite there being no material differences in their credit worthiness.

Hansson’s original tweet was shared almost 5,000 times and garnered a lot of attention, including from the New York Department of Financial Services, who have confirmed that they will be investigating to see if the algorithm is inherently gender-biased.

A spokesperson for the NYDFS said “the department will be conducting an investigation to determine whether New York law was violated and ensure all consumers are treated equally regardless of sex.” They continued, “any algorithm, that intentionally or not results in discriminatory treatment of women or any other protected class of people violates New York law.”

It appears that Hansson and his wife were not the only ones facing the issue. Numerous replies to Hansson’s original post claim that they too have suffered discrimination, including Apple co-founder Steve Wozniak, who claimed “The same thing happened to us.”

Goldman Sachs, who backs the Apple Card, said in a statement, “Our credit decisions are based on a customer’s creditworthiness and not on factors like gender, race, age, sexual orientation or any other basis prohibited by law.”

But interestingly Hansson’s wife immediately had her credit limit adjusted after the tweets went viral, suggesting there is room for change given the right PR disaster.

Permalink to story.

 

mbrowne5061

TS Evangelist
Is their income the same? No? Then shut up. But mah sexism.
If they're married and share accounts, then yes, their incomes are the same for all intents and purposes - because they pool their income and share it.

If a couple is married and shares all accounts, their credit scores should be nearly identical - and so should their credit limits.
 

Scshadow

TS Evangelist
So tossing aside facts and going with hypothetical, if a machine learned algorithm was found to be discriminating against a certain group of people, is it racist or sexist? As a white male, I found I was discriminated when I was getting car insurance when I was 16. They determined I was male and therefore much higher risk then woman... for some reason, but that was okay. Perfectly legal and accepted that I was in a high risk category compared to Women that same age. Soooo I'm pretty sure its time to put some big girl pants on and accept that if a machine crunched some numbers and determined Women are at an inherently higher risk of mishandling money, well then that isn't a problem. Its just an inconvenient truth. That or give me back some of my premiums I paid for. I'll accept that.
 

Zor Ven

TS Rookie
If they're married and share accounts, then yes, their incomes are the same for all intents and purposes - because they pool their income and share it.

If a couple is married and shares all accounts, their credit scores should be nearly identical - and so should their credit limits.
If they are essentially one unit, then perhaps the issue is not the wife but rather the wife applied second and they are not willing do issue the same amount of credit to the second person. Have they tested if in a marriage of equals what happened if the wife applied first? Did the husband get less credit when applying second?
 
  • Like
Reactions: Impudicus