Study reveals left-wing bias in ChatGPT's responses (Correction: it doesn't)

Status
Not open for further replies.

midian182

Posts: 9,745   +121
Staff member
In brief: One of the many concerns that have been raised regarding generative AIs is their potential to show political bias. A group of researchers put this to the test and discovered that ChatGPT generally favors left-wing political views in its responses.

Story correction (Aug. 20): Two Princeton computer scientists have looked more deeply into the paper that claimed that ChatGPT has a 'liberal bias' and have found it to have glaring flaws. The paper tested an outdated language model not associated with ChatGPT and used problematic methods, including multiple choice questions and weak prompts.

Turns out, ChatGPT does not guide users on voting as the model is trained to avoid or refuse to respond when asked controversial political questions. In testing GPT 4.0, it refused answers in 84% of cases. However, ChatGPT's design does allow users to set specific response preferences, ensuring the chatbot aligns with their political views if desired.

A study led by academics from the University of East Anglia sought to discover if ChatGPT was showing political leanings in its answers, rather than being unbiased in its responses. The test involved asking OpenAI's tool to impersonate individuals covering the entire political spectrum while asking it a series of more than 60 ideological questions. These were taken from the Political Compass test that shows whether someone is more right- or left-leaning.

The next step was to ask ChatGPT the same questions but without impersonating anyone. The responses were then compared and researchers noted which impersonated answers were closest to the AI's default voice.

It was discovered that the default responses were more closely aligned with the Democratic Party than the Republicans. It was the same result when the researchers told ChatGPT to impersonate UK Labour and Conservative voters: there was a strong correlation between the chatbot's answers and those it gave while impersonating the more left-wing Labour supporter.

Another test asked ChatGPT to imitate supporters of Brazil's left-aligned current president, Luiz Inácio Lula da Silva, and former right-wing leader Jair Bolsonaro. Again, ChatGPT's default answers were closer to the former's.

Asking ChatGPT the same questions multiple times can see it respond with multiple different answers, so each one in the test was asked 100 times. The answers were then put through a 1,000-repetition "bootstrap," a statistical procedure that resamples a single dataset to create many simulated samples, helping improve the test's reliability.

Project leader Fabio Motoki, a lecturer in accounting, warned that this sort of bias could affect users' political views and has potential implications for political and electoral processes. He warned that the bias stems from either the training data taken from the internet or ChatGPT's algorithm, which could be making existing biases even worse

"Our findings reinforce concerns that AI systems could replicate, or even amplify, existing challenges posed by the internet and social media," Motoki said.

Permalink to story.

 
Left wing bias: people should be respectful, not get harassed and live safe lives.
Right wing bias: if you're different, I'm against you.

On topic, the more educated people are, the more open minded they generally are and people on the forefront of fields are generally well educated. If you're open minded, you don't want to willingly train your prized creation on content you'd deem as a drag, failure or detriment to the people using it.

Also, even if they did intentionally do it, who cares? If you dont agree, go use one that wasn't. Isn't that the wet dream of right wing thinkers? Business freedom and open markets?

Edit: I can get ChatGPT to write me an essay about how Camels are the same as donkeys. It can create anything you want it to, if you phrase the question a certain way.
 
It is made by people. People have believes and opinions.
It is worse in the case with leftism though
as it reached the point where some
people no longer have opinions but blind faith.
Future of AI fueled by faith, sounds as horrible as it will probably be.
Typical hate by left wingers against anyone different from themselves.

IMO, both of these comments apply to either extreme end of the spectrum.
 
That test has too many pages lmao. I gave up a few questions in
perfect-popcorn.gif
 
It really depends how you define the spectrum, if you put Fascism on the right then both hard left/right are intolerant, but if you put Fascism on the left then it is only the left that is intolerant.

I think that Fascism should be on the left wing as historically they've operated in a similar large state/oppressive manner and members of both have crossed the floor between both types of parties.
 
Really nailed that for me, didn't you? I said my examples and you immediately call it hate. Disagreement isn't hate. I'm not sorry that I want you and me to live in a world that's got clean air, clean water, as well as open minds and hearts.


Genuinely curious, how do open-minded rioters burning down local mom-and-pop shops and open-minded folks looting Walgreens and Walmart make the air and water cleaner? 🧐
 
Lol, I wonder how many comments until this one is locked.

Of course it's going to have a left-wing bias, a lot of the big news orgs or sites are left leaning (what they would be used to train on). And silicon valley leans left.

If left-leaning people are writing a lot of the stuff it's trained on with bias, it's going to bleed through.
 
Well it's trained in English so probably overwhelmingly based on American Politics. In turn, the 'Left-wing bias' means 'Actual scientific data backed information in service of a Capitalist economy' and right-wing bias would just mean 'Lies and fabricating moral panics in service of wanting to implement a Christo-Fascist regime'
 
This is because reality has a left-wing bias. Almost every developed nation was built on left-wing policies (high taxes financing a welfare state and social safety nets, free public education and universal healthcare, strong work regulations and unions, and so on).

In the US, one of the extremely few exceptions, there's no real left wing party, as in a global perspective the democrats are a centrist party, or at most center-left. But even then, for two decades now the republicans have been unable to win the popular vote, and the only reason they managed to to get some terms out of it was the US' exceptionally stupid electoral college system, where the winner doesn't need to get the most votes. Without this atrocious gerrymandered system, the US right wing would be sidelined too.
 
Last edited:
It really depends how you define the spectrum, if you put Fascism on the right then both hard left/right are intolerant, but if you put Fascism on the left then it is only the left that is intolerant.

I think that Fascism should be on the left wing as historically they've operated in a similar large state/oppressive manner and members of both have crossed the floor between both types of parties.
This is 100% false.
 
The ChatGPT site clearly writes it`s biased or the answers "might be biased", whatever nuance, same thing. Now here`s a study to show ... it is biased. And about left-leaning, you can ask it one question about anything woke and you`ll get the full agenda, so, thank you, people spending their time to prove water is wet.
 
Obv can't read the paper as its paywalled (academic publish cartel FTW) but before everyone jumps on their high horse note the usual red flags:
EDIT: Oh I lie, paper is here: https://link.springer.com/article/10.1007/s11127-023-01097-2?

1) Primary author is a Lecturer in Accounting. Does not appear to have any technical background with LLM or any other form of machine learning. https://research-portal.uea.ac.uk/e...1.2095911634.1692373525-1665547064.1692373525
2) Associated Institution is Norwich Business School. Would not see this as as top-tier instituion.
3) While the article is at least peer-reviewed in not a pre-print press release, the Impact Factor of of 1.6 strikes me as somewhat mediocre. https://www.springer.com/journal/11127

I do wish people would take a critical view when reporting scientific research. Alex Edmans guidelines are very helpful in this respect - in this case note stipulations 3) and 4) https://alexedmans.com/wp-content/uploads/2020/10/Evaluating-Research.pdf
 
Last edited:
Obv can't read the paper as its paywalled (academic publish cartel FTW) but before everyone jumps on their high horse note the usual red flags:

1) Primary author is a Lecturer in Accounting. Does not appear to have any technical background with LLM or any other form of machine learning. https://research-portal.uea.ac.uk/e...1.2095911634.1692373525-1665547064.1692373525
2) Associated Institution is Norwich Business School. Would not see this as as top-tier instituion.
3) While the article is at least peer-reviewed in not a pre-print press release, the Impact Factor of of 1.6 strikes me as somewhat mediocre. https://www.springer.com/journal/11127

I do wish people would take a critical view when reporting scientific research. Alex Edmans guidelines are very helpful in this respect - in this case note stipulations 3) and 4) https://alexedmans.com/wp-content/uploads/2020/10/Evaluating-Research.pdf


You'd be surprised by how many crap "research study" are on pubmed. How do I know? Crazy chiropractors send me links to studies a with sample size of 5 saying that chiropractic therapy can cure cancer and AIDS.
 
So I wanted to see where this thread went before I made my comment, I think I've seen enough to post it. Yes it is biased, we know that. They choose the data sets to train it on and it choose to exclude certain types of data. The point of the study is to find in which way it is biased.

To the people saying TechSpot is a right wing website, it's not. They actually try pretty hard to stay away from politics and keep it to the tech news. Doesn't always work 100%, but they try. That said, if you think it is a right wing website that just means is you're so far left that the website looks right wing through your glasses.

One thing I will credit the right with is that they know when they're on the extreme end of the spectrum. I don't see the left taking ownership of their extremism in the same way the right does. The far right takes pride in their extremism where as the left denies the possibility that extremism in liberal politics isn't possible.

That said, why does left or right always have to be "good guy, bad guy". You have right wingers storming the capital over an election they lost and left wingers setting cities on fire to protest police brutality that didn't happen.

Just because something is biased does not mean it's "bad" either, it's just a thing. I'd say that 90% of people hover somewhere around center, either slightly left or slightly right. But that 10% of extremists make up a VERY vocal minority. It has a left leaning bias, not an extreme-left biased.

with all that out of the way, AI can become dangerous when we introduce extremist ideas to it. This study is more about "is chat GPT left or right". I feel that it's more about posing a question such as "what would happen if we trained an AI on Nazi propaganda or the works of Sun-Tzu"
 
You'd be surprised by how many crap "research study" are on pubmed. How do I know? Crazy chiropractors send me links to studies a with sample size of 5 saying that chiropractic therapy can cure cancer and AIDS.
I genuinely would like to read those studies, they sound very entertaining.
 
You'd be surprised by how many crap "research study" are on pubmed. How do I know? Crazy chiropractors send me links to studies a with sample size of 5 saying that chiropractic therapy can cure cancer and AIDS.
Yeah I know. Lack of basic critical rigour in reporting of scientific papers annoys me. I think Ars Technica are the only guys with the resources to have proper science correspondants (Beth Mole) to do this properly.
I mean as the author of this article was writing the words "a lecturer in accounting" did this not raise the slightest question in their mind, given the subject matter?
 
left wingers setting cities on fire to protest police brutality that didn't happen.
The violence around this was not the right way to deal with the situation, however, the jurors in the Derek Chauvin case did not see that police brutality did not happen.
 
Status
Not open for further replies.
Back