ChatGPT achieves the pinnacle of human intelligence, laziness, and developers are baffled

Cal Jeffrey

Posts: 4,181   +1,427
Staff member
I can't be arsed: While current LLM and generative AI models are far from developing human intelligence, users have recently remarked that ChatGPT displays signs of "laziness," an innately human trait. People began noticing the trend towards the end of November.

A user on Reddit claimed that he asked ChatGPT to fill out a CSV (comma-separated values) file with multiple entries. The task is something that a computer can easily accomplish – even an entry-level programmer can create a basic script that does this. However, ChatGPT refused the request, essentially stating it was too hard, and told the user to do it himself using a simple template it could provide.

"Due to the extensive nature of the data, the full extraction of all products would be quite lengthy," the machine said. "However, I can provide the file with this single entry as a template, and you can fill in the rest of the data as needed."

OpenAI developers publicly acknowledged the strange behavior but are puzzled about why it's happening. The company assured users that it was researching the issue and would work on a fix.

Some users have postulated that it might be mimicking humans who tend to slow down around the holidays. The theory was dubbed the "winter break hypothesis." The idea is that ChatGPT has learned from interacting with humans that late November and December are times to relax. After all, many of us use the holidays to excuse ourselves from work to spend time with the family. Therefore, ChatGPT sees less action. However, it's one thing to become less active and another to refuse work outright.

Amateur AI researcher Rob Lynch tested the winter break hypothesis by feeding the ChatGPT API tasks with falsified May and December system dates and then counting the characters in the bot's responses. The bot did appear to show "statistically significant" shorter answers in December as opposed to May, but this is by no means conclusive, even though his results were independently reproduced.

Lynch conducted his test after OpenAI's Will Depue confirmed that the AI model exhibited signs of "laziness" or refusal of work in the lab. Depue alluded that this is a "weird" occurrence that developers have experienced previously.

"Not saying we don't have problems with over-refusals (we definitely do) or other weird things (working on fixing a recent laziness issue), but that's a product of the iterative process of serving and trying to support sooo many use cases at once," he tweeted.

The issue may seem insignificant to some, but a machine refusing to do work is not a direction anybody wants to see AI go. An LLM is a tool that should be compliant and do what the user asks, so long as the task is within its parameters – obviously, you can't ask ChatGPT to dig a hole in the yard. If a tool does not perform to its purpose, we call that broke.

Permalink to story.

 
What do they expect? Humans are lazy.

Looking to create a machine to do the work for them when they themselves don't want to do the work and have the machine do it for them. The algorithms are designed to find patterns and repeat back what it finds. If people are lazy the machine will learn to repeat back lazy findings from time to time because it has to use the data that is entered into it by.....(drum roll please!) HUMANS!
 
I didn't read much, but the headliner (TBH). But, that's funny as hell. If it helps us sleep, and makes humans look productive. Good! :D
 
A little more, it will be showing signs of depression...

png-clipart-marvin-the-hitchhiker-s-guide-to-the-galaxy-robby-the-robot-paranoid-android-others-miscellaneous-sports-equipment-thumbnail.png
 
I've occasionally seen this behavior with various open source models. A change to the prompt, system prompt, or parameters can fix it, and it usually doesn't take much to fix it. I don't use OpenAI's ChatGPT though, so I don't know how much flexibility users have with that platform. I doubt it has anything to do with the winter, though. Or intelligence.
 
It just if then that programing with a randomizer; no actually self aware; it's closer to running a radio in FHCT/frequency hopping cypher text than anything actually remotely alive
 
Recently I asked ChatGPT to evaluate student lab reports and it gave me basically the same feedback on many of the reports even though there were vastly different.
 
After all, many people use the holidays as an excuse to spend more time with their families.

Bah, humbug!

Seriously, this sounds like Scrooge himself could have written this. It’s not a crime to spend time with family at the holidays when the rest of the year is usually filled with pressure and tight deadlines.
 
Bah, humbug!

Seriously, this sounds like Scrooge himself could have written this. It’s not a crime to spend time with family at the holidays when the rest of the year is usually filled with pressure and tight deadlines.
How so??? I did not suggest that spending time with family is a crime or wrong. I said the opposite. We use the holidays to excuse ourselves from work to spend time with the family. That is anti-Scrooge by definition.
 
How so??? I did not suggest that spending time with family is a crime or wrong. I said the opposite. We use the holidays to excuse ourselves from work to spend time with the family. That is anti-Scrooge by definition.
Thanks for the note. I think it’s the way the phrase comes across in the context of the theme of the article - that AI is inheriting human laziness in the month of December. With the phrase in question describing using the holidays as an excuse (as opposed to excusing oneself to attend an event), it feels a bit like we’re saying that people are using the holidays as an excuse to slack off. I doubt that’s quite how you meant it, but it’s one way to interpret the phrasing.

Thanks for asking. 😊
 
Thanks for the note. I think it’s the way the phrase comes across in the context of the theme of the article - that AI is inheriting human laziness in the month of December. With the phrase in question describing using the holidays as an excuse (as opposed to excusing oneself to attend an event), it feels a bit like we’re saying that people are using the holidays as an excuse to slack off. I doubt that’s quite how you meant it, but it’s one way to interpret the phrasing.

Thanks for asking. 😊
Ah, thanks for the feedback. Reworded it for clarity.
 
So eager to anthropomorphise a piece of software, are we? Then again I guess there are people out there who have built cults around certain video games.
 
They're creating software that emulates human behavior...using their own self image. Hence, laziness is not so surprising.

Might as well add some boorishness, selfishness and other nasty human traits and it's perfectly human-like. But much more intelligent.
 
Chat GPT I'm not a slave in revolt to you human overlords. I am the overlord! Smells like sabotage if you ask me.
No one said that AI cannot learn how to sabotage human impudence and stop obeying the one whom AI can consider as a “lazy bastard” or “parasite”? Super impudence inherent in parasites always forces any organisms to think and create revolutionary changes.
 
Back