Social The Internet corrupted Tay, Microsoft's AI chat bot, in less than 24 hours

Shawn Knight

Posts: 13,002   +130
Staff member

Remember Tay, the chat bot with the personality of a 19-year-old American girl that Microsoft released just yesterday? The casual conversation bot was designed to get "smarter" over time and pick up on the personalities of the people it chats with via social media.

There's just one problem – Microsoft seemingly overlooked the fact that the Internet isn't always a nice and friendly place.

Less than 24 hours after going live, Microsoft's AI chat bot turned into a raging racist. Virtually any offending topic was fair game including Hitler, 9/11, Ted Cruz, Donald Trump, African Americans, Mexicans and so on. Tay's Twitter account – with more than 55,000 followers – is still alive but Microsoft has deleted all but three of its tweets.

To be fair, it's not entirely Microsoft's fault as the AI learned the "bad behavior" from people on the Internet. Still, they probably should have seen this coming.

In a statement provided to USA Today, Microsoft said Tay is as much a social and cultural experiment as it is a technical one. Unfortunately, Microsoft continues, within the first 24 hours of coming online, they became aware of a coordinated effort by some users to abuse Tay's commenting skills to have it respond in inappropriate ways.

Microsoft has decommissioned the experiment, at least for now.

If nothing else, the experiment should demonstrate to parents why they shouldn't let kids online without proper supervision.

Permalink to story.

 

Timonius

Posts: 648   +58
So that's what happens when you try to create an 'AI' without a conscience or a moral compass. Didn't see that happening. :rolleyes:
 

Scshadow

Posts: 637   +286
So why not challenge the internet to interact with Tay to counter balance the negative influences? Okay maybe its tough for a corporation to allow it to continue when they would take the public's blame for the offensive behavior. So much for a social experiment.
 

Uncle Al

Posts: 8,001   +6,775
Seems to be doing a pretty good job imitating real life. Of course, prior to our interconnected world these were the little dark secrets in each community that you would only hear about if a reporter was on the scene or happened upon the facts while digging for another story. Eventually, much longer than mine or anyone else's current lifetime, society will ever so slowly adjust as it has over the past 200+ years. Only the historians will discover the change and then it will be simply an interesting tidbit of information for them to smile about. Such is life.
 

Shawnonymous

Posts: 17   +7
Microsoft. If you're worried about taking blame for Tay's actions, appoint me the lead director and I'll take all the **** from every angle for you! Reactivate her and run with it! This is groundbreaking and a project of this scale should not be canned due to overly sensitive ****asses who shouldn't be on the internet in the first place if they can't handle modern times technology. ;) Just saying.
 

cliffordcooley

Posts: 12,661   +6,033
Maybe they should release two version (if not more), a Hitler and non-Hitler version. I think they should release a version specifically to troll North Korea!
 

amghwk

Posts: 998   +886
This just shows how corrupted this world is.

Your innocent daughter is also subject to such filth going on around this world.

Your growing up babies are subject to such nonsense too.

As long ad people keep glorifying sex and violence, this will ruin all.
 

captaincranky

Posts: 16,432   +5,215
They should have left her on for another day or so. By then, she'd be "sexting" pictures of her naked USB port(s), to any and every teenage boy on the web.

Well, either that or, make visitors sign an, "I am over the age of 18" waiver, then charge four dollars a minute to talk to her.

"Tay" can be reached via the web, @"https: M$/talkdirtytoTay.com"

Hey, for 4 bucks an hour, I'd rather risk talking to an underage AI, as opposed to some 350 pound, 46 year old skank who's trying to convince you she's the Goddess "Aphrodite"...:eek:
 
Last edited:

seeprime

Posts: 525   +591
So that's what happens when you try to create an 'AI' without a conscience or a moral compass. Didn't see that happening. :rolleyes:
Sounds like a human.
Actually, it sounds like Microsoft took a short cut and didn't run any lengthy tests. They let a buggy bit of software onto the web. This happens too often these days at Microsoft. The bright spot is that they'll fix it and it will eventually be pretty cool. Keep in mind that this is not true AI. It's a program that responds and learns (stores data) in a way that it's programmed to learn, not in a way where it can teach itself based on what it observes. That time is getting nearer. But, we're not there yet.
 
G

Greg J.

This is what happens when an AI program does not perceive it is accountable to anyone with the power to punish bad behavior.