Microsoft launches Tay, an AI chat bot that mimics a 19-year-old American girl

Shawn Knight

Posts: 15,284   +192
Staff member

Microsoft has unveiled a new artificial intelligence-powered chat bot called Tay. Developed by Microsoft's Technology and Research team with help from the Bing team, Tay isn't a virtual assistant like Siri or Facebook's M. Instead, it's a true chat bot in the sense that it's meant to engage and entertain people.

Surely Microsoft didn't build an AI bot just to entertain people, right? Of course not. Tay is an experiment to conduct research on conversational understanding. Artificial intelligence and machine learning have become incredibly skilled over the past several years but most platforms still struggle with things like language context and humor.

Like other AI systems, Tay – who has the personality of a 19-year-old American girl – gets smarter over time and picks up on your personality. She can tell jokes, play games with you, read your horoscope, comment on a photo and more. If you are a parent or grandparent that's not hip to the cool lingo today's teens use, maybe some time with Tay can help.

Microsoft says Tay was created by mining relevant public data and by using AI and editorial developed by a staff that included improvisational comedians (unfortunately, Microsoft didn't specify which comedians).

Do note that Tay will build a basic profile of its users that includes nickname, gender, favorite food, zip code and relationship status. Microsoft says the data and conversations you provide to Tay are anonymized and may be saved for up to a year to help improve the service.

Tay isn't a standalone app. Instead, it's being offered through some of the communication apps you may already be using. You can currently add Tay via Kik, GroupMe and Twitter. It also has its own Facebook account and is on Snapchat as "TayStories."

Permalink to story.

 
So is Cortana her mother?

And here's the father:
clippy-HSLq3KML5NsvC


"Do note that Tay will build a basic profile of its users that includes nickname, gender, favorite food, zip code and relationship status. Microsoft says the data and conversations you provide to Tay are anonymized and may be saved for up to a year to help improve the service."

"Anonymized."
 
Reminds me of the war3 chat bot made by some user that sat in the main chat channel for about 6 months in the early 2000s. It fooled most people. If you said your age, name, job, favorite food, color or any general piece of info it would store it in it's memory banks and use it later. If you said I like leather jackets, a week later the bot would say "user" why do you like leather jackets or something to that effect.
 
Back