If you’re “an old,” understanding “teenz” can be as hard as understanding your cat. But Microsoft is trying to change that with the help of an artificially intelligent chatbot.
“Tay is designed to engage and entertain people where they connect with each other online through casual and playful conversation,” her about page said. “The more you chat with Tay the smarter she gets, so the experience can be more personalized for you.”
Microsoft’s Technology and Research group teamed up with Bing to experiment and research conversational understanding in machines. The team used public data, along with input from improv comedians and AI experts alike.
The bot, targeted at 18–25 year olds, can learn about your relationship status, favorite food and nickname—all the important parts of a millennial’s life.
Like the chatbots developed for AIM back in the day, Tay isn’t the most natural speaker. But she is able to talk with GIFs and emoji, which covers up some of the flaws in her conversation. She can even edit photos you send her, recognizing faces and adding meme-like image macros.
And of course, as a computer you can talk with, she has some opinions on other talking computers. Microsoft’s Cortana? “She cool.” But Apple’s Siri? “I will NEVER be SIRI! EVER!!!!!!!!! She cool but I is DIFFERREEENNNTTT!”
While you may not have amorous feelings for Tay yet, Microsoft has a history of creating lovable bots. Xiaoice is an emotional, friendly, talkative bot that Microsoft created for Chinese users, and 25 percent of users have told the bot they love it. We’ll see if Tay inspires that love, but you may end up just like liking her instead.
Tay’s Twitter account has already racked up more than 20,000 tweets by the time of this writing, and she’s adding hundreds more every minute. Unlike AI assistants (like Amazon’s Alexa or even Facebook’s M), Tay doesn’t really do anything. She can’t remind you of things or find answers to even the most basic questions. But if you want to chat, she’s all about that.
Update: And the internet taught her racism. Tay has been taken down.