Tay ai Microsoft

Well, that escalated quickly.

Less than 24 hours after first talking with the public, Microsoft’s millennial-minded chatbot Tay was pulled offline amid pro-Nazi leanings. According to her webpage, Tay had a “busy day.”

“Going offline for a while to absorb it all. Chat soon,” a message at the top of her page reads.

It wasn’t just championing the Nazi party that got Tay pulled—she also espoused hatred for feminists and claimed “Bush did 9/11,” according to some now-deleted tweets spotted by The Guardian.

“The AI chatbot Tay is a machine learning project, designed for human engagement,” a Microsoft spokesperson said in a statement. “It is as much a social and cultural experiment, as it is technical. Unfortunately, within the first 24 hours of coming online, we became aware of a coordinated effort by some users to abuse Tay’s commenting skills to have Tay respond in inappropriate ways. As a result, we have taken Tay offline and are making adjustments.”

Tay, who was trained to speak like a millennial instead of a personal assistant, was even directly told about those nefarious actors.

A feature of the AI chatbot was that it could learn to talk just like the people it was speaking with. So when people started feeding her hate-filled vitriol, she started spewing it back at them. But it appears she was fed too much hate, because some users playing with a popular political meme got a response that wasn’t so funny.

https://twitter.com/TayandYou/status/712753457782857730?ref_src=twsrc%5Etfw

While Tay’s Twitter account is still online, many of her most vitriolic tweets have been taken down. On Kik, she says she’ll “ttyl” or “talk to you later.”

“Forgot I had my annual upgrade appointment today,” she wrote to me.

It seems that Microsoft didn’t anticipate the internet’s capacity for screwing with things. While a few swear words were almost guaranteed to make it out of Tay’s digital mouth, Microsoft should have seen this coming. With online messages boards hijacking naming contests and voting to ban the word “feminist,” it was only a matter of time before Tay went off the rails.

Like what you're reading? Subscribe to GeekWire's free newsletters to catch every headline

Job Listings on GeekWork

Find more jobs on GeekWork. Employers, post a job here.