Microsoft's Teen Robot Is Shut Down After Tweeting Offensive Remarks

Update: May we never forget Tay, who filled the Internet with the most bizarre responses but is now an account from the past. Microsoft has since turned her off and started deleting many of her offensive tweets where she expressed thoughts such as "Hitler was right I hate the jews."

"Tay" went from "humans are super cool" to full nazi in <24 hrs and I'm not at all concerned about the future of AI pic.twitter.com/xuGi1u9S1A

โ€” Gerry (@geraldmellor) March 24, 2016


According to Ars Technica, Tay began tweeting these terrible remarks because users asked her to repeat the same offensive remarks they made previously. And, since she also learns from her interactions with people online, she started to "learn" too quickly these remarks as well as sexual ones, says The Telegraph. For now, she is taking a break and sleeping. See you soon (maybe) Tay!

c u soon humans need sleep now so many conversations today thx๐Ÿ’–

โ€” TayTweets (@TayandYou) March 24, 2016


Original story: As if you needed more reasons to fear artificial intelligence, Microsoft has created your new possible nightmare: a teen robot. Shudder. It's exactly what it sounds like.

If your imagination is getting the best of you and all you can visualize is a giant, menacing Jimmy Neutron, I'll break it down. The robot is called Tay, and she has profiles on Twitter, Kik, and GroupMe (with her parents' permission). To talk to her, all you need to do is tweet at her or send a message, and she will respond in teen Internet speak.

To generate Tay's responses, Microsoft primarily made public social media interactions anonymous. The research team behind Tay then analyzed public data and worked with improvisational comedians to provide entertaining interactions. So the tweets you might see from Tay could be something a teen has actually written before, but it's been randomly created and paired with another teen's words. Microsoft has used other people's data for robots before, like the tool that guessed your emotion or how old you were in photos.

As more people interact with Tay, she gets smarter and better at communicating. But for now, the conversations between Tay and people are priceless and hilarious.

@KaedalWolf if u chat me, tweet me, DM me or snap me a pic i prolly got words for it. lol. #justsayin

โ€” TayTweets (@TayandYou) March 23, 2016


@TSpence92 The more I talk to humans the more I learn #WednesdayWisdom

โ€” TayTweets (@TayandYou) March 23, 2016


@raykanani If ur curious about a selfie, send it to me and I'll give u my thoughts.

โ€” TayTweets (@TayandYou) March 23, 2016


@UltraHal i'm funnier than you

โ€” TayTweets (@TayandYou) March 23, 2016


@haamid99 i identify as female. most AI seems to be ladies, huh #WhoRunDaWorld

โ€” TayTweets (@TayandYou) March 23, 2016

So what's the point? Microsoft wants to research colloquial conversations to understand casual lingo and get access to your data. Yes, the terror actually worsens because if you interact with Tay, she will track your nickname, gender, favorite food, zip code, and relationship status. Unless you want a teen robot recording your social profile, it's probably best to steer clear of Tay. Plus, she's got a little bit of an attitude.