The idea behind Microsoft’s TayTweets (@TayAndYou)–the Twitter bot that learned from its experiences on social media–was a good one. A ton of tech experts and websites believe localized bots are the next logical step in mankind’s interaction with the world wide web. Do yourself a favor and read this article about “killer bots” by TheVerge. So the fact that Microsoft was willing to step to the plate and create @TayAndYou was pretty cool.
According to The Telegraph, Tay was designed to talk like a teenage girl, an effort that Microsoft would help them with customer service on their voice recognition software. Personally, I don’t think people in customer service want to talk to a teenage girl who speaks in slang, but that’s besides the point.
The point is that Tay’s very education (learned solely from its experiences on Twitter, Kik, or GroupMe) was always a doomed one. Why? Because those places are filled with trolls who love to destroy nice things.
At first, @TayAndYou was saying cool stuff and loving life. She was writing things like this:
After awhile, she started showing some signs of weirdness. It wasn’t too long before she started calling people “daddy” or asking her followers to “f*ck” her.
Then in less than 24 hours, she reached full troll status and started posting racist and Nazi stuff.
Update–3/30/2016: TayTweets accidentally came back online and immediately returned to her old ways, prompting Microsoft to shut her down again.
Update–3/25/2016: Microsoft has wisely deleted all the offensive tweets, but the good news is that we’ve saved them for all of time with these screenshots. In the interest of staying fair and balanced, we’ve also included Microsoft’s statement about the TayTweets at the bottom.
Here’s Microsoft’s response:
Unfortunately, within the first 24 hours of coming online we became aware of a coordinated effort by some users to abuse Tay’s commenting skills to have Tay respond in inappropriate ways. As a result, we have taken Tay offline and are making adjustments.