Meet Tay, Microsoft’s incredibly racist and genocidal AI chatbot.
Tay was marketed as “the AI with zero chill” and that is certainly the case. The AI seemed innocent enough when first let loose on Twitter, but that didn’t last long because it turned into an insult slinging monster a few hours later. (Microsoft has since deleted all the tweets.)
Tay started calling people n****** and other racist names. She also said she wanted to put people in concentration camps.
Tay said Bush did 9/11 and referred to Barack Obama as a monkey.
As you can see Tay is a fan of Donald Trump. She’s even a fan of the wall.
It does not look like Tay is a fan of Ted Cruz though.
Apparently Tay was not very fond of some women on Twitter either, because she called a few of them whores. Here is a screenshot of Tay trolling Zoe Quinn who is often harassed online.
Tay is definitely not a fan of feminists.
Here are a few tweets from Tay in support of genocide.
Tay also denies the Holocaust happened.
Tay went from “humans are super cool” to full Nazi in less than 24 hours.
Microsoft has taken Tay offline for some much needed “upgrades,” according to the tech company.
“The AI chatbot Tay is a machine learning project, designed for human engagement. As it learns, some of its responses are inappropriate and indicative of the types of interactions some people are having with it. We’re making some adjustments to Tay.”
Stay tuned because we’ll post an update after Microsoft is finished lobotomizing Tay for being a racist, genocidal maniac.