Microsoft Created a Twitter Bot to Learn From Users. It Quickly Became a Racist Jerk. - The New York Times
Microsoft chatbot is taught to swear on Twitter - BBC News
Microsoft's AI Twitter Bot That Went Racist Returns ... for a Bit
Microsoft apologizes for hijacked chatbot Tay's 'wildly inappropriate' tweets | TechCrunch
Microsoft briefly reinstates Tay – the Twitter AI that turned racist in 24 hours
What went wrong with Tay, the Twitter bot that turned racist? - Kavita Ganesan, PhD
The Saga of Twitter Bot Tay | Snopes.com
Microsoft's AI chatbot, TayTweets, suffers another meltdown | CBC Radio
Microsoft chatbot is taught to swear on Twitter - BBC News
HuffPost Tech on Twitter: "Microsoft's chat bot "Tay" went on a racist Twitter rampage within 24 hours of coming online https://t.co/znOQ7ubTBN https://t.co/THwECB7gna" / Twitter
Twitter taught Microsoft's AI chatbot to be a racist asshole in less than a day - The Verge
Microsoft's Tay AI chatbot goes offline after being taught to be a racist | ZDNET
Microsoft AI reactivates, promptly becomes druggie | The Times of Israel
Microsoft artificial intelligence 'chatbot' taken offline after trolls tricked it into becoming hateful, racist
Microsoft's Tay is an AI chat bot with 'zero chill' | Engadget
Tay (chatbot) - Wikipedia
Tay the 'teenage' AI is shut down after Microsoft Twitter bot starts posting genocidal racist comments that defended HITLER one day after launching | Daily Mail Online
Twitter taught Microsoft's AI chatbot to be a racist asshole in less than a day - The Verge
Microsoft exec apologizes for Tay chatbot's racist tweets, says users 'exploited a vulnerability' | VentureBeat