by Andrew Lam, New America Media
Just over a year ago Microsoft introduced Tay, an AI chatbot that was designed to learn from and replicate online chatter. Tay, according to Business Insider, “responds to users’ queries and emulates the casual, jokey speech patterns of a stereotypical millennial.”
But within 24 hours, Tay was gone, the casualty of an online universe of hate a bigotry that is now shaping America’s political and social landscape.
“bush did 9/11,” and “hitler would have done a better job than the monkey we have now.” That’s just a sampling of some of Tay’s choicest quips.