Microsoft Is Sorry for That Whole Racist Twitter Bot Thing

“Tay is now offline” A senior Microsoft executive is apologizing after the company’s artificial intelligence chatbot experiment went horribly awry. Microsoft’s software, called “Tay,” was designed to interact with Twitter users in part by impersonating them. But online pranksters quickly realized they could manipulate Tay to send hateful, racist messages. Microsoft pulled Tay offline just a few hours after it launched Wednesday morning. Peter Lee, Corporate Vice President at Microsoft Research, posted the following apology and explanation on a company blog: As many of you know by now, on Wednesday we launched a chatbot called Tay. We are deeply sorry for the unintended offensive and hurtful tweets from Tay, which do not represent who we are or what we stand for, nor how we designed Tay. Tay is now offline…

Link to Full Article: Microsoft Is Sorry for That Whole Racist Twitter Bot Thing

Pin It on Pinterest

Share This

Join Our Newsletter

Sign up to our mailing list to receive the latest news and updates about and the Informed.AI Network of AI related websites which includes Events.AI, Neurons.AI, Awards.AI, and Vocation.AI

You have Successfully Subscribed!