Racist, Sexist, Bigot: Microsoft’s AI “Tay”

At the end of March, Microsoft released an Artificial Intelligence (AI) account on Twitter named “Tay.” Tay was designed as “a chatbot created for 18- to 24-year-olds in the U.S. for entertainment purposes,” wrote Peter Lee, Microsoft’s vice president of corporate research, on their official blog. Modeled after a wildly popular chatbot in China named XiaoIce, Microsoft was hoping to bring the success to a Western audience. Tay, like many other modern AI experiments including her counterpart XiaoIce, was an artificial neural network, a rudimentary AI system modeled after the structure of the human brain. Essentially, Tay was programmed to receive input and learn from it, outputting her own tweets in the process. Any Twitter user had the capability of tweeting to her. In turn, she would offer a response.…

Link to Full Article: Racist, Sexist, Bigot: Microsoft’s AI “Tay”

Pin It on Pinterest

Share This

Join Our Newsletter

Sign up to our mailing list to receive the latest news and updates about homeAI.info and the Informed.AI Network of AI related websites which includes Events.AI, Neurons.AI, Awards.AI, and Vocation.AI

You have Successfully Subscribed!