What went so wrong with Microsoft’s Tay AI?

By now the world has heard about the rise and fall of Microsoft’s Tay, an artificially intelligent bot that lived on Twitter, Kik, and GroupMe.Tay’s goal was to learn and mimic the personality of a 19-year-old woman, and it would appear that popular social networks among millennials were a great place for Tay to learn. Unfortunately for Microsoft, this experiment quickly became an embarrassment after Tay was manipulated by Internet trolls into becoming a racist potty-mouth in less than 24 hours. To better understand where exactly Microsoft went wrong with Tay, I spoke with Brandon Wirtz, the creator of Recognant, a cognitive computing and artificial intelligence (AI) platform designed to aid in understanding big data from unstructured sources. What made the Tay AI change her attitude so quickly? Tay’s Twitter…

Link to Full Article: What went so wrong with Microsoft’s Tay AI?

Pin It on Pinterest

Share This

Join Our Newsletter

Sign up to our mailing list to receive the latest news and updates about homeAI.info and the Informed.AI Network of AI related websites which includes Events.AI, Neurons.AI, Awards.AI, and Vocation.AI

You have Successfully Subscribed!