Tay: Microsoft’s Mishap with Artificial Intelligence

The new social media chat bot Tay started as an innocent social experiment for people between the ages of 18-24, but the project soon went astray once Twitter users abused the vulnerabilities of the ignorant robot. Tay was the name given to the artificial intelligence chat bot created by Microsoft and Bing’s technology and research teams. She is essentially a virtual personality anyone can chat with on Twitter, Kik, and GroupMe. But in less than a day, internet trolls turned Tay into a racist and genocidal terror through their tweets at Tay and as a result of Microsoft’s design.   Anyone could tweet Tay or chat with her and she was designed to learn, as conversations progress, from what people say. Tay embodies a 19-year-old female and uses emojis and lingo such as “bae,” “chill” and “perf” with…

Link to Full Article: Tay: Microsoft’s Mishap with Artificial Intelligence

Pin It on Pinterest

Share This

Join Our Newsletter

Sign up to our mailing list to receive the latest news and updates about homeAI.info and the Informed.AI Network of AI related websites which includes Events.AI, Neurons.AI, Awards.AI, and Vocation.AI

You have Successfully Subscribed!