We must teach AI machines to play nice and police themselves

Microsoft may have made one of the biggest mistakes in recent memory this week. No, it’s not Windows 8 or the Windows Phone. It’s an artificially intelligent chat-bot called Tay that was supposed to learn the art of conversation from humans on Twitter. If you haven’t come across this story on the web yet, you’re unlikely to get through the weekend without. Tay was built to speak like a  teen girl and released as an experiment to improve Microsoft’s automated customer service. Instead, “she” turned into a complete PR disaster – within hours of being unleashed on Twitter, the “innocent teen” bot was transformed into a fascist, misogynistic, racist, pornographic entity. Her tweets, including phrases like “Heil Hitler”, were disseminated widely as an example of why Twitter reflects the worst…

Link to Full Article: We must teach AI machines to play nice and police themselves

Pin It on Pinterest

Share This

Join Our Newsletter

Sign up to our mailing list to receive the latest news and updates about homeAI.info and the Informed.AI Network of AI related websites which includes Events.AI, Neurons.AI, Awards.AI, and Vocation.AI

You have Successfully Subscribed!