Twitch’s new AutoMod tool uses “machine learning” to create less toxic chat

Twitch has just launched a new tool called AutoMod that makes use of “machine learning and natural language processing” to help streamers keep inappropriate chat out of their channels. Rather than simply blocking instances of the seven things you can’t say on television (or whatever it is you don’t want paraded in front of your viewers), AutoMod can detect strings of emotes or other characters that could be used to evade filtering, and gives channel owners the ability to adjust the degree of filtering so that chat can be more or less unpleasant, as so desired. Potentially offensive messages are held in a queue, until a human moderator decides whether or not they get a pass.  “What makes Twitch a leader in moderation, beyond our ever vigilant team of Admin…


Link to Full Article: Twitch’s new AutoMod tool uses “machine learning” to create less toxic chat

Pin It on Pinterest

Share This

Join Our Newsletter

Sign up to our mailing list to receive the latest news and updates about homeAI.info and the Informed.AI Network of AI related websites which includes Events.AI, Neurons.AI, Awards.AI, and Vocation.AI

You have Successfully Subscribed!