The chat that sits alongside livestreams on Twitch is often a nightmare — especially for women. But now the Amazon-owned video platform has introduced a new tool to make it easier for broadcasters to take control of what kind of content is permitted in their chats. The goal is to eliminate ugly situations where racists are able to harass someone like Hearthstone pro Terrence “TerrenceM” Miller on Twitch.
Twitch is launching AutoMod, which is an automated program that will use special algorithms to better understand the intent of a chat message so that it can more accurately block hateful or inappropriate content. This gives streamers more control over what won’t show up in chat when someone writes it, according to Twitch. AutoMod uses technologies similar to what’s found in products like Siri, Google, and Amazon Echo to process natural language. Twitch then combines that with machine learning (an idea that enables computer to learn tasks without people having to specifically program them) to make it so that the tool gets better at identifying unwanted content over time. AutoMod is available now worldwide for English.
[aditude-amp id="flyingcarpet" targeting='{"env":"staging","page_type":"article","post_id":2129642,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,games,media,pc-gaming,","session":"B"}']In addition to English, AutoMod has a beta that works with Arabic, Czech, French, German, Italian, Japanese, Korean, Polish, Portuguese, Russian, Spanish, and Turkish.
Gaming-related video is a multibillion-dollar market, and Twitch dominates the livestreaming side of this business. It has more than 100 million monthly viewers and more than 2 million broadcasters. With that kind of volume, manually moderating chat with only human beings is not something the company could pull off easily or in a way that is cost effective. With AutoMod, Twitch calls itself a “leader in moderation” (though we need to see the results first before we can determine if this claim is true).
AI Weekly
The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.
Included with VentureBeat Insider and VentureBeat VIP memberships.
“We equip streamers with a robust set of tools and allow them to appoint trusted moderators straight from their communities to protect the integrity of their channels,” Twitch moderation lead Ryan Kennedy said. “This allows creators to focus more on producing great content and managing their communities. By combining the power of humans and machine learning, AutoMod takes that a step further. For the first time ever, we’re empowering all of our creators to establish a reliable baseline for acceptable language and around the clock chat moderation.”
The video site also claims that this is part of its initiative to be more inclusive. Twitch is well-known for creating an environment where hundreds or thousands of trolls can harass women about their bodies, minorities about their skin color, or disabled people about their handicaps almost entirely without consequence. But the company has started a program to make its experience more welcoming to different kinds of people.
“Inclusivity is something that is important to both our community and our brand,” Twitch inclusivity boss Anna Prosser Robinson said in a statement. “One of the best ways we can help bring about change is to provide tools and education that empower all types of voices to be heard. AutoMod is one of those tools, and we hope it will encourage our users to join us in our continued focus on fostering a positive environment on social media.”
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More