Today, Twitch has issued a statement announcing the steps it’s taking to protect its marginalized streamers.
Twitch wrote, “We’ve heard a lot about botting and hate raids and other forms of harassment targeted marginalized creators.” We know that we have to do more to address these problems.
Twitch claims it has identified a “vulnerability in our proactive filters” and has released an update to fix it. It will also implement additional safety features over the next few weeks, such as improved account verification and anti-evasion detection tools.
This statement was made in response to the #twitch do better hashtag, which Twitch creator RekItRaven to raise awareness about harassment issues Black creators are experiencing on the streaming platform.
“I was hate raped for the second time in a row and I shared both the first occurrences [on twitter] because they were very point rather than the usual, ‘You’re fat, black and gay stuff,” Raven informs The Verge by direct messaging.
Raiding, a Twitch feature that lets streamers send viewers to other streamers at the end of their broadcasts, is very popular. It is a tool that can increase viewership, foster communities, and foster connections between streamers with their viewers. Hate raids are the toxic, polar opposite. Hate raids are when a streamer points their viewers to another creator, who is often Black, queer or female, to harass that streamer with hate speech.
Raven believes they became a target for hate raids because they stream using the Black tag, a new Twitch feature that allows users to classify their streams with different markers. Streamers use these tags to help them categorize their streams, which can be useful for users to find what they want. However, it also serves as a beacon for trolls who use it to target marginalized streamers. Raven observed that other marginalized streamers were experiencing the same experience after their hate raids. Raven was not informed by Twitch about any protections in place for users against this kind of targeted, violent harassment and decided to restart the conversation.
“I started #TwitchDoBetter because I’m tired of having to fight to exist on a platform that says they’re diverse and inclusive but remained silent to the pleas of marginalized creators asking for more protections from hate raids,” Raven says.
Twitch has difficulties keeping toxic content off its platform. Last year, streamer Critical Bard was subjected to a wave of racist trolls when he became the temporary face of the “pogchamp” emotes. Twitch also removed its Twitch Cop emote amid concerns it might be used to harass creators talking about police violence after George Floyd’s murder. These situations have led to creator frustration as Twitch has been reactive rather than proactive in responding to users. Twitch’s marginalized creators have long pleaded for more proactive moderation tools.
The tools Twitch is implementing in today’s safety rollout will seemingly only address trolls using non-Latin characters to circumvent chat filters. Streamers want more.
Raven states, “I would love to see creators have more tools to control how they experience chatting like allowing creators block accounts that were recently created from chatting [and] allowing mods approval or declining raids.”