• November 28, 2021

Twitch’s First Transparency Report Is Here—and Long Overdue

Twitch today released its first-ever transparency report, detailing its efforts to safeguard the 26 million people who visit its site daily. When it comes to transparency, the decade-old, Amazon-owned service had a lot of catching up to do.

Twitch benefitted from a 40 percent increase in channels between early and late 2020, buoyed by the popularity of both livestreaming technology and video gaming throughout the pandemic. That explosive growth, however, is also the company’s greatest challenge when it comes to stomping out harassment and hate. Unlike recorded videos, live content is often spontaneous and ephemeral. Things just happen, in front of live audiences of thousands or tens of thousands. That can include anything from 11-year-olds going live playing Minecraft—exposing them to potential predators—to now-banned gaming celebrity Guy “Dr Disrespect” Beahm streaming from a public bathroom at E3.

In its new transparency report Twitch acknowledges this difficulty and for the first time offers specific details about how well it moderates its platform. While the findings are encouraging, what Twitch historically has not been transparent about speaks just as loudly.

recommended site
recommended you read
redirected here
reference
related site
resource
resources
review
right here
secret info
see
see here
see here now
see it here
see page
see post
see this
see this here
see this page
see this site
see this website
sell
she said
site
site web
sites
sneak a peek at these guys
sneak a peek at this site
sneak a peek at this web-site
sneak a peek at this web-site.
sneak a peek at this website
sneak a peek here
source
[source]
sources tell me
speaking of
special info
straight from the source
such a good point
super fast reply
take a look at the site here
talking to
talks about it
that guy
the
the advantage
the full details
the full report
the original source
their explanation
their website
these details
they said
this
this article
this contact form
this content
this guy
this hyperlink
this link
this page
this post
this site
this website
top article
total stranger
try here
try these guys
try these guys out
try these out
try this
try this out
try this site
try this web-site
try this website
try what he says
try what she says
understanding
updated blog post
url

Twitch early on earned a reputation as a hotbed for toxicity. Women and minorities streaming on the platform received targeted hate from audiences hostile to people whom they believed deviated from gamer stereotypes. Twitch’s vague guidelines around so-called “sexually suggestive” content served as fuel for self-appointed anti-boob police to mass-report female Twitch streamers. Volunteer moderators watched over Twitch’s fast-moving chat to pluck out harassment. And for problematic streamers, Twitch relied on user reports.

In 2016, Twitch introduced an AutoMod tool, now enabled by default for all accounts, that blocks what its AI deems inappropriate messages from viewers. Like other large platforms, Twitch also relies on machine learning to flag potentially problematic content for human review. Twitch has invested in human moderators to review flagged content, too. Still, a 2019 study by the Anti-Defamation League found that nearly half of Twitch users surveyed reported facing harassment. And a 2020 GamesIndustry.Biz report quoted several Twitch employees describing how executives at the company didn’t prioritize safety tools and were dismissive of hate speech concerns.

Throughout this time, Twitch didn’t have a transparency report to make its policies and inner workings clear to a user base suffering abuse. In an interview with WIRED, Twitch’s new head of trust and safety, Angela Hession, says that, in 2020, safety was Twitch’s “number one investment.”

Over the years, Twitch has learned that bad-faith harassers can weaponize its vague community standards, and in 2020 released updated versions of its “Nudity and Attire,” “Terrorism and Extreme Violence” and “Harassment and Hateful Conduct” guidelines. Last year, Twitch appointed an eight-person Safety Advisory Council, consisting of streamers, anti-bullying experts, and social media researchers, that would draft policies aimed at improving safety and moderation and healthy streaming habits.

Last fall Twitch brought on Hession, previously the head of safety at Xbox. Under Hession, Twitch finally banned depictions of the confederate flag and blackface. Twitch is on fire, she says, and there’s a big opportunity for her to envision what safety looks like there. “Twitch is a service that was built to encourage users to feel comfortable expressing themselves and entertain one another,” she says, “but we also want our community to always be and feel safe.” Hession says that Twitch has increased its content moderators by four times over the last year.

Leave a Reply

Your email address will not be published. Required fields are marked *