Trigger warning: This post talks about child predation and sexual abuse.
Back in September 2022, it was revealed that popular streaming platform Twitch was being used by child predators to track and, in some cases, groom young streamers. Not long after that 2022 Bloomberg report, Twitch announced changes to combat the problem, creating phone verification requirements and claiming that it would work to delete accounts made by people under the age of 13. But a new Bloomberg report published on January 5 of this year reveals that the predator problem hasn’t disappeared, but has morphed, with perpetrators adopting a new, nefarious method to prey on children: abusing the Twitch “clips” feature, which is reportedly being used to record and share sexually explicit videos of minors.
Twitch clips are exactly what they sound like: 20-second snippets of a livestream that any viewer can capture and share on social media. The feature launched in 2016, and Twitch is planning to expand it this year by creating a discovery feed for easy findings—all in an effort to compete with short-form video platform TikTok. Unfortunately, it’s these short-form videos that have reportedly allowed child predators to proliferate the sexualization of minors online.
Bloomberg, in conjunction with The Canadian Centre for Child Protection, analyzed nearly 1,100 clips and found some shocking results. At least 83, or 7.5 percent, of these short-form videos featured sexualized content of children. The analysis uncovered that 34 of the 83 Twitch clips (about 41 percent) primarily depicted young boys between the ages of 5 and 12 “showing genitalia to the camera” reportedly after viewer encouragement. Meanwhile, the other 49 videos (roughly 59 percent) had sexualized content of minors either exposing other body parts or falling victim to grooming.
What makes the situation worse isn’t just the continued spread of child sexual abuse on Twitch, but the frequency with which these clips were watched. According to Bloomberg’s findings, the 34 videos were viewed 2,700 times, while the other 49 clips were watched some 7,300 times. The problem isn’t just the ease in creating these clips, but in proliferating them, as well. According to Stephen Sauer, the director of The Canadian Centre for Child Protection, social media platforms can’t be trusted to regulate themselves anymore.
“We’ve been on the sidelines watching the industry do voluntary regulation for 25 years now. We know it’s just not working,” Sauer told Bloomberg. “We see far too many kids being exploited on these platforms. And we want to see government step in and say, ‘These are the safeguards you have to put in place.’”
In an email to Kotaku, Twitch sent a lengthy, bulleted list of its plan to combat child predation on the platform. Here is that list in full:
- Youth harm, anywhere online, is unacceptable, and we take this issue extremely seriously. We’ve invested heavily in enforcement tooling and preventative measures, and will continue to do so.
- All Twitch livestreams undergo rigorous, proactive, automated screening—24/7, 365 days a year—in addition to ongoing enforcement by our safety teams. This means that when we disable a livestream that contains harmful content and suspend the channel, because clips are created from livestreams, we’re preventing the creation and spread of harmful clips at the source.
- Importantly, we’ve also worked to ensure that when we delete and disable clips that violate our community guidelines, those clips aren’t available through public domains or other direct links.
- Our teams are actively focused on preventing grooming and other predatory behaviors on Twitch, as well as preventing users under the age of 13 from creating an account in the first place. This work is deeply important to us, and is an area we’ll continue to invest in aggressively. In the past year alone:
- We’ve developed additional models that detect potential grooming behavior.
- We’ve updated the tools we use to identify and remove banned users attempting to create new accounts, including those suspended for violations of our youth safety policies.
- We’ve built a new detection model to more quickly identify broadcasters who may be under the age of 13, building on our other youth safety tools and interventions.
- We also recognize that, unfortunately, online harms evolve. We improved the guidelines our internal safety teams use to identify some of those evolving online harms, like generative AI-enabled Child Sexual Abuse Material (CSAM).
- More broadly, we continue to bolster our parental resources, and have partnered with expert organizations, like ConnectSafely, a nonprofit dedicated to educating people about online safety, privacy, security, and digital wellness, on additional guides.
- Like all other online services, this problem is one that we’ll continue to fight diligently. Combating child predation meaningfully requires collaboration from all corners. We’ll continue to partner with other industry organizations, like NCMEC, ICMEC, and Thorn, to eradicate youth exploitation online.
Twitch CEO Dan Clancy told Bloomberg that, while the company has made “significant progress” in combating child predation, stamping out the issue requires collaboration with various agencies.
“Youth harm, anywhere online, is deeply disturbing,” Clancy said. “Even one instance is too many, and we take this issue extremely seriously. Like all other online services, this problem is one that we’ll continue to fight diligently.”