Thorn Launches Safer Predict, a proactive AI detection solution to scale child safety on content-hosting platforms
July 22, 2024
3 Minute Read
Thorn’s new scalable solution helps protect platforms from the risks of hosting child sexual abuse material (CSAM) and text-based interactions that could lead to child sexual exploitation (CSE)
LOS ANGELES — July 22, 2024 — Thorn, a nonprofit that builds technology to defend children from sexual abuse, today announced the launch of Safer Predict, a transformative AI-powered solution to help content-hosting platforms proactively detect and mitigate the risks of hosting child sexual abuse material (CSAM) and text-based interactions that could lead to child sexual exploitation (CSE).
The National Center For Missing And Exploited Children (NCMEC) received more than 36 million reports of suspected child sexual exploitation to its CyberTipLine last year. Safer Predict helps content-hosting platforms reduce the risks of hosting new or previously unreported child sexual abuse material (CSAM) and child sexual exploitation (CSE), including text-based material that may indicate instances of threats that could lead to sexual harms against children.
“Child safety risks are skyrocketing, and platforms need solutions that can effectively scale protection for their users,” said Julie Cordua, CEO of Thorn. “Safer Predict gives platforms the power of Thorn’s cutting-edge child safety technology to identify new or previously unreported CSAM and CSE across images, video, and text. This allows them to take swift action, remove harmful content, and create a safer digital environment for everyone.”
Safer Predict leverages Thorn’s advanced machine learning classification models, trained on confirmed child sexual abuse data – including the organization’s CSAM classifier, which is trained in part using trusted data from the NCMEC CyberTipline. This trusted data enables Thorn’s models to predict the likelihood that image and video content contains CSAM. Safer Predict’s text detection models are also trained on confirmed messages related to child sexual exploitation.
Thorn’s brand-new CSE text classification model identifies potential abuse based on the context of the conversation and allows users to “stack” multiple labels that narrow down problem accounts or quickly target abuse. Multiple language models examine the context of complete conversations – line by line – and classify text to predict possible instances of child sexual abuse and exploitation, providing risk scores for CSAM, child access, sextortion, self-generated content, and more.
Safer Predict offers highly customizable workflows that enable platforms to develop strategic detection plans, prioritize high-risk accounts, and expand their CSAM and CSE detection coverage. The solution also streamlines content moderation processes, empowering teams to conduct in-depth investigations of pertinent results and report harmful material more efficiently.
Ahead of the broader launch of Safer Predict, Thorn partnered with leading social media platform X to beta test the solution’s text-based detection capabilities. The text classifier within Safer Predict proved highly effective at empowering X’s content moderation team to conduct more in-depth investigations of pertinent results, leading to more comprehensive and prioritized reports for NCMEC.
“Thorn’s issue expertise and high-quality training data made us eager to participate in the child sexual abuse text classifier beta,” said Kylie McRoberts, Head of Safety at X. “Our team gained efficiencies in clearing out queues by quickly finding actionable content. Deploying Safer Predict helps us to build on our efforts to build a technology-first approach to combating child sexual exploitation online, specifically our goal of expanding our capabilities in fighting high-harm content.”
Safer Predict builds upon the impact of Safer, which has processed more than 130 billion files and identified more than 5 million instances of potential CSAM on customer platforms since 2019.
Find Thorn at TrustCon on July 23 at the Hyatt Regency San Francisco to discuss how Safer Predict makes the web safer.
About Thorn
Thorn is a nonprofit that builds technology to defend children from sexual abuse. Founded in 2012, the organization creates products and programs to empower the platforms and people who have the ability to defend children. Thorn’s tools have helped the tech industry detect and report millions of child sexual abuse files on the open web, connected investigators and NGOs with critical information to help them solve cases faster and remove children from harm, and provided parents and youth with digital safety resources to prevent abuse. To learn more about Thorn’s mission to defend children from sexual abuse, visit thorn.org.