…the message, “Report it. Don’t share it.” The “it” in this public awareness campaign Facebook just launched is child sexual abuse material (CSAM), the accurate term for what is typically called “child pornography” in the United States.
Thankfully, it’s extremely unlikely you’ll ever see content like this. “The prevalence of this content on our platform is very low,” Facebook researchers report, “meaning sends and views of it are very infrequent.” In the first quarter of this year (the latest figure available), the prevalence figure was 0.05%, meaning that, “of every 10,000 views of content on Facebook, we estimate no more than 5 of those views” are CSAM, according to Facebook’s transparency report. Put another way, compared with the 5.5 million pieces of bullying and harassment content that moderation teams “actioned,” 812,000 pieces of CSAM was removed from the Facebook platform, and 98.1% of those pieces were removed before anyone reported it, according to the same report.
Why is it crucial for social media users, as well as platforms, to report this content? Because every share and view re-victimizes the child depicted in the image or video. The sharing has to stop.
Sharing = re-victimizing
It’s probably hard for anyone reading this post to understand why any social media user needs to hear “report it, don’t share it.” It’s because even people who have no intention to harm a child share this material. More than 90% of this content is the same as or very similar to previously reported content, Facebook researchers found, meaning that it’s not the content originally posted. Senseless, right? So much (re-)victimization of children comes from users sharing this content onward. So the researchers had to know why – what were the intentions behind this sharing?
They needed to understand this for better prevention and intervention, they write.
On the intervention side, the goal was to provide more context to go with the reports the moderation teams send NCMEC (the National Center for Missing and Exploited Children, to which US companies are required by federal law to report CSAM on their servers). More context means more information to help law enforcement find and help victims faster. On the prevention side: to refine the language in pop-ups and other educational messages in apps so they really connect with the users who don’t have any intention to harm children - who may just be appalled or in shock and want to share their outrage, people who wouldn’t share the content if they knew it re-victimizes children.
Maps to sexting typology
The research turned up a whole spectrum of intentions both malicious (intent to harm children) and nonmalicious, and the result was a “taxonomy of intent.” I’ll let you click to this page if you’d like to see the full range, with definitions and examples at the bottom, but what’s interesting to me is how the malicious-nonmalicious taxonomy maps to the youth sexting typology that the Crimes Against Children Research Center published in 2011 as guidance for law enforcement. The CCRC’s two categories are “aggravated” (involving an adult and engaging in sexual abuse, extortion, threats, interpersonal conflicts or creating/sharing of images without the knowledge or consent of the minor depicted) and “experimental” sexting (produced by the minor to share with an established dating partner, to “create romantic interest in other youth, or for reasons such as attention‐seeking” but with “no criminal behavior beyond the creation or sending of images, no apparent malice and no lack of consent by the minor depicted).
Hard-working collaborators
Research insights like these are being shared and acted on in an important industry collaboration called the Technology Coalition, with 22 member companies big and small in different sectors and levels of the Internet working to eradicate child sexual abuse online. I’d like to see this kind of cross-industry collaboration working as effectively against online bullying, harassment and hate speech for vulnerable users of all ages, but at least we have a proven model for that in the Tech Coalition.
As for collaboration across whole sectors – industry, government and NGOs – watch a remarkable TED Talk by Julia Cordua, CEO of Thorn, which is building technology that connects those dots “so we can swiftly end the viral distribution of abuse material and rescue children faster.” The video paints the whole picture – the problem, what needs to be done, what is being done and who’s working on it – in a little over 13 minutes. As of this writing, it has gotten nearly 1.8 million views.
Related links
- A year ago, the Technology Coalition announced “Project Protect: A plan to combat online child sexual abuse,” after “in-depth consultation with more than 40 experts on CSEA around the globe.” With its investment in independent research, a forum for experts and sharing all that’s learned with the public, this is the outstanding model I refer to above.
- Two of the experts Facebook Research worked with on the study leading up to this campaign: Prof. Ethel Quayle at the University of Edinburgh and Ken Lannings at NCMEC.
- The Crimes Against Children Research Center at the University of New Hampshire was one of the first academic centers in the world to publish research on youth online risk (their first Youth Internet Safety Survey was published in 1999.
- Thorn CEO Julia Cordua’s 2019 TED Talk “How we can eliminate child sexual abuse material from the Internet“
[…] If anything needs to go viral, it’s this… – NetFamilyNews.org […]