Skip to content

How Hashing and Matching Can Help Prevent Revictimization

August 24, 2023

4 Minute Read

The growth of child sexual abuse material (CSAM) on the internet is escalating at unprecedented speed and scale and cannot be stopped solely with human intervention. In 2021, 85 million files of child sexual abuse were reported to the National Center for Missing & Exploited Children (NCMEC) up from 20 million files in 2017 with 59% of the files reported featuring pre-pubescent children. 

Behind each CSAM file is a real child. When those files are circulated, it causes a cycle of revictimization for the victim even after they’ve been recovered from harm. Imagine your worst moment, shared over and over again, never certain who has seen it.  According to a 2017 survivor survey from the Canadian Centre for Child Protection, almost 70% of the CSAM survivors worried about being recognized by someone because of the recording of their abuse.

Hashing and matching is one of the most important pieces of technology that we can deploy to help end the viral spread of CSAM and these cycles of revictimization.

What is hashing and matching?

Hashing is a foundational technology used throughout the child protection ecosystem. It converts a piece of known CSAM into a unique string of numbers through an algorithm. This is called a hash. It’s like a digital fingerprint for each piece of content.An image is turned into a string of code

Hashes are compared against hash lists of known CSAM. The system looks for a match of the hash without ever seeing users’ content. (The technology is algorithmic, not manual.)The string looks for a potential match

When and if a match is found, the CSAM can be removed from the platform, and reported to authorized entities who refer it to law enforcement in the proper jurisdiction.A match for CSAM can be removed from the platform and reported to law enforcement.

Why is this technology so important?

Every time a CSAM file is shared, a survivor is revictimized. Often, images of abuse are circulated for years. Hashing and matching can help end the revictimization that can continually traumatize survivors and their families. A CSAM file being recirculated around the internet

In a recent statement from the Phoenix 11, the group of survivors described their collective experiences: “It is our abuse, our bodies, and the most horrific and exploitative instances in our lives being documented and released on the internet to be uploaded, downloaded, traded, and circulated for the rest of our lives.”

Millions of CSAM files are shared online every year. A large portion of these files are of known CSAM, which means it has already been confirmed as CSAM, hashed and added to a hash list. Hashing and matching can and must be used to disrupt this viral spread of this abuse material.

Additionally, investigators and platforms can spend less time reviewing repeat content. This frees them up to prioritize new content, where a child may be suffering ongoing abuse. Learn more about how our CSAM classifier helps find new CSAM.

By using this privacy-forward technology to fight CSAM at scale, we can protect individual privacy and advance the fight against CSAM.

How do online platforms use it?

Safer, our all-in-one solution for CSAM detection, uses hashing and matching as a core part of its detection technology. With the largest database of verified hashes (29 million hashes) to match against, Safer can cast the widest net to detect known CSAM. 

In 2022, we hashed more than 42.1 billion images and videos for our customers. That empowered our customers to find 520,000 files of known CSAM on their platforms. To date, Safer has found 2M pieces of potential CSAM.To date, Safer has found 2M pieces of potential CSAM.

Hashing and matching is crucial to reducing revictimization for those who have experienced child sexual abuse. T​​he more platforms that utilize this technology, the closer we will get to our goal of eliminating CSAM from the internet.

Resources

To learn more about Thorn’s technology:



Stay up to date

Want Thorn articles and news delivered directly to your inbox? Subscribe to our newsletter.