Skip to content

Redefining “Child Pornography”

May 16, 2023

4 Minute Read

It’s a nightmare reality happening at alarming rates: a child is sexually abused – and evidence of that abuse has spread across the internet like wildfire. This abuse material is traumatizing, toxic, illegal and definitely NOT pornography.

Is it porn?

Child sexual abuse material (CSAM), is legally known as child pornography in the U.S. and refers to any content that depicts sexually explicit activities involving a child. Visual depictions include photographs, videos, live streaming, and digital or computer generated images, including AI-generated content, indistinguishable from an actual minor.

CSAM better describes the reality of this crime, rather than the legal term of child pornography. Pornography implies consent, which a child can never give. CSAM is the documentation of the violent and horrific rape of children, often when they’re prepubescent and even non-verbal. 

So it’s important that we use terminology that reflects the impact of this crime on its victims. Categorizing these images as “child pornography” is simply not enough. By referring to this abusive content as “child sexual abuse material,” our aim is to convey the terrible truth about this imagery.

How does abuse material get shared?

The internet has made it too easy for abusers to share CSAM. Abusers create images and videos and then share them within online communities.  These communities promote communication between offenders, who push one another for even more graphic and violent content and desensitize them to the physical and psychological damages inflicted on the children being abused. That content then gets shared widely beyond the initial targets – recirculating the image across the internet like viral wildfire, perpetuating the abuse and victimizing the child many times over. 

When a child takes their own “nudes”, this is known as self-generated CSAM (SG-CSAM) and children can unwittingly contribute to the circulation of material if they send nudes to one another and it escapes that trusted relationship, or to an abuser who has groomed them.

More than 32 million reports of suspected child sexual exploitation are received by the National Center for Missing and Exploited Children (NCMEC) annually. Each report often contains multiple abuse images and/or videos- while there were 32 million reports, there were 88 million files reported. There is a child behind each of these files.

How can I help survivors?

If you or someone you know has been a victim of child sexual abuse, know that you are not alone. Here are just some of the organizations working to provide survivors with resources:

  • NCMEC’s “Take It Down” is a free service that can help you remove or stop the online sharing of nude, partially nude, or sexually explicit images or videos taken of you when you were under 18 years old. You can remain anonymous while using the service and you won’t have to send your images or videos to anyone.
  • RAINN’s National Sexual Assault Hotline is free and confidential, available 24/7. Call 800.656.HOPE (4673) or use the live chat feature.
  • Brave Movement is a survivor-centered global movement fighting to end childhood sexual violence. Brave Movement demands bold and transformative action from global leaders and has a number of active campaigns for change.
  • Canadian Centre for Child Protection (C3P) is dedicated to working together with survivors to effect change that will create a safer world for children.
  • InHope is a global network of 50 member hotlines to fight CSAM online. They also promote legislative and policy development.

What is Thorn doing to address this issue?

Thorn equips technology platforms with innovative tools, insights, and connections needed to end the spread of abuse content and stop revictimization. Our industry tool Safer is designed to help tech companies to detect, review, and report abuse imagery at scale.

Additionally, our prevention programs like Thorn for Parents help equip parents with skills and resources to have meaningful, productive and judgment-free conversations to prevent abuse. NoFiltr  is a youth-focused prevention program that seeks to increase awareness among youth about the risks of sharing nudes online, change toxic attitudes that shame victims, and provide the knowledge and tools they need to resist online threats.

Together, we can tackle child sexual abuse and create a brighter future where every child is free to simply be a kid.



Stay up to date

Want Thorn articles and news delivered directly to your inbox? Subscribe to our newsletter.