Skip to content

The Issue: Child Sexual Abuse Material (CSAM)

See what we're doing about this issue.

4 min read

6 min read

4 min read

4 min read

6 min read

4 min read

4 min read

5 min read

2 min read


Child sexual abuse material (CSAM) comprises images and videos that document the real-life sexual abuse of a child. Traditionally, this content was produced by hands-on abusers. The emergence of generative AI broadens the scope to include AI-adaptations of original content, the sexualization of benign images of children, and fully AI-generated CSAM.  Our digital world makes it easier than ever for bad actors to create and spread CSAM on the very platforms we all use in our daily lives. In addition, self-generated sexual content (SG-CSAM) is increasingly seen as normal behavior among youth. Thorn’s mission is to eliminate CSAM from the internet and halt the cycle of trauma its circulation causes for victims. 

Be in the know

Get the latest blogs and updates from Thorn delivered directly to your inbox.