Category

Partners

Why I Donate My Time to Thorn

By | Partners, Technology For Good | No Comments

Shari Benko, User Experience Design Lead at Intel, joined us last week at Facebook Global Security HQ for the two-day Child Safety Hackathon. Employees from Intel, Microsoft, Google, Facebook, Domino Data Lab, and more, came together for the second year in a row to help develop cutting edge solutions in the fight against child sexual exploitation. Taking two days away from her work, Shari shares her motivation to combine her skills, empathy and passion to help move our projects forward. Read More

Technology and Thorn Are Playing a Critical Role in Addressing Child Sexual Exploitation

By | Partners | No Comments

The following guest post was written by Ernie Allen, a recent addition to Thorn’s Board of Directors. His leadership, dedication, inspiration and ongoing work to protect children across the globe continue to push us forward.

Throughout my career I have worked closely with technology leaders. For more than three decades I have seen the power of technology firsthand and how it has changed every aspect of our lives, including the way we search for missing children, identify those who prey upon children, and keep children safe. Yet, there is a dark side. Technology also facilitates the exploitation of children. We have to change that. Read More

Cloudera Cares + Thorn = Social Impact at Grace Hopper 2016

By | Partners, Technology For Good | No Comments

Our team joined Cloudera Cares to host a hackathon at the recent Grace Hopper Open Source Day. Women attending the conference were invited to take part in a day long hackathon to benefit a social impact project. The event served a number of purposes — to draw attention to the problem of child sexual exploitation, look for ways to stop it, encourage women starting out in their tech careers to contribute and offer those same women the opportunity to be mentored by other experienced engineers and data scientists. Read More

Hacking at Microsoft //oneweek to ID Exploited Children

By | Partners, Technology For Good | No Comments

Oftentimes when a child is in trouble, one of the only clues we have is his or her face.  We can have a picture of a missing child and be looking for them, or we can have a picture of an abused child that was distributed online and want to find them quickly.  One of the main hurdles is how we connect the dots between these images of exploited children with other photos on the open web that may help us identify them and remove them from harm.   Read More

Tech-Solutions-Greg-Clark

Hackathon Creates Tech Solutions for Child Safety

By | News, Partners, Technology For Good | No Comments

Greg Clark is a Senior Program Manager at Microsoft and participated in the recent Child Safety Hackathon at Facebook.

As someone who has spent his entire career thus far working in technology, I’ve always looked for opportunities to use my skills and experience to make a difference, beyond simply producing new innovative software and services.

I recently had the privilege of traveling to the Bay Area to attend a Child Safety Hackathon, put on by Facebook, featuring challenges from Thorn and the National Center for Missing and Exploited Children (NCMEC). The goal was to raise the problem of child sex trafficking to a diverse group of engineers to see what progress could be made over two days. The result was truly amazing! Read More

child safety hackathon julie cordua

Child Safety Hackathon Brings Silicon Valley Together

By | Partners, Technology For Good | No Comments

Last week, we joined Facebook as they hosted the first cross-industry Child Safety Hackathon. The event brought together leaders across the tech industry to hack on creating cutting edge solutions that will help find victims faster, deter predatory behavior and make platforms safer. The event further highlights the power of partnerships among leading technology companies. Read More

PhotoDNA

Microsoft’s PhotoDNA: Leading the Fight Against Child Sexual Abuse Imagery

By | Partners, Technology For Good | No Comments

Courtney Gregoire works as a senior attorney in Microsoft’s Digital Crimes Unit, where she fights technology-facilitated crime against vulnerable populations including children and the elderly. Her blog post is part of Thorn’s hashing series, which highlights the benefits of hashing technology for industry, law enforcement, nonprofits, and service providers as they work to detect and remove child sexual abuse material online.

Microsoft’s Digital Crimes Unit is dedicated to helping fight the online exploitation of children.  One persistent, horrendous crime is the distribution of child sex abuse imagery  on the Internet. The children victimized in this material  are first harmed when their abuse is perpetrated and recorded. They are further victimized each time that record is distributed. Last year, thanks to PhotoDNA, the technology industry was able to disrupt the distribution of over 4 million images, a 4-fold increase over 2014.

Read More

Child Sexual Abuse Material

Eliminating Child Sexual Abuse Material: The Role and Impact of Hash Values

By | Partners, Technology For Good | No Comments

John Shehan is the Vice President of the Exploited Child Division at the National Center for Missing & Exploited Children. He also serves as Vice President to INHOPE, a network of international hotlines combatting child sexual abuse online, and as an advisory board member to the College of Humanities & Behavioral Sciences at his alma mater, Radford University. His blog post is part of Thorn’s hashing series, which highlights the benefits of hashing technology for industry, law enforcement, nonprofits, and service providers as they work to detect and remove child sexual abuse material online.

Read More

hashing

Introduction to Hashing: A Powerful Tool to Detect Child Sex Abuse Imagery Online

By | Child Abuse Imagery, Partners, Technology For Good | No Comments

Last month, Thorn Digital Defender Del Harvey wrote about Twitter’s use of PhotoDNA, a technology developed by Microsoft that computes hash values of child sexual abuse material (CSAM). The tool applies a unique fingerprint to identify an individual photo to detect suspected material online and then supports law enforcement to report and investigate it. Read More