Gaggle Speaks

Ideas, news, and advice for K-12 educators and administrators to help create safe learning environments.

close
Written by Lisa Railton
on March 15, 2021

While legal terminology still refers to visual depictions of sexually explicit conduct involving minors as child pornography, many organizations and advocates around the globe are steering away from that language. These materials are neither legal nor consensual—they are abuse and exploitation, not pornography.  

Referring to these materials as pornography is damaging to the underage victims, who suffer each time someone views videos or images of their abuse. Outside of the legal system, organizations like the National Center for Missing & Exploited Children (NCMEC) and Child Rescue Coalition prefer the term child sexual abuse materials (CSAM) to more accurately describe this illegal content. 

Gaggle Safety Management helps safeguard both students and districts when it comes to incidents involving CSAM. Our machine learning technology identifies images that are likely to be explicit and flags them for further analysis. These images are then reviewed by our team of safety professionals, who have been trained to properly handle sexually explicit content and interact with NCMEC.

If our team suspects minors are being exploited or abused, the content is reported to NCMEC as required, protecting schools from the risks and liability involved in handling these materials. NCMEC strives to reduce child sexual exploitation and prevent child victimization, working with victims, families, law enforcement, and companies like Gaggle to help keep students safe. When content is reported to them through their CyberTipline, NCMEC tags CSAM for removal to make sure the files are not distributed and the minors involved are protected.

Gaggle will remove the ability to access and share content suspected of involving child abuse and exploitation from district Google Drive accounts and retain any files in question for 90 days, as required by law. Our goal is to help protect the students by not allowing the spread of inappropriate content—and the district from the hassle and liability that can accompany the sharing of this content.

During the first few months of the 2020–21 school year, Gaggle recorded a 135% increase in incidents involving nudity and sexual content, which includes content classified as CSAM. In addition, we made 3,952 reports of possible CSAM to NCMEC in 2020, helping to prevent further exploitation of underage victims. 

Whatever you call it, it’s a crime. However, words matter, and using the preferred terminology helps people understand just how harmful these materials are to the children who are being exploited. 

Let Us Know What You Thought About This Post.

Put your comment below.

You may also like:

Gaggle Student Safety Student Mental Health Technology

Gaggle’s Support Team Spotlight | Part Two

Welcome to the second post of our latest series: Gaggle’s Support Team Spotlight. Through this series, we're diving into...

Gaggle Student Safety Student Mental Health Technology

Utilizing ESSER Funds in K-12 Districts

The Elementary and Secondary School Emergency Relief (ESSER) fund has emerged as a crucial resource for K-12 districts. ...

Gaggle Student Safety Student Mental Health Technology

Gaggle’s Support Team Spotlight | Part One

Welcome to the first post of our new series: Gaggle’s Support Team Spotlight. Through this series, we'll dive deep into ...