Skip to main content

Survivors Speak Out in Support of Critical Child Protection Legislation

11-07-2023

For the first time ever, Congress is actively considering five pieces of legislation that could move mountains for survivors of child sexual abuse material (CSAM). Each bill addresses different aspects of the crime of online child sexual exploitation, and together these bills will strengthen protections and remedies for survivors, improve reporting to NCMEC’s CyberTipline and increase law enforcement’s ability to investigate these crimes.

First, we’ll give you a little background on the problem of CSAM, and then you’ll hear from real survivors about why they support these legislative solutions. Some of the survivors’ names have been changed to protect their privacy.

For years now, the volume of child sexual abuse images and videos that offenders distribute and share online has been exploding. The creation of CSAM is certainly not a new crime. Before the internet, these illegal images and videos were traded in person or through the mail as paper photographs or videotapes.

But things have changed dramatically with the proliferation of technology like smartphones with cameras, cheaper access to greater amounts of digital storage and more ways for users to connect one-on-one online. Let’s just look at the past decade. In 2012, NCMEC received just under half a million reports of suspected child sexual exploitation to our CyberTipline. Most of these reports related to CSAM. In 2022, that number ballooned to 32 million reports. So far this year, NCMEC has received over 30 million reports.

Before we go any further, let’s be clear about the material we’re talking about when we talk about CSAM. These are crime scene photos and videos documenting the sexual abuse and assault of children. And once they’re put online, they never go away, as offenders continue to reshare the images over and over again.

We all play a role in working to disrupt the trading of these images online and support survivors, including you – the public – and our elected representatives on Capitol Hill.

While the problem may feel insurmountable, there are viable legislative solutions on the table right now. NCMEC supports these five child protection bills currently under consideration:

STOP CSAM Act
REPORT Act
EARN IT Act/COSMA
SHIELD Act
Project Safe Childhood Act

These proposed bills work as a package to curtail the online distribution of this illegal content, support survivors and strengthen the laws to help prosecute offenders. Survivors of these crimes are speaking out in support of these pieces of legislation. To read a more thorough breakdown of what each specific bill addresses, click on the links above.

The proposed bills come together to support essential protections that are vital to keep our children safer online and to support our efforts to combat online child sexual exploitation.

Empowering Survivors 

At NCMEC, we regularly hear from survivors of CSAM about the long-lasting damage and revictimization they suffer when the abusive images and videos continue to be circulated on the internet. Lack of control, both in having images online and in their constant recirculation, leaves many survivors struggling in their recovery.

Currently, survivors have no ability to hold online platforms accountable, even when the platform knowingly distributes CSAM. Under section 230 of the Communications Decency Act, online platforms are immune from civil lawsuits related to content on their platforms. Several of the proposed bills take aim at these sweeping and outdated protections and would allow survivors to seek legal remedies when online platforms knowingly distribute CSAM. This will be a monumental step forward for victims’ rights.

And in a first-of-its-kind provision, the STOP CSAM Act establishes a “Report and Remove” program that provides victims with a structured process to notify online platforms when they are hosting CSAM and legal recourse when the platform doesn’t promptly remove the CSAM. A newly established Child Online Protection Board will adjudicate petitions filed by survivors and will issue fines to the online platforms as appropriate. This is one more way for survivors to hold companies accountable for CSAM on their platforms and a huge step forward for survivors of these crimes. The STOP CSAM Act also strengthens reporting requirements for online platforms to NCMEC’s CyberTipline, which we will discuss later in more detail.
 

“It has never stopped. I am convinced that it will never stop. I will never have peace. I will never have closure.”
- Charlotte

CSAM images of this survivor, Charlotte, have been circulating on the internet for more than 10 years. She supports the STOP CSAM Act.

Calling It “Child Sexual Abuse Material” –

Not “Child Pornography”

It’s time to update the U.S. federal statutes with the appropriate terminology to more accurately depict what this crime is: the sexual abuse and assault of children. These images and videos are evidence of crimes against children and should be labeled as such. The term “pornography” is inaccurate to describe CSAM. This crime is never legal and is never consensual. The EARN IT Act/COSMA would strike the term child pornography from our laws and replace it with child sexual abuse material. We all know that words matter, and attitudes about the severity of a crime start with the words we use to describe it.
 

“Words matter, and survivors deserve to have the crimes against them acknowledged as abuse.”
- Nicole

Nicole's child is a survivor of child sexual exploitation. She supports the EARN IT Act.

Mandatory Reporting of Child Sex Trafficking and Online Enticement

We know that technology often evolves more quickly than our laws do. Two of the fastest growing online crimes against children that NCMEC has seen in recent years are child sex trafficking and online enticement. But the law stating which crimes online platforms are legally required to report to NCMEC’s CyberTipline is so outdated, that it doesn’t even include these crimes. All instances of child sex trafficking and online enticement need to be reported to NCMEC’s CyberTipline so these crimes can be referred to law enforcement for possible investigation and child victims can be recovered and provided with services. Three of the current bills would fix this gap: the REPORT Act, the EARN IT Act/COSMA, and the STOP CSAM Act. These bills would make it mandatory for platforms to report instances of child sex trafficking and online enticement to NCMEC’s CyberTipline.

Last year alone, NCMEC received more than 18,000 reports of child sex trafficking and more than 80,000 reports of online enticement. These numbers represent only voluntary reporting by a handful of online platforms. If reporting these crimes becomes mandatory, we expect those numbers to rise significantly, giving us a much clearer scope of the problem and enabling more child victims to be recovered.
 

“It is time that survivors have OUR voices back.”
- DJ

DJ is a survivor of child sexual abuse material. He supports the EARN IT Act/COSMA and the Project Safe Childhood Act.

Allowing Time for Law Enforcement to Investigate Online Crimes Against Children

Right now, when an online platform makes a report to NCMEC’s CyberTipline, it is required by law to retain that data so law enforcement has time to review it if needed for an investigation. But currently online platforms are required to retain the data for only 90 days, which is not always enough time for law enforcement to investigate a case. Two of the pending bills (the REPORT Act and the EARN IT Act/COSMA) would extend the amount of time an online platform would be required to retain data they submitted in a CyberTipline report from 90 days to one year. This would support law enforcement’s ability to investigate and hold offenders accountable. 

 

“What if this was your child? Wouldn’t you want law enforcement to have enough time to fully investigate and protect your child from harm?”
- L

L is a survivor of sextortion. She supports the REPORT Act, the SHIELD Act and the Project Safe Childhood Act.

Establishing a Best Practice for Law Enforcement to Submit Seized CSAM to NCMEC’s CVIP

NCMEC’s Child Victim Identification Program (CVIP) supports law enforcement in their efforts to identify child victims of CSAM and serves as the nation’s clearinghouse on information relating to identified child victims. NCMEC reviews and analyzes imagery containing unidentified children for any information relating to their potential location or who is responsible for their abuse. While this function is essential to identify child victims, currently there is no requirement for law enforcement to submit imagery seized from offenders to NCMEC’s CVIP. The Project Safe Childhood Act would change this by encouraging law enforcement to submit imagery they seize from offenders to NCMEC’s CVIP.

The REPORT Act goes a step further to support child identification efforts by enabling law enforcement to submit seized CSAM electronically to NCMEC, which currently is prohibited by law. This would dramatically increase the content submitted to NCMEC’s CVIP for review and the number of children we are able to identify. The REPORT Act also would enable survivors to seek more opportunities for restitution against offenders who are circulating images in which they are depicted.
 

“Not knowing who is uploading, downloading, trading and possessing these images and videos haunts me. It could be my neighbor. It could be someone I pass in the grocery store. It could be thousands of people across the globe.”
- Laura

Laura is a survivor of CSAM and child sex trafficking. She supports the REPORT Act and the Project Safe Childhood Act.

Better Reporting to NCMEC’s CyberTipline

Another common theme in the package of pending legislation is improving the information online platforms provide when they report child sexual exploitation to NCMEC. Right now, while it’s mandatory to report to NCMEC’s CyberTipline when an online platform becomes aware of CSAM, there are no requirements or guidelines about the information a platform includes in that report. As a result, the information varies significantly by the reporting party. And too often, companies do not report sufficient information to enable a victim to be identified, an offender investigated or even a determination of where in the world the abuse is occurring. 

The level of substantive information in a CyberTipline report could be the difference between identifying the child victim in the image and rescuing that child – or law enforcement being left without enough information to investigate.

New proposed legislation, including the STOP CSAM Act and the EARN IT Act/COSMA, would require online platforms to include critical details when making a CyberTipline report, including user information and other substantive data – making these reports much more actionable. These bills also would include a list of voluntary information that platforms should consider submitting to help even further. This change will have life-saving consequences. 

There are even more protections for children in these five bills. The SHIELD Act, for example, will criminalize the distribution of sexually exploitative images of a minor that don’t constitute sexually explicit content as required under the legal definition of child pornography. This closes a much-needed gap in the law that will allow offenders to be prosecuted for distributing nude images of children with the intent to harass or abuse them.
 

We encourage you to read NCMEC’s one-pagers on each bill (above) and then reach out to your Members of Congress, asking for their support. You can find your House of Representatives Member here and your Senator here.

Tell them you listened to survivors of CSAM and you urge them to sponsor these bills, support bringing them to the floor for a vote, and vote YES to pass these critical pieces of child protection legislation.

  翻译: