Congress moves to strengthen laws on online child sexual abuse materials


As the flood of reports of online Child Sexual Abuse Materials, or CSAM, continues to grow seemingly unabated, Congress is taking action to protect children by holding service providers and technology companies more accountable.

The Senate passed a bill in December that revised legislation to strengthen requirements to report suspected cases of online sexual exploitation and abuse of children. These include reporting cases where exploitation is planned or may be imminent.

The act also lengthens to a year the time platforms must preserve crucial evidence so law enforcement has adequate time to investigate and prosecute perpetrators. Currently, providers are only required to maintain data for 90 days. In addition, the financial penalties for failure to report have been increased.

The bill has moved onto the House of Representatives, which is working on a similar bill.

Online protection of children has been on the radar of SafeOC, which has provided information to families about many of the ways children can be exploited online. A Parent Support Page on the site offers a wide array of information on how to talk about public safety threats and dangers with children. SafeOC is a localized version of the national “If You See Something, Say Something” an anti-terrorism awareness campaign that also publicizes information that helps keep the Orange County public safe.

A growing problem

The proliferation of abuse materials online and the growth in the number of children being exposed to the material is alarming.

In 2022, the National Center for Missing and Exploited Children received 32 million CyberTipline reports of potentially abusive material online. This is a 329 percent increase in the last five years, according to a report on emerging trends by Thorn: Digital Defenders of Children, a group fighting human trafficking and sexual exploitation of children.

Although U.S.-based electronic service providers are legally required to report “apparent child pornography” to the CyberTipline, according to the Center for Missing Children, there are no legal requirements for proactive efforts to detect this content or what information a service provider must include in a CyberTipline report. The new law helps fill some of the gaps.

According to the Child Rescue Coalition, “more than 325,000 suspects have been identified as being involved in online trafficking of child sex abuse recordings, yet fewer than 7 percent of these reports have been investigated.”

As a result, the new law’s lengthening of the time requirements of information being retained by providers is seen as critical to law enforcement, which has been overwhelmed by the increased reports of abuse.

“We are seeing an avalanche of child sexual abuse material coming from every corner of the internet,” stated Scott Berkowitz, founder and president of RAINN (Rape, Abuse and Incest National Network) in a release. “Internet service providers have a responsibility to do everything in their power to enable investigations of child sexual abuse online.”

Parents face a double-whammy in attempting to shield children. Research in the Thorn report indicates that minors are increasingly engaging in risky online behavior, taking and sharing sexual images of themselves, whether consensually or by coercion.

“In our digitally connected world, child sexual abuse material is easily and increasingly shared on the platforms we use in our daily lives,” John Starr, VP of Strategic Impact at Thorn, stated in a release. “Harmful interactions between youth and adults are not isolated to the dark corners of the web. As fast as the digital community builds innovative platforms, predators are co-opting these spaces to exploit children and share this egregious content.”

More scrutiny

The government is also looking to further regulate platforms where harmful material can appear.

In a New York Times op-ed, Senators Elizabeth Warren (D-MA) and Lindsey Graham (R-SC) pushed for a new federal agency to regulate Big Tech, while highlighting the harm they can do.

“Giant digital platforms have provided new avenues of proliferation for the sexual abuse and exploitation of children, human trafficking, drug trafficking and bullying and have promoted eating disorders, addictive behaviors and teen suicide,” the Senators wrote.

Warren and Graham drafted and introduced the Digital Consumer Protection Commission Act in July 2023, which would create a new federal commission to regulate digital platforms. Child advocacy groups such as Enough Is Enough, which seeks to protect children online, have come out in favor of the bipartisan legislation.

The group calls the act a “crucial piece of legislation that marks a significant step forward in our collective efforts to combat the internet exploitation of children and ensure a safer digital environment for all.”

In October, in the biggest challenge to operators of social platforms to date, 33 states filed a lawsuit in California against Meta Platforms Inc., alleging the company is deliberately harming young people and contributing to a mental health crisis by designing features on Instagram and Facebook that attract and bind children to its platforms.

For more on past social media reporting by SafeOC, see

For more on SafeOC Parent Support page, see

Sign up for the ReadyOC newsletter and the SafeOC newsletter to receive local updates, public safety alerts, and tips.