Online Safety Laws for Children & Teens Part 4
- Soni Albright
- 5 hours ago
- 7 min read
The REPORT Act and the Push for Platform Accountability

About this series:
This series offers a practical overview of the key U.S. laws that govern online privacy and safety for children and teenagers. It begins with the Children’s Online Privacy Protection Act (COPPA), passed by Congress in 1998 and first implemented by the Federal Trade Commission (FTC) in 2000, with major rule updates in 2013 and 2025. Together, these regulations form the foundation for how companies collect, use, and share data from children under the age of 13. The series also examines related federal laws, including the Children’s Internet Protection Act (CIPA, 2000), which governs schools and libraries; the REPORT Act (2024), which expands online exploitation reporting requirements; and current proposals such as the Kids Online Safety Act (KOSA) and the EARN IT Act, which aim to address risks facing teens. Each installment breaks down what these laws actually say, how they are enforced, and what they mean for families, educators, and young people navigating digital spaces today.
Introduction: From Filtering to Accountability
In Parts One, Two, and Three, we looked at the federal laws that shaped the early internet for children: COPPA, which regulates how companies collect data from kids under 13, and CIPA, which pushes schools and libraries to filter, monitor, and manage children’s access. Together, they reveal a striking pattern: for more than two decades, the federal government placed the burden of online safety on minors, parents, and public institutions, not on the companies that built and profited from the digital world children use every day.
Part Four marks a turning point in this story. Because for the first time since COPPA, Congress passed a law that puts responsibility squarely where it has always belonged: on the platforms.
That law is the REPORT Act, enacted in 2024 (source).
What the REPORT Act Actually Does
The REPORT Act (short for Report Exploitation by Preventing Online and Restraining Technology) updates and expands the federal requirements for when and how platforms must report suspected child sexual exploitation material (CSAM) to the National Center for Missing and Exploited Children (NCMEC).
It does not regulate design, data collection, or age-based protections.
Instead, it strengthens the mandatory reporting framework originally created by 18 U.S.C. § 2258A.
Here is what the law actually changes:
1. Expands which companies must report CSAM
Previously, mandatory reporting primarily applied to:
traditional internet service providers
email services
storage providers
The REPORT Act expands this to include:
messaging apps
social media platforms
cloud providers
gaming platforms with chat
virtual reality systems
any service that “stores, computes, or transmits content provided by a user.”
In other words: the entire modern internet.
2. Increases penalties for failing to report
Companies now face significantly higher fines for failing to report CSAM or for reporting it late. The goal is simple: eliminate the incentive to ignore or hide abuse reports.
3. Creates new record-retention requirements
Platforms must now preserve:
the content in question
metadata
associated account information
for a specific period (now 1 year, previously 90 days), which helps law enforcement investigate cases more effectively.
4. Requires faster reporting to NCMEC
The law shortens the window between detection and reporting.Delays were a consistent problem; some companies took days or weeks to submit reports, allowing offenders to escalate harm.
5. Includes measures aimed at addressing cross-platform abuse
Because abuse rarely occurs on a single service, the law encourages collaboration and information-sharing across platforms and law enforcement.
Why Congress Passed the REPORT Act
Congress did not wake up one day and suddenly decide to intervene.The REPORT Act was a response to a documented and accelerating crisis:
1. A record 36.2 million CSAM reports in 2023
According to NCMEC’s CyberTipline data, almost all of these originated from platform reports, not from law enforcement.
2. Major companies repeatedly failed to report or preserve evidence
Journalistic investigations and federal lawsuits revealed:
delayed reporting by large platforms
platforms disabling detection tools
evidence deleted before investigators could access it
inconsistent internal protocols for reporting abuse
It has been repeatedly shown that major platforms have failed to report, preserve, or act on evidence of child exploitation in a timely or responsible way. For example, Telegram, one of the world’s largest messaging apps, was fined nearly A$957,780 by Australia’s eSafety Commissioner after the platform delayed more than 160 days in responding to a notice involving child sexual abuse material and extremist content. Regulators noted that this delay directly hindered efforts to investigate offenders and preserve evidence that could protect victims. At the same time, newly unsealed court filings against Meta reveal that the company had long-standing internal knowledge of risks to minors across Facebook and Instagram, yet failed to uniformly enforce detection or reporting procedures.
Journalists and attorneys described internal systems that were inconsistent, easily overridden, or deprioritized when they conflicted with growth and engagement metrics.
The problem extends beyond delayed reporting. In hundreds of pending lawsuits, Roblox Corporation is accused of failing to prevent sexual exploitation of minors on its platform, despite repeated warnings and evidence submitted by families. Parents allege that Roblox’s internal processes for detecting abuse, particularly in chat rooms and private messages, were fragmented, inconsistent, and in some cases unable to preserve evidence long enough for investigators to act. Together, these cases demonstrate why Congress strengthened the mandatory reporting framework: when platforms control the detection tools, the reporting pipelines, and the evidence itself, delays, gaps, and failures become systemic rather than accidental.
This pattern is exactly what the REPORT Act sought to address.
3. Escalation of livestreamed and AI-generated CSAM
Firms struggled (and, in many cases, refused) to update their detection systems.This created what NCMEC has described as “the most complex landscape of child exploitation” it has ever seen.
Law-enforcement agencies and child-safety researchers have warned that livestreamed and AI-generated child sexual abuse material is escalating rapidly, often outpacing the tools platforms rely on for detection. The U.S. Department of Justice and International Justice Mission describe live-streamed child sexual exploitation as “one of the fastest-growing forms of online abuse,” noting that the real-time nature of these crimes makes evidence almost impossible to preserve. At the same time, investigations by The New York Times and CNN have shown that AI-generated CSAM now circulates widely across major platforms and routinely evades detection systems that were never designed to identify synthetic imagery. Europol has similarly warned that AI is “expanding the scale of child sexual abuse material at a scale previously unimaginable,” and the nonprofit Thorn reports that companies have not adapted their pipelines to identify this new category of content (source).
4. The massive scale of encrypted messaging apps
Encrypted messaging apps like WhatsApp, Telegram, Discord, and Signal, which use end-to-end encryption, have surged in popularity as users seek greater privacy from data collection and surveillance. But the same features that protect legitimate conversations also make these apps attractive for distributing harmful content and evading detection. Law-enforcement experts note that critical evidence is now “locked behind unbreakable digital walls,” and investigations show that criminals increasingly rely on encrypted platforms to operate anonymously. As these services grow, so does the difficulty of identifying and reporting child exploitation.
The REPORT Act Isn’t a “Youth Internet Safety Law” — But It Shows Something Important
The REPORT Act is narrowly focused on CSAM reporting. It does not address:
addictive design
algorithmic amplification
harassment and online harms
teen privacy
data trafficking
targeted marketing
age-appropriate safety standards
But it does represent something new:
Congress finally placed responsibility on platforms, not schools, parents, or children.
For the first time since 1998, the federal government acknowledged that companies, not kids, must change their practices.
This is the shift that Parts Five and Six of this series will explore in greater depth.
Where the REPORT Act Falls Short
It relies on the detection systems that companies choose to implement.
If a platform disables or weakens its detection tools, the law cannot compel it to build better ones.
It does not address teen safety at all.
A 14-year-old harmed by design choices, algorithms, or predatory adults receives no new protections.
It does nothing about the upstream causes of exploitation.
Many platforms still allow anonymous accounts, frictionless content sharing, and design features that enable grooming or targeted contact.
It does not regulate AI-generated CSAM production tools.
This is an enormous and growing blind spot.
It still depends on NCMEC’s capacity.
The CyberTipline is overwhelmed, underfunded, and dependent on Congress for annual appropriations.
Part 4 Reveals What’s Coming Next
The REPORT Act is a signal.Congress is finally acknowledging:
platforms must bear responsibility for harms
data retention is a safety issue
reporting pipelines must be modernized
the old model of voluntary compliance isn’t working
This momentum lays the groundwork for:
the Kids Online Safety Act (KOSA)
online design-standards proposals
state-level age-appropriate design laws
algorithmic transparency requirements
AI-specific child safety regulations
We are entering the first genuine regulatory moment since COPPA.
What Comes Next in This Series
In Part 5, we’ll look more closely at the current proposals shaping youth safety and digital well-being:
KOSA (Kids Online Safety Act)
The EARN IT Act
State-level Age Appropriate Design Codes
Federal proposals addressing AI, biometrics, and teen data
And we’ll ask the central question that threads through this entire series:
What would it look like if we actually designed the online world with children and teens in mind?
(c) Soni Albright, 2025 — Do As I Say, Not As I Do
Catch up on past articles from this series on my Substack:

Soni Albright is a teacher, parent educator, curriculum specialist, researcher, and writer for Cyber Civics with nearly 24 years of experience in education. She has taught the Cyber Civics curriculum for 14 years and currently works directly with students while also supporting families and educators. Her experience spans a wide range of school settings—including Waldorf, Montessori, public, charter, and homeschool co-ops. Soni regularly leads professional development workshops and is passionate about helping schools build thoughtful, age-appropriate digital literacy programs. Please visit: https://www.cybercivics.com/parent-presentations




























