Online Safety Laws for Children & Teens: Part 2
- Soni Albright
- 6 hours ago
- 9 min read
How COPPA Has Evolved & What the FTC Has Tried to Fix

You can read Soni's full series and other articles about the intersection of media literacy, education, parenting, and more on her Substack: Do As I Say, Not As I Do.
About this series:
This series offers a practical overview of the key U.S. laws that govern online privacy and safety for children and teenagers. It begins with the Children’s Online Privacy Protection Act (COPPA), passed by Congress in 1998 and first implemented by the Federal Trade Commission (FTC) in 2000, with major rule updates in 2013 and 2025. Together, these regulations form the foundation for how companies collect, use, and share data from children under the age of 13. The series also examines related federal laws, including the Children’s Internet Protection Act (CIPA, 2000), which governs schools and libraries; the REPORT Act (2024), which expands online exploitation reporting requirements; and current proposals such as the Kids Online Safety Act (KOSA) and the EARN IT Act, which aim to address risks facing teens. Each installment breaks down what these laws actually say, how they are enforced, and what they mean for families, educators, and young people navigating digital spaces today.
From a 1998 Law to a Living Rule
Although COPPA was passed by Congress in 1998, the law itself hasn’t changed much in more than two decades. Congress wrote the statute broadly and delegated most of the real work to the Federal Trade Commission (FTC), directing the agency to “prescribe regulations” to protect children’s personal information. That regulatory framework — known as the COPPA Rule, located in 16 CFR Part 312 — is the part that has changed over time (source).
The FTC first issued the COPPA Rule in 1999, and it became effective in April 2000. At that point, COPPA covered only a narrow slice of what we now recognize as the online ecosystem: websites, basic data forms, and early chat/message boards.
Everything that came later, including mobile apps, smartphones, social networks, behavioral advertising, geolocation tracking, and now AI-driven tools, emerged outside COPPA’s original design. The FTC has been trying to “patch” this gap ever since.
The 2013 Update: Expanding What “Personal Information” Means
By the late 2000s, the internet no longer resembled the environment for which COPPA was written. Children used smartphones, apps collected data invisibly, and third-party advertising networks were tracking users across sites. In response, the FTC issued a major update to the COPPA Rule in 2013.
Several important changes were made:
1. New categories of children’s personal data
The FTC expanded “personal information” to include:
Geolocation data
Photos, videos, and audio files containing a child’s image or voice
Persistent identifiers (cookies, device IDs, IP addresses) when used for tracking
Screen or user names when they function as identifiers
This was the first acknowledgment that children’s data was being collected passively, not just through forms.
2. Closing the “third-party loophole”
Before 2013, sites could argue that advertisers and plug-ins, not the site itself, were collecting the data.
The 2013 update clarified that operators are responsible for data collection by third-party integrations on their platforms.
3. Updated parental consent mechanisms
The FTC allowed new verifiable parental consent methods (e.g., knowledge-based authentication, video verification), recognizing the need for more accessible tools.
4. Clearer accountability for mixed-audience sites
Platforms serving both children and adults now had to:
Age-screen users neutrally, and
Treat under-13 users as covered by COPPA.
This was especially relevant as platforms like YouTube and social games became extremely popular with kids.
You are probably wondering, what does “age-screen users neutrally” mean? Great question! For years, mixed-audience platforms used age screens not as a safeguard but as a loophole. Instead of asking for a child’s age neutrally, many sites designed prompts that steered kids to lie such as dropdown menus that began at age 18, pop-ups that warned “You must be 13 or older,” or sign-up flows that made it impossible to proceed without claiming to be an adult. This was not an accident; it was a deliberate strategy to avoid COPPA obligations while still benefiting from children’s engagement, data, and advertising value. Parents, too, were routinely misled into thinking a site was “not for kids” when, in reality, the platform fully expected children to use it and built its business model around them. The 2013 update requiring neutral age screening was a direct response to this pattern of trickery and system-abuse, an attempt to stop companies from using design to create the illusion of compliance while knowingly collecting data from millions of children.
As a media literacy teacher, this isn’t abstract to me, I see the consequences every day. I regularly talk with students about what it means ethically to lie about your age just to gain access to a platform. That choice sends a message: if honesty is expected “in real life,” but online it’s acceptable to bend the truth, then the digital world becomes a place where rules feel optional. Even more troubling, I often find myself explaining to parents why helping a child bypass age restrictions by entering a false birthdate is not harmless: it teaches kids that online spaces don’t require integrity, and that the ends justify the means. It is the purest example of “do as I say, not as I do.” And the real punchline is this: the companies wanted it that way. These platforms designed age screens that quietly encouraged children to lie because it allowed them to keep kids on the platform, collect their data, and sidestep the law.
And oh by the way, even after the most recent COPPA update, companies continue to exploit underage users and design platforms that deliberately attract children while pretending otherwise. A recent multi-state complaint lays this bare, noting that:
TikTok is “directed to children” because its audience includes millions of users under 13, because children are an ‘intended audience’, and because the platform features child-directed subject matter, characters, activities, music, and content, along with models, celebrities, and advertisements that appeal directly to children. (Source)
In other words, TikTok knows children are on the platform, benefits from their engagement, and does nothing meaningful to prevent their participation, even though their site is listed as 13+. This practice is ongoing, and it underscores a central truth of this series: even with updated rules, tech companies consistently choose growth and attention over complying with children’s privacy laws.
2025: The Most Significant Update Since COPPA Was Born
In April 2025, the FTC released another major update to the COPPA Rule, the most sweeping revision since 2013, which was long overdue. Several changes reflect new realities in children’s digital lives:
1. Biometrics and AI-relevant data added
The FTC updated “personal information” to include:
Biometric identifiers (faceprints, iris scans, fingerprints, voiceprints)
Any data used for automated recognition
This reflects the widespread use of facial recognition, voice assistants, and algorithmic systems (source).
Long-term consequences for children: Once collected, biometric and location data can form a permanent profile of a child’s voice, face, movements, and behavior, which is used for targeting, behavior modification, or AI training, rather than for their protection. These data footprints persist into adolescence and beyond, often without the child or parent ever understanding the full implications.
2. Stricter data retention and deletion standards
Operators must:
Keep data only as long as necessary
Maintain written deletion procedures
Stop indefinite retention “for potential future use”
“Keep data only as long as necessary” means companies may store a child’s information only for the specific purpose the parent agreed to, and must delete it once that purpose is fulfilled. This change was driven by cases in which companies, including Amazon, retained children’s voice recordings and location data indefinitely, even after parents requested deletion (source, source). Long-term retention creates lasting risks: children’s data can be repurposed for AI training and targeted advertising, or resurface in future data breaches (the risk increases the longer data is stored) long after the child has stopped using the service.
3. Clearer rules around data use for AI training
FTC commentary states that using children’s data to “train or otherwise develop artificial intelligence technologies” is not integral to service operation, meaning it requires separate parental consent.
4. Safer, more accountable “safe harbor” programs
One of the clearest examples of why companies cannot be trusted to police themselves—an argument we began in Part 1—is the overhaul of COPPA’s “safe harbor” system. Safe harbor programs are private, industry-run compliance organizations that companies can join instead of working directly with the FTC. In theory, these programs were supposed to enforce rules at least as strong as COPPA. In practice, many operated with minimal transparency, weak oversight, and relationships that were far too cozy with the companies they supervised. The 2025 update finally tightens this system: safe harbor programs must now file annual compliance reports, maintain public lists of all participating companies, and undergo regular FTC review to prove they are actually enforcing meaningful protections. In other words, this rule exists because history has repeatedly shown that voluntary industry oversight fails. When children’s privacy is at stake, “trust us” is not a safety strategy, and the FTC has had to intervene once again to do the job the industry refused to do.
5. Strengthened security requirements
Platforms must adopt:
Written security programs
Regular testing and monitoring
More substantial vendor oversight
Why These Updates Were Necessary
Between 2000 and 2025, COPPA’s core requirements stayed the same; the internet did not.
What Was Really Happening Behind the Scenes
By the early 2010s, it became clear that companies were not simply “accidentally” collecting children’s data; they were building entire business models around it. Many child-directed mobile apps, games, and connected toys were embedding third-party SDKs (software development kits)—small pieces of code supplied by advertisers, analytics companies, and data brokers. These SDKs quietly harvested precise geolocation, device identifiers, app usage patterns, and sometimes even WiFi data the moment a child opened the app. All of this occurred without meaningful parental consent, despite COPPA’s requirements. (In short: SDKs were invisible data pipelines hidden inside children’s apps.)
Real-world cases reveal that companies understood exactly what they were doing. In 2025, the FTC filed a complaint against Apitor, a Chinese robot-toy maker whose companion app, marketed to ages 6–14, collected children’s precise GPS coordinates through an embedded SDK—all without obtaining verifiable parental consent (source). Other investigations showed that data brokers such as Kochava, Gravy Analytics, and Venntel were compiling hundreds of millions of precise location points from mobile devices each year, often tied to persistent identifiers (source, source). These datasets captured visits to medical facilities, reproductive health clinics, places of worship, union meeting locations, addiction recovery centers, shelters, and schools, and in some cases were used to geofence those areas—creating virtual perimeters that triggered targeted data collection the moment someone crossed the boundary. Although these cases largely involved adults, the same ad-tech pipelines were present in children’s games, toys, and “free” apps. This meant that children’s home addresses, daily routes between school and after-school activities, and behavioral patterns could be mapped and stored, sometimes for years. And because location data reveals the most intimate details of a person’s life, you would never want a child’s physical movements tracked, profiled, or sold to unknown third parties.
What makes this especially troubling is the purpose for which the data were used. These long-term records formed detailed advertising and behavioral-modification profiles, not to support children’s well-being, but to predict and shape future behavior. A child who played a simple “free” game could have their movements tracked between home, school, and after-school activities; their in-app behavior logged; and their device stitched into commercial datasets designed to sell them things. These profiles persist into adolescence and beyond. This was not a misunderstanding. It was a predatory business practice that companies obscured for as long as possible.
Closing the Gaps
From COPPA’s implementation in 2000 through the present day, children’s data has flowed through mobile apps, voice assistants, streaming platforms, educational technology, gaming ecosystems, and now AI-driven tools. Each new technology created opportunities for data collection that COPPA’s original language had never envisioned. The FTC’s rule updates represent attempts to close these widening gaps, not expansions of the law, but reinterpretations of Congress’s original mandate: to protect children under 13 online.
Still, there are substantial limits. The 2025 update does not:
extend protections to teens,
regulate algorithmic outcomes,
set age-appropriate or ethical design standards,
protect children from manipulative platform design, or
create new enforcement powers.
Those issues have fueled current debates over KOSA and other youth-safety proposals, which we will explore later in this series.
What Comes Next
COPPA’s evolution shows the FTC trying to keep pace with an internet that changes faster than regulation can keep up. Even with the 2013 and 2025 updates, COPPA remains a narrow law built for a specific age group and a much earlier technological era.
In Part 3, we’ll look at the Children’s Internet Protection Act (CIPA), a very different type of law, and how schools and libraries became frontline gatekeepers in children’s digital safety.
(c) Soni Albright, 2025 — Do as I Say, Not as I Do
Catch up on past articles from this series:

Soni Albright is a teacher, parent educator, curriculum specialist, researcher, and writer for Cyber Civics with nearly 24 years of experience in education. She has taught the Cyber Civics curriculum for 14 years and currently works directly with students while also supporting families and educators. Her experience spans a wide range of school settings—including Waldorf, Montessori, public, charter, and homeschool co-ops. Soni regularly leads professional development workshops and is passionate about helping schools build thoughtful, age-appropriate digital literacy programs. Please visit:




























