← Back to Library
Wikipedia Deep Dive

Children's Online Privacy Protection Act

Based on Wikipedia: Children's Online Privacy Protection Act

Here's a peculiar fact about the modern internet: if you've ever wondered why so many websites ask if you're over thirteen years old, the answer traces back to a single law passed in 1998—one that effectively drew an invisible line through cyberspace, separating childhood from everything else online.

The Children's Online Privacy Protection Act, known universally as COPPA, created rules that still shape how every major website operates today. But its real legacy might be unintended: it's why your thirteen-year-old can sign up for Instagram while your twelve-year-old technically cannot.

The Wild West of Children's Data

To understand why COPPA exists, you need to picture the internet of the mid-1990s. Electronic commerce was just emerging from novelty into something that might actually matter. Websites were popping up everywhere, many of them aimed squarely at children—games, educational content, virtual communities where kids could explore and interact.

Almost none of these sites had privacy policies.

That's not an exaggeration. Studies at the time found that eighty-nine percent of children's websites were collecting personal information from their young visitors, and most weren't telling anyone—not the kids, not the parents, nobody. Names, addresses, phone numbers, email addresses, all of it flowing into databases with no oversight and no restrictions on how it might be used.

The Center for Media Education, a nonprofit advocacy group, spotted a particularly troubling example: a website called KidsCom. They filed a petition with the Federal Trade Commission, the government agency responsible for protecting consumers from unfair business practices, asking them to investigate. The FTC looked into it and issued what became known as the "KidsCom Letter"—a formal finding that yes, these data collection practices were indeed subject to legal action.

But taking action against one website at a time wasn't going to solve a systemic problem. Congress needed to set actual rules.

What COPPA Actually Requires

The law that emerged in 1998 and took effect in April 2000 is more specific than most people realize. It doesn't ban children from using the internet. It doesn't even ban websites from collecting information about children. What it does is create a framework of requirements that websites must follow when they collect personal information from anyone under thirteen.

The requirements break down into several categories. First, websites must post clear privacy policies explaining exactly what information they collect and what they do with it. Second, they must notify parents directly before collecting information from their children. Third—and this is the requirement with teeth—they must obtain what the law calls "verifiable parental consent" before collecting, using, or disclosing that information.

Verifiable parental consent means the website can't just add a checkbox saying "I have my parent's permission." They need actual verification—a signed form, a credit card transaction, a phone call, something that provides reasonable assurance that a parent actually agreed.

The law also gives parents ongoing rights. They can review what information has been collected about their child. They can demand it be deleted. They can refuse to allow further collection. And websites must maintain reasonable security procedures to protect whatever information they do collect.

The Unintended Consequence

Here's where things get interesting. COPPA doesn't prohibit children under thirteen from using websites. It just makes it expensive and complicated for websites to let them.

Think about what verifiable parental consent actually means in practice. For every user under thirteen, a website would need to contact that user's parent, verify their identity, get their explicit permission, maintain records of that consent, provide mechanisms for parents to review and delete data, and ensure all third parties they share data with also follow these rules.

For a website with millions of users, that's an enormous operational burden.

So most websites chose a different path: they simply banned users under thirteen entirely. It's right there in the terms of service for Facebook, Instagram, TikTok, Twitter, YouTube, and virtually every other major platform. You must be at least thirteen years old to create an account.

This wasn't what Congress intended. The law was supposed to protect children while still allowing them to participate online with appropriate safeguards. Instead, it created a hard cutoff that pushed children into a legal gray zone where they either lie about their age or get locked out entirely.

The Safe Harbor System

The Federal Trade Commission, which enforces COPPA, created an interesting mechanism to help websites comply: the safe harbor program. Under this system, industry groups can create their own compliance frameworks. If the FTC approves these frameworks, websites that follow them get a kind of legal shelter—they'll face the industry group's disciplinary procedures rather than direct FTC enforcement.

Several organizations have received safe harbor approval over the years, including the Entertainment Software Rating Board (the organization that puts age ratings on video games), TrustArc (a privacy compliance company), and kidSAFE (which certifies children's apps and websites).

But the safe harbor system has had its problems. In 2021, a company called Aristotle withdrew from the program after FTC staff raised serious concerns about whether it was actually enforcing its rules. The Commission announced it would start scrutinizing the remaining safe harbors more closely—a tacit admission that self-regulation hadn't worked as well as hoped.

The Fines That Made Headlines

COPPA has real penalties. Courts can fine violators up to fifty thousand dollars per violation—and when you're talking about a website with millions of users, violations can add up fast.

The FTC has pursued dozens of enforcement actions over the years, and some of the fines have been substantial. In 2019, the music video app TikTok—which at the time was still operating under its original name, Musical.ly—agreed to pay $5.7 million, the largest COPPA fine in the law's history at that point. The app had been collecting information from children without parental consent, and ByteDance, TikTok's parent company, agreed to add a kids-only mode to the app as part of the settlement.

YouTube faced an even larger penalty that same year: $170 million for tracking the viewing history of children to serve them targeted advertisements. As part of the settlement, YouTube created a new system requiring video creators to mark their content as "made for kids," which would then disable personalized ads and certain features on those videos.

Epic Games, the company behind Fortnite, paid $275 million in 2022 for COPPA violations. The FTC alleged that Epic had collected personal information from children and made it unnecessarily difficult for parents to get that information deleted.

These big-name cases grab attention, but smaller companies have faced penalties too. A mobile advertising network called inMobi paid nearly a million dollars in 2016 for tracking user locations without consent—including the locations of children. Dating apps, gaming websites, even the company that made the Mrs. Fields Cookies website have all faced COPPA enforcement.

The YouTube Creator Panic

The YouTube settlement created an unusual situation that illustrates how COPPA's effects can ripple outward in unexpected ways.

When YouTube announced that creators would need to mark their videos as "made for kids," panic spread through the creator community. The stakes were high: the FTC could theoretically fine creators up to $42,530 per video that was incorrectly labeled.

But what counts as "made for kids"? The FTC listed factors to consider: the subject matter, visual content, use of animated characters, music style, age of people appearing in the video, presence of celebrities who appeal to children, and even empirical data about who actually watches.

The problem was that plenty of content exists in gray zones. A video game commentary channel might have an adult creator discussing mature themes, but if the games appeal to children, is it "made for kids"? An animation channel might use cartoon styles that children enjoy, but create content aimed at adults. A family vlogger might have both children and adults as subjects and audience.

Creators with vastly different content—from a channel featuring a child reviewing toys to an adult political commentator to an animator known for storytelling—all found themselves trying to navigate the same murky guidelines. The simple binary choice YouTube offered, "made for kids" or "not made for kids," didn't capture the complexity of how actual content works.

The 2013 Update and Beyond

COPPA wasn't meant to be static. The law requires the FTC to review its rules periodically, and in 2013, the Commission issued significant updates to reflect how dramatically the internet had changed since 2000.

Remember, when COPPA first took effect, there were no smartphones. No apps. No social media as we know it today. YouTube didn't exist. Facebook didn't exist. The iPhone wouldn't arrive for another seven years.

The 2013 revisions expanded the definition of "personal information" to include things like photos, videos, audio recordings, and geolocation data—none of which were contemplated when the original rules were written. They also made clear that the rules applied to mobile apps, not just websites, and expanded the definition of "collection" to cover persistent identifiers like cookies that could be used to track children across different sites.

The FTC opened another review in 2019, acknowledging that even the 2013 updates hadn't kept pace with technological change. Voice assistants, connected toys, artificial intelligence, facial recognition—all of these raised new questions about what it means to collect information from children and how to protect their privacy.

COPPA 2.0: The Expansion That Almost Happened

In 2024, Congress came close to dramatically expanding COPPA's reach. A bill called COPPA 2.0 would have extended the law's protections to all minors under seventeen, not just children under thirteen.

The bill passed the Senate with overwhelming bipartisan support—ninety-one votes in favor, only three against. It moved forward alongside another bill called the Kids Online Safety Act, which would have required social media platforms to give minors options to protect their information and opt out of personalized recommendations.

But neither bill made it through the House of Representatives before the congressional session ended in January 2025. The laws expired without becoming law, leaving the original thirteen-and-under threshold intact.

The near-miss illustrates an ongoing tension in how we think about protecting young people online. Thirteen was never a scientifically determined age of digital maturity—it was a legislative compromise made in 1998. A sixteen-year-old can face many of the same privacy risks as a twelve-year-old, but enjoys none of the same legal protections.

The Enforcement Gap

Having a law and enforcing it consistently are two different things. Despite COPPA's clear requirements, studies have repeatedly found that many apps and websites directed at children don't actually comply.

Part of the problem is definitional. What makes an app "directed to children"? A game with cartoon characters might appeal to children but not be designed specifically for them. An educational app might be used by children but marketed to parents and teachers. Developers often claim they didn't know children would use their products, making it harder to prove they had the "actual knowledge" required to trigger COPPA obligations.

Even when apps do comply technically, the implementation often fails children in practice. Terms of service and privacy policies are written in dense legal language that adults struggle to parse, let alone children. A website might technically disclose that it shares information with third parties, but that disclosure might be buried in thousands of words of legalese.

And the third parties themselves present problems. Research has found that data collected from children frequently ends up being used for targeted advertising, promoting products and content that children's advocates consider inappropriate—impulse purchases, unhealthy foods, content designed to be addictive.

Why Thirteen?

The choice of thirteen as the cutoff age has always been somewhat arbitrary. Congress didn't consult developmental psychologists to determine when children become capable of understanding digital privacy. They picked an age that seemed reasonable for the time, young enough to protect actual children while old enough to allow teenagers some autonomy.

Thirteen also had some precedent. It was and is the age at which children in many religious traditions are considered to reach a kind of maturity—bar and bat mitzvahs in Judaism, confirmation in some Christian denominations. It's the beginning of the teenage years, a natural-seeming boundary.

But the internet doesn't actually change on someone's thirteenth birthday. A twelve-year-old and a thirteen-year-old have essentially identical cognitive abilities, face the same risks, and are equally capable of clicking through terms of service without reading them. The law treats them completely differently.

This has led to a peculiar culture of age-lying online. Children learn early that claiming to be older grants access to platforms they want to use. Parents often tacitly approve or even help, creating accounts for children who haven't reached the magic threshold. The rule becomes more of a legal fiction than an actual protection.

The Global Context

COPPA is a United States law, but its effects extend globally. Any website or service based in the United States must comply with COPPA when dealing with children under thirteen, regardless of where those children actually live. A child in London using an American app gets COPPA's protections. A child in Los Angeles using an app from a company based in another country might not.

Other countries have taken different approaches. The European Union's General Data Protection Regulation, which took effect in 2018, includes provisions for children's data but allows individual member states to set their own age thresholds between thirteen and sixteen. The United Kingdom's Age Appropriate Design Code requires platforms to consider children's best interests in how they design their services, going beyond data collection to address broader questions of digital wellbeing.

Some countries have considered or implemented outright social media bans for younger users. Australia in 2024 moved toward banning social media access for anyone under sixteen. These approaches reflect a growing sense that privacy rules alone aren't enough to address the effects of social media and digital platforms on young people.

The Deeper Questions

Twenty-five years after COPPA took effect, the fundamental questions it tried to address remain unresolved. How do we balance children's safety with their autonomy? How do we protect privacy without creating barriers that push children into unregulated spaces? How do we hold companies accountable when the technology changes faster than the law?

COPPA succeeded in establishing that children's privacy deserves special protection. It failed to create a system that actually keeps children safe from the sophisticated data collection practices that have become the foundation of the modern internet economy.

The law imagined a world where parents could meaningfully consent to data collection on behalf of their children, where privacy policies would be clear enough for families to make informed choices, where companies would act in good faith to protect their youngest users. The internet that actually emerged looks nothing like that.

What comes next is unclear. The near-passage of COPPA 2.0 suggests appetite for reform. The FTC's ongoing reviews signal awareness that the rules need updating. But any changes will face the same challenge the original law faced: trying to regulate a technology that evolves far faster than legislation can follow.

In the meantime, the invisible line at age thirteen remains, as arbitrary and consequential as ever, sorting the internet into spaces where children officially exist and spaces where they officially don't—even when everyone knows the reality is far more complicated.

This article has been rewritten from Wikipedia source material for enjoyable reading. Content may have been condensed, restructured, or simplified.