The Law That Followed the Kids

The Law That Followed the Kids

Arlo Gilbert ·

COPPA was signed into law on October 21, 1998. Bill Clinton was president. Google had been incorporated for three weeks. The most popular website for children was probably Neopets, which had launched eight months earlier.

The problem COPPA was built to solve was specific and, by today's standards, almost quaint. Websites were asking kids for home addresses, phone numbers, and email addresses, then selling that information to marketers. A site called KidsCom had been collecting detailed personal profiles from children and sharing them with advertisers. The FTC investigated, Congress got involved, and COPPA became law.

The core idea was simple: if you run a website and you know a user is under 13, you need their parent's permission before you collect personal information. That was basically it.

Twenty-eight years later, that same law is about to become the most consequential AI privacy regulation in the United States.

What changed on paper

The FTC finalized an updated COPPA rule in June 2025. It's the first significant revision since 2013. Most of the new requirements take effect on April 22, 2026. Twelve days from today.

The headline change is a single sentence in the FTC's commentary: disclosures of a child's personal information to train or otherwise develop artificial intelligence technologies are not integral to a website or online service.

That sentence is doing a lot of work. It doesn't say AI training requires extra disclosure. It doesn't say companies need to notify parents about their training practices. It says AI training is categorically not part of providing the service. If you want to use a child's data to train a model, you need separate verifiable parental consent. Every time. No exceptions.

The updated rule also expands what counts as "personal information." Biometric identifiers like voiceprints and facial templates are now included. Indefinite retention of children's data is banned. Operators must maintain a written data retention policy and a written information security program. The penalty for non-compliance is $51,744 per incident, per day.

Why COPPA and not some other law

Something I think about a lot, given Osano's work across privacy regulations, is why some laws survive technological change and others don't.

The Children's Television Act of 1990 is a useful comparison. It regulated advertising during children's broadcast programming. When cable arrived, the rules stretched to fit. When streaming arrived, they broke. Netflix and YouTube aren't "broadcasters" under the law. The regulation was anchored to a specific technology, broadcast signals, so when kids moved to a different platform, the law couldn't follow them.

COPPA took a different approach. It said: if the user is under 13 and you're collecting their personal information, these rules apply. When smartphones arrived and children moved to apps, the FTC updated the rule in 2013 to clarify that mobile apps counted. The jurisdictional anchor held because it was attached to the child, not the platform.

Now AI is the platform. COPPA followed the kids again.

The United States has no federal AI privacy law. The EU has the AI Act reaching full enforcement in August. The U.S. has a patchwork of state laws and sector-specific regulations. In that vacuum, COPPA turns out to be the sharpest tool available. Not because anyone designed it for AI. Because it was designed around the right principle: protect the person, and the law outlives the technology.

What this means if you build AI products

If you operate any service where a child under 13 might create an account, use a chatbot, or interact with an AI feature, COPPA now applies to your AI training pipeline. Not just your product. Your pipeline.

Most AI companies treat user interactions as a training resource. Conversations with chatbots. Search queries. Voice commands. The operating assumption has been that terms-of-service consent covers model improvement. The updated COPPA rule says: not for children. And for children's data, AI training consent must be separate from general consent. You can't bury it in a click-through.

The companies most exposed aren't the ones building kids' products. Those companies already know COPPA. The most exposed are general-purpose AI platforms that haven't thought about children as a user category because they didn't build their product for children.

If your chatbot is used by millions of people, some of them are under 13. The FTC has been expanding what counts as constructive knowledge of a user's age, and "we didn't ask" is increasingly not a viable defense. We see this pattern at Osano all the time. Companies come to us thinking their privacy exposure is in one area, and the regulation they never considered turns out to be the one that applies most directly.

Where this goes

The FTC has said children's privacy is its top enforcement priority. Not one priority among several. Top priority. The updated rule gives them substantially more to work with, and the penalty math gets large fast. A platform with a million child users running AI training without separate consent could face numbers that dwarf a typical GDPR fine.

The fines will get attention. The precedent matters more.

COPPA just established that AI training on a protected class of users' data requires separate, explicit consent. That principle won't stay limited to children for long. If the legal logic says children's data requires its own consent for AI training, adults aren't far behind. Several states have already introduced bills requiring opt-in consent for AI training on personal data, regardless of the user's age.

The April 22 deadline will reshape more than children's privacy. It's building the template for how every future AI privacy law in this country defines training consent.

In 1998, the biggest threat to a child's privacy was a website asking for their zip code. That law now draws the lines for trillion-dollar AI training pipelines. Nobody planned this. It turns out that when you anchor a regulation to the people instead of the technology, the regulation can grow up alongside the threats it was meant to address.

COPPA has been growing up for twenty-eight years. In twelve days, it catches up.

Back to Words