For more than two decades, the Children’s Online Privacy Protection Act (COPPA) has been the primary U.S. law protecting children’s data online. Congress is now working to update that law through legislation known as COPPA 2.0, the Children and Teens’ Online Privacy Protection Act. The Senate recently passed its version of the bill unanimously as part of a broader push in Washington to address children’s safety and privacy online.
At the same time, the House of Representatives is advancing a separate package of online safety bills, the Kids Internet and Digital Safety (KIDS) Act, addressing issues such as platform accountability, artificial intelligence disclosures, safety audits, and age verification regimes. The House version of COPPA 2.0, however, was pulled from committee consideration as bipartisan negotiations continue, with no updated timeline announced.
While many policymakers agree that stronger protections are needed, some experts and parent advocates are raising important questions about how the COPPA 2.0 proposal would work in practice — particularly for teenagers ages 13–16 and their parents. Understanding the details matters, because the way these laws are designed will shape how children and teens interact with digital platforms for years to come.
A Quick Refresher: What COPPA Does Today
The original COPPA, enacted in 1998, was designed to give parents control over the personal information collected from children under 13.
Under current law:
However, COPPA does not explicitly cover teenagers, which has long been viewed as a gap in the law. COPPA 2.0 attempts to address that gap.
What COPPA 2.0 Would Change
The proposed legislation expands the law to include teens aged 13–16 and updates how personal information is defined and regulated in the modern digital ecosystem.
Key elements include:
Updated and clarified definition of personal information
The bill clarifies and modernizes the definition of personal data to reflect modern digital systems. It includes:
This reflects how modern platforms track and analyze user behavior across apps, services, and devices.
Restrictions on targeted advertising
The bill restricts individual-specific advertising to children and teens, limiting the ability of companies to use personal data for targeted ads.
Security and transparency requirements
The bill continues to require reasonable security practices and clear notice regarding how minors’ data is collected and used, extending these expectations to teens as part of the updated framework.
These updates are widely seen as positive steps in modernizing the law. But one aspect of the bill is generating significant debate.
The Central Policy Debate: Who Gives Consent for Teens?
Under COPPA today, parents must give consent for children under 13. COPPA 2.0 introduces a new framework for teens. For users aged 13–16, the bill requires teens to provide consent for themselves, rather than a parent.
In practice, this means:
Supporters argue that this reflects the reality that teenagers increasingly interact with digital platforms independently. However, some child safety and privacy experts believe parents may be surprised by this move. The legislation places primary control of consent and data rights with the teen rather than the parent once a child reaches age 13. However, some question whether early teens have the capacity to understand what they are consenting to.
A Real-World Question: What Does Teen Consent Mean in Practice?
A key issue that many parents may not immediately recognize is that online consent can operate differently from a traditional contract. When a user joins a platform, downloads an app, or creates an account, they are often presented with a Terms of Service agreement and a privacy policy.
These agreements often govern how companies can:
In many cases, these agreements function like contracts between the user and the company. Yet in the offline world, teenagers generally cannot enter into binding agreements on their own and under the age of 18. .
For example:
In nearly every context, parents retain legal responsibility and oversight for minors with limited state exemptions. For instance with specific healthcare teen rights.
But under the COPPA 2.0 framework, a teen aged 13–16 could independently authorize companies to collect and process their personal information.
Why this raises new questions
This creates a set of practical questions many parents may not have considered:
These questions become more complex in the modern digital environment. Most online services today rely heavily on machine learning systems that continuously analyze user data — including interactions, preferences, search history, and behavioral patterns. Even many adults struggle to understand how their data is used in these systems and how they exercise their own data subject rights. For that reason, some privacy experts believe the debate around COPPA 2.0 is not simply about advertising or social media. It is about who has the authority to make decisions about a minor’s digital identity and data footprint.
Why Some Experts Are Concerned
There are several reasons this design is drawing scrutiny.
Teens may not fully understand complex data uses
Modern digital systems often rely on complex data processing, including algorithmic profiling and artificial intelligence systems. Even many adults struggle to understand how their data is used. Some experts question whether teenagers should be expected to make fully informed decisions about these uses, especially when media and data literacy is not mandated in schools.
Parents may expect to retain oversight
Many parents assume that they retain authority over their child’s digital accounts until adulthood. Under the COPPA 2.0 framework, however, the law focuses primarily on teen consent and data rights rather than parental authorization and access rights for ages 13–16.
Data rights are structured around the teen
The bill explicitly provides teens with rights to access, delete, and correct their own personal data. Parents retain those rights for children under 13, but the framework shifts for teens. For some families, this may represent a different balance of authority than they expected.
Deletion is not the same as visibility
The Senate bill gives teens rights to access, correct, and delete their own data. But for parents, who can only request deletion, one concern is that deletion is not the same as review, oversight, or account termination. If a parent is worried that a teen may be in distress, being groomed, or engaging in risky activity online, a pure deletion remedy may not answer the question a parent is trying to resolve: What is happening, and how do I help?
That is why some critics argue that any modern teen privacy framework should address not only consent and deletion, but also whether parents retain any ability to revoke consent or seek appropriate access when a child’s safety is at stake.
How COPPA 2.0 Could Affect Parental Controls, App Stores, and Teen Accounts
Another practical question raised by the proposed legislation is how it would interact with the systems parents already rely on to supervise their children’s online activity. Today, most digital ecosystems assume that parents maintain some level of oversight over minors’ accounts.
For example:
These systems are built around the assumption that parents remain the primary authority over a minor’s account. COPPA 2.0 introduces a more complex framework. The bill recognizes teen consent, for ages 13–16, as sufficient for authorizing the collection and use of personal data, which raises practical questions about how parental oversight systems would function.
Who authorizes the account?
If a teen can independently consent to a platform’s data practices, it becomes less clear how parental approval flows would operate. Would a parent still need to approve the creation of an account where parental consent laws are mandated in specific states? No. They will be preempted. Or would platforms treat the teen as the primary account holder? Yes, they will likely be left in a conflicted position to protect the data rights of the teen and not information to the parent.
How would parental controls work?
Many parental control tools depend on account linking or parental authorization. If a teen is the legal consent holder for their data under COPPA 2.0, companies may need to reconsider how parental oversight features are implemented.
For instance:
These questions are not explicitly answered in the legislation.
What about app stores and age verification systems?
At the same time Congress is debating COPPA 2.0, lawmakers are also considering proposals that would require app stores to verify user ages and transmit age information to apps. These proposals assume a structured age-verification ecosystem where platforms know whether a user is a child, teen, or adult. But if teens can independently consent to data collection once they reach 13, companies may have less incentive to implement robust parental consent systems beyond the under-13 threshold.
This could shift the architecture of youth online services toward a “13-and-up self-consent” model, rather than a widely understood model that maintains shared oversight between parents and teens.
The State Law Question
Another topic receiving attention in policy discussions is how federal law interacts with state laws. The Senate version of COPPA 2.0 states that it only preempts state law where a direct conflict exists and explicitly says that states may adopt stronger protections. However, some policy analysts note that state laws requiring parental consent for teens could conflict with a federal framework that treats teen consent as sufficient.
If such conflicts arise, courts may ultimately determine which rule applies. This issue is likely to continue to be debated as legislation moves forward.
How COPPA 2.0 Fits Into the Larger Online Safety Debate
COPPA 2.0 is moving forward alongside a broader set of legislative proposals in Congress.
In the House of Representatives, lawmakers are currently considering a larger online safety package that includes:
These proposals reflect growing concern across political parties about children’s online safety. At the same time they also show how complex the issue has become. Some bills focus on platform accountability and safety design, while others, like COPPA 2.0, focus specifically on data privacy and consent frameworks. Understanding how these policies align will be critical.
Why This Matters in an AI-Driven Digital World
The nature of online data collection has changed dramatically since COPPA was first enacted in 1998. Today, personal data is not used only for advertising or simple analytics. Increasingly, it contributes to large-scale artificial intelligence systems that learn from billions of user interactions. These systems analyze patterns in how people search, communicate, react, and engage online — including questions they ask, topics they explore, and behaviors they demonstrate across platforms. Over time, this data helps train systems that become more intelligent, more predictive, and more persuasive in how they shape digital experiences. For teenagers, this means the information they generate online today may influence the systems that shape the digital environment tomorrow. And these systems are not constrained to influence the crowd, but will create fine grained understanding and influence over the individuals who interact and engage with the systems. This is why the debate surrounding COPPA 2.0 extends beyond advertising or social media.
At its core, the discussion is about who has the authority to consent to how a minor’s digital identity and behavioral data may be used in increasingly complex technological systems and does that individual have the capacity to consent.
The Bigger Question: Who Controls a Minor’s Digital Identity?
The debate around COPPA 2.0 ultimately raises a broader question that policymakers around the world are now confronting:
Who should control a minor’s digital identity?
In the physical world, society has long recognized that children and teenagers require both protection and guidance. While teens gradually gain independence, parents typically retain legal authority and responsibility for important decisions until adulthood. The digital world, however, is evolving rapidly.
Today, a child’s digital identity can include:
These systems often collect and analyze large volumes of personal data, creating a long-term digital footprint that can shape future experiences, opportunities, and risks. As a result, the question is no longer simply about advertising or app usage. It is about who has the authority to make decisions about the data that defines a young person’s digital life. Some policy frameworks emphasize teen autonomy and digital participation. Others emphasize parental oversight and responsibility for minors. Most families likely expect some balance between the two. As governments update children’s privacy laws for the modern internet, the challenge will be designing systems that respect teens’ growing independence while still ensuring that parents remain meaningfully involved in decisions about their child’s digital identity.
The Bottom Line
At its core, the debate over COPPA 2.0 is not about whether teens deserve privacy protections — everyone agrees they do. The question is who should have the authority to make the final decisions? In the physical world, parents remain legally responsible for their children until adulthood. Whether the digital world should operate differently is now one of the central policy questions facing lawmakers. Modernizing children’s privacy laws is necessary.
The digital world today is very different from the one that existed when COPPA was written in 1998. But as policymakers update these rules, the details matter. Ensuring that laws protect young people while preserving meaningful parental involvement will be one of the most important challenges in the next phase of children’s digital policy. As policymakers continue working toward stronger protections for young people online, the most important question may not be simply what rules to adopt — but how to design digital systems that allow families, platforms, and regulators to share responsibility for protecting the next generation online.