
App Store Age Assurance Bills Spark Concern Despite Good Intentions
Not a day seems to pass without a new bill aimed at bringing long-overdue and desperately needed protections for children, who are increasingly exposed to well-documented harms and risks online. Those wanting to better protect children online should be applauded. It’s high time that big tech can no longer turn a blind eye and willfully disregard the privacy and safety of younger users while amassing vast profits from their data.
This sudden rush to pass new regulations—an attempt to solve the problem after the horse has bolted—could lead to unintended consequences. Many big tech giants are delighted to see lawmakers putting the responsibility for verifying age and obtaining consent on the app stores. Meanwhile, apps stores are pushing back, arguing that app developers are responsible for the data collection, usage, features, and functionality of the apps they create—and therefore, they should handle age verification and consent.
Hot on the heels of Utah’s App Store Accountability Act (SB142), Texas has signed into law a bill requiring parental consent to download apps or make in-app purchases for users under 18. Other states, including Ohio and Louisiana, are proposing similar legislation, and there are federal efforts as well. U.S. Senator Mike Lee, who is sponsoring a federal version of the App Store Accountability Act, claimed the bill would “bring age verification and accountability to the source of the problem.” However, apps are only one source of the problem. These bills don’t protect children across the broader digital ecosystem—on websites, in games, and in services not covered by this legislation.
So, what are the concerns with this well-meaning approach? The Texas and Utah’s law requires developers to verify whether a user is an adult. Since children cannot legally enter into contracts, an adult must be verified and then provide consent when a child tries to download an app or make an in-app purchase. The consent process must inform the parent of the app’s age rating and how their child’s data will be used.
On the surface, this move seems like a good step, solving some of the issues at the door, but in practice, it could be ultimately detrimental to children. Apps collect many types of data—some visible to the user, like age or contact information, and much that is invisible, like device IDs, IP addresses, analytics, and geolocation data to name a few. There are thousands of apps in the stores and many developers don’t have a clear idea themselves of the data collected by the SDKs they implement to operate their apps. Nor is it always clear what happens to that data.
App store age ratings tend to focus on content (e.g., violence or mature themes) rather than data collection. As a result, they provide limited value to parents and fall short of supporting truly informed consent.
The language of these bills is often ambiguous. Utah’s law, in particular, raises more questions than it answers—especially regarding how consent will be implemented. Will parents receive clear, transparent notices? If the app store is responsible for obtaining consent, who is liable for misrepresentations about data collection and usage? App stores don’t build the apps; they can only police them to a degree. Valid consent can only be obtained by providing clear and transparent notice to the parent but its not clear if the requirements for notices will be robust and comprehensive enough to allow the parent to make an informed choice.
Apple and Google opposed the bill. They argue that complying will require them to collect more personal data to verify age and obtain consent—and to share that data with app developers. Verifying age includes the collection and processing of personal data. Age verification often involves sensitive personal information, including government-issued IDs, biometric data, or financial credentials. Apps store accountability acts like other age verification legislation in the US could face court battles and claims that they are unconstitutional infringing on the first amendment by requiring age verification before users can access content.
"There’s not enough attention on the real risks that these proposals create," Kareem Ghanem, Google’s director of public policy, said in an interview with The Center Square. "These bills would do nothing to address people’s concerns. And in the process, they’re letting Zuckerberg and Meta off the hook by providing this false sense of security that no amount of age verification at an app store level can really solve."
Meanwhile, social media companies largely support the legislation—because it shifts the burden away from them. Several platforms issued a joint statement reported in the news claiming that parents want a one-stop shop for age verification and consent. Some of these companies, including Meta and Spotify, have also formed the “Coalition for a Competitive Mobile Experience” to influence federal and state policy.
So, if this isn't the right approach—what is?
There’s no single solution. Assessing the risk of data collection and use should determine if and when age assurance is required and the level of age assurance needed. There are already regulations that require this level of risk assessment, COPPA in the US has a sliding scale of consent akin to risk assessment, GDPR is risk based and the UK & US Children’s Codes such as CA AADCA which is enjoined, Maryland and Vermont require privacy and safety by design. If there are privacy and safety risks the level of age assurance required should be proportionate to the risk. These laws require this and it is an approach that should work if social media and app creators adhered to them.
Age should be verified once by a neutral third party, and a federated ID created for the parent. This prevents the parent from having to repeatedly share sensitive data across dozens of apps, platforms, and websites. Neutral providers offer privacy-preserving certified technologies already capable of managing this. This ensures that age verification and parental consent are not handled by parties with commercial conflicts of interest (e.g., the stores, social media or creators and developers). Parents can then be served informed notice by the service their child wishes to access and opt in or out. Consent management, too, can be handled by trusted third parties. These solutions already exist. There’s no need to reinvent the wheel. Instead, we should focus legislation on enabling interoperability, governance, and that digital services work with neutral third parties building federated family IDs and consent tools that can interoperate supporting a healthy privacy safe ecosystem for younger children and teens.
As the digital world evolves, so must our strategies to protect young users. Rather than placing responsibility on the wrong gatekeepers, it’s time to adopt scalable, privacy-first solutions that respect families and ensure informed, meaningful consent. PRIVO has been leading the charge in this space for over a decade, helping companies comply with regulation and protect minors through trusted secure age assurance and consent services.
Explore how PRIVO’s technology can support your organization’s compliance, online privacy and safety goals. Get in touch to learn more.