The European Union is moving toward one of its most consequential digital interventions yet. European Commission President Ursula von der Leyen has proposed a continent-wide EU social media age restriction for children, signaling that new legislation could arrive within months. The announcement, made at an EU summit in Copenhagen on Tuesday, places child protection at the center of Europe's evolving digital governance agenda and sets the stage for a direct confrontation with major technology platforms.
Von der Leyen framed the issue in precise terms. "The question is not whether young people should have access to social media. The question is whether social media should have access to young people," she told summit leaders. That distinction is deliberate. It shifts the regulatory burden away from families and squarely onto the platforms themselves.
A Global Policy Wave, Not an Isolated Move
The Copenhagen announcement did not emerge in isolation. It reflects a gathering international consensus that children's exposure to social media platforms requires legislative guardrails, not just parental guidance or voluntary platform pledges.
Australia moved first, introducing a ban for users under 16 in December 2024, becoming the first country in the world to legislate at that scale. Europe has been watching closely and moving in parallel.
Within the EU alone, at least ten member states have already proposed or enacted minimum age requirements. France is targeting a ban on social media access for children under 15, aiming for implementation by September 2025. Spain plans to restrict under-16 access to combat addiction, pornography, and exposure to harmful content. Germany is focused on a ban for children under 14 with graduated restrictions up to age 16, alongside strict age verification and algorithm-free youth interfaces.
Norway plans to ban under-16 access by the end of 2026, requiring platforms to deploy verified age-check systems. Portugal has already passed legislation mandating parental consent for users aged 13 to 16 and strengthening protections for those under 13. In the United Kingdom, a major national consultation on under-16 restrictions closes on 26 May 2026, with potential bans, age verification, and content restrictions all under active consideration.
Beyond Europe, New Zealand, Malaysia, and India have all proposed comparable restrictions. The global direction is unmistakable.
Platform Accountability Is Not Optional
Von der Leyen was unambiguous on one point: age restrictions would not allow technology companies to walk away from existing obligations. The EU's Digital Services Act already grants the Commission enforcement authority, and it has deployed that authority with escalating urgency.
Last month, the Commission found that Meta's Instagram and Facebook had breached the Digital Services Act by failing to prevent under-13 users from accessing their platforms. In February, TikTok faced the threat of substantial fines over addictive design features embedded in its platform architecture. These actions are not symbolic. They carry serious financial consequences and operational implications.
The Commission's expert panel is due to deliver concrete child protection recommendations by July. Those findings will likely form the foundation of new EU-wide legislation before the year ends.
The Transatlantic Fault Line
Europe's regulatory assertiveness has generated significant friction with Washington. When the EU fined Elon Musk's platform X in December, the United States accused the Commission of politically targeting American companies. Secretary of State Marco Rubio argued that European regulators were attempting to suppress American viewpoints. Several senior European officials, including former EU commissioner Thierry Breton, were subsequently barred from entering the United States.
The tension reveals a structural conflict that extends well beyond any single fine. The United States, particularly under the current Trump administration, views European digital regulation as protectionist and ideologically motivated. The EU insists it is applying the rule of law consistently to every market participant, regardless of national origin or political alignment.
Von der Leyen addressed this without qualification: "We have set rules. It's the law, and those who break it will be held accountable."
What This Policy Shift Actually Means
The analytical question is whether age restrictions alone can deliver meaningful protection. Enforcement depends on age verification infrastructure that most governments have not built at operational scale. Platforms have historically moved slowly on compliance, and determined teenagers have consistently found workarounds to access restricted content.
Yet the regulatory pressure has reached a threshold that platforms can no longer manage with voluntary pledges and policy statements. Governments are legislating, courts are watching, and financial penalties are materialising.
The EU's position also creates normative pressure beyond its borders. When the world's largest single market sets a regulatory standard, global technology platforms frequently adapt their systems universally rather than maintaining separate compliance tracks by region. That secondary effect may prove more consequential than the legislation itself.
Von der Leyen closed her Copenhagen address with a line that will define the political tone of this debate: "Let us give childhood back to children." The political will is now publicly visible. The legislative architecture is under active construction. Enforcement is the final frontier and the most difficult one.

