The European Union is intensifying its efforts to control the internet. Regulations such as the Digital Services Act (DSA), Digital Markets Act (DMA), and European Media Freedom Act (EMFA) impose strict obligations on platforms like Meta, TikTok, and X (formerly Twitter). The ostensible aim of these measures is to combat disinformation, protect children, and ensure transparency, but critics warn of excessive surveillance and the erosion of online freedom.

The Digital Services Act (DSA), fully in force since February 2024, places responsibility on social media platforms for illegal user content, such as child sexual abuse material, terrorist content, hate speech, and counterfeit goods. Platforms are required to report and remove such content, including harassment, cyberbullying, and non-consensual image sharing. Very large online platforms (VLOPs), such as Facebook, Instagram, TikTok, and X, are subject to additional rules. The European Commission can impose penalties of up to hundreds of millions of euros or bans across the entire EU. In February, proceedings were initiated against TikTok for deficiencies in the protection of minors, advertising transparency, and harmful content.

The European Media Freedom Act (EMFA), adopted in April 2024, is intended to protect media freedom in the digital environment. Article 18 requires VLOPs to implement a declaratory functionality for media service providers. Media must declare that they meet the criteria: being a media service provider, compliance with Article 6 of EMFA, editorial independence from states, political parties, and third countries, subjection to regulation or self-regulatory mechanisms, and the absence of AI content without human oversight.

Within the so-called trilogue (negotiations between the Commission, Council, and Parliament), a fierce battle is currently being waged over age verification on social media. The Commission demands mandatory checks for messaging services (e.g., WhatsApp, Signal) and app stores (Google Play, Apple App Store). Users would have to prove their age — for example, biometrically via a facial scan or with identity documents. The Council broadly supports this interpretation but requires that the methods be privacy-friendly, transparent, and non-discriminatory. The Parliament, however, rejects any compulsory requirement for messaging services, making it optional, with an emphasis on less invasive methods such as behavioral analysis (TikTok). For app stores — only age labels and parental consent.

Negotiations are still ongoing and there is no certainty that the parties will reach agreement. Temporary rules on AI-based control expire in April 2026.

[The author, Aleksandra Fedorska, is a journalist for Tysol.pl and numerous Polish and German media outlets]

[Title, lead, "What You Need to Know," "What This Means for Users" sections and FAQ by the Editorial Team]