BRUSSELS, Feb 7, 2026 – The European Union has launched a major offensive against social media giant TikTok, enacting new sanctions and escalating a landmark regulatory case that accuses the platform of using addictive design techniques that harm minors and violate the bloc’s pioneering digital rules.
The action represents the most forceful application to date of the EU’s Digital Services Act (DSA), which mandates strict content moderation and user protection for large online platforms.
Core of the Case: “Addictive Design”
At the heart of the EU’s case is the accusation that TikTok’s core platform design, including its algorithmic “For You” feed, reward systems, and infinite scroll, intentionally exploits minors’ vulnerabilities to maximize engagement. Investigators argue these features create a “rabbit hole” effect, leading to compulsive use and negative impacts on mental health, privacy, and physical development.
A key focus is the platform’s age verification, which regulators deem insufficiently robust, potentially exposing younger users to inappropriate content.
Two-Pronged EU Action
The EU’s response is two-fold:
- Formal Sanctions & Fines: The European Commission has imposed substantial periodic penalty payments on TikTok. These are daily fines that will continue to accrue until the company demonstrates full compliance with the DSA’s mandates to mitigate systemic risks. The total could reach billions of euros if changes are not made swiftly.
- Enhanced Regulatory Scrutiny: Beyond fines, TikTok has been formally designated as a “Very Large Online Platform” (VLOP) under the DSA’s stricter tier of regulation. This mandates rigorous, independent annual audits of its algorithmic systems, transparent data sharing with regulators, and the implementation of specific, verifiable measures to protect minors.
TikTok’s Response and Industry Impact
TikTok has stated it will “review the decision” and continues to “engage constructively” with regulators. The company highlights recent features, such as default screen-time limits for minors and expanded parental controls, as evidence of its commitment to safety.
The case sets a powerful precedent. Other major platforms like Instagram and YouTube, also designated as VLOPs, are now on clear notice that the EU will aggressively enforce the DSA’s requirements for protecting young users from manipulative design.
Legal experts suggest the EU’s actions could force a global redesign of core social media features, moving platforms away from maximal engagement models toward designs that prioritize user well-being.

















