EU Moves to Curb TikTok’s Addictive Design Under Digital Safety Rules
TikTok may soon be required to overhaul key elements of its platform after European regulators concluded that the app’s design could be encouraging compulsive use, particularly among children and vulnerable users.
In a preliminary ruling, the European Commission said the video-sharing platform had likely breached the European Union’s Digital Services Act (DSA) by failing to properly assess and mitigate the risks created by its engagement-driven features.
Commission Raises Concerns Over User Wellbeing
The EU’s executive arm stated that TikTok, which has more than one billion users globally, had not done enough to evaluate how its product design affects users’ physical and mental health.
According to the commission, the app’s continuous stream of personalized content “rewards” users in ways that promote endless scrolling. This design, regulators argue, can place users in what they described as “autopilot mode,” reducing self-control and increasing the risk of compulsive behavior.
Officials also cited concerns about excessive late-night usage, particularly among children, as a warning sign that the platform has not adequately addressed harmful patterns.
Allegations of Ignoring Compulsive Use Indicators
The preliminary ruling accuses TikTok of overlooking clear indicators of problematic usage, including prolonged screen time during nighttime hours.
Regulators said the company had failed to demonstrate that it had systematically analyzed these risks or implemented strong enough safeguards to limit their impact.
As a result, the commission believes TikTok’s current approach falls short of the standards required under the DSA.
Possible Changes to TikTok’s Core Features
The European Commission is now considering requiring significant changes to how TikTok operates.
At this stage, the commission considers that TikTok needs to change the basic design of its service.
Potential remedies under review include:
Gradually limiting or disabling infinite scrolling
Introducing mandatory and effective screen-time breaks
Restricting nighttime usage
Modifying the content recommendation system
These measures would directly affect the algorithms and interface elements that drive user engagement on the platform.
Criticism of Safety and Parental Controls
Regulators also criticised TikTok’s existing safety framework, particularly its screen-time management and parental control tools.
According to the commission, these features are not sufficiently robust. Screen-time reminders can be easily dismissed, while parental controls are described as time-consuming and difficult to set up.
The ruling suggests that these weaknesses undermine TikTok’s ability to reduce the risks created by its design.
Investigation Still Ongoing
Despite the strong language used, the commission emphasised that its findings are preliminary.
The current assessment does not represent a final decision, and TikTok will be given the opportunity to respond and contest the conclusions before any enforcement action is taken.
The investigation remains open, and its outcome could shape how social media platforms operate across the EU.
READ MORE: TikTok Escapes a Ban in the U.S.—But the Algorithm Fight Isn’t Over
Growing Political Pressure on Social Media Design
The case reflects broader concerns among policymakers and campaigners about the psychological impact of social media.
Online safety advocates have long called for tougher regulation of features that encourage prolonged engagement. In the UK, crossbench peer Beeban Kidron has urged authorities to detoxify the dopamine loops embedded in many digital platforms.
These debates are increasingly influencing regulatory agendas in Europe and beyond.
Financial Risks Under the Digital Services Act
If TikTok is found to have violated the DSA, it could face substantial penalties.
The legislation allows regulators to impose fines of up to 6 percent of a company’s annual global turnover. Authorities can also mandate structural remedies, including compulsory app redesigns.
Although TikTok does not publicly disclose its revenues, estimates from the World Advertising Research Centre suggest the company could generate around $35 billion (£26 billion) this year. A maximum fine could therefore run into billions of dollars.
TikTok Rejects the Commission’s Claims
TikTok has strongly denied the allegations and said it intends to challenge the findings.
A company spokesperson said:
“The commission’s preliminary findings present a categorically false and entirely meritless depiction of our platform, and we will take whatever steps are necessary to challenge these findings through every means available to us.”
The company maintains that it has invested heavily in safety tools and user protection mechanisms.
A Growing Enforcement Record
The case against TikTok follows earlier enforcement actions under the DSA.
Last year, Elon Musk’s X was fined €120 million (£104 million) in the first major penalty issued under the legislation. Regulators cited misleading verification practices and restrictions on advertising research.
That precedent suggests the EU is prepared to use its new regulatory powers aggressively.
The European Commission’s action against TikTok signals a possible shift in how regulators approach social media governance.
Rather than focusing only on content moderation, authorities are increasingly scrutinising the structural design choices that influence user behaviour.
If the ruling is upheld, TikTok may be forced to rethink core engagement features that have defined its success. More broadly, the case could set new standards for how digital platforms balance growth, profitability, and user wellbeing in the age of algorithm-driven media.
