Financial Gazette
  • Politics
  • Europe

The EU moves to kill infinite scrolling

  • Eliza Gkritsi
  • February 12, 2026 at 9:16 PM
  • 28 views
The EU moves to kill infinite scrolling

BRUSSELS — Doom scrolling is doomed, if the EU gets its way. 

The European Commission is for the first time tackling the addictiveness of social media in a fight against TikTok that may set new design standards for the world’s most popular apps. 

Brussels has told the company to change several key features, including disabling infinite scrolling, setting strict screen time breaks and changing its recommender systems. The demand follows the Commission’s declaration that TikTok’s design is addictive to users — especially children. 

The fact that the Commission said TikTok should change the basic design of its service is “ground-breaking for the business model fueled by surveillance and advertising,” said Katarzyna Szymielewicz, president of the Panoptykon Foundation, a Polish civil society group.

That doesn’t bode well for other platforms, particularly Meta’s Facebook and Instagram. The two social media giants are also under investigation over the addictiveness of their design. 

The findings laid out a week ago mark the first time the Commission has set out its stance on the design of a social media platform under its Digital Services Act, the EU’s flagship online-content law that Brussels says is essential for protecting users. 

TikTok can now defend its practices and review all the evidence the Commission considered — and has said it would fight these findings. If it fails to satisfy the Commission, the app could face fines up to 6 percent of annual global revenue. 

It’s the first time any regulator has attempted to set a legal standard for the addictiveness of platform design, a senior Commission official said in a briefing to reporters.  

“The findings mark a turning point [because] the Commission is treating addictive design on social media as an enforceable risk” under the Digital Services Act, said Lena-Maria Böswald, senior policy researcher at think tank Interface.   

Jan Penfrat, senior policy adviser at civil rights group EDRi, said it would be “very, very strange for the Commission to not then use this as a template and go after other companies as well.”

Defining risks 

The Digital Services Act requires platforms like TikTok to assess and mitigate risks to their users. But these risks are vaguely defined in the law, so until now it had been unclear exactly where the regulator would draw the line. 

Two years after the TikTok probe was launched, the Commission has opted to strike at the heart of platform design, claiming it poses a risk to the mental health of users, particularly children. The Commission’s other concerns with TikTok were settled amicably between the two sides.

Jaap Arriens/NurPhoto via Getty Images

At a briefing with reporters, EU tech chief Henna Virkkunen said the findings signal that the Commission’s work is entering a new stage of maturity when it comes to systemic risks.  

Facebook and Instagram have been under investigation over the addictiveness of their platforms since May 2024, including whether they endanger children. Just like TikTok, the design and algorithms of the platforms are under scrutiny.

Meta has mounted a staunch defense in an ongoing California case, in which it is accused of knowingly designing an addictive social media that hurts users. TikTok and Snap settled the same case before it went to trial.

TikTok spokesperson Paolo Ganino said the Commission’s findings “present a categorically false and entirely meritless depiction of our platform and we will take whatever steps are necessary to challenge these findings through every means available to us.”

The right solution

The Commission could eventually agree with platforms on a wide range of changes that address addictive design. What they decide will depend on the different risk profiles and patterns of use of each platform — as well as how each company defends itself.

That likely means it will take a while for TikTok to make any change to its systems, as the platform reviews the evidence and tries to negotiate a solution with the regulator.

In another, simpler DSA enforcement case, it took the Commission more than a year after issuing preliminary findings to declare Elon Musk’s X was not compliant with its obligations on transparency.

TikTok may pursue a series of changes and may push the Commission to adopt a lighter regulatory approach. The video-sharing giant likely won’t “get it right” the first time, said EDRi’s Penfrat, and it may take a few tries to satisfy Brussels. 

“It could be anything from changing default settings, to outright prohibiting a specific design feature, or requiring more user control,” said Peter Chapman, a governance researcher and lawyer who is associate director at the Knight-Georgetown Institute.

He expects the changes could be different for each platform — as while the findings show the Commission’s thinking, interventions must be targeted depending on how design features are used.

“Multiple platforms use similar design features” but they serve different purposes and carry different risks, said Chapman, pointing to the example of notifications that try to draw you back in. For example, notifications for messages carry a different risk of addiction to those alerting a user about a livestream, he said.

Originally published at Politico Europe

Share: