The European Commission has found that Meta and TikTok have breached the EU’s Digital Services Act by restricting independent researcher access to platform data, and in Meta’s case, making user reporting and appeal systems overly complex. The findings, issued on Friday, are preliminary but could lead to fines of up to six per cent of each company’s global turnover if confirmed.

The Digital Services Act (DSA), effective since 2024, obliges large online platforms to give independent researchers access to both public and non-public data, and to provide users with simple, transparent tools to flag illegal content, such as terrorist material or child sexual abuse imagery, and to appeal moderation decisions. These measures aim to ensure public scrutiny of platforms’ social and psychological impacts and protect user rights.

Research access and obstructive patterns

The Commission said both Meta and TikTok have procedures that leave researchers with incomplete or unreliable data, limiting their ability to investigate key platform impacts. Under the DSA, independent researchers can study a wide range of issues, including minors’ exposure to harmful or illegal content, algorithmic amplification, the spread of misinformation, and broader societal effects. For Meta, this applies to both Facebook and Instagram. By making access difficult, the platforms undermine the DSA’s goal of accountability across all aspects of their services.

Meta, which operates Facebook and Instagram, faced additional criticism. Regulators said its “notice and action” reporting systems include unnecessary steps and so-called “dark patterns” that could confuse or dissuade users from completing complaints. Its appeals process for content removal or account suspension also does not allow users sufficient opportunity to provide explanations or supporting evidence, limiting the effectiveness of challenges.

Henna Virkkunen, the Commission’s executive vice-president for tech sovereignty, security and democracy, said:

You might be interested

“Our democracies depend on trust. That means platforms must empower users, respect their rights, and open their systems to scrutiny. The DSA makes this a duty, not a choice.”

Company responses and next steps

Meta said it disagreed with the preliminary findings and had already updated its reporting, appeals, and data-access tools to meet EU requirements. TikTok said it supports transparency but cautioned that some Commission requirements could conflict with privacy rules under the GDPR.

Both companies can now respond formally and propose corrective measures before the Commission issues a final ruling. If upheld, these cases would mark one of the EU’s most significant enforcement actions under the DSA, testing the bloc’s ability to hold major online platforms accountable for content moderation, user safety, and data transparency.