top of page
outsystems-Q225-prospecting-ban-v1-300x600.png
outsystems-Q225-prospecting-ban-v1-728x90.png
TechNewsHub_Strip_v1.jpg

LATEST NEWS

EU finds Meta and TikTok in violation of Digital Services Act (DSA) transparency rules

  • Marijan Hassan - Tech Journalist
  • 12 minutes ago
  • 2 min read

The European Commission has delivered a significant blow to Big Tech giants Meta (Facebook, Instagram) and TikTok, issuing preliminary findings that both companies are in breach of key transparency obligations under the landmark Digital Services Act (DSA).


ree

The investigation, launched in 2024, concluded that the platforms are failing in their duty to allow for external scrutiny, threatening the integrity of the EU's new digital rulebook. If the findings are confirmed, the companies face massive fines of up to 6% of their annual global turnover, potentially amounting to billions of dollars.


Core violation: Hindering researcher access

The primary breach applies to both Meta and TikTok. The Commission found that all three platforms (Facebook, Instagram, and TikTok) have established "burdensome procedures and tools" for researchers seeking access to public platform data.


This lack of adequate access often leaves researchers with partial or unreliable data, severely hindering their ability to study crucial societal issues, such as the potential impact of the platforms on minors’ mental and physical health or their exposure to illegal or harmful content.


The DSA views researcher access as essential for providing public scrutiny.

Additional violations for Meta (Facebook and Instagram)


The Commission preliminarily found Meta's platforms, Facebook and Instagram, in violation of two other core DSA obligations related to user protection and content moderation:


  • Ineffective illegal content reporting: Meta does not appear to provide a user-friendly and easily accessible "Notice and Action" mechanism for users to flag illegal content (like child sexual abuse material or terrorist content). The system imposes "unnecessary steps and additional demands" on users.

  • Use of "dark patterns": The platforms allegedly use "dark patterns" (deceptive interface designs) in the content reporting process, which the Commission states can be "confusing and dissuading," potentially rendering the entire flagging mechanism ineffective.

  • Limited appeals process: The appeals mechanisms for content moderation decisions (when posts are removed or accounts are suspended) do not allow users to provide explanations or supporting evidence to substantiate their appeals, thereby limiting the effectiveness of their right to challenge decisions.


Company responses

Both companies have pushed back on the preliminary findings. A Meta spokesperson stated the company disagrees with the suggestion of non-compliance, noting they have already introduced changes to their reporting, appeals, and data access tools since the DSA came into force and are confident their solutions meet the legal requirements.


On its part, TikTok acknowledged that it is reviewing the findings but raised a concern that the DSA’s transparency demands regarding data sharing could put it in "direct tension" with the EU’s strict privacy rules, as defined by the General Data Protection Regulation (GDPR).


The company urged regulators to provide clarity on how the two legal obligations should be reconciled.


Both companies now have the opportunity to formally review the Commission’s files and submit a written response before a final decision is reached.

wasabi.png
Gamma_300x600.jpg
paypal.png
bottom of page