
The European Commission announced today that Meta's social platforms, Facebook and Instagram, have violated the EU's Digital Services Act (DSA) by failing to provide users with a simple and effective way to report illegal content.
According to the EU executive body's findings, Meta's user reporting process included numerous unnecessary steps, significantly undermining the effectiveness of the platform's content review mechanisms and potentially preventing it from effectively identifying and removing illegal content. Meta denies this.
The Commission stated: "Neither Facebook nor Instagram appear to offer user-friendly, easy-to-use 'report and action mechanisms' within Meta's services that allow users to easily flag illegal content, such as child sexual abuse material or terrorist content."
A senior EU official stated that the case is not only about illegal content but also about the balance between freedom of expression and content moderation. The official noted that the existing reporting process is "overly complex, making it difficult for ordinary users to submit reports completely," thereby undermining the mechanism's effectiveness.
Meta has previously faced numerous security concerns. In September 2024, former employee and whistleblower Arturo Béjar published research claiming that most of Instagram's new safety tools "do not effectively protect children under the age of 13." Meta responded by stating that the company already provides comprehensive management tools for parents, will introduce a mandatory youth account system in 2024, and recently introduced controls similar to those used in PG-13 film ratings.
The European Commission also noted issues with Meta's user complaints process. If users' content is blocked or their accounts are suspended, the current complaints system "does not allow them to adequately explain their reasons or provide evidence," limiting the effectiveness of the appeal channel.
The Commission believes that streamlining reporting and feedback systems will not only help combat illegal content but also prevent the spread of false information. For example, a fake video recently appeared in Ireland falsely claiming that presidential candidate Catherine Connolly had withdrawn from the race.
The investigation is being conducted jointly by the European Commission and Ireland's Digital Services Regulator, Coimisiún na Meán, which oversees Meta's EU headquarters in Dublin.
Under the DSA, companies that fail to meet the requirements within the rectification deadline may be fined up to 6% of their global annual turnover and may face ongoing penalties. Meta's 2024 revenue reached $164.501 billion (IT Home Note: approximately RMB 1.17 trillion at the current exchange rate) and net profit was $62.360 billion (approximately RMB 443.771 billion at the current exchange rate).
Henna Virkkunen, Executive Vice President of the European Commission for Technological Sovereignty, Security and Democracy, said: "Our democracy relies on trust. This means platforms must empower users, respect rights, and open their systems to scrutiny. The Digital Services Act makes this a legal obligation, not a voluntary action for companies."
A Meta spokesperson responded: "We disagree with any allegations that we have violated the DSA. Since the law came into effect, we have improved content reporting options, appeals processes, and data access tools across the EU and are confident that these measures comply with EU law."