Meta’s recent changes to content moderation in the US, which do not extend to the EU, are drawing scrutiny in Europe.
Content Moderation: A Shift to “Community Ratings”
Mark Zuckerberg recently announced that Meta will replace traditional fact-checking with “community ratings” on Facebook, WhatsApp, and Instagram in the United States. He defended the change as a move to enhance freedom of expression.
These new rules, however, will not apply in the European Union, where the Digital Services Act (DSA) sets strict requirements for platform accountability and transparency. If Meta attempts to implement similar changes in Europe, it must first present a comprehensive risk assessment to the European Commission.
“We don’t dictate content moderation policies. That’s up to the platforms,” said Thomas Regnier, European Commission spokesman. “But whatever model they choose, it must be effective.”
Digital Services Act: Enforcement and Sanctions
Under the DSA, the EU can initiate formal proceedings if a platform violates its rules. Non-compliance could result in fines of up to 6% of the company’s annual global revenue.
For more severe or “extreme” cases, the EU has additional tools at its disposal. J. Scott Marcus, a researcher at the Brussels-based CEPS think tank, cited the example of blocking Russia Today and Sputnik during the sanctions following Russia’s invasion of Ukraine.
Upcoming Discussions on Platform Regulation
The European Commission, German regulators, and major digital platforms will convene on January 24 to discuss platform regulation ahead of Germany’s early elections in February. This meeting highlights the EU’s ongoing efforts to ensure compliance with its stringent digital regulations, particularly during sensitive political periods.