The European Union is investigating Meta’s election policies

0
25
The European Union is investigating Meta’s election policies

The EU has officially opened a significant investigation into Meta for its alleged failures to remove election disinformation. While the European Commission’s statement doesn’t explicitly mention Russia, Meta confirmed to Engadget the EU probe targets the country’s Doppelganger campaign, an online disinformation operation pushing pro-Kremlin propaganda.

Bloomberg’s sources also said the probe was focused on the Russian disinformation operation, describing it as a series of “attempts to replicate the appearance of traditional news sources while churning out content that is favorable to Russian President Vladimir Putin’s policies.”

The investigation comes a day after France said 27 of the EU’s 29 member states had been targeted by pro-Russian online propaganda ahead of European parliamentary elections in June. On Monday, France’s Ministry of Foreign Affairs Jean-Noel Barrot urged social platforms to block websites “participating in a foreign interference operation.”

A Meta spokesperson told Engadget that the company had been at the forefront of exposing Russia’s Doppelganger campaign, first spotlighting it in 2022. The company said it has since investigated, disrupted and blocked tens of thousands of the network’s assets. The Facebook and Instagram owner says it remains on high alerts to monitor the network while claiming Doppelganger has struggled to successfully build organic audiences for the pro-Putin fake news.

Mark Zuckerberg onstage during a company keynote presentation. Profile view from his left side.

Mark Zuckerberg onstage during a company keynote presentation. Profile view from his left side. (Meta)

The European Commission’s President said Meta’s platforms, Facebook and Instagram, may have breached the Digital Services Act (DSA), the landmark legislation passed in 2022 that empowers the EU to regulate social platforms. The law allows the EC to, if necessary, impose heavy fines on violating companies — up to six percent of a company’s global annual turnover, potentially changing how social companies operate.

In a statement to Engadget, Meta said, “We have a well-established process for identifying and mitigating risks on our platforms. We look forward to continuing our cooperation with the European Commission and providing them with further details of this work.”

The EC probe will cover “Meta’s policies and practices relating to deceptive advertising and political content on its services.” It also addresses “the non-availability of an effective third-party real-time civic discourse and election-monitoring tool ahead of the elections to the European Parliament.”

The latter refers to Meta’s deprecation of its CrowdTangle tool, which researchers and fact-checkers used for years to study how content spreads across Facebook and Instagram. Dozens of groups signed an open letter last month, saying Meta’s planned shutdown during the crucial 2024 global elections poses a “direct threat” to global election integrity.

Meta told Engadget that CrowdTangle only provides a fraction of the publicly available data and would be lacking as a full-fledged election monitoring tool. The company says it’s building new tools on its platform to provide more comprehensive data to researchers and other outside parties. It says it’s currently onboarding key third-party fact-checking partners to help identify misinformation.

However, with Europe’s elections in June and the critical US elections in November, Meta had better get moving on its new API if it wants the tools to work when it matters most.

The EC gave Meta five working days to respond to its concerns before it would consider further escalating the matter. “This Commission has created means to protect European citizens from targeted disinformation and manipulation by third countries,” EC President von der Leyen wrote. “If we suspect a violation of the rules, we act.”

Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here