Media

Meta Faces EU Probe Over Russian Disinformation on Facebook


E.U. officials are again looking to test out their new powers under the Digital Services Act (D.S.A.), with a view to bringing social media companies into line over data usage and advertising.

After already launching investigations into X, related to the spread of misinformation on the platform, and TikTok, over the protection of minors in the app, E.U. authorities have now announced a new probe into Meta, this time over the distribution of Russian-originated disinformation in its apps.

As per the E.U. Commission:

The Commission suspects that Meta does not comply with D.S.A. obligations related to addressing the dissemination of deceptive advertisements, disinformation campaigns and coordinated inauthentic behavior in the E.U.. The proliferation of such content may present a risk to civic discourse, electoral processes and fundamental rights, as well as consumer protection.

The note from the Commission doesn’t single out Russian-based operations by name, but Bloomberg has reported that the main target of this push is a specific Russian-based group.

As per Bloomberg:

The probe targets the so-called Doppelganger campaign — a pro-Kremlin operation, according to people familiar with the matter who spoke on condition of anonymity. The campaign attempts to replicate the appearance of traditional news sources while churning out content that is favorable to Russian President Vladimir Putin’s policies, the people said.”

Which is similar to how Chinese-backed influence campaigns operate, generally posting benign, aggregated news, often using AI-generated text, in order to gain audience attention and reach, before then inserting pro-government propaganda into the mix.

And in a year of many major elections in the region, the risk of such increases, which is why the Commission is seeking to ensure that Meta, and all social apps, are doing whatever they can to curb such initiatives.

And Meta says that it is working to address such.

As per Meta’s Coordinated Inauthentic Behavior reports, the company has shut down various disinformation programs being run by Russian groups, while it also says that its detection processes have continued to improve.

As per Meta:

“We have a well-established process for identifying and mitigating risks on our platforms. We look forward to continuing our cooperation with the European Commission and providing them with further details of this work.”

However, recent research by not-for-profit group AI Forensics found that the Doppelganer operation is gaining more traction on Facebook, and is now reaching “five to 10 times more people than previously thought”.

Which is what’s prompted this new probe, forcing Meta to provide further detail on its efforts to protect E.U. users. 

At the same time, the Commission has also taken aim at Meta’s recent shift away from political content:

“The Commission suspects that Meta’s policy linked to the ‘political content approach’, that demotes political content in the recommender systems of Instagram and Facebook, including their feeds, is not compliant with D.S.A. obligations. The investigation will focus on the compatibility of this policy with the transparency and user redress obligations, as well as the requirements to assess and mitigate risks to civic discourse and electoral processes.

Finally, the Commission has also questioned Meta’s decision to shut down its CrowdTangle platform monitoring tool, as it’ll reduce oversight for political researchers.

Meta will now have five working days to respond to the E.U.’s concerns, before regulators escalate to the next stage, which could result in significant fines, if Meta’s systems are found to be inadequate.

Though it’s unlikely that they will be, given Meta’s advanced detection processes, and the systems that it has in place to detect manipulation.

But then again, with so much Facebook content being hidden in private groups, it’s also difficult to assess just how successful these approaches could be on a broad scale in Meta’s apps.

Which is part of the reason why Meta’s looking to move away from political content, in order to reduce scrutiny of this type, at least in a public-facing sense. Plenty of political discussion is still happening in private groups and DM chats, but Meta doesn’t gain a heap of value from facilitating public engagement around the same, which is why it’s stepping away from political news, at least in the main feed.

That could help Meta wash its hands of any culpability in such cases, though it’ll be interesting to see how E.U. investigators assess Meta’s approach to this Russian misinformation push.

Maybe, that could provide more insight for others seeking to combat the same. And with the U.S. election also fast approaching, there’s a lot of value in gleaning more transparency on this front right now.



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.