The Information War in Europe: The New EU Report on Interference and Manipulation

Questo articolo è disponibile anche in: The Information War in Europe: The New EU Report on Interference and ManipulationItaliano

In recent years, information manipulation has taken on an increasingly structured form. Behind the appearance of spontaneity lies a global, sophisticated machine designed to alter perceptions and weaken democracies. This is highlighted in the third report on FIMI threats – Foreign Information Manipulation and Interference – by the European External Action Service (EEAS), the diplomatic body of the European Union in charge of foreign and security policy. The document captures a global ecosystem of information manipulation, largely orchestrated by China and Russia, which in 2024 alone targeted over 90 countries. These are not isolated incidents but coordinated operations that exploit sophisticated networks and content amplification.

FIMI operations describe a set of behaviors that are mostly not illegal but highly damaging. They are manipulative actions intentionally conducted by state or non-state actors, often through intermediaries active beyond national borders. The goal is not only to spread disinformation but to undermine democratic values, decision-making processes, and political institutions.

In 2024, the European Union’s monitoring system identified 500 incidents related to information manipulation operations. The campaigns involved over 38,000 dissemination channels, from social media to websites. Other notable data concerns Elon Musk’s platform, X, which was the most used for these types of campaigns: about 88% of activity and 73% of the channels used were fake accounts or bots. AI usage was detected in 41 cases, mostly related to audio and video content.

Russia and China are the two most active players, though they operate with different styles. Russia focuses on volume: it produces and circulates content massively, often aggressively, across hundreds of channels. Its strategy is to saturate the information space and create confusion. China takes a subtler approach: its campaigns are embedded in a more institutional, less noisy but more stable ecosystem. It uses official outlets, editorial content, as well as opaque channels and coordinated accounts. Both operate on hybrid infrastructures that mix state media, fake profiles, clone sites, and aligned channels. Recurring targets include Ukraine, the European Union, NATO, and Moldova. Journalists, political leaders, and independent outlets are also frequently in the crosshairs. In both cases, the result is a fragmentation of public discourse, where confusion becomes a strategy.

FIMI operations are not based on a single false piece of content but operate as a coordinated system. They use tools like deepfakes, decontextualized videos, and websites mimicking news outlets. Among the most notable cases are Doppelgänger, a network that graphically replicates European news portals to spread pro-Russian content, and Portal Kombat, which spreads manipulated videos to attack Ukrainian media. Many FIMI contents are not necessarily false but are taken out of context or paired with misleading elements. The result is a type of disinformation that does not aim to build an alternative truth but to erode trust in information itself.

These campaigns are not limited to institutions: public figures are also targeted. European leaders such as Ursula von der Leyen, Josep Borrell, and Kaja Kallas have been subjected to attacks ranging from personal insinuations to distorted public statements. Independent media are also systematically targeted with impersonation operations: outlets like the BBC, Le Parisien, and Der Spiegel have been imitated in logo, graphic style, and editorial identity. The aim is not just to spread falsehoods, but to undermine trust, sow suspicion, and shrink the space for rational debate. Information manipulation does not impose an alternative truth: it destroys the common ground on which democratic dialogue is based.

“What we’re observing today is a multi-level propaganda system, particularly from Russia, which combines official channels like RT, Sputnik, and TASS, affiliated media, and covert networks like Doppelgänger, False Façade, and Portal Kombat,” explains Federica Urzo of the Luiss Data Lab research center, which coordinates the Italian Digital Media Observatory (IDMO). “The focus is clearly on destabilizing Ukraine and EU countries, with direct interference also in countries like Moldova, Romania, and Georgia. Just consider that in 2025 alone, €1.18 billion is expected to be invested in Russian state media. The propaganda has both internal objectives (such as supporting Putin’s regime and justifying the war) and external ones (discrediting Ukraine and the West).” According to Urzo, the Russian strategy is not limited to the digital sphere: “Another trend we’re observing is the combination of diplomacy and disinformation: embassies, international forums, but also cultural centers and universities are used to give a veneer of legitimacy to these manipulative narratives.”

Alongside Moscow, Beijing is also expanding its FIMI operations, with complementary goals and methods. “China conducts targeted operations in strategic areas such as Taiwan, the United States, Southeast Asia, and Africa,” Urzo continues. “It uses state media like CGTN, Global Times, and China Daily, but also more opaque operations like Paperwall. The use of influencers, PR campaigns, and co-produced content serves to mask the government source and spread pro-Chinese messages. The goal is twofold: to improve China’s international image and to delegitimize democratic systems.”

Although the two powers operate with different logics and priorities, the analysis highlights a growing alignment. “We are witnessing a reciprocal amplification between their respective state media, especially on anti-Western and particularly anti-European and anti-NATO themes,” Urzo concludes. “Even if their actions are not yet fully coordinated, the effect on the global information ecosystem is already significant.” In this context, it is worth introducing the concept of Coordinated Inauthentic Behavior (CIB), as explained by researchers at the Luiss Data Lab. Disinformation campaigns often publish overwhelming volumes of content with the same or similar messaging from various inauthentic accounts, created either by automated programs known as bots or by professional disinformation groups known as troll farms. By constantly seeing the same narrative repeated, the public perceives it as a popular and widespread message and becomes more likely to believe it.

Specifically, so-called coordinated inauthentic behavior is a manipulative communication tactic that uses a combination of authentic, fake, and duplicate social media accounts to operate as an adversarial network (AN) across multiple platforms. “They therefore represent a concrete threat to freedom of expression and democracy, as they can alter the collective perception of reality, generate polarization, and even influence decision-making or electoral processes. These are pre-planned, carefully orchestrated campaigns, characterized by extraordinary consistency in the messaging and behavior of the accounts involved, to the point of simulating widespread and spontaneous consensus. What makes them even more insidious is their covert nature: the real promoters remain in the shadows, making these operations difficult to detect and counter.”

The European Union, meanwhile, has developed a structured response over the years, combining technical monitoring, diplomatic tools, and targeted sanctions. In December 2024, for the first time, the EU Council adopted restrictive measures against individuals involved in information manipulation operations. However, the point is not just to identify who spreads content, but to understand the workings of the networks that enable large-scale dissemination. Rather than defending against isolated incidents, it is about recognizing, observing, and interpreting the underlying dynamics. Information manipulation is no longer an exception; it is a form of influence that must be addressed.

To map these phenomena more precisely, the report introduces the FIMI Exposure Matrix, a model that classifies channels based on their level of connection to threat actors

The Information War in Europe: The New EU Report on Interference and Manipulation
FIMI operations do not operate through isolated channels, but within a complex digital architecture where each node plays a role in spreading and legitimizing the message. Some channels act in coordination, while others amplify content due to ideological alignment or strategic interest. The Exposure Matrix provides a dynamic map of a fluid ecosystem, serving as a valuable tool for navigating an information war that is increasingly difficult to trace. A cura di Maria Helena Rodriguez

Altre news