AI-related disinformation campaigns saw substantial growth, with the extent of foreign interference increasing markedly in 2025, as evidenced by recent research. Significant advances in electronic communication technology have been made, and consequently, AI is now fully integrated into much more rapid, widespread and concealed influence operations.
3 hours Ago By Iwo Mazur
Metres of AI-Driven Duplicity
According to the European Union's EEAS, a total of 540 foreign information manipulation and interference (FIMI) incidents took place during the 2025 calendar year, of which 147 involved AI, just over one-quarter of all cases. The incidence of AI-related cases is up 259% on the previous year's 41 cases.
As a result of the rapid growth and sophistication of AI tools, both China and Russia have been able to include AI in their information operations, allowing them to create targeted content on a mass scale with lower production costs.
The Most Targeted & Responsible Parties
Of the incidents, Ukraine was hit hardest at 112 cases, followed by France at 107, Moldova at 94, Germany at 71, and the USA at 51; Poland was ranked 8th with 17 recorded cases. Of the total incidents, approximately 29% were attributed to Russia; 6% to China, and 65% cannot be definitively attributed, although the report implies that a substantial number of unattributed incidents are more than likely connected to Russian or Chinese networks.
It is difficult to attribute actions due to the covert nature of the majority of these influence operations; 10,500 unique information channels were used in the decades of these incidents, and many use deceptive means to obscure the origin of their funding and/or desired outcome. The report cautions that numerous channels use hidden and/or fraudulent information means, as their operational infrastructure must be exposed to prevent their continued effectiveness and/or reach.
Social media and messaging platforms continue to serve as significant conduits, allowing low-cost access to global audiences. Specifically, 88% of all identified activities took place on the Elon Musk-owned X platform.
Narratives & Future Threats
The narratives being advanced through these campaigns include sensitive topics related to both geopolitical and domestic issues, e.g., support for Ukraine and transatlantic relations. In addition, as various elections occurred worldwide, these operations capitalised on internal political schisms to increase divisions; for example, immigration was emphasised in Germany while anti-refugee narratives were advanced in Poland.
Examples of false narratives are numerous and include, but are not limited to, false narratives concerning the Ukrainian people being responsible for attacks in Europe, all aimed at major strategic events, such as the Russian drone attacks on Poland and the Polish rail network sabotage.
The report indicates that the trend for such activities will continue to accelerate through 2026. Regions of strategic significance will be inundated with influence operations in the near future, such as the Baltic Sea and Arctic regions.
Copyright @ 2024 IBRA Digital