Breaking News

A pro-Russian disinformation campaign uses free AI tools to feed an “content explosion”

Pro-Russian disinformation The campaign takes advantage of consumer artificial intelligence tools to feed an “content explosion” focused on exacerbation of existing tensions around global elections, Ukraine and immigration, among other controversial questions, according to new research published last week.

The campaign, known as many names, including Overload and Matryoshka Operation (other researchers have also linked it to Storm-1679), has been operating since 2023 and has been aligned with the Russian government by several groups, including Microsoft and the Institute for Strategic Dialogue. The campaign diffuses for false stories by usurping the identity of the media with the apparent aim of sowing the division in democratic countries. While the campaign targets the public in the world, including in the United States, its main objective has been Ukraine. Hundreds of videos managed by AI of the countryside have tried to fuel pro-Russian stories.

The report explains how, between September 2024 and May 2025, the quantity of content produced by those who run the campaign has increased considerably and receives millions of views worldwide.

In their report, the researchers identified 230 unique pieces of content promoted by the campaign between July 2023 and June 2024, including images, videos, QR codes and false websites. In the past eight months, however, overloading the operation has produced a total of 587 parts of unique content, the majority of them being created using AI tools, the researchers said.

The researchers said that the tip of the content was motivated by consumer quality AI tools that are available for free online. This easy access has helped to feed the tactics of the “content fusion” campaign, where those who direct the operation were able to produce several content parts pushing the same history thanks to AI tools.

“This marks a change to more scalable, multilingual and increasingly sophisticated propaganda tactics,” wrote Reset Tech researchers, a non -profit organization based in London that follows disinformation campaigns, and first checks a Finnish software company, wrote in the report. “The campaign has considerably increased the production of new content in the past eight months, signaling a change to faster and more scalable content creation methods.”

The researchers were also amazed by the variety of tools and types of content that the campaign was pursuing. “What surprised me was the diversity of content, the different types of content they started using,” said Wireds Wired Aleksandra Atanasova, leader of the Reset Tech intelligence researcher. “It is as if they had diversified their palette to catch as many different angles of these stories. They overlap different types of content, one after the other. ”

Atanasova added that the campaign did not seem to use personalized AI tools to achieve their objectives, but used generators of voice and images fueled by AI accessible to all.

Although it is difficult to identify all the tools that campaign agents used, researchers were able to shrink in a particular tool: AI flow.

Flux AI is a text generator with the image developed by Black Forest Labs, a company based in Germany founded by former AI stability employees. Using the sightengine image analysis tool, the researchers found a 99% probability of a certain number of false images shared by the overload campaign, some of which claimed to show riots of Muslim migrants and fires in Berlin and Paris – were created using the generation of images from AI flows.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button