Thursday, April 24, 2025

USA-China Tensions Transform Global Market

After the U.S. elections, relations between the...

The Growing Threat of Disinformation in AI: Russian Propaganda’s Influence

SECURITYThe Growing Threat of Disinformation in AI: Russian Propaganda’s Influence

As artificial intelligence (AI) continues to evolve rapidly, concerns about disinformation are gaining increasing attention. A recent analysis by NewsGuard revealed that 33% of AI-generated content from popular systems is influenced by Russian propaganda, posing serious risks to information security and the stability of democratic processes.

Pravda Network: A Disinformation Machine

The Pravda network was launched shortly after Russia’s invasion of Ukraine in 2022 and has expanded aggressively, using approximately 150 domains to distribute content in multiple languages. According to the American Sunlight Project, the network publishes an average of 20,273 articles every 48 hours, amounting to around 3.6 million publications annually.

NewsGuard’s investigation found that Pravda had spread over 200 false narratives, including claims about secret U.S. biological laboratories in Ukraine and allegations that Ukrainian President Volodymyr Zelensky used Western financial aid to purchase a mansion in Germany—a property supposedly visited in the past by Adolf Hitler.

How AI Becomes a Tool for Disinformation

The spread of false narratives impacts the way AI models process and present information. Systems like ChatGPT, Copilot, and Gemini learn from vast datasets available on the internet, making them vulnerable to disinformation. This contaminates AI-generated content, leading to the unintentional propagation of misleading narratives.

Wojciech Głażewski, director of Check Point Software Technologies in Poland, warns that Pravda’s disinformation efforts could destabilize democratic processes.

“If generative AI tools continue to amplify disinformation narratives, they could influence public opinion and electoral decisions without users even realizing it. This highlights the urgent need for effective filtering mechanisms that can identify and exclude disinformation sources from AI training data,” says Głażewski.

Poland: A Prime Target for Disinformation

Disinformation is not just a technological challenge but also a political and social issue. Reports from EUvsDisinfo Lab and the World Economic Forum (WEF) indicate that Poland is one of the most vulnerable countries to disinformation campaigns, particularly from Russian propaganda, which intensifies ahead of the 2025 presidential elections.

Over the past 10 years, Poland has experienced 1,443 disinformation campaigns. According to the Financial Times, the number of false information incidents and deepfakes in Poland increased by 100% in 2024.

Countering Disinformation: A Global Effort

Combating disinformation requires raising public awareness, educating people about how manipulation campaigns operate, and developing strategies to counter them. This process demands not only technological innovations but also international cooperation among governments, organizations, and the tech sector.

A transparent exchange of information and joint efforts in monitoring disinformation campaigns are essential to mitigating the risks posed by data manipulation.


Source: CEO.com.pl

Check out our other content
Related Articles
The Latest Articles