News Russia

AI Deepfake Videos Fuel Disinformation in Russia Ukraine War

AI Deepfake Videos Spread Disinformation on Ukraine Conscription

Amid the ongoing Russia Ukraine war, AI-generated deepfake videos falsely depicting Ukrainian soldiers unwillingly conscripted have emerged on social media platforms like TikTok and X. Using the Sora 2 AI video generator and stolen faces of Russian streamers, these disinformation campaigns aim to undermine Ukrainian morale and international support by fabricating forced mobilization narratives. Ukrainian authorities confirm these videos are part of hybrid warfare tactics deployed to distort the reality of conscription, which legally applies to individuals aged 25 and older, countering the false claims.

Background & Context

The ongoing Russia Ukraine war encompasses not only traditional military confrontations but also sophisticated information warfare tactics, with disinformation campaigns playing a critical role. Russia has increasingly employed AI-driven deepfake technology to produce fabricated videos that undermine public trust in Ukraine’s conscription policies and military efforts, especially in contested eastern regions like Pokrovsk and Chasiv Yar. This manipulation targets both domestic and international audiences through widespread dissemination on social media platforms, complicating efforts to maintain morale and societal cohesion. In response, Ukraine and its allies, including fact-checking entities in Italy and governmental counter-disinformation centers, continue to combat these narratives to preserve reliable information amidst the broader Russia geopolitical conflict.

Key Developments & Timeline

  • October 31, 2025: The first AI-generated videos appear on TikTok, falsely claiming to depict unwilling Ukrainian soldiers. These deepfake videos represent a new dimension in information warfare within the ongoing Russia Ukraine war.
  • November 2, 2025: The fabricated videos gain widespread circulation across multiple social media platforms, including X, amplifying their reach and influence globally.
  • November 4, 2025: Italian fact-checking organization open.online identifies that these videos misuse the faces of Russian streamers from Twitch without consent, exposing the underlying AI manipulation. Many of the videos include visible watermarks from the AI video generator Sora 2, revealing their artificial origin.
  • November 7, 2025: The Ukrainian Centre for Countering Disinformation officially condemns the videos as deliberate fake news designed to undermine Ukrainian society and weaken international support. This reflects defensive efforts amid ongoing Russian military buildup and hybrid warfare tactics.

This sequence of events highlights the emerging role of AI-generated disinformation within the broader context of the Russia geopolitical conflict. The fabricated deepfakes aim to discredit Ukrainian conscription policies by portraying forced mobilization narratives at illegal ages, thereby impacting societal morale, particularly in eastern Ukraine regions such as Donbas, Pokrovsk, and Chasiv Yar.

Analyses reveal technical inconsistencies and equipment anomalies in these AI videos, further confirming their fraudulent nature. The TikTok profile fantomoko has been identified as a primary source spreading this disinformation campaign, illustrating how digital platforms are exploited in modern warfare to influence public opinion and destabilize adversaries.

As the ongoing Russia Ukraine war latest update shows, this example of hybrid warfare dovetails with physical confrontations involving conventional forces and strategic considerations about Russia nuclear weapons and escalation risks. The dissemination of deepfake content underscores the multifaceted nature of this conflict and the importance of robust information security measures alongside military defense.

Official Statements & Analysis

Recent disclosures highlight how disinformation has become a core element of the ongoing Russia Ukraine war, with deepfake videos emerging as a powerful tool in hybrid warfare. Aleksei Gubanov, a Russian streamer exploited in the creation of these fabricated videos, stated, “I have no connection whatsoever to these videos – all of them were created by someone using the Sora neural network.” He added, “These materials play directly into the hands of Russian propaganda and cause serious harm to Ukraine.” The Ukrainian Centre for Countering Disinformation explained that this campaign aims “to sow distrust within Ukrainian society, disrupt mobilisation efforts and discredit Ukraine in the eyes of the international community.”

This deliberate use of AI-generated deepfakes signals an escalation in psychological operations designed to influence public perception and morale during the conflict. By employing stolen identities of Russian streamers and embedding visible AI watermarks, adversaries target both domestic and international audiences with false narratives of forced conscription and low troop willingness. Understanding such tactics is essential for combating disinformation and maintaining accurate situational awareness amidst Russian missile attacks and military pressure. The prevalence of these videos underscores the need for vigilance in verifying social media content and highlights how information warfare complements kinetic military actions, shaping the broader Russia geopolitical conflict landscape.

Conclusion

The proliferation of AI-generated deepfakes targeting Ukrainian soldiers highlights a growing facet of the Russia Ukraine war: sophisticated information warfare aimed at undermining morale and international support. These disinformation campaigns exploit advanced technology to create false narratives, complicating efforts to maintain public trust and accurate reporting. As hybrid warfare tactics continue to evolve, improving deepfake detection and public awareness will be essential to counteract psychological operations and protect Ukraine’s defense efforts. Looking ahead, combating misinformation will remain a critical component in sustaining effective responses to Russian aggression and ensuring the integrity of military mobilization in this complex geopolitical conflict.

Door Braces & Reinforcers – Harden your entry points — grab portable or permanent reinforcement gear.

Pry Bars & Wrecking Tools – Break through or get in/out safely — strong leverage tools built for field use.

Related: Citigroup Plans 3,500 Layoffs in China Technology Division

Related: RSF Seizes Key Border Area Amid Sudan Conflict