UNIVERSITY of NOTRE DAME

Deepfake Fight: AI-Powered Disinformation and Perfidy Under the Geneva Conventions

Major D. Nicholas Allen

All that we are not stares back at what we are.

– W.H. Auden

Introduction

On February 24, 2022, missiles began hitting major cities across the country: Kyiv, Kharkiv, Chernihiv.  Russian infantry, armor, mechanized fighting vehicles, mobile artillery, aviation, trucks, and supply assets charged over Ukraine’s border at every point of the compass except west.  The war the world feared for years would happen, that had actually been happening but on a smaller, deniable scale, started.  

But the expected quick Russian victory did not materialize.  In the following days the Ukrainian military fought harder and better than Russia had planned for, resulting in thousands of Russian troops killed, hundreds of Russian combat vehicles destroyed, and almost none of Russia’s apparent major military objectives achieved.  Russian forces also slogged through self-inflicted logistics woes which further degraded Russian forces’ abilities to maneuver, caused many Russian crews to abandon their vehicles across Ukraine, and quickly became a point of tremendous embarrassment for Russian military leaders.  

In the public relations sphere Russia would be in arguably its deepest hole.  Worldwide condemnation of its invasion would feed an enormous sanctions regime, a strengthening among NATO alliances as well as potential expansion of NATO, and the growth of Ukrainian President Volodymyr Zelenskyy as an international hero figure.  Even at home Moscow would have to confront a significant counter swell among the Russian people, leading Moscow to resort to Soviet-style tactics of mass arrests, severe free speech restrictions, and intimidations to suppress the dissent movement.

On March 16, 2022, a new tactic emerged.  Ukraine 24, a major television news network in Ukraine, broadcast a quixotic video of Ukrainian President Zelenskyy imploring his troops, not to push to victory, but to surrender.  In a motif similar to his daily press briefings and which would have been familiar to his daily viewers, President Zelenskyy appeared behind a podium with short-crop hair, a thin growth of beard, wearing an olive-green shirt, and with presidential symbols in the background.  However, instead of his usual remarks encouraging Ukrainians to remain strong and detailing his armed forces’ needs to the world, President Zelenskyy claimed instead that “[b]eing the president was not so easy,” that “[i]t didn’t work out,” “[t]here is no tomorrow,” and finally “I advise you to lay down your arms and return to your families.  It is not worth dying in this war.”  A chyron also ran at the bottom of the news broadcast claiming that Ukraine had surrendered.

News agencies and social media companies around the world sped to analyze the video and quickly determined that this realistic video was not actually real at all.  Instead it was the most recent employment of a still-young technology – a deepfake. 

 

[Fig. 1.  Side-by-side stills contrasting the deepfake Zelensky video on the left with a genuine video of President Zelensky on the right making remarks at a news conference days prior.]

 

As of the writing of this article the video has had no discernible direct impact on the battlefield or Ukraine’s war effort, likely due to its relatively poor quality.  But the confusion it sowed, even if temporary, provided immediate and worldwide effects in the information space and demanded priceless time and attention from President Zelenskyy and members of his administration to rebut.

The episode remains a clarion call to those who contemplate the future of media manipulation and digital deception.  The evolutionary march of digital deception leads straight to the battlefield, and few capabilities when at their highest potential are better primed to cause confusion and chaos in the battlefield’s information space than deepfake technology.

“Deepfake” is the term associated with ultra-realistic video and audio images created not by human actors but by artificial intelligence.  Originally associated with salacious pornography videos that depicted unwitting victims participating in sex acts, people have used the technology to create perceptually perfect fake videos of such figures as President Barack Obama, celebrities like Emma Watson and Nicolas Cage, or even Russian President Vladimir Putin as early as 2018.  The technology has manipulated images of weather patterns and even depicted the life cycle of a daisy without needing human input for guidance.

The Zelenskyy deepfake is also not the first time that a deepfake has made a mark during a time of crisis.  In 2019, a deepfake-caused crisis instigated an attempted coup in Gabon, which nearly caused a civil war.  Supporters of Gabonese President Ali Bongo Ondimba became convinced that, after the President had not been seen for several days, a video purporting to show President Bongo alive, astute, and on the job was not real but instead was a deepfake. In support of this assumption, citizens pointed to differences in the President’s demeanor, physical appearance, his apparent inability to use a hand, and even raised skepticism about the video’s lighting.  Local newspapers had also speculated about deepfake, and on January 7, 2019, military officers from the Gabonese armed forces attempted a coup d’état by forcibly seizing a broadcast station and sending messages in an effort to “restore democracy.”  

While the coup did not succeed and the video was most likely real, the impact of the episode is enough to give skeptics of deepfake manipulation further pause.  No actual manipulation was necessary.  Deepfake technology’s existence alone brought the country to the edge of non-international armed conflict.  

With media manipulation at such new heights, international actors must not neglect its technical and legal impact on the battlefield.  This Article therefore attempts to assess the current state of deepfake technology, look ahead to its potential future applications in armed conflict, process the ways in which current law contemplates such deception, and distill recommendations for improving governance where needed.

References

Judge Advocate, United States Army.  Presently assigned as Chief of National Security Law, 25th Infantry Division, Schofield Barracks, Hawaii.  L.L.M. in Military Law, 2021, The Judge Advocate General’s School, United States Army, Charlottesville, Virginia; J.D., 2010, University of Baltimore School of Law; B.A., 2006, University of Florida.  Previous assignments include Command Judge Advocate, United States Army Security Assistance Training Management Organization, Fort Bragg, North Carolina, 2018-2020; Defense Counsel, Fort Bragg Trial Defense Service Field Office, Fort Bragg, North Carolina, 2016-2018; Battalion Judge Advocate, 2nd Battalion, 3rd Special Forces Group (Airborne), Fort Bragg, North Carolina, 2014-2016; Trial Counsel, Fort Jackson, South Carolina, 2013-2014; Legal Assistance Attorney, Office of the Staff Judge Advocate, Fort Jackson, South Carolina, 2012-2013.  Member of the bar of Maryland.  The author wishes to thank the editors and staff of the Notre Dame Journal on Emerging Technologies as well as the myriad mentors, colleagues, and friends who assisted with this article.  Most of all the author thanks his wife Anna and his children Jackson and Finley for their boundless love and support.

Article by Noah John Kahekili Rosenberg

Article by Daniel E. Ho; Jennifer King; Russell C. Wald; and Christopher Wan

Notre Dame Journal on Emerging Technologies ©2020  

Scroll to Top