The Deep Impact of Deepfake Audio on Political Landscape
As political tensions rise in the United Kingdom, a potentially explosive situation has emerged with an audio file circulating on a platform known as X, formerly Twitter. This audio file, a 25-second clip posted by a user named @Leo_Hutz, allegedly features Sir Keir Starmer, the Labour Party leader, engaging in verbal abuse towards a staff member at a party conference.
“The problem of easy-to-create deep fake media is compounded by the fact that detection tools are not widely available.” - Sam Gregory, Executive Director at Witness.
The authenticity of the audio file is currently under investigation by British fact-checking organization Full Fact. While the organization has not definitively verified whether the recording is accurate, AI-generated, or created by an impersonator, specific characteristics suggest it may be a deepfake. These include repeated phrases with the same intonation and glitches in the background noise.
The Perils of Audio Deepfakes
Audio deepfakes are increasingly recognized as a significant threat to the democratic process. As the UK and over 50 other countries move towards elections in 2024, the potential for manipulated audio content to mislead voters becomes a severe concern. The ease and affordability of creating such content and the difficulty in quickly and accurately identifying fake recordings make them a dangerous tool in the political landscape.
Fact-checkers say that these recordings can remain on social media for hours or even days before being debunked. By then, the damage had often been done, with the public having disseminated and absorbed false information. This creates a precarious political atmosphere where voters may need help trusting information.
Professor Kate Dommett, a specialist in digital politics at Sheffield University, expresses concern over this issue. She believes that the uncertainty surrounding the authenticity of online content undermines the foundation of democratic debate and hampers the ability of individuals to stay informed.
Platforms and Policies
Platform X's manipulated media policy suggests deceptively altered or used audio or video content should be labeled or removed. However, in the case of the Starmer audio file, action has yet to take place. Despite requests for comment, X has yet to respond to queries about whether the platform is investigating the recording's authenticity.
The reaction to the audio file has been diverse. While Starmer's team has yet to comment, several Members of Parliament from the ruling Conservative party have called the recording a deepfake. MPs have expressed concern over the potential undermining of public faith in institutions, exacerbated by AI and social media use.
The Global Deepfake Dilemma
The issue of deepfake audio is not confined to the UK. Countries worldwide are grappling with how to respond to alleged deepfake recordings causing confusion and chaos. Incidents have been reported in countries ranging from Slovakia to Sudan and India, where politicians have had to deal with the fallout of supposed deepfake recordings.
Sam Gregory, executive director at Witness, a human rights group focusing on technology, highlights the lack of detection tools and standards for deepfake media. He emphasizes that politicians will exploit this lack of reliable tools to deny the authenticity of audio recordings, thereby putting undue pressure on fact-checkers.