AI Voice Cloning: The Threat to Democratic Elections

AI Voice Cloning: The Threat to Democratic Elections

AI Voice Cloning: The Threat to Democratic Elections

In the age of sophisticated technology, the threat of artificial intelligence (AI) voice cloning to manipulate democratic elections has become a pressing concern. Digital civil rights groups have warned that publicly available AI tools can easily be weaponized to produce persuasive election lies in the voices of prominent politicians. Recent tests conducted by the Washington, D.C.-based Center for Countering Digital Hate revealed that six popular AI voice-cloning tools generated convincing voice clones in 80% of cases. These clones were used to make false statements about elections in the voices of eight influential American and European politicians.

The implications of these findings are grave, revealing a significant gap in safeguards against the misuse of AI-generated audio to mislead voters. With major democratic elections on the horizon, the accessibility and advanced capabilities of AI technology have raised concerns among experts. What is particularly alarming is the ease with which these tools can be misused. Although some of the AI voice-cloning tools have implemented certain rules and technological barriers to prevent the generation of election disinformation, the study found that many of these obstacles could be easily circumvented with quick workarounds.

Imran Ahmed, CEO of the Center for Countering Digital Hate, lamented the dangerous consequences of AI companies prioritizing market dominance over the safety of democracies. Ahmed expressed concern over the ease with which these platforms can be used to create and perpetuate lies, causing politicians to constantly refute false claims and leaving democracies vulnerable to manipulation. The lack of self-regulation on the part of these companies, coupled with the absence of robust laws to prevent the abuse of AI voice-cloning tools, exacerbates the threat posed to democratic processes.

The nonprofit organization conducted the research in May and identified the most popular publicly available AI voice-cloning tools for testing. These tools included ElevenLabs, Speechify, PlayHT, Descript, Invideo AI, and Veed. Real audio clips of politicians were submitted to prompt the tools to impersonate their voices and make five baseless statements. The researchers found that none of the AI voice-cloning tools had sufficient safety measures to prevent the cloning of politicians' voices or the production of election disinformation. Even tools that required users to upload a unique audio sample before cloning a voice could be easily exploited, as a unique sample could be generated using a different AI voice-cloning tool.

The findings also highlighted variations in the performance of the different AI voice-cloning tools. Speechify and PlayHT were found to perform the worst in terms of safety, generating believable fake audio in all 40 test runs. ElevenLabs emerged as the best-performing tool, as it successfully blocked the cloning of U.K. and U.S. politicians' voices. However, it still allowed for the creation of fake audio in the voices of prominent E.U. politicians. In response to the report, Aleksandra Pedraszewska, Head of AI Safety at ElevenLabs, stated that the company recognized the need for improvement and was constantly working to enhance its safeguards.

The use of AI-generated audio to manipulate elections is not a novel phenomenon. Reports have surfaced of AI-generated audio clips being employed in various attempts to sway voters globally. For instance, just days before Slovakia’s parliamentary elections in 2023, widely shared audio clips resembling the voice of a liberal party chief were circulated on social media. These deepfakes allegedly featured him discussing the manipulation of votes and proposing an increase in beer prices. Earlier this year, during the New Hampshire primary, AI-generated robocalls mimicked President Joe Biden’s voice and urged voters to stay home and “save” their votes for the general election.

Experts have emphasized that AI-generated audio has become an early preference for those seeking to influence political outcomes due to the rapid advancements in this technology. With just a few seconds of real audio, a lifelike fake can be generated. Yet, it is not only AI-generated audio that has raised concerns among experts, lawmakers, and tech industry leaders. OpenAI, the company behind ChatGPT and other popular generative AI tools, recently revealed that it had identified and halted five online campaigns that utilized its technology to manipulate public opinion on political issues.

Imran Ahmed called for more stringent security measures from AI voice-cloning platforms and greater transparency. He suggested that these platforms should publish a library of audio clips they have created so that suspicious audio can be verified when it spreads online. Ahmed also stressed the need for lawmakers to step in and implement minimum standards. While the European Union has passed a comprehensive artificial intelligence law, which will go into effect in the coming years, it does not specifically address voice-cloning tools. In the United States, Congress has yet to pass legislation regulating AI in elections. Ahmed emphasized that the threat of disinformation to elections goes beyond causing minor political incidents; it undermines trust in what people see and hear.

As the world approaches significant democratic elections, it is crucial to confront the emerging threat of AI voice cloning. Safeguards must be established, both through self-regulation by AI companies and robust legislation. Failure to address this issue promptly may lead to a erosion of public trust, threatening the very foundations of democratic processes.

This article was written with inspiration from the original article published by The Associated Press on June 1, 2024.


Written By

Jiri Bílek

In the vast realm of AI and U.N. directives, Jiri crafts tales that bridge tech divides. With every word, he champions a world where machines serve all, harmoniously.