The Rise of AI in Manipulating Elections

The Rise of AI in Manipulating Elections

As the 2024 presidential campaign heats up, concerns about the role of artificial intelligence (AI) in manipulating elections are on the rise. Experts warn that AI technology has advanced to a point where it can create convincing fraudulent images and videos, making it difficult for voters to discern what is real and what is fake. Matthew Stamm, a professor at Drexel University, leads a team that detects false or manipulated political images and believes that things will only get worse in the AI era. Last year, Stamm’s group debunked a political ad for then-presidential candidate Ron DeSantis, which featured a manipulated image of Donald Trump embracing and kissing Anthony Fauci. This incident was a “watershed moment” in U.S. politics, highlighting the potential for AI to create voting misinformation.

Election experts are concerned about the ways in which AI could disrupt and damage the electoral process. They envision a future where AI-generated fake evidence, sham videos of ballot destruction or voter suppression, phony emails, and misleading texts could create mass confusion and undermine the integrity of elections. Pennsylvania Secretary of State Al Schmidt is leading a newly formed Election Threats Task Force to combat misinformation about voting. Schmidt acknowledges that misinformation has been spread using primitive methods like tweets and Facebook posts, but AI poses an even greater challenge if it is weaponized to deceive voters or harm candidates.

AI’s ability to generate fraudulent content has evolved significantly over the years. Previously, manipulating text and imagery to denigrate opponents required expertise in tools like Photoshop, but now anyone with access to AI technology can create convincing deepfakes. Deepfakes are synthetic media in which a person’s likeness is swapped or manipulated to make them say or do things they never did. The Campaign Legal Center, a nonpartisan government watchdog group, pointed to an incident before the New Hampshire primary where an AI-generated robocall simulated President Joe Biden’s voice, urging voters not to participate. This kind of manipulation could easily lead voters to believe false information and become disenfranchised.

AI provides malicious actors with the means to work quickly and effectively at a low cost. Detecting AI-generated deepfakes is a constant challenge, as every time a solution is developed, fraudsters find ways to circumvent it. Political communications expert Kathleen Hall Jamieson warns that AI’s capabilities are continually improving, making it increasingly difficult to detect deepfakes. In this environment, skepticism is crucial, as we cannot trust everything we see.

Misinformation, often amplified by AI, poses a significant threat to democracy. Matt Jordan, director of the Pennsylvania State University News Literacy Initiative, describes it as a “fire hose of falsehoods” that can erode the capacity to share reality, upon which democracy depends. It not only affects politicians but also puts election workers at risk. Security specialists recommend keeping personal social media accounts private to limit access to images and voices that AI could use for nefarious purposes. To combat fabrications, experts advise delaying reposting emotionally charged material from social media until its veracity can be verified.

The battle against AI-generated misinformation is an ongoing challenge, as human overreactions to false reports can often be more difficult to resolve than developing anti-AI measures in the lab. We find ourselves in uncharted waters, where the impact of AI on elections continues to evolve rapidly. With the 2024 elections imminent, it is critical to remain vigilant and cautious about the information we consume and share, as the future of our democracy may depend on it.


Written By

Jiri Bílek

In the vast realm of AI and U.N. directives, Jiri crafts tales that bridge tech divides. With every word, he champions a world where machines serve all, harmoniously.