Skip to main content

The year 2024 is set to be a significant one for global elections, with key parliamentary and presidential elections taking place in major countries such as the United States, the United Kingdom, India, Brazil, Indonesia, and Mexico. However, the increasing prevalence of cyberattacks and AI-driven disinformation are considered top risks to the integrity of these democratic processes as stated in the World Economic Forum’s 2024 Global Risks Report.

As technology advances, the sophistication and accessibility of deepfake audio and video production have grown. These manipulated media can spread rapidly on social media, swaying public opinion and undermining trust in elections. For example, during the 2023 Slovakian election, deepfake audio clips falsely attributed to Michal Simecka, leader of the liberal Progressive Slovakia party, spread misinformation about rigging the election and doubling the price of beer if he won. Similarly, in the UK, Labour Party leader Sir Keir Starmer was targeted with a deepfake audio clip portraying Starmer verbally abusing staff, this was released during his party’s annual conference.

Cyberattacks remain a constant threat, with politicians, their families, staffers, and party officials frequently targeted. In the 2020 U.S. elections, much of the interference was linked to Russia, and similar concerns exist for the 2024 elections. Other nations, such as China, Iran, and North Korea, are also potential sources of election interference. For instance, China was reported to have interfered in Canada’s 2019 and 2021 federal elections. State-sponsored hackers might target voting machines to compromise or create the appearance of compromised security. The Cybersecurity and Infrastructure Security Agency (CISA) is preparing for such attacks, offering resources and best practices to election officials through its #protect2024 initiative. Additionally, ethical hackers from the Election Security Research Forum and MITRE are working to identify and fix vulnerabilities in election technology.

AI can efficiently create and spread disinformation, presenting a serious challenge to election integrity. For example, in January 2024, a “robocall” used Joe Biden’s voice to discourage voters from voting in New Hampshire Primaries. Disinformation campaigns can manipulate voter behavior and erode trust in the election process, as seen in the 2016 U.S. presidential election when international hackers breached the Democratic National Committee’s network.

Despite these challenges, AI also offers opportunities to enhance election security and efficiency. AI can help campaigns create targeted messages, provide instant responses to political events, and assist in election administration tasks like verifying voter eligibility and counting ballots. Moreover, AI tools can educate voters by providing reliable information about candidates and the voting process.

To protect election integrity, several strategies are being proposed or implemented. The proposed For the People Act aims to modernize and secure voter registration, mandate paper ballots, and establish a national commission to protect democratic institutions. Cybersecurity initiatives by CISA assist election officials with resources and best practices, treating election systems as critical infrastructure. Guidelines for the responsible use of AI in elections include disclosing sources of AI-generated content and enabling independent audits. Public education campaigns aim to inform voters about AI’s role in elections, offering tools to detect misinformation and digital hubs of trusted election resources. Additionally, the U.S. and the European Union have established a joint working group on election security to share best practices and coordinate responses.

While AI and cybersecurity pose significant challenges to the 2024 elections, they also offer tools to combat deepfakes and strategies to protect and enhance the electoral process. Ensuring the ethical and responsible use of AI will be crucial in maintaining trust and integrity in global elections.

About the author: