As the 2024 U.S. presidential election nears, misinformation threatens to make the race more of a battleground than a civic activity.
At a Friday, November 17 Ethnic Media Services briefing, experts in fair elections, civil rights, and digital justice discussed how misinformation, as an urgent threat to the 2024 elections, is spread by obstacles to voting access, social media misinformation, high turnover of election officials and artificial intelligence “deepfakes.”
Threats to voter access
Gowri Ramachandran, deputy director of the Elections & Government Program at the Brennan Center for Justice, said that key to fair elections is undisrupted voter access. Accordingly, election officials should be resilient “in the event of touchscreen voting machines breaking down, or electronic poll books becoming unusable, or a breach of the voter registration database.”
“We recommend practices like backing off the voter database well before the election, having plenty of emergency and provisional paper ballot supplies, and doing capacity testing for electronic systems,” she continued. “Small disruptions like these can be fodder for a lot of misinformation about how voters can vote, or even about the whole election being unfair.”
Another security threat to the upcoming election involves poll worker shortages — which occurred in 2020 due to the pandemic, particularly given that many poll volunteers tend to be elderly — but “you also get shortages when election workers feel unsafe due to threats and harassment,” Ramachandran said.
“To help poll workers feel safe, we recommend that election officials implement security upgrades like bulletproof glass and keycard access … and make it clear that threats like doxxing and disinformation will not be tolerated,” she added.
Offline consequences of online politics
Social media companies play a major role in perpetuating misinformation, said Nora Benavidez, senior counsel and director of Digital Justice and Civil rights at Free Press. “Particularly since the January 6 insurrection, the biggest companies — Meta, Tik Tok, Google, YouTube, Twitter — finally seem to accept that their failure to moderate content played a role in undermining public safety and democracy.”
The tens of thousands of layoffs from these companies over the past year, deprioritizing accountability of accurate content, “point to where their values lie,” she added. “There’s a downstream effect where mainstream media outlets, like CNN and the LA Times, often digest unverified misinformation and disinformation originating on social media … and that will have grave implications towards the next 12 months.”
To promote accurate content around the elections, Benavidez said these tech companies should reinvest in staffing teams to “moderate information and safeguard election integrity,” more efficiently moderate political ads across languages, develop increased transparency practices like data analytics reports shared with researchers, journalists and policymakers, and bolster political ad policies to prohibit content promoting misinformation about polling locations, practices or candidates.
Election official turnover and misinformation
On the polling side, too, misinformation threats to a fair 2024 election are worsened by a high turnover of election workers, said William Adler, associate director of the Elections Project at the Bipartisan Policy Center.
Administering elections has always been a “relatively thankless, low-paid government job, which has gotten increasingly complex over the past 20 years,” he said. “As we’ve incorporated more technology into polling, they’ve become IT managers in the public spotlight, facing public threats … New responsibilities in how they communicate about their work are a key part of their job as wasn’t the case before 2020 or 2016.”
These factors, alongside safety concerns, translate to a high turnover, Adler continued, citing a November 2023 Reed College survey of approximately 1,000 local election officials. In it, “31% of those surveyed said they knew other “local election officials who left their jobs because of personal safety issues and threats. 11% surveyed had considered leaving because of safety concerns, and over a third of them will be eligible for retirement before 2026.”
This constitutes a dangerous cycle of misinformation, Adler explained: “Election officials face threats, they may be more inclined to leave their jobs that results in less institutional knowledge on how to run an election, which might result in more mistakes, which may in turn undermine voter confidence, which brings more threats … Delays or errors in processing ballots creates a hunger for information which misinformation peddlers are all too eager to fill.”
Artificial intelligence “deepfakes”
An increasing amount of this misinformation takes the form of AI-generated “deepfake” images and audio, said Sam Gregory, executive director of Witness.org.
This process has gotten particularly easier over the last year, as a wide range of technology has emerged whereby anyone can generate images from a text prompt, or mimic voices from audio samples, “to target and push people out of the public sphere. In conversations I’ve had with people working in electoral processes, this is something they’ve seen and worry about.”
In electoral contexts, patterns of deceptive image and audio use are already on the rise; he mentioned recent examples of an audio deepfake of Slovakian liberal politician Michal Šimeka and journalist Monika Tódová apparently discussing how to rig the upcoming elections; another audio deepfake targeting UK Labour leader Keir Starmer; and another audio deepfake of Chicago mayoral candidate Paul Vallas.
Another form of misinformation involves “cases when someone claims something is a deepfake when it’s actually real,” explained Gregory. Witness.org “receives many cases of deep fakes, and a lot of them are people basically relying on others’ absence of knowledge to deny a piece of audio on the basis that it has been faked.”
“Our information environment, by design, discourages engagement and conversation” across opposing political sides, he added. To combat electoral threats like these, we should “start from a baseline of what’s possible, of how misinformation is spread, so we know where to look for and stop it.” (Selen Ozturk/Ethnic Media Services)