top of page

THREAT ASSESSMENT: RUSSIA, IRAN, AND CHINA SEEK TO INFLUENCE US ELECTIONS, UNDERMINING DEMOCRACY AND SOWING VOTER DISCORD. DISINFORMATION WILL ALMOST CERTAINLY BE THEIR PRIMARY MEANS OF INFLUENCE.

William Adams, Sabrina Bernardo, Clémence Van Damme, Yassin Belhaj, Samuel Pearson

Finley Thomas, Naureen Salim, Alice Cian Editor; Elena Alice Rossetti, Senior Editor

October 16, 2024


Moving in the Shadows[1]


Summary

Russia, Iran, and China continue seeking influence over the 2024 US Presidential Election through disinformation campaigns aimed at advancing their interests. Each state will very likely intensify efforts to discredit key political figures and manipulate public perceptions on critical issues, using AI-driven tools to enhance the effectiveness of their operations. They will very likely attempt to shape public opinion and fuel divisions, likely aiming to create a more polarized political landscape that benefits their objectives. Foreign influence campaigns will very likely affect perceptions of election legitimacy and undermine trust in democratic institutions.


The Russian Federation (Russia) intends to reuse previous disinformation-spreading models in the US, attempting to exploit and enhance political polarization to damage America’s international image and democracy.[2] Russia’s campaigns will very likely seek to damage Zelensky’s public image to contrast US commitment to Ukraine with the federal government’s perceived inaction on other polarizing topics, such as the US-Mexico border, and immigration. Other attack vectors for Russian influence will very likely include the incompetence or criminality of US political leadership.  


The Islamic Republic of Iran (Iran) attempts to interfere with the elections through hack and leak operations targeting the Trump US presidential campaign.[3] Iran will very likely focus on reducing US support for Israel, likely by inciting protests and fueling political discord. Tehran’s influence campaigns will very likely prioritize preventing US presidential candidate Donald Trump’s re-election, viewing him as more opposed to Iran’s foreign policy than the US presidential candidate Vice President Kamala Harris.


The People’s Republic of China (PRC) uses extensive networks of social media accounts to spread disinformation.[4] The PRC will very likely aim to maximize distrust in US institutions, likely exploiting natural disaster management[5], by accusing the US of acting against societal interests. China’s efforts to undermine US support for Taiwan will very likely focus on discrediting pro-Taiwan public figures and shaping online discourse about the risks and consequences of US involvement.


US government agencies and political parties will likely engage in counter-threat communications, including public hotlines and private liaisons with social media companies. Foreign influence will likely complicate the new administration’s efforts to establish itself, as perceptions of this influence will likely lead to claims of an illegitimate election or collaboration with US adversaries.


Recent Events

From January 1, 2023, to October 7, 2024, analysts at The Washington Post examined over 19,000 posts on X from “Spamouflage,” a Chinese disinformation network. Recent data reveals that China has targeted down-ballot elections, with 15% of Spamouflage posts mentioning local politicians like Alabama’s representative Barry Moore. This strategy focuses on local elections and hot-button issues, as experts note a significant increase in anti-Semitic rhetoric, with over 230 posts about Jews from this network since July 1. The Spamouflage network has grown sophisticated, utilizing highly realistic fake accounts that mimic US citizens, contributing to a fivefold increase in views of its posts, rising from 60,000 to 300,000 per week in 2024.


Operation Overload was a recent Russian disinformation campaign targeting the Paris 2024 Olympics, which has now shifted its focus to the US 2024 elections.[6] The operation targeted journalists, fact-checkers, and news agencies worldwide with disinformation via email, directing them to websites and social media channels that promoted Russian objectives. In September 2024, emails related to US politics increased, with some undermining Harris’ campaign and spreading rumors about her low IQ. On September 2, 2024, US news agencies received an email with manipulated images depicting a crashed fighter jet and several fake screenshots in an attempt to propagandize against Ukraine.[7]


In August 2024, the cybersecurity company CyberCX uncovered the Green Cicada network, consisting of over 5,000 AI-generated social media accounts on X, linked to a Chinese university and aimed at infiltrating democratic discourse. The operation focused on amplifying polarizing content on sensitive issues in the US, such as immigration and foreign policy including conflicts in the Middle East, to create a greater divide among US voters. According to CyberCX, the information operation is capable of interfering in the upcoming election, as the network’s activity has steadily increased since July 2024.[8]


On August 19, 2024, an official statement from the Federal Bureau of Investigation (FBI), the Office of the Director of National Intelligence (ODNI), and the Cybersecurity and Infrastructure Security Agency (CISA) announced that Iran tried to interfere with the 2024 elections. Iran attempted to access sensitive information about both presidential parties using social engineering tactics.[9] In June 2024, an Iranian intelligence unit reportedly sponsored a phishing attack targeting high-ranking campaign officials, gaining access to personal information about vice presidential candidate J.D. Vance. News outlets and the Democratic Party declined to publish the illegally obtained documents, but independent journalist Ken Klippenstein released the 271-page investigation report on September 26.[10] In August 2024, OpenAI announced the discovery and subsequent dismantling of a cluster of ChatGPT accounts involved in a covert Iranian influence operation called Storm-2035. Iran-controlled accounts produced AI-generated content in English and Spanish, disseminating it online via social media and news sites covering US and global issues.[11] 


Russia plans to use AI to interfere in the 2024 US elections by deepening political polarization and disrupting public discourse, attempting to raise doubts about the US government and election integrity. Russian operatives intend to tailor narratives to highlight the target audiences' biases and reinforce their preferences with AI. One tactic involves infiltrating institutions that support Western defense systems and values by inserting and customizing content portrayed as factual but designed to overwhelm the target with misleading or false information.


Assessments        

Russia will very likely spread disinformation about Zelensky’s demands for US support and exaggerate his claims regarding Ukraine’s progress in the Russo-Ukraine conflict. These claims very likely aim to prevent military aid from being sent to Ukraine by fostering skepticism about the effectiveness of US involvement and encouraging narratives that portray the conflict as unwinnable. Falsified Zelensky’s demands for support will likely incite resentment among conservatives and anti-war advocates over the lack of attention to issues like the US-Mexico border crisis or the destruction caused by extreme weather-related events.  Russian donors will very likely use their financial contributions to influence media companies to spread Russian propaganda and divert attention from news about Ukraine's achievements in the war.


Russia will almost certainly use AI to create disinformation and misleading content that will spread divisive, anti-Ukraine rhetoric, likely creating online echo chambers that amplify their messages and beliefs.  Due to the ease of content production, threat actors will very likely keep pace with events and exploit the most relevant grievances to their audience, such as governmental spending on the Ukraine war. The rate of content distribution will very likely overwhelm readers, likely dissuading them from conducting due diligence and fact-checking. Regular viewers of this content will very likely encounter similar posts after their initial exposure based on social media recommendation algorithms. These posts will very likely increase users’ belief in false claims, possibly influencing voter opinions. Russian state or state-backed actors will very likely use manipulated images and videos of celebrities or politicians to claim they endorse Russian-aligned views. Political figures will very likely be the targets of disinformation or conspiracies, depicted as incompetent and going against US interests.


Iran will almost certainly try to undermine Trump’s re-election due to his strong anti-Iranian stance, including withdrawal from the Joint Comprehensive Plan of Action (JCPOA) nuclear deal and ordering the assassination of Major General Qasem Soleimani’s assassination, a top commander of the Islamic Revolutionary Guard Corps (IRGC). As tensions with Israel rise and the election approaches, Iran will very likely escalate cyberattacks and disinformation against Trump’s campaign. As Iran considers this election a matter of national security, it will likely support Harris in the election, viewing her as a more moderate candidate than Trump.


Iran will very likely target Americans with access to privileged information, such as diplomats, scientists, government officials, academics, think tank experts, candidates, and campaign staff, to expose sensitive information about candidates and generate controversy to influence the electorate. There is a roughly even chance that Iran will release compromising information about the candidates in the weeks leading up to the elections, as it has been confirmed Iran gained access to sensitive information of individuals from both parties. The IRGC Intelligence Organization (IRGC-IO) will very likely intensify social engineering schemes such as phishing emails by impersonating news outlets, NGOs, journalists, and URL shortening services to harvest credentials for access to privileged information. Phishing emails will also very likely contain malware, allowing them to penetrate devices and conduct surveillance activities with little to no detection. Iran will likely leverage AI to boost misinformation campaigns, drive engagement, and spread false information through fake personas on social media and news outlets, fueling debates to undermine social cohesion and trust in American democratic institutions. However, it is unlikely that Iran’s misinformation campaigns will significantly impact the election’s outcome, as previous campaigns have had limited outreach despite attempts to personalize fake accounts and produce English-language content.


Iranian-operated fake activists’ accounts will very likely encourage pro-Palestine protests, likely to disturb political events and increase threats to the safety of political candidates. There is a roughly even chance that Iran will actively target uncommitted voters in swing states with large populations of Muslim and Arab Americans, such as Michigan and Minnesota. It is almost certain that Iran will disseminate anti-Israel propaganda to erode US support for Israel by disseminating violent images of the conflict highlighting human rights violations, conspiracies, and antisemitic tropes to influence voter decisions and compel leaders to reconsider their positions.


China will almost certainly primarily use disinformation to discredit the American elections, likely aiming to enhance its global standing and cast doubt on American leadership. The PRC will almost certainly use AI to make fake accounts look more authentic while spreading disinformation, likely through realistic text, synthetic audio, and video to reach US audiences while concealing their origins. The Chinese government will very likely use AI networks, like the Green Cicada Network, to deepen divisions and increase polarization on key political issues such as immigration and the economy, likely creating a wider divide among voters.


It is very likely that the Chinese influence campaign known as “Spamouflage” or  “Dragonbridge” will expand as the election date approaches, likely to spread scandals about key political figures, aiming to confuse voters and reduce turnout, ultimately weakening the integrity of the election process.[12] The PRC will very likely try to obtain information on congressional candidates to shape the future political and economic direction of the US, particularly on issues such as foreign policy or the Israel-Hamas war. Candidates for the Senate or Congress who hold strong positions on the Taiwan issue will likely become primary targets for the PRC, as the country seeks to promote candidates more aligned with its geopolitical interests. Beijing will very likely exploit US natural disasters, as seen in 2023 with the Hawaii wildfires, spreading false rumors about military activity, likely undermining American confidence in institutions and officials.


Future Implications

Given that threat actors will likely prioritize propaganda and disinformation over direct electoral interference, the Intelligence Community (IC) will almost certainly continue to monitor “perception hacking.”[13] The US government and polling stations will likely establish protocols to handle concerns about election integrity processes and infrastructure like campaign stages, locations for debates, and voting polls. The ODNI Foreign Malign Influence Center will almost certainly keep collaborating with companies to stay updated on tactics and procedures used by foreign actors, including AI. CISA training will very likely assist election officials in detecting ransomware threats, secure electronic voting machines, and conduct effective post-election audits to ensure vote accuracy.


Political parties will very likely set up communication channels, such as hotlines, to address public concerns and offer real-time voting guidance. Defensive briefings or private notifications from the IC, alerting political figures that they are targets of disinformation, will very likely enable them to safeguard themselves and counter these efforts, particularly by refining their communication strategies.


US politics post-election will likely be turbulent, with foreign influence casting doubt on the validity of the results, likely making it increasingly difficult for the new administration to assert legitimacy. Claims of foreign interference will likely damage the defeated candidates’ supporters’ trust in democratic institutions, leading to more radical actions exploited by extremist groups. Foreign actors will likely attempt to instigate recounts if the outcome does not align with their expectations. As in the 2020 elections, false claims of vote tampering will very likely lead to prolonged legal challenges and delayed certifications in key states like Arizona, Michigan, or Georgia.


Misinformation has roughly even a chance of reaching intelligence personnel working on national or protective security issues, as influence campaigns will very likely attempt to further polarize voters by portraying factions or individuals as threatening or dangerous. There is a roughly even chance that authorities will miss extremist plans to disrupt presidential candidate movements and impact their safety, raising the risk of successfully interfering with the election process and transfer of power. Online conspiracies will very likely spread, claiming government agitators were involved in instigating violence, very likely eroding trust between the public and law enforcement.

 

[1] Shadows, generated by a third party database

[2] "Overload" a disinformation campaign originating from Russia, has been targeting the Paris 2024 Games in recent months. It is now turning its attention to the US presidential election, particularly targeting Democratic candidate Kamala Harris., Franceinfo, September 2024, https://www.francetvinfo.fr/replay-radio/le-vrai-du-faux/campagne-de-desinformation-prorusse-qu-est-ce-que-l-operation-overload-qui-cible-la-presidentielle-americaine_6757519.html 

[3] U.S. confirms Trump campaign claim it was breached by Iranian hackers, NBC News, August 2024, https://www.nbcnews.com/tech/security/us-confirms-trump-campaign-claim-was-breached-iranian-hackers-rcna16728

[4] China-linked ‘Spamouflage’ network mimics Americans online to sway US political debate, AP, September 3, 2024,

[5] China Sows Disinformation About Hawaii Fires Using New Techniques, The New York Times, September 11, 2023,  https://www.nytimes.com/2023/09/11/us/politics/china-disinformation-ai.html

[6] "Overload", a disinformation campaign originating from Russia, has been targeting the Paris 2024 Games in recent months. It is now turning its attention to the US presidential election, particularly targeting Democratic candidate Kamala Harris., Franceinfo, September 2024, https://www.francetvinfo.fr/replay-radio/le-vrai-du-faux/campagne-de-desinformation-prorusse-qu-est-ce-que-l-operation-overload-qui-cible-la-presidentielle-americaine_6757519.html

[8] Beijing-based 'Green Cicada' AI network uncovered on social media, fears of US election disruption, ABC News Australia, August 2024, https://www.abc.net.au/news/2024-08-13/green-cicada-beijing-ai-network-uncovered-social-media-x/104219752

[9] Joint ODNI, FBI, and CISA Statement on Iranian Election Influence Efforts, FBI, August 2024, https://www.fbi.gov/news/press-releases/joint-odni-fbi-and-cisa-statement-on-iranian-election-influence-efforts

[10] Independent journalist publishes Trump campaign document hacked by Iran despite election interference concerns, NBC News, September 2024, https://www.nbcnews.com/tech/security/ken-klippenstein-publishes-irans-hacked-trump-campaign-document-substa-rcna172902 

[11] Disrupting a covert Iranian influence operation, OpenAI.com, August 2024, https://openai.com/index/disrupting-a-covert-iranian-influence-operation/ 

[12] “Spamouflage” and “Dragonbridge” are AI-enabled online disinformation efforts linked to the PRC.

[13] “Perception hacking” means to artificially inflate the significance of an event in an audience’s perception.

bottom of page