Disinformation has become a substantial threat to electoral integrity worldwide. Due to the rise of advanced technologies such as AI and false narratives, deceptive information spreads quickly, shaping public opinion and possibly altering election outcomes. This propaganda is also a menace to the integrity of the state by casting doubt on its electoral system. The basic purpose of the use of disinformation is to influence voter decisions. The AI-generated content can be tailored to specific goals and audiences. The governments and political parties also spread disinformation to influence voters. Through different techniques, AI targets the voter’s interests. Micro-targeting is the newest and most effective technique for maximizing outreach efforts. The AI tools use the voter’s data and personalize messages that resonate with them. It's a way to engage voters more effectively.
The human mind is incapable of identifying the AI-generated content and propaganda most of the time. AI identifies thought leaders by monitoring their online engagement patterns and social media presence. If a person is supporting a certain political party, his search history will create an algorithm of sentimental analysis and behavioral trends, which will further divert his mindset towards that party's support, and also influence others linked to them by following the same approach. The chatbots powered by AI create personalized messages, making the companies connect with the voters at a deeper level. Natural Language Processing (NLP) causes the AI to generate content which resonates with each voter's mindset and interests. The AI-generated content is also used to target vulnerable communities by shaping their mindset in support of a certain political party.
The US 2024 elections have seen a significant increase in AI-powered campaigns and disinformation, which shows how foreign actors use AI at interstate level, attempting to influence voter opinions. China, Russia and Iran have used AI during 2024 elections, each with distinct strategies. These states undermined public confidence in US democratic institutions and exacerbated societal divisions. Iran somehow, wanted the political unrest in the United States at that time. The covert Iranian influence operation "Storm-2035", identified by Open AI and Microsoft, used intelligence and specifically ChatGPT, to manipulate public opinion in the United States. Russia was potentially an active threat during these elections. Russia used AI-generated content to erode public interest and confidence in US democratic leaders and institutions. The fake content was basically generated to damage the ascendancy of the democratic candidates Kamala Harris and Joe Biden and supported the Republican candidate Donald Trump due to his stance on Ukraine.
As every state protects its self-interests, Russia's use of AI was for its own strategic concern. Russia preferred Republican Presidential nominee Donald Trump and other politicians who had same stance regarding Ukraine and NATO support for the Russia/Ukraine war. Another reason for fostering anti-Americanism was to make it hard for the government to present a united front on Russia-related issues, consequently shifting the public perception about Democratic Party. China aims to reunify Taiwan with mainland China, and its actions are geared towards achieving this goal. China used AI for gathering intelligence and shaping the US public opinion on Taiwan issue and US-China relations. The US neither supports nor opposes Taiwan’s independence. Meanwhile, the US provides military support to Taiwan. China’s AI-powered influence seeded doubts among American voters about the country's commitment to Taiwan and its relations with China. The US continues to walk a fine line in its relations with China, balancing cooperation with competition for influence and power, while defending Taiwan’s democratic values and security.
Conclusion
The risk of disinformation during elections is real and on the rise. With increasing technology, the use of artificial intelligence to disseminate misinformation and influence public opinion is a new weapon for the states for leveraging their influence in the current emerging world order. This is a daunting challenge to democratic processes and the sanctity of electoral processes. A way to overcome this is by building effective methods of detecting and averting disinformation. This involves advocating for media literacy, strengthening cybersecurity practices, and advancing transparency in political campaigning over the web. States should ensure the integrity of the democratic processes and safeguard the rights of citizens to make choices that are well-informed.




