Disinformation is commonly described as a deliberate attempt to spread fake information. But it is not only about action; it’s about the idea that prevails. At its core, it’s a struggle over meaning, what is accepted as truth and what is dismissed as falsehood. And what is accepted as truth relies on two points: who created the meaning and how persuasive enough is the meaning? These points are interlinked because persuasion comes with the power. This is where Michael Foucault becomes essential in the discussion of disinformation. Michael Foucault is famously quoted due to his nexus of discourse, power, and knowledge. For him, knowledge was never separated from power; it was created and is being created by discourses through which societies count what actually truth is. So putting it simply, disinformation becomes information when it comes from authoritative sources. White becomes black and black becomes white when disinformation is persuasive to let people believe that it’s the sole reality. So disinformation is not only about false claims; it’s about creating discourses.
Disinformation is successful not because people are misled cognitively, but because they are captivated aesthetically through persuasion, emotional resonance, and manipulation. In democratic Athens, Sophists were labeled as orators who emphasized the value of persuasion over discovering truth. So, mentioning Foucault, truth depends on the struggle of who gets the authority to define it in society. Disinformation is about manipulation that comes with power; power to decide which voices are trusted and which are silenced.
But who holds the power to define what is truth?
For much of modern history, states had a monopoly on creating truth. Through social institutions, national media, and propaganda tools, states control discourse. This is perfectly explained by George Orwell in his book “1984.” States could label their discourse as “patriotic information,” while labeling rival claims as false propaganda. During WW1, Britain published the Bryce Report, alleging horrific German atrocities in Belgium. This demonized Germans as “Huns” and justified Britain’s war stance, which convinced the neutral states, especially U.S., to enter the war in 1917. Britain also managed to manipulate a huge audience who had been loyal to Nazi Germany.
The ascension of Hitler to power through propaganda of disinformation is a key case study. The Ministry of Enlightenment and Propaganda, headed by Joseph Goebbels, brainwashed people into obeying Hitler. “Big lie strategy” is famously quoted by Hitler in his Mein Kampf. It states that “The great number of people will fall victim to a big lie rather than to a small one,” referring to the repetition of falsehood till it is self-evidently true. An interesting case point to mention is that with the decline of Nazi power, even the reality of the Holocaust became dubious. There is a stark debate between hard-liner deniers of the Holocaust, who claim that the Holocaust didn’t even exist and soft-liner holocaust, who claim that the Holocaust existed but with not much intensity.
In 2020, the EU Disinfo Lab exposed the fifteen year long disinformation campaign by India that aimed at shaping the Western narratives around the issue of Kashmir and Baluchistan. Similarly, NATO alleges Russia is the greatest threat to global security and the Euro-Atlantic area and criticizes Russia for using disinformation campaigns through Russia Today and Sputnik against NATO allies and partners. This has compelled NATO to strengthen the deterrence power and defense of its allies. In 2023, Meta warned about stepping up efforts of China to propagate disinformation, following Russia and Iran. About five networks were taken down, the highest number for any country in that year. This discussion effectively proves Foucault, who reminds us truth depends on who has the power to define it and spread it.
Diffusing Power and Disinformation: Platforms, Algorithms, and Non-State Actors:
The legitimacy of states over discourse creation is now fragmented. Where once the Ministry of Propaganda monopolized information, today algorithms on Facebook, YouTube, TikTok etc. perform the same function. The shift is not the end of disinformation, but its multiplication across various centers. With the rise of digital technologies, the power is no more state-centric but has diffused across new actors, including media corporations, social media platforms, AI, and non-state actors. They do not only curate but also filter out information algorithmically. And due to this shift, it has become a daunting task for states that were once the propagators of disinformation to curb this non-traditional threat. Though the source of power has shifted, the ultimate motive of discourse creation that blurs the line between information and disinformation has not.
Online disinformation is more effective than previous forms of propaganda, as it collectively affects societal cohesion, trust in governance and transparency, public opinion, and delegitimizes sources of knowledge. One case study is the “Pizzagate Conspiracy Theory.” During the 2016 elections, WikiLeaks released piles of John Podesta emails. He was a big Democratic supporter and was raising funds for Hillary Clinton. He owned the Comet Ping Pong, a pizza restaurant. This theory conspired that high-ranking Democratic officials were involved in child pornography, and the hub was this pizza parlor. This theory emerged on 4chan and rapidly made it to mainstream media, where Reddit posted a document with all the evidence a few days ago before the elections. Social media platforms claimed that children were kept in the basement of the restaurant even though Comet Ping Pong didn’t have a basement. The New York Times and The Washington Post debunked the theory. This helped Trump to frame an election campaign against corrupt elite. This illustrated how digital tools became central in discourse creation. Research by Dr. Marco Basto revealed how social media bots were used during the 2016 Brexit referendum. About 13,493 automated accounts were active two weeks before and after the referendum. These accounts amplified polarizing messages, created the illusion, and disappeared once polling concluded. Similarly, during COVID-19, Telegram became a central hub of anti-vaccine content.
Disinformation is profitable for search engines, and it is demonstrated by the decisions of social media platforms to cut fact-checking programs. After Elon Musk’s takeover of X in 2022, he rolled back fact-checking partnerships with various organizations and replaced them with community notes, which has resulted in a surge in disinformation campaigns. So, the power in the hands of individuals ensures how content will be made for platforms. Generative AI has reduced the cost of disinformation. It has become easier to personalize disinformation to target audiences. During 2016 elections in US, hundred of websites operated by people in Macedonia and Kosovo began promoting pro-Trump content and it was investigated that 75% of people didn’t actually follow those pages. This was aimed at solidifying disinformation among people by increasing site traffic and getting revenue. AI works on a corporate logic; whoever can pay can amplify disinformation.
Hahaganda is a strategy used by disinformation agents to spread fake content through memes, jokes, or political comedy. Panorama, a Russian satirical site, has successfully launched a joke campaign on Russian politics. In 2018, a state-owned news channel was tricked when it reported the fake story of a Russian schoolteacher. Once, Panorama carried the news of the suicide of the head of the Moscow anti-doping laboratory. This was a joke but was treated as authentic news by various Russian news outlets. High-profile Russian journalist Vladimir Solovyov was fooled twice by Panorama’s satire.
Terrorist groups exploit disinformation not simply to lie but to shape narratives. This disinformation is usually decentralized, adaptive, and often embedded in encrypted platforms. According to the EU, the definition of disinformation also includes manipulating content, so activities by terrorist groups usually fall under this definition. ISIS has built one of the most sophisticated media platforms. It mimics the style of TV channels to claim responsibility for attacks. Pro Islamic State media platforms run multilingual disinformation operations. According to the Institute of Strategic Dialogue report of 2022, about 38 Pro Islamic State channels with a collective following of 108,268 span across Facebook, Twitter, and Telegram. This group also uses pictures of official soldiers and individuals to taunt governments. Boko Haram combines traditional media with social media platforms. This diffusion of power has widened the scope of disinformation. It has made it difficult to trace the agents of disinformation as it has blurred the boundaries between sender and receiver. In the past, accountability was traceable. Now in digital ecosystem, actors are dispersed and decentralized. A single post can originate from anywhere. But this fragmentation of authorship gives support to Foucault’s idea that disinformation comes from those who have power to create, propagate, and convince.
Conclusion:
Disinformation is more than deliberate attempt of spreading fake news. It’s roots lies in power and persuasion. Therefore, with the rise of digital world, Foucault’s idea on discourse, power, and disinformation has re-emerged as a critical framework to analyze how narratives are constructed, legitimized, and internalized as truth. Power is not only repressive, but also productive. Disinformation is a mechanism through which power manifests itself by constructing certain truths, while silencing others. Digital media and non-state actors amplify certain narratives not because they are true, but because they are engaging and profitable.




