24–36 minutes

— by Alperen Keskin

1. Introduction: From Propaganda to Perception Management

In the era of the Internet, the lines between truth, story, and manipulation are hopelessly muddied. Propaganda, it has long been known, is “whatever power wants us to believe as truth” (Ingram, 2016) But what exactly is this power, and how is it related or unrelated to the nation-state? In an age of platform capitalism, algorithmic targeting, and borderless digital media, propaganda is no longer merely weaponized by the state. It is co-produced by transnational corporations, elite institutions, media elites, and by algorithmic systems that work globally. That transformation demands rethinking how consent is engineered in a public sphere, especially in an environment in which media ecosystems are global and attention is commodified.

This article examines the contemporary floor upon which state propaganda stands, focusing specifically on the global manufacturing of consent. Referencing Noam Chomsky and Edward Herman’s Manufacturing Consent (1988, 2002)[1], the article reconsiders the original propaganda model and extends it to reflect contemporary technologies, agents, and platforms. It also touches on how global corporations and social media systems are complicit in helping to forge ideologies and validate the powerful, not just within nations, but across them.

We are in a time when popular opinion can be swayed faster and on a larger scale than ever before. At the same time critical analysis and debunk has been retarded by over-load of information, algorithmic filter-bubble effects and a loss of belief in media. In this convoluted environment, perception management, not open discussion, has become the prevailing device for obtaining legitimacy and public cooperation (Ingram, 2016).

1.1. Propaganda

The concept of propaganda is often misunderstood. It’s not just about lies or blatant manipulation. Instead, it’s about any organized attempt to influence how the public thinks about, believes, or acts in furtherance of a political, ideological, or financial objective. Edward Bernays, the father of the public relations industry and a prime architect of modern propaganda, defined it as “the conscious and intelligent manipulation of the organized habits and opinions of the masses” (Edward L. Bernays, 1928). Bernays didn’t believe that propaganda was inherently bad; He believed it was a necessary tool of modern democracy — a method of informing people that was not restricted by one of two choices: the “intellectual integrity of the taught” and the “moral integrity of the teacher (Edward L. Bernays, 1928).

Propaganda is inextricably with state power to such an extent historically (Snow & Taylor, 2006). Propaganda posters, patriotic movies, and radio broadcasts were called on during the First and Second World Wars to drum up national support and demonize the enemy. In fascist systems like Nazi Germany or Stalinist Russia, propaganda helped to construct totalizing worldviews that eliminated dissent and constituted a common political self.

In liberal democracies, propaganda has been more covert. It occurs through media, schooling, advertising, and cultural production. It is not that it so much smashes opposition as marginalises it. It’s not a requirement to believe, but it does establish the limits of acceptable discourse. It works less by compulsion than by consensus-building and the emotional identification with prevailing values (Bakir et al., 2019).

1.2. Consent

By “consent” here, I do not mean an individual’s rights or a coerced agreement, but a collective consent to a political narrative, policy decision, or ideological frames. In their book, Manufacturing Consent, Chomsky and Herman suggested that the mass media help create such acceptance. It’s not just about reporting the facts but about how events are presented in a way that serves elite interests and marginalizes opposing views. This is not centrally planned so much as structurally driven—ownership, advertising, official sources, and ideological affinity with state power (Chomsky & Herman, 2002).

In contemporary society, consent is increasingly manufactured through gradual processes of normalization. It’s not that people are coerced into accepting visible falsehoods, but rather that they aren’t presented with effective counter-narratives. Some events are extensively covered by the media, and some others are not covered. Diverse voices are elevated and others are delegitimized, or made to disappear. The appearance of an open debate remains, but the scope of debate is rigidly proscribed.

Today, consent is not only produced by nation-states or media corporations. It is more and more the product of global platforms, algorithmic filtering, and digital echo chambers. Here, the levers of propaganda have become more sophisticated, more personalized, more powerful — and harder to see. We are seeing the development of a new transnational propaganda diet, this article argues, a site of a reissued critical focus, of theoretical revision.

2. Revisiting the Propaganda Model in a Globalized Context

The purpose of the propaganda model that Noam Chomsky and Edward S. Herman proposed in Manufacturing Consent (1988), by contrast, was precisely the opposite: to debunk the mechanisms by which the media in liberal democracies routinely screened — and filtered for quality — the flow of information. The media were not always so much a check on power as the myth suggests. Instead, it largely went along with the interests of state and business, acting as a pipeline for elite messaging. The propaganda model of media also laid down five distinct distortions or “filters” that mediated news content:

  1. media ownership,
  2. advertising as the primary income source,
  3. reliance on the government and corporate press releases and other powerful feeds,
  4. flak (negative responses to media statements),
  5. anti-communism as a governing political ideology.

This model was bold in 1988, but the world of 1988 has long been gone. “The development of globalised digital media, social media platforms, and transnational tech corporations has changed the landscape.” The model remains a useful analytic lens, but it needs to be revised for the highly dispersed, customized, and algorithmically filtered landscape of today’s media ecosystem.

2.1. Filter 1 – Ownership: From Conglomerates to Tech Empires

As Herman and Chomsky put it in their original model, the main news media were owned by a few massive corporations, which produced a structural-economic censorship of what news publications could report. Today, the power of ownership has even further filtered into far fewer owners and far more globally influential ones —tech titans like Alphabet (Google), Meta (Facebook/Instagram), Apple, and Microsoft (and more that cannot be listed in this article). They not only own or influence vast digital ecosystems but also the pipes through which information is funneled.

Unlike traditional media companies, this new breed of tech firms doesn’t just curate news — they control what’s visible in the first place. They are de facto censors and editors, amplifying some messages while drowning out others, both through algorithms and content moderation policies. Their role is not so much ideological in the traditional sense as it is profit-motivated, favoring content that maximizes engagement, frequently at the cost of the truth or democratic debate.

2.2. Filter 2 – Advertising: Surveillance Capitalism and Algorithmic Incentives

Herman and Chomsky focused on the demand for advertising income and its impact on the behaviour of the media. Today, advertising is more totalizing, but in the framework of surveillance capitalism. Platforms such as Facebook and YouTube make money by vacuuming up vast amounts of information about their users and then selling targeted ads. The upshot is that users are provided with personalized information ecosystems that fit the contours of their established biases — a dynamic that has led to the rise of what the internet activist Eli Pariser once called the “filter bubble.”

This leads to a situation where an emotionally driven type of content, which is polarized, is promoted through the algorithm, and the nuanced and critical perspectives are punished by the algorithm. The idea is to keep people hooked, not educated. Consent operates as a kind of personal assembly line, in other words, and emotions self-replicate like artificial intelligence, which may just make propaganda more insidious than it’s ever been.

2.3. Filter 3 – Sourcing: Government, Think Tanks, NGOs

Reporters have always relied on official sources — governments, companies, the military — for access to information. In the age of the internet, that filter has multiplied to include a host of PR companies, state-sponsored media, lobbyists, and international think-tanks. For example, the reporting on war abroad is typically that given by Western governments or NATO spokespeople, and oppositional views coming from a country such as Russia, China, or Iran are portrayed as propaganda, whether they are rooted in the truth.

The Syria conflict, for instance, was frequently reported using sources favorable to U.S. or UK policy objectives, and opposing sources were labeled as disinformation. This consensus serials dominant geopolitical alignments and marginalized counter-narratives, resulting in psychological operations at work disguised as journalism.

2.4. Filter 4 – Flak: Cancel Culture, Deplatforming, and Labeling

In the past, “flak” was a punitive response — lawsuits, complaints, or public outrage — that advised media to stay on dominant narratives. In the digital era, Flak takes many new forms: algorithmic deboosting, demonetization, shadow banning, and content flagging. Those who produce and consume content that is critical of U.S. foreign policy, Israeli militarism, or corporate capitalism risk having their content limited, demonetized, or barred from platforms.

That’s also how the flak is becoming more ideological: Critics get painted as “conspiracy theorists,” or “foreign agents,” or as “extremists.” These categories are not simply used to suppress dissent, but to discredit alternative knowledge systems. In the process, they narrow what can and can’t be said, much as flak did before it, in the age of traditional media.

2.5. Filter 5 – Ideology: From Anti-Communism to Pro-Western Liberalism

In the Cold War era, the prevailing ideological sieve was anti-communism. So today, the ideology has faded, but with it not so much the faith as a new dogma: pro-Western liberal internationalism. It is the belief that there are universal norms of free markets, human rights (those rights that apply selectively), liberal democracy, and global capitalism. Media that subscribe to this world view are normal; those that do not — whether from the Left, Right, or global south — will be accused of authoritarianism, populism, or destabilization.

It’s this ideological lens that colors coverage of things like the Venezuela crisis, Hong Kong protests, or Russian elections. And stories that bolster Western policy are cast as pro-democracy, and counternarratives are relegated to the dustbin of state propaganda, even when they have proven truths.

We conclude that although the original propaganda model is still tenable, it will need to be updated to incorporate a number of fundamental shifts brought about by globalization, the rise of digital media, and the new platform capitalism. In 2018, a few large tech companies controlled the flow of digital information along lines compatible with dominant political and economic doctrines. The result is a new architecture of consent production — one that is decentralized, increasingly automated, and far more difficult to see for what it is than the propaganda media systems of the 20th century.

3. Consent Production in the Global World

However, the emergence of consent in our contemporary world cannot be disentangled from the structural and technological shifts that have altered how knowledge is produced, disseminated, and consumed. Whereas Chomsky and Herman’s five-filter model described how media worked to serve elite interests in liberal democracies, we now consent to our rulers through a more complex interplay of digital infrastructure, global tech monopolies, corporate lobbying, entertainment, and militarized communication strategies. In this chapter, we further explore how MNCS and digital platforms actively influence global consciousness, using some of Chomsky’s commentary as a reference point, to contemporary (in this case, early 21st-century) conditions.

3.1. Platform Capitalism as a Consent Engine

Platform capitalism, or economic systems focused around digital platforms that broker large amounts of data, user interactions, and information flows, has been the key to the manufacture of consent today (Kunzru, 2019). Corporations such as Meta (Facebook, Instagram, WhatsApp), Alphabet (Google, YouTube), Amazon, Microsoft, and Apple not only host content; they build environments that track, shape, and sell attention. These companies make up the consent infrastructure of the global public sphere (Center for Humane Texhnology, 2021)

Take Google, for example. The world’s largest video site and the dominant search engine for more than a billion users, Google is the behemoth gatekeeper of information online, with the ability to shape who sees what, when they see it, and in what order. SEO, PageRank, and algorithmically driven recommendation engines do not work on neutral terms – they systematically privilege certain types of content, and most especially content that is commercially safe and politically uncontentious from the perspective of Western liberalism. Thus, what is widely perceived by the public as “popular” or “credible” content is often a consequence of invisible sorting based on financial and ideological incentives.

3.2. Examples from Manufacturing Consent

Chomsky and Herman dedicate a great deal of time in Manufacturing Consent to discussing of media treatment of U.S. foreign policy, and what they call the “worthy” and “unworthy” victims. The killing in 1984 of Polish priest Jerzy Popiełuszko by the Soviet-backed Polish regime got extensive coverage in U.S. media, for example, while the assassination in U.S.-backed El Salvador of Archbishop Óscar Romero was underreported despite being as (if not more) politically relevant. The disparity in coverage, they said, was a consequence of the way in which American interests dovetailed with the tale of Soviet oppression, and El Salvador’s right-wing junta was a United States ally.

Another form of this phenomenon prevails in today’s interconnected global information economy, but in even more amplified and accelerated forms provided by platforms. For example, the media’s treatment of Ukrainian civilian casualties in the 2022 Russian invasion was instantaneous, emotional, and international. The Western press published vivid, gory, and repeated images of human suffering imported from Ukraine. Meanwhile, Yemeni civilians killed by the Saudi-led coalition, with support from U.S. and British arms, have received consistently and have been given much less empathy.

This extension of Chomsky’s “worthy vs. unworthy victims” paradigm is now manifested in the algorithmic visibility of some tragedies over others and has become normative. Events are not simply covered differently — they are ranked, circulated, and monetized differently. This way, platform companies amplify and heighten the biases of media while obscuring them behind the veil of neutrality.

3.3. The Rise of Corporate-State Hybrids

Consent manufacturing is not just government influence on the media anymore. Corporate and state power have ever more converged, fused into hybrid entities that co-produce a global narrative. Think about the role played by Microsoft, Amazon Web Services, and Google Cloud in hosting the backend infrastructure for intelligence agencies, military, and police forces in the U.S. and its allies. These companies are not simply tech vendors; they are embedded in the state apparatus.

For example, Amazon’s hosting of classified information on its AWS platform for the CIA illustrates how the lines between private and public actors in information control have become increasingly blurry. Google’s brief but instructive participation in Project Maven—a Pentagon AI initiative—has shown that tech novices play an increasingly critical role in military messaging and targeting (Konkel, 2024).

Such alliances show us that multinationals are not a passive reflection of hegemonic ideologies but are a determinant of them. The everyday tech platforms we use are tangled up with geopolitical imperatives, and the content they elevate can be shaped by these alliances.

3.4. Hollywood and the Cultural Apparatus

The entertainment industry, and Hollywood in particular, is another key component in the global consent-manufacturing machine. By way of movies and series, and even the new streaming platforms like Netflix, American foreign-policy aims and assumptions are normalised around the world.

As Chomsky noted, the role of the media is to “mobilize the population for war, even against their populations.” This is also done by storytelling…and by making people cry. In the digital age, it goes beyond news to culture. Take movies such as American Sniper or Zero Dark Thirty, which present U.S. military interventions in Iraq and Afghanistan as moral conflicts with no deeper reflection on the legality of the wars or their consequences. The result is a movie that provides a cinematic rationale for violence, making audiences across the globe believe that the U.S. is a necessary global police force.

But even seemingly apolitical shows help subtly reinforce dominant ideologies. An absence of critique of capitalism, of scandalous lifestyles of the elites, and of the idea that state power is not a machine of benevolence, all reinforce the image of a world where power relations, as they are, seem natural or fair.

4. Psychological Agents of Manufacturing Consent

As propaganda’s production of consent migrates from the timeless world of legacy media to the internet and to the behavioural manipulations of surveillance capitalism, the psychological side of propaganda is taking over. In the old days, propaganda would often involve mass spectacle, patriotic imagery, or ideological slogans. Today, it is more personal, covert, and parasocial, conveyed in a torrent of highly tailored content and influencer-suggested thought that runs below the register of conscious apprehension. The psychological agents of consent are now not only state-controlled broadcasters or editorial boards but woven into the very fabric of platforms, the ethos of algorithms, and the idiom of social media.

This chapter examines how digital media, and particularly social platforms and influencers, instrumentalize psychological mechanisms to influence public perceptions and emotions, and, eventually, the political and ideological accord. We discuss the cognitive impulses and emotional buttons such tools press, and how influencers, the trusted micro-propagandists of the age of platform capitalism, wield them.

4.1. Social Media and Behavioral Engineering

Social media aren’t just neutral platforms for free expression, which we can use as we like — they are multibillions of dollars of engineered systems for the creation and extinction of psychological convictions. In The Age of Surveillance Capitalism (2019), digital theorist Shoshana Zuboff details how platforms like Facebook, YouTube, TikTok, and Instagram hoard unimaginable amounts of user data, which are then fed into predictive algorithms. These systems are designed to maximize engagement, often by facilitating outrage, fear, identity-based tribalism, and confirmation bias (Zuboff, 2019).

This behavioral programming replaces reason with emotion, which is low-hanging fruit for propaganda. Rather than providing a comprehensive ideological package, platforms serve users emotionally intense shards of content that direct them toward particular worldviews. This effect is compounded by personalization: No two users get the same simulated “reality,” which is customized for their psychological profile.

In this hyper-personalized world of content consumption, where attention is scarce and algorithms filter reality, the public sphere is fragmented into self-reinforcing echo chambers. In these bubbles, people see the same emotionally charged messages over and over again, creating a feeling of social proof and normalcy, key components of psychological persuasion.

4.2. Cognitive Biases and Consent Manipulation

Digital propaganda preys on a number of implicit biases and mental shortcuts we use to navigate facts. Some of the key ones are:

  1. Confirmation Bias: Humans look for and accept evidence that supports their already formed opinions. Social media algorithms compound this by repeatedly presenting content that aligns with a user’s worldview.
  2. Availability Heuristic: We judge the relative importance and frequency of events based on the ease with which we can remember examples. When clips of isolated incidents go viral, they can create the impression that such behavior is widespread and prompt an overreaction.
  3. Bandwagon Effect: Observing that others believe in or support something makes it more likely that a person will also adopt that belief, particularly when indicators such as likes, retweets, or views communicate popularity.
  4. Emotional Priming: A lot of the content that these platforms and propagandists put out triggers people’s fear, anger, or empathy. Emotional priming diminishes critical thinking and enhances acceptance of the intended message.

The object of modern propaganda is not to persuade a new belief but to instill a new intonation to imperialist bullshit. Such strategies derail reasoned debate and make people more receptive to being led down the path of least resistance to what somebody else thought was best.

4.3. TikTok, Instagram, and the War for Attention

Platforms such as TikTok illustrate how the psychological machinery behind propaganda has become real-time, data-driven, and has merged with the entertainment economy. TikTok’s algorithm is set up to quickly analyze what a user likes and then feed it to them: an endless stream, in real time, of highly, highly optimized content that matches their micro-emotional responses. This results in what researcher Tobias Rose-Stockwell recently called a “weaponized dopamine economy.” (Rose-Stockwell, 2023).

Due to the platform’s viral logic, some ideologies or memes can quickly flood the public discourse, especially among young people. Consider, for instance, the 2023 Israel-Gaza conflict, during which the TikTok users were given different storylines, depending on the first accounts they followed. Some users were sucked into pro-Israel, Zionist content loops; others were funneled into pro-Palestinian activist content. Each cohort was made to feel like their perspective was not just the majority view, but an ethically clear one, ossifying argument, destroying nuance.

In authoritarian states, the potential of TikTok and the like is all the more thoroughly harnessed. TikTok’s Chinese version (Douyin) censors political content and surfaces educational or patriotic videos, while the global version is more chaotic and less monitored, meaning that the platform functions themselves can even be ideologically customized for different geographies and objectives.

5. Public Sphere

The idea of the public sphere formulated by Jürgen Habermas refers to a sphere of debate in which citizens gather to talk about social problems, deliberate about matters of common concern, and influence political decisions through reasoned discussion. Normatively, it is a sphere of communicative reason and democratic participation. But in reality — and this is especially true now — the public sphere is very much influenced by power, ideology, and propaganda.

The globalized and digitized public sphere of today is fractured, unequal, and highly manipulable. It’s not a national or even a place-based thing anymore; it’s out in digital space, in international media, in algorithmically constructed micro-communities. iii On this new terrain, both state and non-state actors use propaganda to influence public opinion, rationalize military interventions abroad, silence dissent, and build hegemonic narratives. This chapter considers the use of propaganda by significant state actors – Russia, China, the United States, and Turkey – to influence the global public sphere during well-known conflicts.

5.1. Russia: Ukraine and the Disinformation Ecosystem

Russia’s propaganda efforts in and around the Ukraine war may be the most widely discussed example of information warfare in the 21st century. The Kremlin’s playbook has advanced from Soviet style censorship to a sophisticated disinformation ecosystem using state-owned media, troll farms, bots, and hybrid warfare tactics.

RT (Russia Today) and Sputnik News generate an alternative version to that presented by Western media, telling a story of Russia as a victim of NATO encroachment and Ukraine as a fascist, Western puppet regime. These outlets not only transmit lies; they mix facts and fiction and construct behaviours and conspiracies to create (what some say) is called “cognitive confusion” by the scholars. The point is not so much to persuade as to erode the credibility of all narratives — a tactic called “firehosing.”

On top of this, Russia’s Internet Research Agency (IRA) ran a vigorous campaign generating divisive social-media content designed for Western consumption. Russian-linked accounts have also spread anti-establishment views on both the left and the right, during the 2016 U.S. presidential election and the 2022 Ukraine war, widening polarization and undermining confidence in Western media and democratic institutions. This propaganda consciousness is adaptable, decentralized, and it’s intended to make the most of the digital public sphere’s openness.

5.2. China: Censorship and Global Image Management

For all that Russia disrupts and sows disinformation, China’s model of propaganda is based instead on controlling an image and maintaining internal coherence. Censorship and surveillance by the Chinese Communist Party (CCP) are also highly effective in controlling the domestic public sphere. Tiananmen Square, Taiwan independence, and the fate of Uyghur Muslims are either censored or presented slantingly as a nationalist struggle.

China also spends heavily on global image management: Externally, that is. TV channels such as CGTN (China Global Television Network) and China Daily are used to push China’s growth model, paint the CCP as a source of global steadiness, and bark at the West’s doublespeak about human rights. For the Uyghur crisis, Chinese state media have tried to change the narrative around the internment camps into “vocational training centers,” releasing documentaries and testimonials that have portrayed the facilities as benign educational enterprises.

And China’s officials and diplomats have deviated more and more toward “wolf warrior” diplomacy, firing off truculent rhetoric on platforms like Twitter (now X in China) as they fight back against Western criticism. This new assertiveness is an attempt to recover the authority over narrative in the global public sphere from that of Western hegemony in human rights discourse and international legitimacy.

China is also a major funder of Confucius Institutes around the world, where Chinese culture and language are promoted, but critical academic discussion on political issues is often restricted. It shows how soft power institutions can be long-term propaganda devices, ingraining authoritarian values under the guise of cultural exchange.

5.3. United States: Kosovo, Vietnam, and the Narrative of Humanitarianism

Even the United States, a nation we Americans like to think of as the backer of free speech and democracy, has an extensive record of propaganda, particularly abroad. From the Cold War to the Global War on Terror, U.S. foreign policy has long hinged its interventions not on the grounds of geopolitics but as moral imperatives.

During the Vietnam War, U.S. authorities relied upon propaganda distributed through embedded journalists, staged press briefings, and the cover-up of crimes such as the My Lai Massacre. Then and only then was public support preserved through appeals to anticommunism and national security — until the Pentagon Papers, leaked by Daniel Ellsberg, revealed the scale of official lies. Vietnam became a laboratory test of the limits of that propaganda as public confidence eroded in the face of increasing disparities between rhetoric and fact (Antonenko, 1999; Harland, 2010).

The U.S. and NATO intervened in the Kosovo War in 1999 on humanitarian grounds to prevent Serbian ethnic cleansing. But as real as the atrocities were, the establishment of the conflict as a simple moral crusade against evil has served to obfuscate the geopolitical calculations that underpinned it. Chomsky has noted, for example, that the media dutifully parroted the official story and minimized the impact of NATO bombing civilian casualties, a demonstration of the precise kind of humanitarian propaganda used selectively (Jashari, 2022; Kay, 2000).

American media environments perpetually illustrate what Chomsky and Herman termed “worthy victims”—those whose plight undergirds U.S. policy—and “unworthy victims,” who are shunned or dismissed. Civilian deaths in Palestine, Yemen, or Somalia hardly get the kind of focused attention that deaths in Ukraine or Israel do. This asymmetry isn’t unintentional; it mirrors the strategic imperatives woven into the production of U.S. information.

5.4. Turkey: Nationalism and the Syrian Civil War

Turkey offers a fascinating case of how a rising power deploys both nationalist propaganda and the manipulation of the digital sphere as mechanisms for forging public opinion. Under President Recep Tayyip Erdoğan and the AKP(JDP), Turkey has become a country with an atmosphere of domestic media silencing of the critical and the dominance of the pro-government media discourse (Kaya, 2023).

As the war in Syria unfolded, Turkish state media continuously portrayed the conflict primarily as a security threat, namely posed by Kurdish militias (YPG/PKK), and the need to protect Syrian ‘brothers and sisters’ from the grasp of Assad’s regime. The government whipped up public support for military incursions into northern Syria (Operation Euphrates Shield and Operation Peace Spring) with a blend of religious discourse, nationalist fervor, and anti-Western posturing.

Pro-Erdoğan publications like A Haber and TRT World have played a large role in creating a narrative of national unity and existential threat, often depicting refugees as pawns of Western inaction or the terrorist group the PKK. At the same time, those voices in opposition—journalists, academics, opposition politicians—that advocated skepticism towards or solidarity with Kurdish autonomy were branded terrorists or traitors.

Turkey has also taken part in online propaganda campaigns, employing troll armies and bot accounts to disseminate government narratives and attack critics on the internet. These online campaigns dovetail with traditional state propaganda, which shows how the public sphere can be fabricated by both censoring and fanatically spreading a hyperbolic message.

5.5. Fragmentation and the Crisis of the Global Public Sphere

The result of all these propaganda methods is a fractured global public sphere. Instead of a shared space for information and rational debate, we have enclaves that don’t communicate with one another. Russian, Chinese, American, and Turkish media produce fundamentally different realities for their audiences, and each is underpinned by different ideological aims and claims to legitimacy.

And there is, furthermore, the emergence of nation-branding propaganda models, which undermine the potential for a universal truth or justice. Instead, the truth is subordinated to power, and public opinion is a battleground. Consent is produced here not just by persuasion, but also by isolation, confusion, and identity engineering.

That digital public sphere, so long promised as the new commons of human freedom, has proved, in this fraught and momentous year, to be deeply untrustworthy, swamped with disinformation and deceit. The “battle over meaning” does not create better informed citizens; it breeds more compliant subjects, more divided publics, and larger bodies of passive, ideologically-infused consumers.

6. Conclusion

The modern era has not banished propaganda; it has perfected it. In a world of transnational information flows, platform capitalism, algorithmic personalization, and psychographic targeting, propaganda is not an appendage to the state; propaganda is the lifeblood of the infrastructure of everyday existence itself. The universalized online public sphere, at first described as a sphere of democracy and mutual comprehension, has turned into a narrative battlefield, orchestrated by states, corporations, and micro-influencers who do not vie for truth per se, but for attention, loyalty, and emotional superiority.

This article opened the re-reading following the Chomsky and Herman propaganda model, an iconic work in the media studies field, which critically measures sender-oriented factors and systemic factors that filter media content to serve elite interests. Their model is still relevant, but it needs to be revised to take into account the fact that we live in a world in which traditional media sit alongside digital platforms oriented towards new surveillance, data monetization, and behavioural nudging logics. Today’s media consumers are not only the passive recipients of propaganda but also its producers, amplifiers, and data points, as a system continually learns how to manipulate their beliefs and behavior most effectively.

We have observed that the manufacture of consent is now the result of a complex interplay of different processes: the algorithms of the platforms that prioritise content that triggers emotions; the influencers who act as ‘informal propagandists’; the cognitive biases that render users susceptible to ideological manipulation; and corporate-state hybrids that filter global conflicts through selective storytelling. These intermediaries not only report but also interpret reality. And, in the process, they generate public authorization for policies, ideologies, and actions that otherwise would inspire resistance or dialogue.

The case studies from the Kremlin disinformation echo chamber, to China’s global image management, to the U.S. humanitarian interventionist narratives, to the Turkish nationalist media assemblage show how states articulate propaganda strategies in both domestic and foreign contexts to deflect distrust, reinforce trust, and generate consensus. They also reveal that even truth has now been transformed into a strategic resource ripe for manipulation, dissection, and weaponization.

This analysis suggests a world in which the lines between information and manipulation are more indistinct by the day. The digital public sphere, whose democratic potential was once hailed, is now a factory of consent of unprecedented scale. Instead of increasing democratic capacity, it has fueled polarization, misinformation, and public exhaustion — conditions that favor authoritarianism and elite control.

But this is not a call to cynicism or resignation. By contrast, we must learn how propaganda functions in today’s world to renew democracy. It requires a new literacy — a literacy not in reading media critically, but in looking critically at the structures and incentives that form our information environment. It calls for a new way of thinking about what it means to have an open public sphere that is not only preserved but enlarged by education, regulation, and technology.

Now that we live ever deeper into the 21st century, the discussion is no longer does propaganda exists. It’s this: Whose propaganda are we consuming? How do we recognize it? And how do we push back against some of its more manipulative varieties without sliding into nihilism or disconnection?

The manufacture of consent has gone through a major upgrade as well, becoming more diffuse, more participatory, and more dangerous in its subtlety. Understanding this transition is the initial step of restoring control of oneself within a system designed to obfuscate it.

References:

Note: Manufacturing Consent (2002 edition) by Chomsky and Herman has been used several times as the main reference. Exact page numbers could not be added for the each quotation.

Antonenko, O. (1999). Russia, NATO and European security after Kosovo. Survival. https://doi.org/10.1080/713660137

Bakir, V., Herring, E., Miller, D., & Robinson, P. (2019). Organized Persuasive Communication: A new conceptual framework for research on public relations, propaganda and promotional culture. Critical Sociology, 45(3), 311–328. https://doi.org/10.1177/0896920518764586

Center for Humane Texhnology. (2021). The Attention Economy. https://www.humanetech.com/youth/the-attention-economy

Chomsky, N., & Herman. (2002). Manufacturing Consent: The Political Economy of the Mass Media. https://mitpressbookstore.mit.edu/book/9780375714498

Edward L. Bernays. (1928). Propaganda By Edward L. Bernays. http://archive.org/details/in.ernet.dli.2015.275553

Harland, D. (2010). Kosovo and the UN. Survival. https://doi.org/10.1080/00396338.2010.522097

Ingram, H. (2016). A Brief History of Propaganda During Conflict. International Centre for Counter-Terrorism – ICCT. https://icct.nl/publication/brief-history-propaganda-during-conflict

Jashari, M. (2022). The Position of Russian Diplomacy Toward The Kosovo Issue 1998-1999. Vakanüvis – Uluslararası Tarih Araştırmaları Dergisi. https://doi.org/10.24186/vakanuvis.1205318

Kay, S. (2000). After Kosovo: NATO’s Credibility Dilemma. Security Dialogue. https://doi.org/10.1177/0967010600031001006

Kaya, A. (2023, October 30). The World’s Leading Refugee Host, Turkey Has a Complex Migration History. Migrationpolicy.Org. https://www.migrationpolicy.org/article/turkey-migration-history

Konkel, F. (2024). A decade-old risk led to ‘phenomenal partnership’ between AWS and the intel community—Nextgov/FCW. https://www.nextgov.com/acquisition/2024/12/decades-old-risk-led-phenomenal-partnership-between-aws-and-intel-community/401649/

Kunzru, H. (2019). Attention. Harper’s Magazine. https://harpers.org/archive/2021/07/attention-hari-kunzru/

Rose-Stockwell, T. (with Haidt, J.). (2023). Outrage machine: How tech amplifies discontent, disrupts democracy–and what we can do about it (First edition.). Legacy Lit, an imprint of Hachette Books.

Snow, N., & Taylor, P. M. (2006). The Revival of the Propaganda State. https://journals.sagepub.com/doi/10.1177/1748048506068718

Zuboff, S. (2019). The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. https://www.hbs.edu/faculty/Pages/item.aspx?num=56791


[1] Mostly 2002 edition of Manufacturing Consent has been used for this study; however, the original publishing date of the first edition is 1988. The referencing might be confusing in this instance.

Trending

Discover more from IR Scholars

Subscribe now to keep reading and get access to the full archive.

Continue reading

Discover more from IR Scholars

Subscribe now to keep reading and get access to the full archive.

Continue reading