What does propaganda have to do with academic research and teaching? Citizens can reasonably expect the academic community to generate scholarly understanding and public awareness of what propaganda is and how propaganda operates. Academics should certainly aim to ensure their own research and teaching are not influenced by it. But how well does the academic community actually deal with propaganda? Addressing this question means considering both how propaganda should be dealt with and how academics actually deal with it.

Given the distinctive social role of academics, there are five general responsibilities that it is reasonable to expect them to fulfil in relation to propaganda. The most basic is to engage in research and teaching with methods of developing and communicating knowledge quite different from those that constitute propaganda. Whereas propaganda involves strategically communicating information selected on the basis of a prior agenda, the methods of science and scholarship involve collaborative deliberation and openness to new discoveries. A second general responsibility directly follows: academics should endeavour not to succumb to the influence of propaganda in their teaching and research. While seemingly a negligible risk in some fields of inquiry, something to be alert to regarding topics of public controversy is how an academic may, unaware, have passively absorbed assumptions and framings of propagandistic genesis. A third responsibility is to avoid reproducing propaganda, even inadvertently, in teaching and written works. A fourth, more imperative still, is the responsibility not to engage in the active (re)production of propaganda.

The four responsibilities just referred to do come with certain caveats: not all forms of propaganda are necessarily pernicious; perhaps not all instances of it will ever be fully identified; and academics are not necessarily less vulnerable to being deceived by propaganda than are their fellow citizens. But while there is scope for discussion of caveats, the important point is to be clear about the responsibilities themselves. While recognizing a place for reasoned critiques of particular studies of propaganda, in fact, we need also to be aware of one further general responsibility, namely, to ensure that the critical study of propaganda is not actively obstructed or attacked. This distinct fifth responsibility unfortunately needs mentioning because, as we will see, the active discouragement of critical study of certain cases of propaganda comes not only from representatives of vested interests outside academia but even sometimes from people who hold positions within.

The purpose of this article is to heighten awareness of why a conscious commitment to fulfilling those responsibilities is important for the academic community and the wider society it serves. Illustrations will generally be drawn from cases I happen to have some familiarity with, but the aim is to discern certain patterns of propaganda at work that others will be able to trace across a wider range of topics of public interest.

1. Propaganda as a subject matter of academic study

The first thing to be aware of is that effective propaganda exercises its influence largely unnoticed. Any of us, including those who happen to be academics, may not necessarily recognize it in practice, because it can be designed to work by eliciting psychological and emotional responses that bypass the more rational faculties involved in conscious deliberations such as are honed through academic study. Propaganda involves forms of communication that can be described as strategic because their purpose is to promote certain beliefs regardless of how well these are supported by evidence or dispassionate reasoning. Contemporary methods of strategic communications – often branded as ‘public relations’ – have been honed in the business world, but they are also deployed by governments and other state and non-state actors, including as a ‘soft’, or ‘non-kinetic’, aspect of military operations. They can involve complex and extensive organisation.

Strategic communication contrasts quite starkly with a model of communicative action describable as deliberative, which is what scholars and scientists aim to engage in. Academic methods and procedures, which are intended to make reliable knowledge and understanding possible, involve promoting the values of careful and thorough investigation of evidence combined with intellectual integrity in the application of reason and argument. Thus the deliberative communication of ideal academic inquiry stands opposed to the aims and methods of strategic communication.

Insofar as strategic communication is the antithesis of what academic inquiry ideally involves, there should in principle be no place for it in the practices of scholarship and science other than as an object of study. Study of it should also be suitably critical. For although there is much to be learned from the writings of practitioners – such as the seminal study, Propaganda, by Edward Bernays in 1928 or the more recent NATO Handbook of Strategic Communication – academic study of its use requires the application also of epistemological and ethical judgement in assessing the status of its truth claims and its wider socio-political impacts. The need for critical attention is accentuated by the fact – emphasised by practitioners themselves – that the most effective propaganda is not readily noticed as such. Yet, despite some notable contributions (from e.g. Bakir et al, Herman and Chomsky, Miller and Dinan), critical scholarship regarding propaganda remains a marginal activity within academia. As for the idea that propaganda could be a problem in academia, this is little discussed.

So having set out a clear contrast between the deliberative communication of academic work and the strategic communication of propaganda operations, it has to be acknowledged that how things ideally should be is not always how they are. In reality, the influence of strategic communications can affect academic research in a variety of ways.

2. Passive absorption of assumptions from propaganda

Although academics characteristically have epistemological capabilities that can help them resist propagandistic claims, some potential vulnerabilities to deception are attendant on those epistemological advantages.

For one thing, because academic research develops in depth as a result of specialisation, researchers can sometimes become ‘siloed’ in their narrow field, and some kinds of cross-disciplinary expertise needed for the identification of propaganda may simply not be well-developed by any particular grouping of academics.

A second disadvantage is attendant on the virtue of academic research in creating authoritative bodies of knowledge and procedures for validating them. For the academic requirement of duly acknowledging the accumulated knowledge on a given topic of inquiry, before presuming to innovate on it, can sometimes trap researchers in ways of thinking that are not well-suited to a discerning analysis of novel problems. They may find themselves at an epistemic disadvantage – i.e. liable to deception – in relation to strategic communications, particularly innovative forms of these, since, unlike academic communications, these are not constrained to observe principles of what we might call ‘epistemic due process’.

A third disadvantage attends academics’ professional freedom to pursue the ‘life of the mind’ in their highly focused endeavours, for this can mean being to some extent cloistered from the wider gamut of human affairs. Certain kinds of naivety can prevail. In the context of propaganda, an illustration would be a degree of reliance on the accuracy and good faith of reports published in the established press, with the media being considered ‘a properly constituted epistemic authority’ and, ‘in the main, trustworthy’. Of course, by no means all academics are naïve or cloistered, and not all cite non-academic publications in their work, at least uncritically, but it is not hard to find examples of academic writings influenced by assumptions generated through strategic communications that are relayed through mainstream or corporate media. In fact, this can even happen when a researcher’s actual aim is to diagnose the propaganda of a disinformation campaign, as the following case illustrates.

In a 2017 Guardian article, Olivia Solon presented what she called ‘a neat case study in the prevailing information wars’. She identified what she claimed was a Russia-backed disinformation campaign deploying a ‘network of anti-imperialist activists, conspiracy theorists and trolls’ whose purpose was ‘to discredit the White Helmets rescue workers in Syria’. The article has since been cited in dozens of academic discussions. What many of these have not noted, however, is the fact that regardless of how reliable the information promoted by the White Helmets and supporters may be, it was conveyed as part of a demonstrable propaganda campaign: the White Helmets were founded under the auspices of a UK-funded strategic communications operation; they were funded by Western governments, including the UK which expressly acknowledged the value of their testimony in providing confidence to its statements made in condemnation of Russian. Furthermore, although Solon alleges that criticism of the White Helmets stems from a Russian disinformation campaign, the earliest sceptical publications actually came from independent journalists in Canada (Cory Morningstar) and the US (Rick Sterling). Solon ignores these and also dismisses the influential work of independent journalist Vanessa Beeley, on the advice of James Sadri, Director of the organisation responsible for the White Helmets’ public relations campaign in the West. All of the other sources that Solon cites in support of her argument also have traceable connections to NATO strategic communications operations. Neither she nor The Guardian would respond to questions about the article, which was closed to readers’ comments.

So an academic who has not independently researched the matter, but cites claims from Solon’s article as established findings, can be said to have passively absorbed propaganda. They may do this because the focus of their own work is on a different subject, and they mean simply to offer an indicative reference standing for a more generic phenomenon (e.g. Bunce 2019, Clay 2021). They may also appeal to the general defence that we all have to take some things on trust, for reasons Neil Levy has examined. Yet it is still objectively problematic, since even offhand references do help sediment propaganda claims in the academic literature.

3. Uncritical reproduction of propaganda claims

More problematic, nevertheless, is when a researcher does not merely cite a propaganda claim inadvertently when writing about something else but actively reproduces one by taking it as a substantive premise of their own work. With regard to the Solon article, for example, we find several academics have advanced their own research contributions on the premise of a disinformation campaign with the features and protagonists Solon claimed (e.g., Brandt 2021; Cosentino and Alikasifoglu 2019; Freeman 2019; Hernandez et al 2020; Horawalavithana et al 2020); Lester 2018; Levinger 2018; Martin and Shapiro 2019; Merlan 2019; O’Shaughnessy 2020; Pacheco et al 2020; Starbird et al 2019; Vilmer et al 2018; Wilson and Starbird 2020).

More problematically still, some academics have uncritically reproduced specific claims about important matters of fact – even serious crimes – that have been recorded in the literature on the basis of propaganda rather than as a result of good faith investigation.

An example concerns an event in Douma, Syria, in 2018, where 43 civilians were found dead. The Western media reproduced the claim that the deaths were a result of a chemical attack by the Syrian government – a claim cited as justification for US, UK, and France firing missiles into Syria a week later. Yet the evidence appealed to in the allegations was found by inspectors from the Organisation for the Prohibition of Chemical Weapons (OPCW) not to support them. However, the OPCW’s management excluded the inconvenient evidence in their official report which was then seized upon by the Western powers as confirmation of their claimed justification for their missile attack. Whatever is the truth of the matter, any academic researcher would know, especially since the inspectors’ original report and documented misgivings came to light, to be cautious about accepting the official version and would certainly not uncritically reproduce it. Thus, several authors do treat the matter in an academically responsible fashion (e.g. Al-Kassimi 2021; Beal 2020; de Beer and Tladi 2019; de Lint 2021; Gray 2019; Kleczkowska 2020; Olsen 2019; Portela and Moret 2020; Tomić 2020; van der Pijl 2020; Yue and Zhu 2020), and some are quite thoroughly critical (see in particular publications by members and associates of the Working Group on Syria, Propaganda and Media). Yet we find a number of articles in academic publications that do insufficient epistemic diligence. This might be because the focus of discussion is elsewhere and reference to the event simply reproduces the assumption that the official Western version of it is authoritative (e.g. Ekzayez and Sabouni 2020; Orchard 2020 and Watkin 2020). Some authors are one-sided in selecting sources, but without necessarily making specific controversial claims (e.g. Notte 2020; Van Schaack 2020). Some do explicitly uncritically affirm the West’s version (e.g. Anthony 2020, Mitton 2019, Newlee 2020), or confusedly do so (Reynolds 2020). Some who follow these matters closely are aware of critical questions yet ignore them (e.g. Koblentz 2019). Some choose even to disparage the critical questions and those asking them, as is the case with authors from the Bellingcat organisation (Fiorella et al 2021).

A more general kind of problem is that whole research programmes can sometimes become skewed in favouring one view of the world over another. Indeed, ideologies are sometimes nurtured within academia that can have significant propaganda value. For instance, Inderjeet Parmar suggests that some social sciences, including the discipline of International Relations (IR), developed as ‘products of Anglo-American early twentieth-century hegemonic elite knowledge networks aimed at securing elite and imperial-state interests’. Relatedly, critics of the way Economics is taught in universities complain that the dominant neoclassical paradigm enjoys a virtual monopoly from which it promotes ‘free-market propaganda’. Indeed, Peter Söderbaum has lamented that universities’ economics departments have become ‘political propaganda centers not very different from the think tanks that we see these days in the USA and Europe.’

When the activities of a university are not clearly distinguishable from those of a think tank, there is a heightened risk of not only reproducing but actually producing propaganda.

4. Production of propaganda

Although the rigours of academic procedures provide reasonable safeguards against the production of propaganda within universities, more insidious opportunities for it arise in what we might call the ‘para-academic’ sphere. Here the reputation of universities is in one way or another leveraged to give credibility to communications that have not been subjected to the rigours of good faith peer review and would not necessarily have survived them. This sphere notably includes think tanks and NGOs with campaigning agendas and influential backers. When such organisations promote research dedicated to the pursuit of specific objectives, the publications they authorize will not necessarily disseminate the best available knowledge on a subject but may instead present claims that fit with their particular agenda. In this respect, their approach differs decisively from that of independent academic working groups, and the same is true of their preparedness to suppress knowledge found inconvenient for their purposes. (Strategic omission of information is itself an important part of propaganda.) Yet such organisations might employ academically qualified researchers – who may also hold or have held university positions – and thus lay claim to academic credibility in virtue of those credentials.

The blurring of boundaries between universities and think tanks can undermine the academic community’s fulfilment of responsibilities regarding propaganda in several ways. As well as leveraging the academic reputations of individual researchers to the purposes of special interests, para-academic activities can also involve leveraging the reputation of a university itself. For instance, individuals whose principal employment is in journalism, or at a think tank or NGO, may take visiting or honorary fellowships at universities and enjoy the reputational benefit of the association without necessarily making a commensurate academic contribution. Or they may publish working papers or blogs under a university’s banner whereby a veneer of credibility is lent to material that independent academic reviewers might have regarded as partisan polemics. It is also possible for businesses to leverage university connections so as to imply academic respectability for activities that might be regarded as propagandistic. An example is the teaming up of a business ‘based at’ Goldsmiths, University of London, with Bellingcat to produce a reconstruction of the Douma ‘chemical attack’ that supports OPCW management’s official version against that of the OPCW’s actual inspectors. This involved ‘modelling’ of a situation that failed to accord with the actual records and measurements gathered by inspectors on the ground, or even with visual evidence incidentally broadcast by the BBC (Hayward 2019a; Watson 2020a, 2020b). Yet the piece, published in the New York Times, has been approvingly cited in academic literature.

The use of modelling in place of real-world evidence is more generally a versatile tool for propaganda. This was illustrated by the UK’s response to Covid-19, which relied on modelling by Neil Ferguson, the Imperial College professor with a track record of over-estimating the threat to life posed by new pathogens. Simon Ruda, one of the government’s leading behavioural scientists involved in generating public fear on the basis of Ferguson’s projections, conceded in retrospect that there had been an overemphasis on modelling and data that was ‘propagandistic’. Ruda had come to regret how the ‘nudges’ instigated by the behavioural scientists ‘inadvertently sanctioned state propaganda’. This was ‘unethical and undemocratic’ according to a group of healthcare professionals led by Gary Sidley.

In general, a degree of intellectual humility would be an appropriate element of academic diligence on the part of the media-promoted experts. It would have been helpful, for instance, in the case of a high profile academic claiming on a BBC programme shown in UK schools that ‘Covid-19 vaccines are 100% safechildren should get the vaccine to protect their parents, and the benefits to children outweigh any risks.’ According to a complaint made on behalf of independent medics, ‘such a simplified and biased message’ was ‘deeply irresponsible’ and ‘amounts to propaganda’. Even assuming academics who promote one side of a debate do so in good faith, the point is that scholarship and science develop by means of debate and disagreement, which does not happen if only one side of an argument is heard.

This issue is all the more significant with matters attended unavoidably by a degree of secrecy, as when they concern national security or intelligence. Given this is the situation for some of the most momentous decisions that elected representatives must decide on – such as going to war – there is a need for scrupulous alertness to the possibility of being swayed by propaganda. This was made very clear in the wake of the 2003 Iraq war.

Nevertheless, some academics appear to have been drawn into propaganda operations that serve to heighten geopolitical tensions. For instance, we are aware, thanks to leaked documents, of the so-called Integrity Initiative, supported by the Institute for Statecraft, that has involved academics as well as journalists in clusters engaged in covert operations to present a one-sided view of relations with Russia (Hayward 2018, Klarenberg 2019, King and Miller 2020). Nor is the problem simply with covertly directed activities. There is a university research centre overtly funded by UK security and intelligence agencies whose publications ‘are subject to review by the Security and Intelligence Agencies’. This means their readers cannot be sure there are no ‘key omissions in data, analysis or argumentation’. Such concerns are intensified when researchers find themselves obliged to sign the Official Secrets Act. This, as Massoumi and colleagues observe, ‘opens up the possibility for secret or covert research’, which ‘cannot be properly tested by others since none of the data from the research is available.’

Another use of social science for potentially propagandised ends is in manipulating online discourse and activism to generate outcomes congenial to its sponsors. The tactics deployed by a formerly secret British propaganda unit called the Joint Threat Research Intelligence Group (JTRIG) aimed to ‘“discredit”, promote “distrust,” “dissuade,” “deceive,” “disrupt,” “delay,” “deny,” “denigrate/degrade,” and “deter”’. Although criticised by other psychologists, a psychologist involved in this work defended it on the speculative grounds that the harm it would do might be outweighed by harms it could help avert. Such tactics have received academic support also from other quarters. The case for governments intervening in citizens’ conversations – through ‘cognitive infiltration’ – has not only been approved but even positively advocated by certain academics with government connections, notably the US legal scholar and government advisor Cass Sunstein.

Sunstein has specifically developed such an argument in response to what he claims is the danger of conspiracy theories. Critics of this view, such as Kurtis Hagen and myself, highlight how the use of the term ‘conspiracy theory’ as a slur is used to discourage critical discussion of public malfeasance by implicitly branding critics as delusional. The effect of labelling certain concerns as ‘conspiracy theories’ is effectively to make discussion of them taboo, which is the antithesis of a scientific or scholarly approach and can allow propaganda a free pass. Arguably, a greater threat to democracy than conspiracy theories is the pre-emption of dissenting views and the suppression of free speech this involves.

Certainly, the logic of Sunstein’s approach tends to undermine the possibility of a critical study of propaganda. That is why, I argue, there is also a responsibility of academics to defend such study against its attackers, within and beyond academia.

5. Attacks on scholars of propaganda

Mutual criticism is at the heart of academic activity and crucial to its advancement. In the rigorous testing of ideas, new insights emerge and old illusions get laid to rest. Exchanges can be robust, but good faith criticisms of an idea are not the same as attempts to discredit those who advance it. Yet when one side of a controversy is actively promoted by the dominant media, those who articulate principled concerns about it may come under personal attack.

I learned this first hand doing epistemic diligence on media reports relating to the war in Syria. On one early occasion, academic members of the Working Group on Syria, Propaganda and Media (WGSPM), including myself, were subjected to an onslaught of smears in The Times newspaper, on both front and inner pages. The attacks did not mention anything we had published, but aimed simply to undermine our reputations. These attacks were lent a veneer of apparent academic support by citing opinions of certain individuals with academic credentials known for a willingness to engage in ad hominem abuse in defence of official narratives.

This propaganda tactic has been conspicuous in Covid-19 communications, where highly respected epidemiologists, doctors and other relevantly qualified scientists have been vilified for articulating views contrary to those being promoted in the mainstream media (Broudy and Hoop 2021; Cáceres 2022; Hughes 2021). One instance was the campaign against the concerned scientists who issued the Great Barrington Declaration. They had argued, ‘in view of a thousand-fold difference in COVID mortality risk between the old and the young’, for a policy of ‘focussed protection’. But the media were ‘working 24/7 to say that lockdowns were the only option, so anyone against them was a “Covid denier.” It was brutal.’ We now know from leaked emails that Dr Anthony Fauci, the US president’s Chief Medical Advisor, responded to a colleague’s call for ‘a quick and devastating published take down’ of the dissenters’ premises by pointing to an available example: an article by Gregg Gonsalves, an academic at Yale. This was not so much a reasoned critique as the manifestation of an approach the author summed up in a tweet: ‘This f*****g Great Barrington Declaration is like a bad rash that won’t go away’.

Attacks on academics by academics generally take the form of op-eds, letters or tweets – forms of communication not subject to peer review. Smears can gain considerable public traction without peer scrutiny and without the targets having any meaningful right of reply. Academics subjected to smears can even find that their own universities are swayed by them. There have been disturbing cases of universities disciplining their own academics on the basis of externally-driven smear campaigns.

Campaigns can also target publishers who are pressured to retract items that go against official claims. For instance, when David Hughes published an article on the silence of the IR discipline about unresolved controversies concerning events of 9/11, this was the subject of extraordinary vitriol on the part of certain other academics. Fortunately, editor and publisher held firm in this case, but other publishers have yielded to pressures. A striking example came to light as I was writing this article. The publisher Edward Elgar announced it was removing a chapter by Piers Robinson from its recently published Research Handbook on Political Propaganda. Since the chapter in question documents campaigns to silence academics who present inconvenient truths, this occurrence is a clear case of Quod Erat Demonstrandum! [Happily, the text of Robinson’s chapter, together with notes on the story of its removal, is now freely available.]

A particularly sickening instrumentalization of universities in the service of propaganda involves the coaxing of students with journalistic ambitions to write attacks on academics – regardless of whether they have even had any dealings with the academic targeted. An egregious example of this resulted in David Miller’s recent sacking by Bristol University. That case involved a particularly sustained and coordinated campaign, but I am personally aware of other, lower level, attacks, including several on myself. One student at Sheffield collaborated with HuffPost senior editor Chris York – himself author of more than a dozen attacks on WGSPM – in attacking Piers Robinson, and then attempted to do the same to me. Fortunately, that attack was aborted when a student at my university, contacted for negative comment, instead alerted me. Then another student journalist approached me at the instigation of producers working for the BBC on its Mayday podcast series – itself the subject of a rare rebuke from the BBC’s own complaints unit – but, investigating the story independently, opted to decline the producers’ suggestion. A less principled student, however, did publish an attack on me for the Murdoch publication The Tab – without contacting me or verifying its defamatory accusations. Notwithstanding this lack of journalistic ethics, her attack was praised and amplified on Twitter by Oliver Kamm of The Times, to whom she replied ‘Thank you so much Oliver! Always available for work experience at The Times! [“Wink” emoji]’. Kamm has since deleted his tweets, but the sort of example he sets to aspiring journalists has already been noted by others (Leiter 2005, Peterson and Herman 2010, Robinson 2022, Sayeed 2016).

Academics who are not as aware as they could be of how propaganda operates in practice may not appreciate the degree of epistemic diligence that ought to be applied to mainstream narratives. A telling illustration was provided by the student who chose to speak to me rather than the HuffPost. This Syrian student spoke of cognitive unease at ‘the overwhelming outlook of those around me in Scotland and at the university,’ feeling implicit pressure to conform to a view that conflicted with personal understanding of the situation in Syria. The student avoided writing essays on Syria. The fact that academics can contribute to demoralising experiences like that, for want of awareness of how their own assumptions can have origin in highly coordinated propaganda operations, should be a concern to anyone working in education. When the media single out for attack those of us making this very point it shows how their values and ethos are fundamentally at odds with the very vocation of academics.

Attacks on academics who challenge official orthodoxies are launched under the guise of combatting disinformation, and general concerns about disinformation are certainly now widespread. But before presuming to combat disinformation, one needs to reliably identify it, which means understanding the ways in which it can be purveyed and also having a robust epistemic account of what one takes to be reliable information. These are essential responsibilities of academics and need to be applied no less to orthodox views than to those that challenge them.


Academics have unique skills and an important social role to play in maintaining open and honest communications. This article has argued that there are significant responsibilities they should acknowledge and, collectively, work to fulfil. The wider public has reasonable expectations that universities are able to assure the integrity of knowledge available for public discussion – which, in practice, means also being ready, when necessary, to help cut through propaganda claims or hold officials to account.

Even if these expectations are not always fully met, they should still guide what academics aspire to. This article has acknowledged that to engage critically with the strategic communications of powerful actors can be difficult in itself and can even encounter active resistance. That engagement can require a certain resolve, particularly when, as Bertrand Russell many years ago observed, it means taking a stand against ‘the powerful organizations which control most of human activity’. But if academics cannot hold the line against the propagandising of information in public debate, it is not clear who else will be able to do it.


Al-Kassimi, Khaled (2021) ‘ The Legal Principles of Bethlehem & Operation Timber Sycamore: The “Islamist Winter” Pre-Emptively Targets “Arab Life” by Hiring “Arab Barbarians”’. Laws10: 69.

Allan, James (2021) ‘Covid hysteria based on lies, propaganda and ignorance’, On Line Opinion 7 October:

Anthony, Ian (2020) Strengthening Global Regimes: Addressing the Threat Posed by Chemical Weapons, Stockholm International Peace Research Institute:

Bakir V, Herring E, Miller D, Robinson P. (2019) Organized Persuasive Communication: A new conceptual framework for research on public relations, propaganda and promotional culture. Critical Sociology. 2019;45(3):311-328. doi:10.1177/0896920518764586

Barrows-Friedman, Nora (2022) ‘Palestinian teacher reinstated after UK Israel lobby attacks’, Electronic Intifada, 28 January:

Beal, Tim (2020) ‘US Imperialism, the Korean Peninsula and Trumpian Disruption’, International Critical Thought, 10:1, 89-112, DOI: 10.1080/21598282.2020.1741961

Bernays, Edward (1928) Propaganda, New York: Liveright.

Blumenthal, Max (2016) **Blumenthal, Max (2016) ‘**Inside the Syria Campaign, the shadowy PR firm lobbying for regime change’. The Grayzone, 2 October:

Brandt, Jessica (2021) ‘How Autocrats Manipulate Online Information: Putin’s and Xi’s Playbooks’. The Washington Quarterly 44, no. 3: 127–54.

Broudy, Daniel and Hoop, Darwin (2021) ‘Messianic Mad Men, Medicine, and the Media War on Empirical Reality: Discourse Analysis of Mainstream Covid-19 Propaganda’. International Journal of Vaccine Theory, Practice, and Research 2, no. 1: 1–24.

Bunce, Mel (2019) ‘Humanitarian Communication in a Post-Truth World’. Journal of Humanitarian Affairs 1, no. 1: 49–55.

Cáceres, Carlos F. ‘Unresolved COVID Controversies: “Normal Science” and Potential Non-Scientific Influences’. Global Public Health 0, no. 0 (15 February 2022): 1–19.

Clay, Dean (2021) ‘David vs Goliath’: The Congo Free State Propaganda War, 1890–1909’. The International History Review 439(3): 457-474.

Cook, Jonathan (2021) ‘After Success against Corbyn, Israel Lobby Ousts UK Scholar’.

Cosentino, Gabriele and Alikasifoglu, Berke (2019) ‘Post-truth politics in the Middle East: the case studies of Syria and Turkey’. Artnodes 24: 91-100.

De Beer, Aniel Caro, and Tladi, Dire (2019) ‘The Use of Force against Syria in Response to Alleged Use of Chemical Weapons by Syria: A Return to Humanitarian Intervention?’ Zeitschrift für Auslandisches Offentliches Recht Und Volkerrecht 79, no. 2: 205–39.

de Lint, Willem (2021) Blurring Intelligence Crime. Singapore: Springer.

Dhami, Mandeep (2011) ‘Military Social Influence’. Analyses of Social Issues and Public Policy 11 (1 December 2011).

Ekzayez, Abdulkarim and Sabouni, Ammar (2020) ‘Targeting Healthcare in Syria: A Military Tactic or Collateral Damage?’ Journal of Humanitarian Affairs, 2(2): 3–12

Fiorella, Giancarlo, Godart, C. and Waters, N. (2021) ‘Digital Integrity: Exploring Digital Evidence Vulnerabilities and Mitigation Strategies for Open Source Researchers’. Journal of International Criminal Justice 19, no. 1 (1 March 2021): 147–61.

Fishman, Andrew (2015) ‘Psychologist’s Work for GCHQ Deception Unit Inflames Debate Among Peers’, 2015.

Flanagan, Padraic (2021) ‘BBC Admits Syria Gas Attack Report Had Serious Flaws after Complaint by Peter Hitchens’  Daily Mail Online. Accessed 22 February 2022.

Freeman, Lindsay (2019) “Law in Conflict: The Technological Transformation of War and Its Consequences for the International Criminal Court,” New York University Journal of International Law and Politics 51, no. 3: 807-870

Gonsalves, Gregg (2020) ‘Focused Protection, Herd Immunity, and Other Deadly Delusions’, 8 October 2020.

Graupe, Silja, and Steffestun, Theresa (2018) ‘“The Market Deals out Profits and Losses” – How Standard Economic Textbooks Promote Uncritical Thinking in Metaphors’. JSSE – Journal of Social Science Education 17, no. 3 (5 November 2018): 5–18.

Gray, Gavan Patrick (2019) ‘Evidentiary Thresholds for Unilateral Aggression: Douma, Skripal and Media Analysis of Chemical Weapon Attacks as a Casus Belli’. Central European Journal of International and Security Studies 13, no. 3 (25 September 2019): 133–65.

Greenwald, Glenn (2014) ‘How Covert Agents Infiltrate the Internet to Manipulate, Deceive, and Destroy Reputations’. Accessed 22 February 2022.

Hagen, Kurtis (2022) Conspiracy Theories and the Failure of Intellectual Critique.

Hayward, Tim (2019a) ‘Should Universities Care about the Truth? | MR Online’. Accessed 22 February 2022.

Hayward, Tim (2019b) ‘Media Coverage of OPCW Whistleblower Revelations’, Tim Hayward’s Blog. Accessed 22 February 2022.

Hayward, Tim (2019c) ‘Three Duties of Epistemic Diligence’. Journal of Social Philosophy 50(4): 536-561.

Hayward, Tim (2020) ‘Peer Review Vs Trial By Twitter’. Tim Hayward (blog), 8 March 2020.

Hayward, Tim (2021) ‘“Conspiracy Theory”: the case for being critically receptive’. Journal of Social Philosophy. Early view:

Hayward, Tim (2018) ‘Integrity: Grasping the Initiative’, WordPress blog: [accessed 22 February 2022]

Hayward, Tim (2019d) ‘A Syrian Student Writes’. Accessed 22 February 2022.

Hayward, Tim et al (2018) ‘Response to the Guardian Article by Olivia Solon’, Working Group on Syria, Propaganda and Media: [accessed 10 July 2021]

Herman, Edward S. and Chomsky, Noam (1988) Manufacturing Consent: The Political Economy of the Mass Media. New York: Pantheon.

Hernandez, Anthony et al (2020) ‘Using Deep Learning for Temporal Forecasting of User Activity on Social Media: Challenges and Limitations’, Companion Proceedings of the Web Conference 2020, Taipei: 331-336.

Horawalavithana, Sameera et al (2020) ‘Twitter Is the Megaphone of Cross-platform Messaging on the White Helmets’, pp.235-244 in Social, Cultural, and Behavioral Modeling, edited by Robert Thomson, Halil Bisgin, Christopher Dancy, Ayaz Hyder, and Muhammad Hussain, 12268:235–44. Lecture Notes in Computer Science. Cham: Springer International Publishing, 2020.

Hughes, David A. (2020) ‘9/11 Truth and the Silence of the IR Discipline’. Alternatives 45, no. 2 (1 May 2020): 55–82.

Hughes, David A. (2021) Covid-19 Vaccines” for Children in the UK: A Tale of Establishment Corruption’. International Journal of Vaccine Theory, Practice, and Research 2, no. 1: 209–47.

Johnstone, Caitlin (2020) ‘Nobody Sets Out To Become A War Propagandist. It Just Sort Of Happens’. Consortium News, 31 January: [accessed 22 February 2022

King, Chris, and David Miller (2020) ‘Declassified UK: UK Information Operations in the Time of Coronavirus’. Daily Maverick, 30 September 2020.

Klarenberg, Kit (2019) ‘Close Associate: The Integrity Initiative’s Intimate Connections to “RussiaGate”’. Sputnik International.

Kleczkowska, Agata (2020) ‘The Illegality of Humanitarian Intervention: The Case of the UK’s Legal Position Concerning the 2018 Strikes in Syria’. Utrecht Journal of International and European Law 35, no. 1 (30 September 2020): 35–49.

Koblentz, Gregory D. ‘Chemical-Weapon Use in Syria: Atrocities, Attribution, and Accountability’. The Nonproliferation Review 26, no. 5–6 (2 September 2019): 575–98.

Kulldorf, Martin, and Jay Bhattacharya (2021) ‘It’s Mad That “herd Immunity” Was Ever a Taboo Phrase’. Accessed 22 February 2022.

Leiter, Brian (2005) ‘Oliver Kamm, Marko Attila Hoare, and the Importance of Being Able to Read’. Leiter Reports:  A Philosophy Blog. Accessed 22 February 2022.

Lester, Nicola (2018) ‘Introducing a Trauma-Informed Practice Framework to Provide Support in Conflict-Affected Countries’. The RUSI Journal 163, no. 6 (2 November 2018): 28–41.

Levinger, Matthew (2018) ‘Master Narratives of Disinformation Campaigns’. Journal of International Affairs, 71(1.5): 125-134.

Levy, Neil (2007) ‘Radically Socialized Knowledge and Conspiracy Theories’, Episteme 4(2): 181-192.

Magness, Phillip, and James Harrigan (2021) ‘Fauci, Emails, and Some Alleged Science | AIER’. Accessed 22 February 2022.

Martin, Diego A. and Shapiro, Jacob N. (2019) ‘Trends in Online Foreign Influence Efforts’ (Version 1.2):

Massoumi, Narzanin, Tom Mills, and David Miller (2019) ‘Security and Intelligence Incursions in Academic Research: A Threat to All of Social Science’. Discover Society (blog), 2 October 2019.

Massoumi, Narzanin, Tom Mills, and David Miller (2020) ‘Secrecy, Coercion and Deception in Research on “Terrorism” and “Extremism”’. Contemporary Social Science 15, no. 2 (2 April 2020): 134–52.

Merlan, Anna (2019) Republic of Lies: American Conspiracy Theorists and Their Surprising Rise to Power. Random House.

Miller, David and Dinan, William (2008) A Century of Spin. London: Pluto Press. Accessed 22 February 2022.

Mitton, John Logan (2019) ‘Lessons in deterrence: Evaluating coercive diplomacy in Syria, 2012–2019’, Journal of Strategic Studies 0, no. 0 (11 December 2019): 1–28.

Morningstar, Cory (2014) ‘Syria: Avaaz, Purpose & The Art Of Selling Hate For Empire’ Wrong Kind of Green website:

NATO (2015) NATO Strategic Communications Handbook (draft for use):

Newlee, Danika (2020) ‘The Punishment Campaign’. In Seth G. Jones (ed) Moscow’s War in Syria. Center for Strategic and International Studies (CSIS).

Notte, Hannah (2020) ‘The United States, Russia, and Syria’s chemical weapons: a tale of cooperation and its unravelling’, The Nonproliferation Review 27, no. 1–3 (2 January 2020): 201–24.

O’Shaughnessy, Nicholas (2020) ‘From Disinformation to Fake News: Forwards into the Past’, pp.55-70 in P. Baines et al (eds), The SAGE Handbook of Propaganda. Sage.

Olsen, Frances (2019) ‘Law, Language and Justice’. In Legal Linguistics Beyond Borders: Language and Law in a World of Media, Globalisation and Social Conflicts., edited by Friedemann Vogel, 2:253–88. Relaunching the International Language and Law Association (ILLA). Duncker & Humblot GmbH, 2019.

Orchard, Phil (2020) ‘Contestation, Norms and the Responsibility to Protect as a Regime’. In Charles T Hunt and Phil Orchard, Constructing the Responsibility to Protect. Routledge.

Pacheco, Diogo, Alessandro Flammini, and Filippo Menczer (2020) ‘Unveiling Coordinated Groups Behind White Helmets Disinformation’, 3 March 2020.

Parmar, Inderjeet (2017) ‘How Elite Networks Shape the Contours of the Discipline of International Relations’, in Jan Selby, Rorden Wilkinson, Synne L. Dyvik (eds), What’s the Point of International Relations? Taylor and Francis.

Peterson, David, and Edward S. Herman (2010) ‘The Oliver Kamm School of Falsification: Imperial Truth-Enforcement, British Branch’, MR Online, 22 January 2010.

Portela, Clara, and Erica Moret (2020) ‘The EU’s Chemical Weapons Sanctions Regime: Upholding a Taboo under Attack’. European Union Institute for Security Studies (EUISS), 2020.

Räikkä, Juha and Ritola, Juho (2020) ‘Philosophy and Conspiracy Theories’, in Michael Butter and Peter Knight (eds.), Routledge Handbook of Conspiracy Theories, New York: Routledge.

Reynolds Chris (2020) Global Health Security and Weapons of Mass Destruction Chapter. In: Masys A., Izurieta R., Reina Ortiz M. (eds) Global Health Security: Advanced Sciences and Technologies for Security Applications. Springer, Cham.

Robinson, Piers (2017) Learning from the Chilcot Report: Propaganda, Deception and the “War on Terror”’. International Journal of Contemporary Iraqi Studies 11, no. 1–2: 47–73.

Robinson, Piers (2019) ‘Expanding the Field of Political Communication: Making the Case for a Fresh Perspective Through “Propaganda Studies”’. Frontiers in Communication 4 (2019).

Robinson, Piers (2022) ‘Democracies and War Propaganda in the 21st Century’, Working Group on Syria, Propaganda and Media.

Ruda, Simon (2022) ‘Will Nudge Theory Survive the Pandemic?’ UnHerd, 13 January 2022.

Russell, Bertrand (1960) ‘The Social Responsibilities of Scientists’, Science, Vol. 131, Issue 3398: 391-392.

Sayeed, Theodore (2016) ‘Chomsky and His Critics’. Mondoweiss, 19 February 2016.

Sidley, Gary (2022) ‘Britain’s Unethical Covid Messaging Must Never Be Repeated | The Spectator’. Accessed 23 February 2022.

Söderbaum, Peter (2002) ‘Democracy and the Need for Pluralism in Economics’, Post-Autistic Economics Review, issue no. 11, January, article 4. Accessed 23 February 2022.

Solon, Olivia (2017) ‘How Syria’s White Helmets became victims of an online propaganda machine’. The Guardian, 18 December:

Starbird, Kate et al (2019) ‘Disinformation as Collaborative Work: Surfacing the Participatory Nature of Strategic Information Operations’. PACMHCI. Vol: CSCW, Article 127:

Sterling, Rick (2015) ‘Seven Steps of Highly Effective Manipulators’, Dissident Voice: a Radical Newsletter in the Struggle for Peace and Social Justice (blog), 10 April 2015.

Sunstein, Cass R. and Vermeule, Adrian (2009) ‘Conspiracy Theories: Causes and Cures’. The Journal of Political Philosophy 17(2): 202–27.

Tomić, Nikola (2020) ‘Global Justice and Its Challenges: The Case of the EU’s Approach to Syria – GLOBUS’. Accessed 23 February 2022.

Tucker, Jeffrey A. (2021) ‘A Short History of the Great Barrington Declaration ⋆ Brownstone Institute’. Brownstone Institute (blog), 17 July 2021.

UK Medical Freedom Alliance 2021

UK Medical Freedom Alliance (2021) ‘Open Letter to Professor Devi Sridhar Re BBC Newsround Episode on Children’s Vaccines’. Accessed 23 February 2022.

Van der Pijl, Kees (2020)‘The Coming Trial of the MH17 Suspects. A Piece of Political Theatre?’ Accessed 23 February 2022.

Van Schaack, Beth (2020) ‘Innovations in ICL Documentation Methodologies and Institutions’. In Imagining Justice for Syria. New York: Oxford University Press.

Vilmer, J.-B. Jeangène et al (2018) ‘Information Manipulation: A Challenge for Our Democracies’. Media Freedom Resource Centre OBCT. Accessed 23 February 2022.

Watkin Kenneth (2020) ‘Humanitarian Intervention and the Responsibility to Protect: Where it Stands in 2020’, Southwestern Journal of International Law 26(2): 213-240.

Watson, Philip (2020a) ‘Two Buildings – Two Cylinders: Part 1’. Hiddensyria (blog), 5 January 2020.

Watson, Philip (2020b) ‘Douma Incident and Forensic Architecture – The Emails’. OffGuardian (blog), 7 March 2020.

Wilson, Tom and Starbird, Kate (2020) ‘Cross-platform disinformation campaigns: lessons learned and next steps’. Misinformation Review, 14 January:

Yue Hanjing and Zhu Ying (2020) ‘Great Power Game around the Chemical Weapons Attacks in Syria and the New Norm on Banning Chemical Weapons’, Scholars Journal of Economics, Business and Management, 7(9): 304-312.

(Featured Image: “Stuart Six” by Accretion Disc is marked with CC BY 2.0.)


  • Tim Hayward

    Tim Hayward is a social and political philosopher whose books include Ecological Thought: an introduction (Polity, 1995), Constitutional Environmental Rights (OUP 2005) and Global Justice & Finance (OUP 2019). His current work examines the influence of strategic communications on the development of norms of international justice. As a founding member of the Working Group on Syria, Propaganda and Media, his studies of propaganda in action have led to academic publications on ‘conspiracy theory’, ‘disinformation’, and academic duties of due epistemic diligence. Tim maintains a personal blog and Twitter account. He is Professor of Environmental Political Theory at the University of Edinburgh.