Introduction
As someone who has quite often been publicly accused of purveying disinformation, I have had reason to examine the burgeoning phenomenon of counter disinformation experts who get cited in support of those accusations. Here, I report on some recent findings. This is not an academic report, but it does focus on academic involvement in counter disinformation activities; and while the examples are activities directed against myself or people associated with me, I believe the reflections on them are of wider significance. Accordingly, this feedback is intended not only to provide a learning opportunity for the relevant experts, but also to be of interest to a wider public. For it develops into an argument to suggest that the very project of organised counter disinformation is inherently problematic and that there is a need for more explicit reflection on universities’ social responsibilities of epistemic diligence with regard to the quality of information in public circulation.
The Argument in Outline
Disinformation is widely cited as a major problem of our time. The project of countering disinformation is being advanced with the support of governments and wealthy private backers as well as more tacit inputs of intelligence agencies. Particularly active in this field have been journalists and a burgeoning ‘fact checking’ sector. Now, we are also finding a growing number of academics contributing.
It ought to be a welcome development that academics get involved in efforts to improve the informational quality of communications in the media and social media. For these can be confusing or contradictory with regard to major matters of significant public concern. Yet the reality is that the academic contribution to public debates about disinformation does not always – if ever – live up to reasonable expectations. In fact, a serious concern is that academics are getting drawn into purported counter disinformation campaigns that actually serve to promote disinformation.
The possibility of identifying disinformation depends on a sound understanding of what is the relevantly reliable information. Assuring this understanding, however, is not necessarily a priority of communicators who are already committed to a position in an information war, as the project of countering disinformation tends to imply. Yet, it is what a good journalist would aim at, and its pursuit would be a fundamental professional obligation for any academic venturing into this field. Discharging this obligation involves appropriately competent and rigorous testing of the knowledge claims in dispute. This can be referred to as due epistemic diligence, and it has to be done before any countering of disinformation could even be possible, let alone necessary or desirable.
However, if, instead of doing due epistemic diligence, academics use their credentials or university positions to support unexamined claims that are promoted by particular powerful interests, they not only contravene fundamental academic principles, but can cause a serious problem in relation to the wider public interest. For the academic imprimatur, notwithstanding its spurious basis, can be appealed to by media outlets to amplify misleading information about matters of serious concern.
Unfortunately, it appears that certain academics can be incentivised to contribute to supposed counter disinformation activities without doing appropriate epistemic diligence. In being recruited to a side in an information war – even if they believe it to be in the right – they betray the principles of independent and dispassionate investigation that their credentials are supposed to assure the public of. In the worst of cases, it is a matter of academic reputations being traded on in the laundering of their own side’s disinformation.
A more humdrum failing of some counter disinformation efforts is to misunderstand the significance of the commonly drawn distinction between disinformation and misinformation. On accepted definitions, misinformation is the communication of erroneous information, including by innocent mistake, whereas disinformation involves intent to deceive. An important point to note then is that deception can be accomplished without using any misinformation at all. Hence, disinformation can involve misleading people by selectively presenting true information. (The selection can be ‘constructive’, in positively steering attention towards a partial view of matters, while excluding mention of crucial contrary considerations – which is sometimes called ‘being economical with the truth’; or the selection can be ‘destructive’ in that it involves the more negative strategy of disseminating irrelevant or misleading facts indiscriminately with an intent to disorientate and confuse.) Because disinformation need not involve direct misinformation, identifying cases of it cannot be achieved by simply checking for false claims. However, this does not mean the requirement of epistemic diligence is removed, only that it must also involve other kinds of investigation. Since instances of serious disinformation are generally the product of complex coordination, investigations may need to focus on the relevant coordinated communication strategies and the various agents involved in them. However, a crucial point is that these further tasks help accomplish, but do not displace, the requirement of showing that, and how, the public is actually being deceived. This can only be shown by reference to a reliable account of matters that they are purportedly deceived about. What counter disinformation experts are wont to overlook is that this is required regardless of whether people are indirectly deceived about a state of affairs or directly misinformed about it.
As will be shown in the first section, ‘counter disinformation experts’ who focus on tracking networks of communicators can fail to fulfill the key task of showing how these networks are actually doing any misleading or deceiving. Sometimes, it may not even be made clear how the associated actors are coordinated in any significant way. In extreme cases, as will be illustrated, that failure of diligence leads to promoting disinformation rather than countering it. So there are good reasons to be cautious about ‘countering’ disinformation without first thoroughly understanding/conceptualizing it in principle and being sure how to identify cases of it reliably in practice. Yet, as Section 2 shows, there is a wider problem of academics getting drawn into counter disinformation projects without having the experience or expertise needed to understand what they are getting involved in. Section 3 shows how not only particular individuals but also networks of academics can get drawn into ‘para academic’ counter disinformation activities which are not grounded in epistemic diligence. Section 4 looks at the problem of academics getting involved in public attacks on those who are actually endeavouring to do epistemic diligence on information of public significance.
All the examples discussed here are from cases where I or my associates have been identified as alleged purveyors of the disinformation to be countered. The reader will accordingly be mindful that I could quite possibly be mistaken about the benchmark of reliable information I am assuming in each case. The whole point of my argument, however, is that it is precisely this that would need to be shown before it could be asserted that I have been purveying disinformation. Yet showing this is the crucial thing that the counter disinformation experts quoted in the attacks consistently fail to do.
1. Failure of Diligence
A very basic professional responsibility of any academic is to ensure that their work – whether research, teaching or public engagement – is conducted in accordance with appropriate scholarly or scientific methods. If an academic is to become involved in ‘countering disinformation’, it is crucial to be clear what methods of analysis are appropriate.
Some approaches involve such methods as ‘social network analysis’ and the deployment of ‘deep learning algorithms’ to identify how ‘disinformation’ is spread around social media. This kind of research aims to show how complex networks of actors are engaged in mutually supportive communications in identifiable ‘clusters’ or ‘echo chambers’. What requires different kinds of research to show, however, is whether what they are communicating is actually faulty information. If this kind of epistemological investigation falls outside the expertise of the counter disinformation researchers, they should acknowledge this fact and its potential significance. The trouble is that its significance could be such as to invalidate a key premise of this kind of research project itself and make it impossible to present in good faith to the public.
Consider, for instance, a recent briefing from Boucher et al (2022). Boucher’s own research ‘focuses on applied machine learning to understand how the digital world shapes our society’, and his team consists of people with qualifications in mechanical engineering, data science, and quantitative research in political science. What is missing from their publication, however, is an answer to the question that is critical for the success of their endeavours, namely, how does one reliably identify a case of disinformation?
They define ‘disinformation’ as ‘false information intended to manipulate, cause damage, or guide groups and people in the wrong direction’. This implies, for any case of disinformation they purport to identify, claims of knowledge about what is true and in the ‘right direction’. Yet the authors do not expand on how they are confident of having such knowledge. Regarding the ‘right direction’, they cite as a source a website of the Canadian Government. As for how they identify specific instances of disinformation, they issue a peremptory list of propositions that they consider to exemplify it. Thus, they stipulate that the ‘spread of disinformation in Canada’ can be identified by reference to five “pro-Russia narratives”:
- Implying NATO expansionism legitimizes the Russian invasion
- Portraying NATO as an aggressive alliance using Ukraine as a proxy against Russia
- Promoting a general mistrust in institutions and elites
- Suggesting that Ukraine is a fascist state or has extensive fascist influences
- Promoting a specific mistrust of Canada’s Liberal government, and especially of Prime Minister Trudeau
On this basis, Boucher et al name 26 Twitter users who they call the most influential promoters of Russian-backed narratives.
An obvious problem, as Dimitri Lascaris points out, is that the authors offer no evidence or argument to show why any of those narratives should be regarded as deceptive. His own view is that they ‘find abundant support in the historical record’, and this is what is reflected in the Twitter discourse.
The authors could and should have anticipated this line of objection, but their willingness to cite the Canadian Government as their arbiter of what is right shows that they did not see it as part of their task to do so. Given that their research is handsomely funded by the Canadian Government, the independence and impartiality of their briefing can be assessed accordingly.
A problem, however, is that many people who read that those 26 Twitter accounts are branded as purveyors of ‘disinformation’ will not necessarily be in a position to make a fully informed assessment. For although the Briefing Paper makes no overt pretence to being a peer-reviewed article, it appears with the imprimatur of Calgary University and has been treated as a reliable report in an article by the journalist Marie Woolf of The Canadian News for Canada’s CTV News. As Yves Engler notes, this was amplified throughout the Canadian press and broadcast stations, with Boucher himself appearing on multiple TV and radio outlets ‘accusing critical journalists of being “useful idiots” of Vladimir Putin.’ Those accused – and with no right of reply – included notable independent journalists such as Aaron Maté, Benjamin Norton, Max Blumenthal, Richard Medhurst, and John Pilger. And no evidence was offered to connect these individuals to Russia. More importantly, nothing those journalists have said has been shown to be false or misleading by any independent academic standards of epistemic diligence. The criteria applied appear to be that a view affirmed by the Canadian government is true whereas a view affirmed by the Russian government is false: the epistemological basis of these criteria is not explicated. If it cannot be shown why Russia’s views are invariably wrong, then it is possible they sometimes capture truth; and since truth is in principle attainable by any who diligently seek it, this could account for how the journalists named might sometimes light upon truths that Russia has also identified. This implies no coordination between the journalists and Russia.
So Boucher et al have provided no reason to rule out the possibility that the named journalists are among those most assiduously doing epistemic diligence on the matters they cover. So to suggest that they are part of a ‘problem’ and need to be in some way ‘countered’ is not merely premature but actually perverse. If it is bad enough to fail to do epistemic diligence, it is even more egregious to be involved in attacking those who do it.
If a ‘counter disinformation expert’ cannot credibly establish that certain journalists’ work actually is in any way deceptive, then to brand them as purveyors of disinformation is itself deceptive. In other words, academics who engage in this practice run a very real risk of promoting disinformation rather than countering it.
It is, in fact, a wider problem that academics who seek to participate in the project of countering disinformation fail first to perform the basic task of establishing how disinformation is reliably identified. It is self-evidently premature to think of countering something that has not yet been clearly identified. Such a practice can exacerbate rather than resolve the problem, since more harm than good is done by countering putative disinformation on the basis of assumptions that turn out themselves to serve in the promotion of verifiable disinformation. In fact, reciprocal accusations of disinformation are by no means unusual. For, as follows from very the nature of organised disinformation, a predictable strategy for its dissemination is to level accusations at those who impede its progress of being themselves engaged in disinformation.
Hence, it can even happen that not only journalists but also academics who engage in independent and impartial analysis with respect to these matters, who acknowledge stringent duties of epistemic diligence, then find themselves under attack from other academics who believe they are fulfilling a mission to counter disinformation.
2. Unfounded Criticism
It should go without saying that one should be scrupulous to avoid errors that involve defamation of those one disagrees with. Yet, unfortunately, not everyone always is, and as a cautionary tale, I take the case of an academic who was given the opportunity to exhibit his expertise by the Times of London for an article with the headline: University of Edinburgh academic Tim Hayward accused of spreading propaganda.
The basis of the Times’s accusation was a tweet of mine in which I had linked to an article by Max Blumenthal for the Grayzone, while adding the comment:
‘The destruction of Mariupol theatre was broadcast worldwide as an atrocity warranting West’s intervention. But what do we know of the reality?’
The point of asking the question was that the theatre bombing was seized upon by those seeking to escalate foreign involvement in the war, something that risked making a terrible situation even worse. Blumenthal’s article explores the possibility that the theatre was destroyed not by Russian forces but by a bomb planted by the Azov Battalion who were understood to be in control of the area. For a warning had been published several days earlier on the Telegram channel of Dmitriy Steshen, a correspondent reporting from Mariupol for the Russian newspaper Komsomolskaya Pravda: the military in control had a plan – ‘given a good opportunity – to detonate the building and then scream around the world that this was by the Russian Federation Air Force and that there should be an immediate ‘no fly zone’ over Ukraine.’”’ Of course, the fact that this allegation was made does not suffice to make it true, but in the absence of clear contrary evidence, it cannot reasonably be dismissed as of no significance at all.
Someone at The Times, however, evidently believed the possibility should be ruled out. The Times specially commissioned a ‘fact check’ from a Ukrainian organisation called Molfar whose brief, as Molfar describe it, was ‘to debunk Max Blumenthal’s fake article’. The Times then cites Molfar’s avowedly one-sided report as authoritative.
This approach is par for the course for The Times, but the paper also includes comments from the academic mentioned, who is based at the Oxford Internet Institute (OII), which lend apparent support to the claims made against Blumenthal.
This perplexed me, since having examined Molfar’s ‘fact check’, I found it impossible to believe that any competent academic could in good faith endorse it. I think no suitably attentive citizen could. It evades key points Blumenthal makes while including a lot of distracting material about a variety of claims from Russian sources that, for some unexplained reason, are simply assimilated to Blumenthal’s.
Accordingly, I contacted the OII academic cited by The Times, Dr Aliaksandr Herasimenka, a fellow with Oxford University’s Oxford Martin Programme on Misinformation, Science and Media. He is quoted by The Times as saying:
‘It’s normal to question and verify whatever information is coming from conflict. However, we must be very careful when doing it.’
In the context of the article, the reader would naturally understand Herasimenka to be implying that Molfar is very careful and that I – the headline target of the article’s attack – am not. Nevertheless, when I initially read this, I allowed the charitable assumption that Herasimenka had made a general point, not directed at me specifically, when speaking to the journalist. However, I then found that he had gone on to issue a press release featuring the full attack on me while also making some other points indicating his satisfaction with the Times’s hit piece. So I contacted Herasimenka asking him to confirm whether or not Molfar’s report is an instance of what he regards as ‘careful verification of evidence’.
He did not reply to the email I sent to him, which was copied to his programme director. The latter replied that I should accept this evasive tweet from Herasimenka as his response.
Notwithstanding Herasimenka’s reticence to divulge his thinking about Molfar and academic standards of research and reporting, he had been quite prepared to tell the world via The Times and an Oxford University press release about a problem of ‘toxic news outlets’, which, in the context provided by The Times, evidently include Blumenthal’s news outlet, Grayzone. Herasimenka asserts that
“Putin wants people to concentrate on these toxic news outlets rather than paying attention to serious topics like whether he’s about to use chemical weapons.”
How Herasimenka presumes to know what Putin wants is a question I shall not dwell on. Two questions are worth asking, however, for they highlight the risks of insufficient diligence leading to the reinforcement of disinformation rather than its dissolution.
One question is how the Oxford researcher decides which news outlets are ‘toxic’, especially if he thinks that one of them is Grayzone. Grayzone has a strong record of investigative journalism that has uncovered various kinds of malfeasance in public office. This exposure may have an adverse effect on parties who are complicit in the kinds of lies exposed, but it can be highly salutary for the general concerned public: what’s good for a healthy public discussion may be ‘toxic’ for professional liars. Claims that Grayzone is a source of ‘disinformation’ have been made numerous times by certain sections of the press and social media, but in the cases I have examined, as here, here or here, I have found the reverse to be true. In fact, those claims have often come from the same quarters as those also repeated against the Working Group on Syria, Propaganda and Media (WGSPM), as Ben Norton has highlighted. (I am a founding member of WGSPM.) The suggestion that Molfar’s report is more careful than Blumenthal’s is, in my view, wrong. More significantly, Herasimenka has, to date, provided no actual reason to think otherwise, and seems unready to offer any.
Given his evident lack of appetite to engage in reasoned discussion with the people he is complicit in smearing, I did not trouble to ask Herasimenka my second question. But I will put it on record. The question is how he decides which topics should be regarded as serious, and, in particular, how he came to single out the chemical weapons question as the one to ask in the context of Ukraine. Certainly, the use of chemical weapons is a very serious matter – it is a war crime. In that general sense it is clearly a serious topic. However, Herasimenka’s assertion implies a more specific concern, namely, that Putin could be on the verge of using them. How did he light upon this as an imminent threat? Was it through careful verification of the available information, as he commends, or was it the passive absorption of a piece of propaganda?
On 22 March, there were press reports (e.g. in the Guardian) that President Biden had suggested the possibility of a chemical threat, although, as The Times of Israel dryly concluded, ‘Biden did not offer proof’, and the Reuters version of the story featured in its headline the caveat ‘without citing evidence’. Biden was directly asked at a NATO press conference on 24 March ‘whether the U.S. has gathered specific intelligence that suggests that Russian President Vladimir Putin is considering deploying chemical weapons’ (ABC News). Biden’s response was ‘I’m not going to give you intelligence data’. After that, the matter dropped out of the news.
However, Herasimenka was already being quoted in The Times by 22 March. So presumably he was influenced by talk of the possibility among US and UK spokespersons as reported in the press in the preceding weeks. In that coverage, a particularly active voice was that of Hamish de Bretton Gordon (or HBG as he styles himself). On 6 March HBG wrote in the Daily Telegraph that ‘We are wildly under-estimating the risk of chemical attacks in Ukraine’ – a claim he would go on to reiterate on BBC Newsnight, Channel 4 News, GB News, and the Express. On Twitter he claimed ‘Russian plan of attack not working’ and he invoked a historical comparison: ‘When Assad’s attack plan failed in Syria he went Chemical & it worked’. Blumenthal’s article reminds us of how contentious is this claim, given that at the only alleged Syrian use of chemical weapons to be investigated on site by the Organisation for the Prohibition of Chemical Weapons (OPCW) – that in Douma 2018 – ‘OPCW investigators found no evidence of a chemical attack, but had their findings doctored and censored as US officials worked to pressure and co-opt the organization.’
Indeed, it was immediately following the Douma incident that The Times, as it also now reminds us, had made the first of its attacks on myself and fellow WGSPM members. The timing on that occasion, too, coincided with allegations which HBG was prominently promoting. When WGSPM posted critical analyses of evidence for alleged chemical attacks in Syria, HBG was identified as a central figure in procuring it. In fact, as noted in a WGSPM Briefing Note, he ‘has described to the All-Party Parliamentary Group and elsewhere his covert role in collecting samples from alleged chemical attacks in Syria’. Yet the quality of evidence relating to those attacks has been contested. Claims and predictions made by HBG about chemical weapons have sometimes turned out to be mistaken – even farcically so, as when he claimed that gas from household fridges could be used in terror attacks.
Questions concerning the reliability of intelligence about chemical threats arise in the current case too. A few weeks after Biden’s citing of claims of the kind made by HBG, even the mainstream media in America had become quite frank about the poor quality of intelligence that initially influenced Biden. As NBC News summed it up:
‘It was an attention-grabbing assertion that made headlines around the world: U.S. officials said they had indications suggesting Russia might be preparing to use chemical agents in Ukraine. President Joe Biden later said it publicly. But three U.S. officials told NBC News this week there is no evidence Russia has brought any chemical weapons near Ukraine.’
‘Multiple U.S. officials acknowledged that the U.S. has used information as a weapon even when confidence in the accuracy of the information wasn’t high.’
So, while investigators like those of Grayzone and WGSPM are striving to critically examine information, certain serviceable academics can be drawn into attempts to malign their endeavours – which they do from a demonstrably ill-informed and propagandised basis. In this case, Herasimenka compounded his unfounded accusations against Grayzone with the uncritical reproduction of unverified speculation about a threat that was only ever a propaganda construct conjured by an ‘attention-grabbing assertion’ based not on real evidence but merely ‘low confidence intelligence’.
What this case illustrates is how a media outlet like The Times can get an academic to provide the appearance of legitimacy to its agenda – which is achieved by forsaking the scholarly responsibility of doing epistemic diligence, and instead endorsing low quality intelligence, apparently because this practice is rewarded kudos that can be vaunted in press releases.
3. Supporting Attacks on Dissidents
But it is not just individual academics whose confidence or ambition exceeds their competence that are a problem. There are also coordinated covert collaborations directed by elements beyond academia. Again, I can take examples that have affected me personally.
On 19 March 2022 I received a direct message on Twitter with a link to one of my own tweets and these words:
‘Sorry won’t send anything on Tim for the coming weeks, but he is really a prick’
The message was from a lecturer at my university, sent to me by mistake, and intended for a group that the sender was reporting to about my Twitter activity. I don’t know what purpose was served by their monitoring of my tweets. What I do know, however, is that in the same period, another lecturer, also at my own university, was providing information to a UK intelligence-linked cabal. He identified me as a ‘rogue academic’ to a group of plotters coordinated by the journalist Paul Mason whose aims include getting critical independent outlets like Grayzone defunded and dissident academics discredited or fired.
Evidently, then, ‘counter disinformation’ operatives can find a certain kind of academic who shares their view that people like me are a legitimate target of their hostile attention. So the question of public interest here is whether they who believe they are contributing to ‘counter disinformation’ efforts are demonstrating the diligence, competence and good faith required to be sure of not doing more harm than good from the perspective of the public interest.
The academic onus is certainly on them to demonstrate this. It is not just that covert information operations are anyway problematic on a variety of grounds, but there is a specific reason why it would be reprehensible for academics to allow themselves to be drawn into them. This is that it flouts the most fundamental principle of epistemic diligence to covertly discredit a perspective without even an attempt to evaluate it and, worse still, to prevent others from evaluating it.
A basic responsibility of academic researchers is to endeavour not to succumb to the influence of propaganda in their investigations, and they certainly should not use propaganda to undermine efforts of other academics who – as I claim is the case with members of WGSPM, for instance – are doing diligent research. One would, in fact, expect that scholars studying disinformation would be duly sensitised to risks of being influenced by it.
Yet, the fact is one can find scholars of ‘disinformation’ who, while able to expound at length on the supposed failings of others, remain oblivious to the limitations of their own perspective. This issue is actually illustrated by the academic who is so confident in his perspective as to dismiss those of us who do not share it as ‘rogue academics’.
Huw Davies, a lecturer who specialises in ‘data and society’, thinks it important to observe a clear distinction between, on the one hand, ‘disinformation and pseudo science’, and, on the other, ‘credible, reality-based research’. This is well and good, but some effort does need to be put into understanding how the distinction is conceptualised and how it is to be applied in practice. The effort required involves more than simply identifying the dominant opinion about a matter and then deprecating any propositions not in conformity with it. For epistemic diligence is especially needed in cases where precisely at issue are serious questions about the reliability of the dominant opinion.
Yet across a range of cases Davies not only sides unreflectively with the official narrative, he actively maligns those who raise questions about it. For instance, he singles out for dismissive treatment an article by David A Hughes (2020) which comments on the silence of International Relations (IR) scholars concerning serious questions about the official account of events on 9/11. Davies, presumably, feels on safe ground here because Hughes’s position is certainly heterodox. Yet Hughes is not wrong that there are questions, for even assuming the official account of events is essentially correct, it is still remarkably incomplete. Despite 9/11 being a horrendous crime, criminologist Willem de Lint (2020) shows it was not investigated as such, and as I have noted elsewhere (Hayward 2021), suggestions that certain well-placed people in the West could have had some foreknowledge of events of 9/11 (Ryan 2010), or that there could be doubts about the identities and provenance of the alleged terrorists involved (Kolar 2006), or about the precise cause of collapse of World Trade Center Building 7 (WTC7) (Hulsey and Quan 2020), may be speculative but are not irrational to entertain. This is the premise of Hughes’s article. Yet the article was attacked by a number of academics on Twitter simply for entertaining that premise, and not for any specific argument advanced, since none was actually evaluated by his attackers. They simply insisted the subject should be kept off limits. I have discussed this unedifying event in Hayward (2020), and would here just point out that Davies simply sides with those who don’t like to be challenged to do more thorough epistemic diligence, while being prepared to hurl slurs at those who do it.
Now, a defence often advanced by those who decline to address an inconvenient question is that it is a distraction not worthy of their attention. Typically, it is claimed that people who are ‘just asking questions’ are really trying to kick up dust where the situation is already clear, and, as for those who don’t just ask questions but also advance heterodox hypotheses as possible answers, they are dismissed as ‘conspiracy theorists’. But what should be noticed about this kind of defence – which might better be described as a counter-attack – is that while it may have some deterrent effect in the rough and tumble of social media and the press, it is not academically credible, since it disregards the point in contention without engaging with it. In academia, if a question betrays an inadequate grasp of the existing state of knowledge, this can readily be shown, but the point about the questions in Hughes’ article is that they arise precisely from the inadequacy of the current state of knowledge. Hulsey’s scientific investigation into the cause of collapse of WTC7, for instance, arose from the bafflement of 3,338 architects and engineers – whose professional responsibilities demand comprehension of such matters – as to how a building could collapse in a manner that appears to defy the laws of physics. Hulsey’s answer may eventually prove to be wrong, but more research, not less, would be required to prove this.
Where there is deep scientific controversy, it is wise for ‘disinformation experts’ to be mindful of the limits of their competence to comment. Certainly, there is a case for saying it is in the public interest to ensure that where there is a genuine scientific consensus on a matter with significant policy implications that clear communication of that consensus view should be facilitated. However, it is quite another matter to declare that The Science is clear when there is significant disagreement amongst relevantly expert scientists themselves. To ‘pick a side’ in a scientific debate when one is not oneself expert in the field is to risk backing the wrong side. Then to promote that side in public communications and exclude the other is to run a real risk of doing more harm than good.
An instance of this situation has been the serious debate on the part of different scientists concerning the appropriate strategies for dealing with the spread of the SARS-Cov2 virus. Some views have been strongly endorsed by establishment media while others have been marginalised and dismissed. In particular, the proposal of ‘focused protection’ put forward by signatories to the Great Barrington Declaration (GBD) had the backing of some distinguished epidemiologists and merited proper discussion, but it was very much marginalised by the mainstream. Yet, here again we find Huw Davies – who has no background in epidemiology but great confidence in the dominant opinion – being haughtily dismissive of GBD scientists: ‘The 2 Oxford professors who backed herd immunity (sans vaccine) & dodgy mask research have given me invaluable material to teach digital literacy.’ He adds, ‘Pseudoscience/science boundary is too thin’, implying that he knows better than accomplished epidemiologists where to draw that boundary. In fact, as time goes on, more and more evidence emerges that tends to vindicate the GBD scientists (see e.g. Finley 2022). And even if there remains room for debate here, the point is that the debate should be allowed and not prejudiced by one-sided media coverage. Certainly, academics who consider themselves disinformation experts should have no part in condoning let alone supporting that one-sidedness.
The point applies for any controversial issue: it is one thing to defer to mainstream opinion because one is not an expert oneself, which is generally rational to do, for reasons Neil Levy (2007) has set out, but it is another to categorically dismiss minority expert opinions that one is not competent to evaluate. One should certainly not presume to teach ‘digital literacy’ by promoting an assumption that the views most favoured by the dominant media are necessarily the most reliable.
Hubris turns into nefariousness, however, in a move from simply being dismissive of the ‘wrong experts’ to engaging in active operations to make trouble for dissenters. Unfortunately, an instance of this recently came to light with the leaking of some email exchanges centering on the well-known journalist Paul Mason. Kit Klarenberg and David Miller, writing for Grayzone, have provided more detailed coverage of what they have dubbed #masongate, but the crux of it is that Mason had been seeking helpers for his plans to discredit anti-war activists and journalists, and he noted also that “the far left rogue academics is who I’m after”. He mentions that “Emma’s tipped me off to their current activities.” This is the academic Emma Briant, an enthusiastic advisor for his project and committed to the dominant counter-disinformation agenda. She introduces Huw Davies to Mason, noting that Davies ‘is potentially interested in helping with this effort to analyze these networks and do social media analysis.’ In response, Davies recommended a recent Twitter thread which he said listed “a few rogues”, with Tim Hayward [myself] being the first of only two people named in the tweets.
Also revealed in the emails is that one of the people Paul Mason anticipated would be involved in his project is Chloe Hadjimatheou. She is the BBC journalist who, a few weeks later, was to present the BBC’s File on 4 broadcast attacking Julian Schlosberg and myself. She gives Paul Mason the concluding word in her programme. Whatever her actual intentions may have been, her programme can objectively be seen as part of a now extended attempt to get people like me fired – or at least to make enough of an example of us to discourage any younger academics who might be tempted to engage in epistemic diligence of claims that powerful actors are strongly invested in promoting.
So we are seeing now how the blithe dismissiveness exhibited by over-confident academics can be ‘weaponised’ by other actors with potentially harmful effects not only on those other academics who engage in critical analyses of contemporary propaganda but also on the general public.
The public good is undermined rather than served by these activities: campaigns against critical academics do no good since they do not even attempt to address any substantive points of information; but they can do harm by lending gratuitous support to dominant narratives instead of providing what the public requires of them, namely, the dispassionate assessment of competing claims and due intellectual humility where the respective merits of claims is uncertain.
If we the targets of such campaigns really were engaged in misleading the public, this could be revealed by doing epistemic diligence on the things we say. There would be no need to try and discredit us by nefarious means. The fact that the things we say and publish never are engaged with demonstrates to impartial and thoughtful observers that what we are saying is likely worth taking note of. This is probably why our following amongst reflective sections of the public is further enhanced with every high profile media attack on us. Furthermore, when ordinary decent people once discern how matters stand, they don’t forget it. This is part of the reason why trust in mainstream media has been inexorably collapsing in recent years.
It also explains why intimidation and other dirty tricks come to be employed. And sadly, not everyone who engages in maligning dissidents does so by mistake, or due to naïveté or ineptness. The active discouragement of critical study of certain cases of propaganda comes not only from representatives of vested interests outside academia, but also from a small yet vocal group who hold positions within.
4. Engaging in Attacks on Dissent
Up to now we have considered how counter disinformation activities can fail to meet academic standards of epistemic diligence, but now we come to practices that represent the stark antithesis of academic standards. For this reason, it is unlikely to be pure chance that those most active and persistent in these practices are individuals for whom academic work is only part of what they do. For instance, we find people who have positions as journalists who also hold academic positions which can appear to confer epistemic authority on activities they engage in on a different basis.
Several of these individuals have repeatedly made absurd and defamatory allegations – which they could never do in any academic forum – against myself and other members of WGSPM. Two of them were attacking some of us on social media even before we formed the group, and their opinions about us were then invited for the first major attack on us by The Times.
Scott Lucas maintains a news website – formerly under the banner of the University of Birmingham where he held a position – which mainly presents the US view of contemporary foreign policy issues. But, it also includes tabloid-style attacks on dissent from that view – so WGSPM members have figured frequently in these. A prolific tweeter, Lucas has also very frequently attacked us on social media. He is, nevertheless, a go-to commenter for British journalists who do hit pieces on us.
So is Idrees Ahmad, a lecturer at the University of Stirling, who had become a journalist with Al Jazeera at a time when other of its journalists were leaving because of the editorial bias being imposed on its coverage of Syria by its Qatari proprietor. Ahmad went on to become editor of Pulse Media, and more recently was a founding editor of New Lines Magazine. He is notorious on Twitter for the aggressiveness of his invective against anyone who articulates criticism of claims that are endorsed as the basis of Western foreign policy. His activities were already noted by Louis Allday in 2016, and by both Ben Norton and former UN weapons inspector Scott Ritter in 2017. Max Blumenthal of Grayzone has reported even receiving a threatening phone call. Ahmad, styling himself as a ‘narrative corrector’, pursues his mission relentlessly, chastening waverers and attacking even his ostensible allies if they appear to stray from the orthodox position. Hence Ahmad’s public aggressiveness has been criticised even by his political allies. Law Professor Maryam Jamshidi of the University of Florida has complained that a small group of people around Idrees Ahmad ‘have made a career out of bullying and threatening people on Twitter’, and she refers to them as ‘toxic forces that are undermining the cause of a free Syria & the Syrian people.’
Among others named by Jamshidi is the journalist and blogger Joey Ayoub. For a time, Ayoub was registered as a PhD student at my own university. During that short time, before his acrimonious departure, which he made rather public, he had taken to Twitter to accuse me of being a ‘fraud’ – an allegation that goes way beyond the bounds of reasonable criticism – so I asked his head of school to have a quiet word about the need to observe those reasonable bounds. I expressly declined to escalate that into a formal complaint. However, Idrees Ahmad is subsequently quoted in The Times alleging I did just that and strongly implying I was bullying a student from a minority background. The two of them have since been repeating their story of an old white man failing to ‘check his privilege’ whenever they choose to attack me, including in a fact-free diatribe Ahmad published in OpenDemocracy. The same tale was warmed over again in an attack on me in the Murdoch-funded ‘student’ paper The Tab, written by Noa Hoffman, without offering me any right of reply. Hoffman, who was cheered on at the time by Oliver Kamm – the columnist who claims credit for the initial attack on us in Murdoch’s Times – is now employed by Murdoch’s Sun.
The use of students in attacks on dissident academics is something I have commented on elsewhere as a particularly distasteful practice. Its evident objective is not just to discredit them but to try to hound them out of their jobs – which would serve as a clear deterrent to other academics who might be tempted to voice public dissent.
This objective is openly avowed by some journalists, and certain academics appear to endorse it. An academic quoted in one of Chris York’s many hit pieces attacking Working Group members – this one published at the time the Working Group was exposing the ‘Integrity Initiative’ and its use of covert clusters of journalists and academics ready to spring into action against anyone deviating from the official line in favour of militarism and war – is Lydia Wilson, referred to as ‘an Oxford and Cambridge research fellow’. Wilson describes the views of working group founding member Piers Robinson as ‘dangerous to students – he’s working in a journalism department and he can’t analyse journalism sources.’ She said it was ‘ridiculous that Piers Robinson is teaching propaganda’ and ‘raises serious questions for the University of Sheffield’, adding that the ‘most troubling thing for me is how did he get this job? It’s not hard to uncover this man.’ Looking at Wilson’s publications profile, it is not clear why she is deemed to have the academic competence to criticise Piers Robinson. One might do better looking elsewhere for York’s reasons for consulting her. As it happens, she had previously been a BBC journalist and a contributor to a consortium headed by the UK-funded stratcom company ARK, whose projects included founding the White Helmets, The Free Syrian Police and CIJA, and whose director Alistair Harris can be heard chatting cosily with Chloe Hadjimatheou in one of her podcasts explaining that if he was involved with MI6 he would not be allowed to say. In brief, Wilson is connected to a certain discernible network.
Another familiar name in this context is Nader Hashemi, director of the Center for Middle East Studies at the University of Denver, and he also supplies quotes for York’s article: “Piers Robinson and his friends have no interest in truth or justice” “The administrators of the university that he teaches at have to be presented with this evidence.” “Someone who’s supposed to be objective and teaching propaganda is himself a propagandist.”’ Hashemi was later called on by the BBC to provide a supposedly independent evaluation of my own academic work for their File on 4 programme. In the broadcast, the BBC toned down what he actually told them, which Chloe Hadjimatheou had put to me in my right of reply. Hashemi had been prepared to be publicly quoted as accusing me of attempting to ‘justify the horrors of Assad regime in Syria’, ‘exonerate the Assad regime’, support ‘totalitarian, rapacious policies’ or even ‘genocide denial’. These are such defamatory accusations that even the BBC did not run with them.
A central reason for the persistent press attacks on WGSPM members, as made clear also by the BBC, is the group’s responsibility for doing epistemic diligence on OPCW reporting on the 2018 Douma incident that has raised very serious questions about the effective independence of the organisation. There is, in fact, now a solid body of academic literature that recognizes the validity of the concerns raised by the working group and which have been considerably elaborated by the OPCW’s own scientists who have spoken out (e.g. Al-Kassimi 2021; Beal 2020; Boyd-Barrett 2019, 2021; Cole 2022;de Beer and Tladi 2019; de Lint 2021; Diesen 2022; Gray 2019; Kleczkowska 2020; Olsen 2019; Portela and Moret 2020; Robinson 2019, 2022; Simons 2019; Tomić 2020; van der Pijl 2020; Yue and Zhu 2020).
The case could well be regarded as a paradigm example of how due epistemic diligence can be contrasted with ‘counter disinformation’ strategies that fail to do it. A key element of contrast can be captured by applying the test suggested by the philosopher Imre Lakatos for differentiating between scientific and pseudoscientific approaches to knowledge claims. This is to ask whether the approach generates novel hypotheses that come to be confirmed. As I have shown before (Hayward 2021), the working group’s analysis has done this. What we find within the counter disinformation community, by contrast, is no novel prediction but mainly just the endless repetition of ad hominem slurs. What is surprising is that even as attacks continue from sections of the media, they still succeed in enlisting the support of individuals with academic affiliations. So, for instance, we find yet another HuffPost article by Chris York 29 January 2020 aiming to discredit the Working Group, this time in the wake of leaked material that confirmed the Group’s grounds for concern about the OPCW report on Douma, for it showed that OPCW’s own inspectors accused their managers of fraudulently editing it. York quotes University of Maryland research associate, Caroline Orr, introduced as a behavioural scientist who specialises in disinformation networks on social media. She refers to the group’s work on this case as ‘the latest in a long line of misleading and defamatory claims from the “working group”.’ Without indicating any specific claim which might be misleading, let alone defamatory, she continues: ‘It’s done for the purpose of creating confusion …, to make people doubt the facts and, ultimately, to bury evidence of some of the most heinous atrocities and war crimes in modern history. If you look at the tactics they use, they’re straight out of a propaganda handbook.’ Another academic quoted, Kate Starbird, billed as ‘an expert in online disinformation networks at the University of Washington’, whose research on ‘disinformation’ has been funded by the US Navy, similarly seems to feel absolved from having to address any actual issues in the WGSPM briefing: ‘It has all the elements of “throw as many different conspiracy theories at the wall until we find one that sticks”.’ This trope is commonly invoked by certain members of the counter disinformation community to direct attention away from the need for actual assessment of the evidence.
The avoidance of evidence in Starbird’s work is something I have commented on before, but the fate of that earlier article is relevant to relate. My central argument there was, as here, that it is mistaken to suppose one can counter disinformation without doing any epistemic diligence to identify wherein it lies. The Harvard-based journal that published the piece I responded to, Misinformation Review, initially accepted my article for publication. Subsequently, however, referring to the challenging nature of my critique, they decided it should be published under the rubric of a letter to the editor so as to allow a right of reply to the authors of the article criticised. However, three weeks after my critique was sent out to Starbird and her colleague, I was simply informed that ‘we are unable to publish letters on our site at this time.’
Another article attacking us that we were not permitted a reply to also attracted the backing of certain academics. It appeared in the magazine Index on Censorship, which is published by SAGE with much of the appearance of an academic journal, although, on inspection, it does not appear to operate a peer-review system and provides no formal submission guidelines. It purports to be a serious publication with an important aim: ‘Index on Censorship is a nonprofit that campaigns for and defends free expression worldwide.’ Ironically, the article in question was arguing that members of WGSPM should not be enjoying freedom of expression. The author, Nerma Jelacic, communications director of CIJA (a private organisation whose name is Commission for International Justice and Accountability), another of ARK’s spin-offs, was enraged that WGSPM had published articles critical of the organisation. She did not make any specific rebuttal of anything appearing in those articles, but simply hurled generic slurs like ‘unscientific behaviour’, ‘fringe conspiracists’, ‘disinformation’, ‘lies’, ‘pseudoscientific revisionism’. Yet despite the lack of substance in the article, it attracted likes and retweets on Twitter from academics – not only Scott Lucas and Idrees Ahmad, but also Ben Gidley (London), Kelly Grotke (Cornell), Melinda Rankin (Queensland). On Twitter, Jelacic also published a Twitter thread with allegations about me ‘spewing pro-Putin/Assad conspiracies’, and even this evidence-free rant found favour with Andrea Sella (London) and Jeff Kenner (Nottingham).
Although WGSPM members were permitted no right of reply to the Index article, Peter Hitchens managed to secure one, and he wrote: ‘the tone of Jelacic’s article was that of a prosecutor at a show trial, denouncing the accused to a tribunal whose verdict was not in doubt. … Abuse of this kind is not debate.’ Unlike Jelacic, Hitchens attends to the substance of OPCW whistleblowers’ claims. He reports that he ‘read the OPCW reports with care’ and made direct contact with OPCW inspectors who showed him ‘that something disturbing was indeed happening.’ He adds:
‘These decent, honest, non-political people have an important story to tell. … They are beyond doubt intimidated by the sort of language Jelacic and others employ in this controversy. They are, in a way, censored. I have little doubt that the failure of many reporters to examine what they say impartially and properly is a direct consequence of the hurricane of abuse (“War crimes denier!” “Assad apologist!” “Tool of the Kremlin!”) which is hurled at anyone who dissents from the official line. Had these methods of control been as well-developed in 2003 as they are now, we might never have known that Hussein had no WMDs. You can believe that this form of discourse is a permissible weapon. Or you can believe that censorship is wrong. But you cannot believe both.’
Hitchens’ intervention here goes to show that journalists with integrity – whatever their politics, and Hitchens declares himself no political ally of Grayzone or WGSPM members – can have something to teach academics about epistemic diligence. His words here touch on a motivation that transcends political divisions: those of us who endorse the OPCW whistleblowers’ appeal to be heard – and who support such whistleblowers in any organisation created to serve society or the international community, especially against the avoidable evils of war – believe that the only way to fight for the truth is by allowing open discussion of evidence and argument. From this perspective, counter disinformation warriors appear so confused about their own declared objective that they lean towards supporting censorship instead.
Conclusion
It is hard to know how serious a problem disinformation really is. Little research has helpfully addressed this question. We are, nevertheless, increasingly deluged with messaging from governments and corporate media about how disinformation is a grave threat to security, democracy and the public good. Ever stronger measures for dealing with it are being proposed and implemented – yet these are applied without regard to the need for the open expression of reasonable disagreement that should be at the heart of public debate as well as of science and scholarship. We are moving into a situation where dissent from officially sanctioned opinion can be pathologized and criminalised. Pressures for increasing censorship and crackdowns on dissenting opinion are fast becoming normalised.
To find some academics being drawn into this disturbing process, when they are precisely the community that should be most carefully assessing the real situation, is profoundly worrying. My purpose, in publishing this feedback, is to urge academics more generally – and universities as institutions of higher learning – to stand firm in support of those fundamental principles of science and scholarship that require dissenting opinion to be openly discussed and not stigmatised or suppressed. The public should be able to rely on universities to help identify the reliable information that is required for dealing with deceptive or misleading claims that may be in public circulation.
(Featured Image: “graduation caps” by j.o.h.n. walker is licensed under CC BY 2.0.)