top of page

Disinformation, or Debating with a Bot

Richard Sandbrook is professor emeritus of Political Science at the University of Toronto and President of Science for Peace

Contributed article for the Working Group on Nonviolent Resistance

Image: via Wikimedia by Maqa001

“Disinformation” undoubtedly exists as a form of warfare in this contentious age of artificial intelligence. But how do we know disinformation when we come across it? The obvious danger is that officials and activists will dismiss strongly opposing views as disinformation, not to be taken seriously, or, at worse as potential sedition to be investigated.

An article in the March 30th (2023) Globe & Mail exemplifies this danger. Entitled “Pro-Kremlin Twitter Accounts ‘Weaponizing’ Users to Erode Canadians’ Support for Ukraine, Study Finds,” the article suggests that 200,000 Twitter accounts have been established in Canada that propagate the Kremlin’s line on Ukraine. The purported aim of this campaign is to undermine Canadian support for the Ukrainian government. Three centres, two at universities, conducted the study, which was supported financially by the Canadian and United States governments. Allegedly, the disinformation campaign succeeded to the extent that both “far right” and “far left” Twitter accounts extensively “shared’” the disinformation on their own networks.

What is the evidence for the impact of this alleged disinformation campaign? The article notes that 36 per cent of respondents in a recent Canadian survey believed that NATO was responsible for the war, or were unsure. Clearly, the writers of the report (entitled “Enemy of My Enemy”) believe that NATO holds no responsibility for the war. To uphold the opposite view is tantamount to disinformation or being uninformed.

And yet the idea that NATO provoked, or at least did not act to prevent, the war is not far-fetched or merely Russian propaganda. Neither side is blameless. NATO portrays itself as a strictly defensive club of democracies that share basic values and are compelled to admit other states that ascribe to them. Yet all military alliances need an enemy to justify their existence. The external threat that gave rise to the alliance – the USSR and the Warsaw Pact - disappeared in 1991, as should have NATO. NATO lost an opportunity in the 1990s, when the Soviet Union expired, to build bridges to Russia. NATO’s expansion to the borders of Russia and its willingness to arm and contemplate future membership of Ukraine in NATO were provocations. Russia considers Ukraine as within its cultural and political sphere of influence; Russian fears of NATO’s intentions are thus not unreasonable. However, Russia for its part has made aggressive moves in Georgia, Moldova and elsewhere that have rekindled the fears of former client states of the Soviet Union. The brutality of its invasion of Ukraine is unjustifiable. Neither side is without some responsibility for this dreadful and unnecessary war.

Consequently, debates are raging in academic and activist circles over apportioning responsibility for the conflict. Moreover, the left (contrary to the report’s view of a homogeneity of views) is split in its interpretation of the war, leading to many acrimonious exchanges.

It might be argued that the report itself is an example of disinformation. The article that describes the report provides no criteria for distinguishing the Twitter accounts engaged in disinformation, other than that the views they propound cohere with themes of the Russian narrative. The report veers toward the Red Scare techniques of the Cold war. But let’s be clear: you can advocate the view that NATO shares part of the blame for the war without in any way being part of Russia’s disinformation campaign.

Having said this much, I must also state that disinformation does exist. I know because I have debated with a bot. On a Canada-wide network of peace activists, I have been critical of those who jump to the conclusion that the war in Ukraine is a NATO war. Bots entered the intellectual fray, but one of them was very poorly programmed. It acted like a parody of a bot: in one message, claiming to speak on behalf of the ostensibly Western-oppressed “Third World” (this archaic term itself being a tip-off); in the next as an outraged representative of the people of the Donbass, allegedly brutalized by the “Nazi” government in Kyiv. It was over the top in the most extreme version of Russian disinformation and bombast. It was a stupid bot. Bots like this give artificial intelligence a bad name.

The global situation is complex. Yes, there is disinformation undertaken by Russian agencies. But let us be careful not to equate disinformation with any ideas that conflict with the dominant Western narrative. If we allow that to happen, we will find ourselves returning to the era of US Senator Joseph McCarthy in the 1950s, with his “unAmerican activities committee” rooting out fellow travellers of the Russians.

Those in Canada who dissent on the war are not “unCanadian” and rarely are they engaging in disinformation. We may disagree with their positions. Yet that disagreement is a healthy aspect of our democracy. And yes, disinformation and bots are real; but some of the bots are laughably stupid.


bottom of page