Refresh

This website www.globalgovernmentforum.com/defeating-disinformation-how-to-create-a-healthier-national-conversation/ is currently offline. Cloudflare's Always Online™ shows a snapshot of this web page from the Internet Archive's Wayback Machine. To check for the live version, click Refresh.

Defeating disinformation: how to create a healthier national conversation

By on 11/02/2024 | Updated on 12/02/2024
Photo courtesy Produtora Midtrack via Pexels

Bad actors are poisoning the information well from which populations drink in the democratic world. In the final part of our investigation into overseas interference in elections, Matt Ross asks how those waters can be cleansed – and how electorates can be helped to spot tainted drinks

Over recent years democratic nations have been subjected to a barrage of influence operations, targeting their elections and, more broadly, their national conversations. In the first three sections of this five-part report, we profiled the problem; in the fourth, we explored how public servants can improve monitoring, intelligence and transparency to respond. In this fifth and final report, we’ll consider what Michael Wernick, former secretary to the cabinet of Canada, calls “the most urgent task over the next two or three years”: that of improving “the information space within which our democracy operates”.

In recent years, says Wernick, social media has “devastated the economics of conventional journalism. There’s smouldering ruins left of the old legacy media companies”. With an ever-growing proportion of the population getting their news from social media, he believes, the tech giants’ argument that they’re not responsible for the material they distribute is becoming “a less and less tenable argument as the 2020s unfold”.

Meanwhile the threat posed by online disinformation is becoming ever greater, points out security expert Elisabeth Braw, a senior associate fellow at the European Leadership Network and a member of the UK’s National Preparedness Commission. “We face the onslaught of AI-fuelled words, images, videos, audio,” she says: if the use of AI in social media isn’t controlled, “then we’ll have total informational chaos, and people will absolutely lose faith in the public debate because they’ll have no way of verifying information”. In her view, the social media firms have been “slow and inattentive” in clamping down on the disinformation transmitted via their platforms. “I think now legislators have a right to say it just hasn’t worked,” she argues. “They have to be treated as publishers and not just as facilitators.”

Legislating for regulating

Nina Jancowicz

Nina Jancowicz, vice president of the Centre for Information Resilience and former director of the US government’s Disinformation Governance Board, instinctively resists the idea of regulating social media. Nonetheless, she warns that over the last year or two we’ve seen “a rollback of a lot of the self-imposed regulation” by the platforms: YouTube has reversed its policy on disinformation, for example, while Twitter has abandoned its user verification process and slashed its content moderation operation. “So I think it’s time for some more direct oversight,” she concludes, suggesting “some sort of oversight commission that looks at the content moderation decisions the platforms are making” – perhaps checking decision-making for consistency, and releasing data on performance.

Jancowicz will be “watching closely” as the UK implements its new Online Safety Act, she adds: this focuses on protecting children from harmful content and preventing people from sharing intimate images of others without consent. Meanwhile, in Australia the Labor government has picked up its predecessor’s Combatting Misinformation and Disinformation Bill, which will give its communications regulator the power to impose a code of conduct on social media platforms that consistently carry harmful disinformation.

Read more: Australia’s media regulator set for new powers to tackle internet misinformation

Like the UK Act, the EU’s Digital Services Act also pays careful attention to protecting minors, while requiring the tech giants to combat electoral interference and giving users more control over what they see online. The Act gives social media firms an “obligation to deal with the risks that exist on their own platforms,” says Lutz Güllner, head of strategic communications (foreign information manipulation and interference) at the EU’s European External Action Service. Meanwhile, his team operates an EUvsdisinfo website that identifies and challenges disinformation campaigns – focusing in particular on Russia, and offering “a regular update on trends” in the field.

Rising awareness

The emergence of legislation on disinformation reflects a growing awareness of the danger it poses. “There’s so much attention to these issues, in the United States and elsewhere, that we’re better prepared to address these sorts of operations than we were in 2016 – when we were completely caught with our pants down,” says David Salvo, a former State Department security policy advisor and managing director of transatlantic campaign group the Alliance for Securing Democracy.

In particular, Russia’s invasion of Ukraine has lost the country many of its western sympathisers. “Ukraine has been a big wake up call,” comments UK MP Ben Bradshaw, who is taking the UK government to the European Court of Human Rights over its failure to commission an independent investigation into Russian interference. “We’re in a slightly better place than we were, in that we’re alive to the threat.” Russia’s focus on fighting a war in Ukraine also means that “their bandwidth is a little bit stunted,” adds Salvo – blunting the country’s disinformation operations.

So legislators are acting to combat disinformation; but unless their populations understand how to sift the true from the false, they will always be fighting an uphill battle. People need training in “information, and how we consume it,” argues Jancowicz, calling for more public education in electoral processes, the operation of government, how news is produced and, crucially, how social media platforms operate. “People don’t understand how algorithms work,” she says. “They don’t understand that your news feed isn’t chronological; that you’re being fed certain content based on what’s going to be most engaging for you.”

Public education

Elisabeth Braw agrees. “This can’t just be something for the government,” she says. “If we are to be more resilient to disinformation and misinformation, we have to have public awareness campaigns where people are taught how to verify information. And this isn’t just about children; it’s about training people above the age of 18. Most of us have no idea how to verify information; how can we possibly have an informed debate?”

This kind of work has shown its worth in countries such as Finland, Sweden and Estonia, says Jancowicz: “It does pay off,” she says, urging civil servants to support it. “It’s a hard sell to politicians, because they can’t point to it at election time and say: ‘I did this!’,” she comments. “But civil servants are in this for the long run.” In her view, public education is best managed by community-based organisations rather than national governments or schools. “It shouldn’t be a guy in a suit from Westminster,” she says. “It should be libraries, civil society organisations that know their communities.”

Benjamin Fung

Benjamin Fung, professor of information studies and Canada research chair in data mining for cybersecurity at Montreal’s McGill University, argues that diaspora communities can also play an important role in pushing back against the narratives promoted by autocratic leaders overseas – challenging the idea, for example, that actions to combat overseas interference are themselves racist. “If more Chinese people come out and criticize the humanity violations done by the [Chinese Communist Party], that is the best way to fight against racism, because then the local Canadian community will understand that the Chinese community actually share the same universal values,” he says.

Put out the fire

If democratic governments can find ways to challenge disinformation, identify foreign actors operating within their national debates, rein in the social media firms and boost their own people’s ability to interrogate and test information, they’ll be well placed to combat the influence operations run by rival countries. And to some extent, the tide has already turned: since the 2016 high water mark of Russia’s interventions, “their activities have declined,” says Braw. “There hasn’t been anything as ambitious.”

While the barrage of ‘hack and leak’ and election system cyber attacks is declining, however, this may reflect Russia’s success rather than its failure. “They realised that, at least in the States, there’s so much chaos and so much distrust already that they don’t need to overplay their hands,” comments Salvo. “I think they’ve been happy to stick to the information operation space, and just continue to pour fuel on the fire that way.”

David Salvo

That may change over the coming year, he believes: with elections scheduled in some 50 countries – including the USA, UK, India, Mexico, Russia, Pakistan and nine EU nations – over the coming year, the stakes are high. “Looking ahead to 2024, I think they have much more motivation to be more heavily involved,” says Salvo. “It would not surprise me to see them revert to some of the tactics they used in 2016.”

But even if – as Salvo puts it – they “stick to the information space as a vector for malign influence,” Russia’s operatives can easily create a huge amount of controversy and disruption. For the fires fuelled by Russia’s interventions are now consuming domestic politics and societies from the inside, creating a major new challenge for elected leaders and officials across the democratic world: that of calming the debate among their own politicians, campaigners and commentators, facing down the bad actors who seek to gain political advantage by stoking anger and division.

“We could hermetically seal our social media platforms against every Russian bot, but we’re not going to be fully safe until we recognise that disinformation in any form – actor agnostic – is bad for democracy,” says Nina Jancowicz. “We’re as strong as our weakest link; and right now, that link is the discord and polarisation that we see taking root and being normalised as part of politics in the 21st century.”

This is the fifth and final part of our report into the attempts by government-backed actors – particularly in Russia and China – to influence election outcomes and national debates in the democratic world, with the use of tools including disinformation campaigns, election hacking and party donations.

Read parts 1 to 4:
On Russia’s goals: Organised chaos: how Russia weaponised the culture wars
On Russia’s tools: Russia’s elections toolkit: dollars, disruption and disinformation
On China’s operations: A subtle opponent: China’s influence operations
On combating foreign interference via intelligence work and transparency rules: Knowing and showing: how intelligence and transparency can combat electoral interference

About Matt Ross

Matt is Global Government Forum's Contributing Editor, providing direction and support on topics, products and audience interests across GGF’s editorial, events and research operations. He has been a journalist and editor since 1995, beginning in motoring and travel journalism – and combining the two in a 30-month, 30-country 4x4 expedition funded by magazine photo-journalism. Between 2002 and 2008 he was Features Editor of Haymarket news magazine Regeneration & Renewal, covering urban regeneration, economic growth and community development; and from 2008 to 2014 he was the Editor of UK magazine and website Civil Service World, then Editorial Director for Public Sector – both at political publishing house Dods. He has also worked as Director of Communications at think tank the Institute for Government.

Leave a Reply

Your email address will not be published. Required fields are marked *