In many countries over the past few years, the political process – and social cohesion – have been threatened by various forms of disinformation, sometimes misleadingly and inadequately called “fake news”. Politically-motivated and for-profit disinformation is blamed, among other things, for the UK’s decision to vote to leave the EU and the election of Donald Trump as US president.
Disinformation takes many forms and is driven by many factors. Foreign states sometimes try to subvert other countries’ political processes. People publish false and fabricated information masquerading as news for profit. Domestic politicians lie to their own people – and sometimes these lies are amplified by news media, by hyper-partisan activists, or spread far and wide via social media and other platforms.
These different problems are serious – and many have called on public authorities to tackle them. The question is how? Only a small part of what we encounter online is clearly demonstrable as simply true or false, and much of what ordinary people think of as “fake news” is simply forms of poor journalism or partisan political debate. In diverse societies, where we disagree deeply about many important issues, disinformation is hard to define clearly and objectively. As a result, government responses are difficult to target precisely.
Despite this, some are reaching for content regulation – trying to ban “fake news”. Others are tasking law enforcement – or even the military and the security services – with combating disinformation. These are “hard power” responses – based on the state’s ability to command, its ability to act directly. They are also often problematic responses, especially when the target remains unclear.
Content regulation of material that – while perhaps problematic and uncomfortable – is often part of political debate, smacks of censorship and is at odds with freedom of expression. Asking the executive branch to directly police acceptable speech is in direct tension with citizens’ fundamental right to receive and impart information and views without interference from public authorities. To demand technology companies police speech on their platforms without clearly defining how exactly they are supposed to do so – and who citizens can appeal to – is simply privatising the problem.
With many of these responses, the risk is that the cure may be worse than the disease.
Power: hard and soft
Luckily, the alternative to “hard power” responses is not to do nothing – even in the US, few believe that the market alone will solve the problem. Clearly we should act to protect our open societies and permissive and plural media environments against those who want to abuse and undermine them. The alternative to crude hard power responses is a soft power approach.
The term “soft power” was coined by the American international relations scholar Joseph N Nye to capture kinds of power that aim at creating a situation where a range of different actors cooperate in addressing a problem, often through multilateral action. It stands in contrast to older forms of “hard power” more directly applied, often unilaterally.
In foreign affairs, soft power is building a coalition to stop Iran from developing nuclear weapons. Boring and complex, yes, but so far successful. Hard power is the Iraq invasion. More dramatic and immediately gratifying for those who strongly believe “something must be done”. But the collateral damage is much higher, and success no more certain.
Hard power forces actors to do (or not do) specific things. Soft power rewards them for constructive collaboration. As Nye has pointed out, in an ever more complex world characterised by greater and greater interdependence, soft power is increasingly central to how we approach the most important problems of our time: climate change, migration, nuclear proliferation.
Today, Europe has a chance to show that soft power also provides an effective response to disinformation. Trying to define – and ban – “disinformation” would be problematic. A better approach by far is for the European Commission and EU member states to encourage and support collaboration among the different stakeholders who are all challenged by different disinformation problems. This should start from a joint commitment to freedom of expression and the right to receive and impart diverse information and views.
If civil society organisations, news media, researchers and technology companies work together, we can increase resilience to disinformation by investing in media and information literacy, increase the supply of credible information, better understand the threats at hand, limit the dissemination of harmful information online, and help people find quality news.
European commissioner for digital economy and society Mariya Gabriel and Madeleine de Cock Buning from Utrecht University. EPA-EFE/Olivier Hoslet
Meanwhile, the role of governments and institutions such as the European Commission in such a soft power approach should be to encourage and support collaboration to counter disinformation and increase resilience – not to try to use hard power to directly crack down on a poorly defined and perhaps necessarily unclear problem.
Like many other soft power strategies, this sounds complex and does not generate headlines like unilateral actions such as the US Congress’ commitment of US$120m to combat Russian propaganda, or public authorities doing their own fact-checking has done.
For a soft power approach to disinformation to work it is critical that all stakeholders do in fact work together – and that public authorities primarily focus on rewarding such collaboration. This is precisely the kind of approach that the recently published EC report on disinformation calls for.
If it fails, cruder responses may be the only ones left. But let’s hope not.
This opinion piece originally appeared in The Conversation on 13 March 2018
This opinion piece reflects the views of the author, and does not necessarily reflect the position of the Oxford Martin School or the University of Oxford. Any errors or omissions are those of the author.