I thank the grand committee for the opportunity to attend. I will focus my comments on electoral interference and large-scale disinformation operations because these are what I study on a regular basis.
This is a vast and fast-moving problem set. According to the Oxford Internet Institute, 70 countries are now reported to be running organised social media information operations, up from 48 last year. We do not have enough data to prove whether this stems from a rise in operations, a rise in reporting or both. Either way, it indicates a global phenomenon. Most of these operations are aimed at domestic audiences, but we must remember that the Russian operation that targeted the US from 2014 onwards also started out by targeting the domestic opposition.
The evidence suggests that a state that has the capability to run domestic information operations can quickly pivot to external targets if the political need is there. Russia did so in 2014. Saudi Arabia did so after the murder of Jamal Khashoggi. China did so when the Hong Kong protests began. Nor is this limited to state actors. We saw the far right in Britain and America trying to interfere in the French presidential election in 2017. These operations do not solely play out on social media platforms. They also include websites and television stations. They can include on-the-ground events and local activists, some of whom are unaware of the role they are playing.
All of these problems are underpinned by a perception that online information operations are easy, cheap, effective and profitable. Since 2016, the narrative has emerged that Russia managed to tip the balance in the US election by running social media trolls. That narrative is significantly flawed, but it has caught on.
Unscrupulous marketing companies around the world are promising to "change reality" for their political clients through social media campaigns. Fake amplification on social media is very cheap. One can buy a three year old YouTube channel with videos already uploaded for just $1. Domestic actors on both sides in the US have experimented with Russia’s playbook.
However, we also know that the environment in 2019 is much less permissive than it was in 2016. The platforms, law enforcement and open source researchers are all actively hunting influence operations online. The rate of takedowns has accelerated dramatically since early 2018. Over the past 12 months, we have seen more than 50 takedowns just from Facebook, covering operations from some two dozen countries. That has forced interference operations to sacrifice engagement to stay concealed.
In this environment, I bring four urgent needs to the committee's attention. These are not the only four, but they are the areas where parliamentary work can have most immediate impact. First and of most direct relevance to elections, parliaments and political campaigns must urgently improve their own cybersecurity to prevent the sort of hack-and-leak operations that Russia used to such devastating effect in 2016. This is not a duty that can be passed on to the social media platforms. Every parliament and every campaign should ensure that all its staff have cyber training and contingency plans in place. This is expensive and many campaigns will argue that the money would be better spent on ads, but it is much less costly than seeing their emails leaked to the press a week before the election.
Second, we do not yet have a deterrence system in place. We have seen individual nations react to interference attempts, but we do not have a credible system for imposing unacceptable costs on foreign actors who attempt to interfere in elections.
Third, we need legislation that imposes systematic costs on the commercial operators who sell fake accounts or hire out their interference campaigns. Two weeks ago, we saw the first case of the Federal Trade Commission fining a US company for selling fake engagement. Social media companies can, and do, ban such operators from their platforms, but they cannot impose a direct financial cost. The black market in fake accounts is the easiest place for hostile actors to buy their assets, as China demonstrated over the Hong Kong protests.
Fourth, parliaments should lead discussions on how to reduce polarisation online, both through regulation and through education. This is a long-term challenge, but we should always remember that if we did not have domestic trolls, the foreign trolls would not have anyone to pretend to be. Such discussions will require technical analyses of how the platforms' algorithms suggest friends and content for users, but they will also require social analysis of how users select their online identities and tribes and how they can be encouraged to broaden them. Every human has a complex pyramid of identities in real life, but the dynamics of social media often reduce that to one-dimensional tribalism. If our parliaments can work across party lines and lead the debate on how to reverse the spiral of ever narrower partisan groups online, that would mark a step towards reducing the scope for online polarisation and online interference.