There are many reasons to be concerned about the #OnlineSafetyBill, the latest manifestation of which has just been launched, to a mixture of fanfares and fury. The massive attacks on privacy (including an awful general monitoring requirement) and freedom of speech (most directly through the highly contentious ‘legal but harmful’ concept) are just the starting point. The likely use of the ‘duty of care’ demanded of online service providers to limit or even ban both encryption and anonymity, thereby making all of us less – and in particular children – less safe and less free is another. The political control of censorship via Ofcom is in some ways even worse – as is the near certain inability of Ofcom to do the gargantuan tasks being required of it – and that’s not even starting on the mammoth and costly bureaucratic burdens being foisted on people operating online services. Cans of worms like age verification and other digital identity issues are just waiting to be opened, without their extensive downsides being even mentioned. And that’s not all – it’s such a huge and all-encompassing bill there are too many problems with it to mention in a blog post.
All that, however, misses the main point. Why are we even doing this? Do we even need an Online Safety Bill?
The main reasons the government seem to be doing this are based on what is a kind of classical misunderstanding of the internet. In my 2018 book, The Internet, Warts and All, I wrote about how the way we look at the internet overall impacted upon how we thought it should be regulated. The net is a complex, messy and confusing place at times – it has many warts. The challenge is to see it warts and all: to look at the big picture, to see the messy reality, and approach it accordingly.
Some people don’t even see the warts, so don’t think anything needs to be done – we should leave the internet alone, let it regulate itself. Others see only the warts, and miss the big picture. That’s what lies behind the Online Safety Bill. An obsession with the warts, and a desire to eradicate them with the strongest of caustic medicine, regardless of the damage to the face itself. That’s the view of the internet as a ‘Wild West’, full only of trolls and bots, ravaged by abuse and misinformation, where no-one dares roam without their trusty six-shooter.
The thing is, it’s just not true. Almost all the time, for the vast majority of people, the internet is something they use without much problem. They work, they shop, they get their news and their entertainment, they converse and socialise. They find romance. They buy their cars and homes – not just their books and groceries. They live. The internet does have warts – and no-one should underestimate the impact of trolling or misinformation in particular (there’s a chapter on each in The Internet, Warts and All) but neither should we forget what the internet really is.
If we see only the warts, we end up with disastrous legislation like the Online Safety Bill. If we see the warts, but treat them as warts, we have a chance to do regulation more reasonably, and not do untold damage on the way. As an example, the inclusion of cyber flashing in the bill is very welcome. It’s a wart that can be treated, and without anything in the way of negative consequences. Smaller, piecemeal legislation dealing with particular harms is a far more logical – and effective – way of dealing with the problems we have on the net than grand gestures like the Online Safety Bill, which will almost certainly do far more harm than good.
The latest manifestation of the much heralded Online Safety Bill is due to make its appearance tomorrow. It’s a massive bill, covering a wide range of topics and a huge number of issues about what happens online – and yet there’s a gaping void at its heart, a void that means that it will have almost no chance of succeeding in any of its key aims.
There are many things that should worry us about the Online Safety Bill. The vagueness of the ‘duty of care’ that it imposes on online service providers. The deliberately grey area of ‘harmful but legal’ content. Its focus on content rather than behaviour (which means it misses a massive amount of trolling, bullying and hate). The inevitable inadequacy of Ofcom as a regulator for something it knows very little about – clever trolls and others will run rings around it, and will even take joy in doing so. And, indeed, its aim – why do we want the U.K. to be the safest place to be online rather than the most creative, the most productive, even the best place to be online?
All that is vital, and most of it has been written about by people much more expert than I am in the field. That, however, is not what this piece focuses on. This is about something rather different: a blind spot at the heart of the bill. For all its focus on online harms and online safety, the bill misses how a great deal of the harms take place – because those harms come from the people behind the bill itself. It is easy to focus on evil, anonymous trolls and bots, and on hidden Russian creators of fake news – they’re convenient enemies, particularly right now – but at the heart of a great deal of harm are people very different: mainstream politicians and journalists. Blue tick accounts. The Press. The Online Safety bill says almost nothing about them, and as a result it is highly unlikely to have any kind of success, except on the periphery.
Trolling begins at home
Everyone hates trolls – indeed, the idea that the internet is full of evil trolls was one of the reasons behind the whole online harms approach – but they rarely think the whole thing through. What is generally considered to be trolling encompasses a lot of different activities – but most people’s ideas of what a troll looks like seem to be relatively consistent. Sad, angry, anonymous people – images like furious men tapping away at their keyboards in the basement of their parents’ homes are very common. There is of course some truth in this kind of image – but it’s a tiny part of the picture. Indeed, it’s very much a symptom rather than the disease itself.
Two factors are rarely discussed enough. One is the observation that many (perhaps most) trolls don’t consider themselves to be trolls. Indeed, very much the opposite: they consider their enemies to be trolls, and they themselves are either the victims of trolls or the noble warriors fighting against evil trolls. This is true not only of those debates where there is some kind of relative equality of argument or of power, but of those where to most relatively neutral observers there’s clearly a ‘good’ side and a ‘bad’ side.
The other is to ask how trolls find their victims. How they choose who to target, who to victimise, who to abuse. One of the most direct ways is through a pile-on. That is, someone points at a potential victim, saying ‘look at this idiot,’ or words to that effect, hinting that they deserve to be attacked. When the person pointing has thousands (or millions) of followers, those followers then pile on to the victim.
Who’s the troll here? The big account who just said something relatively innocent (‘look at this idiot’) or the followers who add the abuse, the racism, the misogyny, the death or rape threats? The big account stands back, claiming innocence, and pretending that the trolls had nothing to do with them. And of course those big accounts can be politicians or journalists – indeed some of the worst pile-ons are instigated by the biggest and most mainstream of accounts. MPs. Journalists from big newspapers or broadcasters.
That’s not the only kind of trolling that MPs and journalists engage in – without recognising or acknowledging that it is trolling. Indeed, the minister responsible for the Online Safety Bill, Nadine Dorries, has herself been called out for what many would describe as trolling. And yet she would vehemently deny being a troll – and believe that she is right in doing so.
The trouble is, not only are these kinds of activities by MPs and journalists actually trolling, but they’re much more dangerous trolling than that of the small, anonymous accounts that people tend to focus on. One relatively innocent tweet by someone with 100,000 followers can bring about thousands of vicious attacks. If we want to deal with the viciousness, we need to look at the big accounts, and at the structural trolling that goes on as a result. The Online Safety Bill does nothing for that at all – because it would mean both challenging the whole structure of social media and admitting the role that politicians themselves play in the online harm they claim to be dealing with.
Fakery begins at home
It’s a similar – or even worse – story with harmful misinformation. Again, the pantomime villains are Russian trolls, creating fake news in troll farms outside St Petersburg. These, of course, do exist – but again, they’re just a small part of the picture. As I’ve written before, mainstream politicians such as Jacob Rees-Mogg employ some of the same tactics and methods of those we usually think of as spreading fake news – and he’s far from alone. Fake news and other forms of misinformation do not exist in a vacuum – very much the opposite. Fake news works when it fits with people’s existing prejudices and biases, when it confirms what they already think. So, to make fake news work, you create it to fit in with those prejudices – and you twist reality to fit with those prejudices.
If this sounds familiar, it should. Fake news isn’t something new, it’s just a new manifestation of the techniques employed by politicians and (particularly tabloid) journalists ever since politics and journalism has existed. Of course neither the politicians or journalists would be happy to acknowledge this. ‘Spin’ sounds much better than misinformation. And yet the relationship is very close. Spin helps create a fake narrative that is every bit as damaging as actual fake news – and far harder to detect, disprove or oppose.
As with trolling, the effect of all of this is much greater if the accounts spreading it have both credibility and large numbers of followers. That means that the ones that matter are the big, blue tick accounts rather than the dodgy anonymous trolls – and again, the structure of social media that allows information to be spread so rapidly via those big, blue tick accounts. And again, this is not the focus of of the Online Safety Bill. Safer to focus on the obviously villainous than acknowledge our own role in villainy.
Who gets a free (press) pass?
One final thought. If the Online Safety Bill gets passed – and it almost certainly will – it will mean that the press is the only bit of the media that is not regulated. Broadcast media has had statutory regulation for a long time – with Ofcom as the regulator. After the Online Safety Bill, the same will be true about social media. And yet those of us with memories long enough to remember the Leveson Inquiry will remember the vehemence with which the press resisted any idea of statutory regulation of the press, as though it were an intolerable affront to free speech.
I don’t think they were necessarily wrong – but they should be clear that statutory regulation of social media is every bit as much of an affront to free speech. Indeed, in many ways a worse one – as it is the ordinary people, rather than the relatively privileged peoples who run the newspapers and magazines, whose free speech is being curtailed. That ought to matter.
A gaping void
As it is, the Online Safety Bill looks likely to attack the symptoms rather than the causes of online harms. Unless it finds a way to address the underlying problems – and to confront the massive blind spot it has for the role of politicians and journalists – it will be just yet another massive game of Whac-A-Mole, doomed to failure and disappointment.
That, frankly, is what I expect to happen. The bill will be passed, everything will trumpet how we’re finally taming the Wild West, but nothing will really happen. Trolls will continue trolling – new ones replacing those who do get caught – and misinformation will continue to spread. The powerful will still be unscathed, and the hate will still spread. And a few years later we will have another go. With the same result.