The Snoopers’ Charter: we need a new consultation

The Communications Data Bill – more commonly (and fairly accurately) known as the ‘Snoopers’ Charter’ is due to re-emerge at any moment. We have been expecting it for some time – and yet have seen nothing official, and there has been no sign of a proper public consultation on the subject. That, to me, is wholly inadequate – so I have written to the Home Office, copying in my MP, Dr Julian Huppert. The contents of the letter are below. If you feel strongly about this matter – and I hope you do – you could add to the pressure to have a proper public consultation by using the Open Rights Group’s system, which can be found at:

http://www.openrightsgroup.org/campaigns/snoopers-charter-consultation

Here’s what I wrote – it is fairly long, but still only scratches at the surface of what is wrong with the overall approach to surveillance put forward in this bill:

————————————————————

Dear Home Office

Re: Draft Communications Data Bill

I write to you as a legal academic, specialising in data privacy, and as a member of the public with significant concerns over the progress of the Communications Data Bill. In my opinion we need a consultation – and a public and open consultation – on the Bill for many reasons.

The media storm – and the eventual and significant criticism levelled at the bill by the Parliamentary Committee hastily convened to discuss the bill the first time around should have made it clear to the proponents of the bill that there is a huge public interest in this area. That has a number of implications:

  1. That the criticisms levelled at the bill need to be taken seriously.
  2. That all the interested groups, including advocacy groups and academics – and indeed the public – need to be talked to up front, not after all the work has been done.
  3. That a ‘fait accompli’ is not an acceptable solution
  4. That the level of ‘proof’ provided by those putting the bill forward needs to be much more convincing – and much more open – than what has been provided to date. It is simply not sufficient to say ‘it’s dangerous and something must be done’, or ‘we can’t tell you why, but we need to do this’.

Those of us interested in the Bill have been waiting for the consultation to begin – there have been leaks to the media at intervals suggesting that it would start soon, but so far nothing has been made official or public. That is both unfortunate and ultimately unacceptable. We need that consultation to begin soon, and in an open and public way.

A targeted rather than universal approach to surveillance

Though in my view the Parliamentary Committee did a very good job in scrutinising the bill and in reviewing the huge amount of evidence submitted to it, there are a number of areas that I do not believe were sufficiently considered.

These areas hit at the very essence of the approach adopted by the bill. The whole idea of a ‘gather everything for later scrutiny’ approach misses many of the fundamental risks attached to this kind of surveillance: the risks of data and system vulnerability, of function creep, and of system misuse. Much of the evidence submitted to the committee that scrutinised the Communications Data Bill examined these risks – but the committee did not, in my opinion, see quite how fundamentally they undermined the overall approach of the Bill. Nor, in my opinion, did they look sufficiently into a genuinely alternative approach.

That alternative is to go for a targeted rather then universal surveillance. This kind of approach can significantly reduce all of these risks. Putting both the warranting and the filtration systems before the gathering stage rather than the accessing stage would reduce the amount of data that is vulnerable, make the systems harder to misuse, and reduce the likelihood – or the impact – of function creep. It is closer to the concept at the heart of British justice: that people are innocent until proven guilty.

1     Risks of data and system vulnerability.

It is a fundamental truth of computer data that wherever data exists and however it is held it can be vulnerable – to hacking, to accidental loss, to corruption, to misinterpretation, to inappropriate transfers and to many other things. By gathering all the communications data, this approach sets itself up for disaster – it is like painting metaphorical signs saying ‘hack me’ on the databases of information. If you build it, it will be hacked.

What is more, it is not only data that’s vulnerable but systems – if ‘black boxes’ are installed at ISPs, those black boxes will also have the metaphorical ‘hack me’ signs upon them. If you make a back door, the bad people as well as the good people can come in. It doesn’t matter how secure you think your system is, it can be broken into and hacked.

The government doesn’t have an inspiring record in terms of keeping data secure – from the Child Benefit data discs and the MOD laptops to the numerous NHS data breaches – but this is not really so much a reflection of government inadequacy to an underlying truth about data and systems. Stories of hack and data losses are in the news almost every day – and even those with the greatest technical ability and the greatest incentives to keep data secure have been victims, from Swiss banks to pretty much every major technology company. Facebook, Apple, Microsoft and Google have all fallen victim over recent months.

Ultimately, the only data that is not vulnerable is data that doesn’t exist at all. Furthermore, the only systems that can’t be hacked are systems that don’t exist. If targeted rather than universal surveillance is used, then the vulnerability is enormously reduced.

2      Risks of Function Creep

When data is gathered, or systems are built, for a specific purpose, that purpose can very easily end up being changed. This is a phenomenon particularly common in the field of anti-terror legislation and systems. Most people are aware of RIPA having been used for such things as dog fouling and fly-tipping, of CCTV cameras ostensibly for crime prevention actually being used to check children’s addresses for school catchment areas and so forth – but these are not myths or even particularly atypical. This has become something close to a ‘norm’ in this field.

There often appear to be good reasons for this function creep – not many people argued when the CCTV system for the London Congestion Charge started being used for crime prevention, for example – but it is a phenomenon that needs to be acknowledged. There isn’t really a legislative way to deal with it – caveats in laws can be sidestepped, laws can be amended in moments of ‘need’ and so forth. The only way to prevent it, as for data vulnerability, is to not build the systems or gather the data in the first place.

Again, this is a strong argument against universal data gathering – data gathered specifically, in a targeted and warranted way, presents less of a risk of function creep. Similarly, specifically designed and targeted systems are less susceptible to function creep than huge, universal surveillance systems.

3      Risks of System and Data Misuse

Another phenomenon familiar to those who study this field is that systems can be and are misused – whether it is databases searched for information about people against whom the searcher has a grudge, or collusion between the authorities and the press. The Leveson Inquiry should have made it entirely clear that such risks are not mythical – and if anyone believes that either the police and other authorities or the press have completely changed as a result of the exposure of the phone and email hacking then they are being extremely naïve.

The systems and data envisaged in this plan are particularly susceptible to this kind of misuse. The description of this kind of system as a ‘Google for communications data’ is entirely apt, and anyone who uses Google regularly should understand how easily the process of searching morphs from one thing to another. Human beings will use these systems – and human beings have human weaknesses, and those weaknesses lead almost inevitably to this kind of misuse. With universal data gathering built into the system, the database would be ripe for very serious misuse indeed.

Again, there is only one real way to deal with the possibility of system and data misuse – minimise the size and scope of the system and the amount of data involved. That, again, suggests that we need targeted rather than universal surveillance.

The Normalisation of Surveillance – and the Panopticon Chill

These are just a few of the problems that a system like this could bring about. There are many more – and I have written before about how this kind of surveillance impacts not only on privacy but on a whole array of human rights. By suggesting that universal surveillance is something that we should consider normal and acceptable we are ‘normalising’ – which has a whole set of implications.

Jeremy Bentham’s concept of the Panopticon, one with which I am sure you are familiar, is based on the idea that if people know that they’re being observed, they modify their behaviour. He envisaged it for a prison – to help control the behaviour of potentially violent and dangerous prisoners, put them in a position where they know that at any time they might be observed. In effect, this is what this kind of a system does – it tells everyone that whatever they do on the internet can and will be observed.

What will that mean? Well, it creates a kind of ‘chilling effect’ – what I would call the ‘Panopticon Chill’. It means people will be less free with their speech – and feel less free about what they do, where they go, and what they do online. That impacts upon many of our recognised human rights: freedom of expression, freedom of association and freedom of assembly to start with.

There are some who would welcome this kind of Panopticon Chill – some who think that we need to control the internet more. That, however, is to treat us as though we are prisoners in an enormous online jail. Bentham’s idea was not for ordinary peoples, but for potentially violent and dangerous prisoners. Is that how we all want to be treated? Is that the kind of society that we want to live in?

What kind of society do we want to live in?

That is the bottom line for the Communications Data Bill. Putting this kind of bill into place would be to set up precisely the kind of surveillance society that Orwell warned us of in 1984. Is that what we want to do?

There are other huge questions – not least the question of whether it will work at all. As the huge quantity of evidence submitted in the initial consultation revealed, few real experts believe that it will – anyone with expertise will be able to sidestep the system, leaving the rest of us to suffer the downsides of this kind of surveillance without the upsides even existing.

What is more, by promoting such a system in this country we not only give permission for the more oppressive of regimes to do the same (and leaders from Putin in Russia to Assad in Syria will be watching us with eagle eyes) but we would also be kick-starting the surveillance technology industry. Whichever companies win the contracts to supply the tech to enable the Bill will be looking to exploit that technology – who will they sell their systems to? What will the next systems they apply their learning and experience to? The £1.8 billion (and probably more) that the UK government spends on this will reap benefits to dictators and oppressors worldwide in coming decades.

A new draft of the Communications Data Bill?

As I understand it, only people close to the Home Office have yet seen how the Communications Data Bill will look in its new draft. I have been told that I won’t be upset when I see it – but without more information it is hard for me to be optimistic. Unless there is a fundamental change – most importantly a shift from the universal to the targeted approach, and an application of warrants and filters at the gathering rather than the accessing stage – it is hard to imagine that it will be something that I do like, and something that will inspire public trust.

A bill based on targeted rather than universal surveillance is possible. If it is brought about I believe it could not only be more compatible with human rights, send better messages to the rest of the world and be more cost effective – but it could also be more effective and less wasteful of the scarce resources of the police and intelligence services.

It does, however, require a big shift in attitudes. I hope that shift in attitudes is possible – and, at the very least, that we can have this debate in public, on reasonable and sensible terms, and without the feeling that we are being railroaded into a particular solution without any real options being presented.

A proper consultation

That, ultimately, is why we need a proper consultation. I have a perspective to present – and very particular views to put forward. I believe they are worthy of consideration – and I am very much open to discuss them. That discussion has not yet taken place. We need it to happen – and we have a right to have it happen. A proper consultation should happen now – at the drafting stage – not after everything has been already set in stone.

One real key here is that the public has not been properly informed over this debate. Initially, it appeared as though the government wanted to get this bill passed so quickly there wouldn’t even be time for Parliamentary scrutiny – it was only when the issue caused outrage that the committee was set up, and even then the consultation period was very brief and at a time when many academics in particular were not in a position to submit. We need more time – this is a crucial issue, and the public needs to have confidence that an appropriate decision is being made. The debate was characterised at times by language that should have no place in a serious debate, with opponents of the bill being accused of having blood on their hands.

This time around, we need proper consultation, with sufficient time and sufficient opportunity for all stakeholders to have their say. All options need to be considered, and in a way that both encourages and supports public participation, and develops a greater level of trust.

I am copying this letter to my MP, Dr Julian Huppert, who was on the Parliamentary Committee that scrutinised the bill, and to the Open Rights Group as well as making it public on my blog.

Kind regards

Dr Paul Bernal

Lecturer in Information Technology, Intellectual Property and Media Law
UEA Law School
University of East Anglia
Norwich NR4 7TJ
Email: paul.bernal@uea.ac.uk

One thought on “The Snoopers’ Charter: we need a new consultation

Leave a comment