Guest post: Data Retention: I can’t believe it’s not lawful, can you? A response to Anthony Speaight QC

Guest post by Matthew White

Introduction:

Ladies and gentlemen, Bagginses and Boffins. Tooks and Brandybucks. Grubbs! Chubbs! Hornblowers! Bolgers! Bracegirdles! Proudfoots. Put your butter away for I am about to respond, rebut, rebuke and more to a recent blog post for Judicial Power Project, by Anthony Speaight QC on data retention.

Blanket data retention is unlawful, please deal with it:

Speaight starts off by referring to the recent Court of Appeal (CoA) judgment in  Tom Watson and Others v Secretary of State for the Home Department [2018] EWCA Civ 70 and how the Court of Justice of the European Union (CJEU) has created problems and uncertainties with regards to data retention. As David Allen Green would say, ‘Well…’ Well, just to be clear, the position of the CJEU on blanket indiscriminate data retention is crystal clear. It . Is . Unlawful . It just happens that the CoA took the position of sticking their fingers in their ears and pretending that the CJEU’s ruling doesn’t apply to UK law, because its somehow (it’s not) different.

Just billing data is retained? Oh really?

Next, Speaight recaps the data retention saga so far, in that telecommunications companies have always recorded who uses their services, when and where, often for billing purposes. A long time ago, in a galaxy far, far away (a few years ago, and anywhere with an internet connection) this position was a robust one. But the European Commission (Commission) in 2011 highlighted that:

[T]rends in business models and service offerings, such as the growth in flat rate tariffs, pre-paid and free electronic communications services, meant that operators gradually stopped storing traffic and location data for billing purposes thus reducing the availability of such data for criminal justice and law enforcement purposes.

So, in a nutshell, data for billing purposes are on the decrease. This would explain why the Data Retention Directive (DRD) (discussed more below) affected:

[P]roviders of electronic communication services by requiring such providers to retain large amounts of traffic and location data, instead of retaining only data necessary for billing purposes; this shift in priority results in an increase in costs to retain and secure the data.

So, it’s simply untrue to refer to just billing data when talking about data retention, because this isn’t the only data that is or has ever been sought.

It’s the Islamists fault why we have data retention:

Speaight next points out that it was the advent of Islamist international terrorism that made it advantageous to place data retention obligations on companies. Oh really? Are we going down this route? Well….. demands for data retention can be traced back to the ‘International Law Enforcement and Telecommunications Seminars’ (ILETS) (6) and in its 1999 report, it was realised that Directive 97/66/EC (the old ePrivacy Directive) which made retention of communications data possible only for billing purposes was a problem. The report sought to ‘consider options for improving the retention of data by Communication Service Providers.’ Improve? Ha. Notice how 1999 was before 9/11? Funny that.

It doesn’t stop there though. A year later (still before 9/11), the UK’s National Crime and Intelligence Service (NCIS) made a submission (on behalf of the Mi5/6, GCHQ etc) to the Home Office on data retention laws. They ironically argued that a targeted approach would be a greater infringement on personal privacy (para 3.1.5). Of course, they didn’t say how or why this was the case, because, reasons. Charles Clarke, the then junior Home Office Minister, and Patricia Hewitt, an ‘E-Minister’ both made the claim such proposals would never happen (Judith Rauhofer, ‘Just Because You’re Paranoid, Doesn’t Mean They’re Not After You: Legislative Developments in Relation to the Retention of Communications Data’ (2006) SCRIPTed 3, 228; Patricia Hewitt and Charles Clarke, Joint letter to Independent on Sunday, 28 Jan 2000) and should not be implemented (Trade and Industry Committee, UK Online Reviewed: the First Annual Report of the E-Minister and E-Envoy Report (HC 66 1999-2000), Q93).

Guess what? A year later Part 11 of the Anti-terrorism, Crime and Security Act 2001 (ATCSA 2001) came into force three months after 9/11 (Judith Rauhofer, 331). The Earl of Northesk, however, pointed out that ‘there is no evidence whatever that a lack of data retained has proved an impediment to the investigation of the atrocities’ on 9/11 (HL Deb 4 Dec vol 629 col. 808-9). What this demonstrates is that data retention was always on the cards, even when its utility wasn’t proven, where the then Prime Minister Tony Blair, noted that ‘all the surveillance in the world’ could not have prevented the 7/7 bombings. It’s just that as Roger Clarke succinctly puts it:

“[M]ost critical driver of change, however, has been the dominance of national security extremism since the 2001 terrorist attacks in the USA, and the preparedness of parliaments in many countries to grant law enforcement agencies any request that they can somehow link to the idea of counter-terrorism.” (Roger Clarke, ‘Data retention as mass surveillance: the need for an evaluative framework’ (2015) International Data Privacy Law 5:2 121, 122).

Islamic terrorism was just fresh justification (7,9) for something that ‘the EU governments always intended to introduce an EC law to bind all member states to adopt data retention.’ Mandatory data retention was championed by the UK during its Presidency of the European Council (Council) (9) (and yes, that includes the ‘no data retention from us’ Charles Clarke (who was accused of threatening the European Parliament to agree to data retention (9))) and described as a master class in diplomacy and political manoeuvring (Judith Rauhofer, 341) (and they say it’s the EU that tells us what to do!!). Politicians goin’ politicate. Yes, the DRD makes reference to the Madrid bombings, but the DRD was not limited to combating terrorism (6), just as the reasons for accessing communications data in UK law under s.22 of the Regulation of Investigatory Powers Act 2000 (RIPA 2000) were not solely based on fighting terrorism. There is nothing wrong with saying that data retention (yeah, but not blanket, of course) and access to said data can be important in the fight against Islamist terrorism, but would you please stop pretending that was the basis on which data retention was sought?

Data retention was smooth like rocks:

Next, Speaight points to the ‘smooth operation’ of the data retention system. Smooth how and in what ways? Harder to answer that is, yess! Well….. in 2010, the Article 29 Working Party (WP29) pointed out that ‘the lack of available sensible statistics hinders the assessment of whether the [data retention] directive has achieved its objectives.’ The WP29 went further pointing out that there was a lack of harmonisation in national implementation of the DRD (2). This was, the purpose of the DRD (harmonising data retention across the EU), and it didn’t even achieve what it set out.

What about its true purpose? You know, spying on every EU citizen? Well the European Data Protection Supervisor (EDPS) responded to the Commission’s evaluation of the DRD. WARNING: EDPS pulls no punches. First, the EDPS reiterated that the DRD was based upon the assumption of necessity (para 38). Secondly, the EDPS criticised the Commission’s assertion that most Member States considered data retention a necessary tool when conclusions were based on just over a third (that’s less than half, right?) of them (para 40). Thirdly, these conclusions were in fact, only statements (para 41). Fourthly, the EDPS highlighted there should be sufficient quantitative and qualitative information to assess whether the DRD is actually working and whether less privacy intrusive measures could achieve the same result, information should show the relationship between use and result (43).

Surprise, surprise, the EDPS didn’t find sufficient evidence to demonstrate the necessity of the DRD and that further investigations into alternatives should commence (para 44). Fifthly, the EDPS pretty much savaged the quantitative and qualitative information available (para 45-52). A few years later, the CJEU asked for proof of the necessity of the DRD. There was a lack of statistical evidence from EU Member States, the Commission, the Council and European Parliament, and despite that, they had the cheek to ask the CJEU to reject the complaints made by Digital Rights Ireland and others anyway (ibid). Only the Austrian government were able to provide statistical evidence on the use (not retention) of communications data which didn’t involve any cases of terrorism (ibid). The UK’s representatives admitted (come again? The UK admits something?) there was no ‘scientific data’ to underpin the need of data retention (ibid), so the question begs, wtaf had the DRD been based upon? Was it the assumption of necessity the EDPS referred to? Draw your own conclusions. The moral of the story is that the DRD did not operate smoothly.

Ruling against data retention was a surprise?

Speaight then moves onto the judgment that started it all, Joined Cases C‑293/12 and C‑594/12, Digital Rights Ireland in which the CJEU invalidated the DRD across the EU. According to Speaight, this came as a ‘surprise.’

I felt a great disturbance in the Law, as if thousands of spies, police, other public authorities, politicians and lawyers suddenly cried out in terror, as the State were suddenly unable to spy anymore. I fear something terrible has happened.

So, who was surprised? Was it the European Parliament who had initially opposed this form of data retention as they urged its use must be entirely exceptional, based on specific comprehensible law, authorised by judicial or other competent authorities for individual cases and be consistent with the European Convention on Human Rights (ECHR)? Was it a surprise to them when they also noted that that ‘a general data retention principle must be forbidden’ and that ‘any general obligation concerning data retention’ is contrary to the proportionality principle’ (Abu Bakar Munir and Siti Hajar Mohd Yasin, ‘Retention of communications data: A bumpy road ahead’ (2004) The John Marshall Journal of Computer & Information Law 22:4 731, 734; Clive Walker and Yaman Akdeniz, ‘Anti-Terrorism Laws and Data Retention: War is over?’ (2003) Northern Ireland Legal Quarterly 54:2 159, 167)?

Was it a surprise to Patrick Breyer who argued that data retention was incompatible with Articles 8 and 10 of the ECHR back in 2005 (372, 374, 375)? Was it a surprise to Mariuca Morariu who argued that the DRD had failed to demonstrate its necessity (Mariuca Morariu, ‘How Secure is to Remain Private? On the Controversies of the European Data Retention Directive’ Amsterdam Social Science 1:2 46, 54-9)? Was it a surprise to Privacy International (PI), the European Digital Rights Initiative (EDRi), 90 NGOs and 80 telecommunications service providers (9) who were against the DRD? Was it a surprise to the 40 civil liberties organisations who urged the European Parliament to vote against the retention of communications data?

Was it a surprise to the WP29, the European Data Protection Commissioners, the International Chamber of Commerce (ICC), European Internet Services Providers Association (EuroISPA), the US Internet Service Provider Association (USISPA), the All Party Internet Group (APIG) (Abu Bakar Munir and Siti Hajar Mohd Yasin, 746-749) and those at the G8 Tokyo Conference? Hell, even our own assistant Information Commissioner, Jonathan Bamford, back in 2001 wouldn’t be surprised because he said ‘Part 11 isn’t necessary, and if it is necessary it should be made clear why’ (HL Deb 27 Nov 2001 vol 629 cc183-290, 252). Was it a surprise when prior to Digital Rights Ireland:

Bulgaria’s Supreme Administrative Court, the Romanian, German Federal, Czech Republic Constitutional Courts and the Supreme Court of Cyprus all [declared] national implementation of the DRD either invalid or unconstitutional (in some or all regards) and incompatible with Article 8 ECHR?

Was Jules Winnfield surprised?

The point I’m trying to hammer home is that (you’ve guessed it), the CJEU’s ruling in Digital Rights Ireland should come as no surprise. Still on the issue of surprise, for Speaight it was because it departed from decisions of the European Court of Human Rights (ECtHR) and the CJEU itself. Ok, let’s look at these ECtHR cases Speaight refers to. The first is Weber and Saravia v Germany, a case on ‘strategic monitoring.’ This is a whole different kettle of fish when compared to the DRD as this concerned the surveillance of 10% (I’m not saying this is cool either btw) [30, 110] of German telecommunications, not the surveillance of ‘practically the entire European population’ [56]. Ok, that may have been an exaggeration by the CJEU as there are only 28 (we’re not so sure about one though) EU Member States, but the point is, the powers in question are not comparable. The DRD was confined to serious crime, without even defining it [61]. Whereas German law in Weber concerned six defined purposes for strategic monitoring, [27] and could only be triggered through catch words [32]. In Digital Rights Ireland, authorisation for access to communications data in the DRD was not dependent upon ‘prior review carried out by a court or by an independent administrative body’ [62] where in Weber this was the case [21, 25]. Apples and oranges.

The second ECtHR case was Kennedy v UK, and it’s funny that this case is brought up. The ECtHR in this case referred to a previous case, Liberty v UK in which the virtually unfettered power of capturing external communications [64] violated Article 8 of the ECHR [70]. The ECtHR in Kennedy referred to this as an indiscriminate power [160, 162] (bit like data retention huh?), and the UK only succeeded in Kennedy because the ECtHR were acting upon the assumption that interception warrants only related to one person [160, 162]. Of course, the ECtHR didn’t know that ‘person’ for the purposes of RIPA 2000 meant ‘any organisation and any association or combination of persons,’ so you know, not one person literally.

And this was, of course, prior to Edward Snowden’s bombshell of surveillance revelations, which triggered further proceedings by Big Brother Watch. A couple of years ago, in Roman Zakharov v Russia, the ECtHR’s Grand Chamber (GC) ruled that surveillance measures that are ‘ordered haphazardly, irregularly or without due and proper consideration’ [267] violates Article 8 [305]. That is because the automatic storage of clearly irrelevant data would contravene Article 8 [255]. This coincides with Advocate General (AG) Saugmandsgaard Øe’s opinion that the ‘disadvantages of general data retention obligations arise from the fact that the vast majority of the data retained will relate to persons who will never be connected in any way with serious crime’ [252]. That’s a lot of irrelevant data if you ask me. Judge Pinto de Albuquerque, in his concurring opinion in Szabo and Vissy v Hungary regards Zakharov as a rebuke of the ‘widespread, non-(reasonable) suspicion-based, “strategic surveillance” for the purposes of national security’ [35]. So, I’d say that even Weber v Saravia is put into doubt. And so, even if the CJEU rules that data retention in the national security context is outside its competence, there is enough ECtHR case law to bite the UK on its arse.

Probably the most important ECtHR case not mentioned by Speaight (why is that?) is that of S and Marper v UK, this is the data retention case. Although this concerned DNA data retention, the ECtHR’s concerns ‘have clear applications to the detailed information revealed about individuals’ private lives by communications data.’ What did the GC rule in S and Marper? Oh, was it that blanket indiscriminate data retention ‘even on a specific group of individuals (suspects and convicts) violated Article 8’? Yes, they did and it was S and Marper to which the CJEU referred to on three separate occasions in Digital Rights Ireland [47, 54-5]. Tele 2 and Watson (where the CJEU reconfirmed that blanket indiscriminate data retention is prohibited under EU law) is just the next logical step with regards to communications data. And so far from being surprising, the CJEU in Digital Rights Ireland and Tele2 and Watson are acting in a manner that is consistent with the case law of the ECtHR.

The CJEU case law that Speaight refers to is Ireland v Parliament and Council which was a challenge to the DRD’s legal basis, not whether it was compatible with the Charter of Fundamental Rights, so I’m not entirely sure what Speaight is trying to get at. All in all, Speaight hasn’t shown anything to demonstrate that Digital Rights Ireland has departed from ECtHR or CJEU case law.

You forgot to say the UK extended data retention laws:

Speaight then rightly acknowledges how the UK government replaced UK law implementing the DRD with the Data Retention and Investigatory Powers Act 2014 (DRIPA 2014) in lightspeed fashion. What Speaight omits, however, is that DRIPA 2014 extended retention obligations from telephone companies and Internet Service Providers (ISPs) to Over-The-Top (OTT) services such as Skype, Twitter, Google, Facebook etc. James Brokenshire MP attested that DRIPA 2014 was introduced to clarify what was always covered by the definition of telecommunications services (HC Deb 14 July, vol 584, 786). This, of course, was total bullshit (5), but like I said, politicians goin’ politicate.

Claimants don’t ask questions, courts do:

Speaight moves onto the challenges to DRIPA 2014, we know the story already, the High Court (HC) said it was inconsistent with Digital Rights Ireland, whereas the CoA disagreed, blah, blah. Speaight points out that the claimants had no issue with data retention in principle, which is true, but so what? Speaight also points out that the CJEU went further than what the claimants asked by ruling that blanket indiscriminate data retention was not permissible under EU law. Wait, what the fark? It’s not the bloody claimants’ that ask the CJEU a question on the interpretation of EU law as I’m pretty sure it was the Swedish referring court (via Article 267 of the Treaty on the Functioning of the EU, you know, a preliminary reference) that asked the CJEU:

Is a general obligation to retain traffic data covering all persons, all means of electronic communication and all traffic data without any distinctions, limitations or exceptions for the purpose of combating crime (as described [below under points 1-6]) compatible with Article 15(1) of Directive 2002/58/EC, 1 taking account of Articles 7, 8 and 15(1) of the Charter?

And the CJEU said no. End of discussion.

The ends don’t always justify the means and for clarity, the CJEU didn’t reject shit:

Speaight also says that the CJEU in Tele2 and Watson rejected AG Saugmandsgaard Øe’s advice that the French governments found access to communications data useful in its investigations into terrorist attacks in 2015. Such a position however, falls victim to several questions, such as under what circumstances was the data sought? Was it accessed as a consequence of the legal obligation to retain? Or was it already retained for business purposes? What were the results of the use of that data? Could the same results have been achieved using less intrusive means? Saying it is useful tells us nothing as the ECtHR has plainly said necessity (in a democratic society) is not as flexible as expressions such as ‘useful’ [48], and as the CJEU rightly noted, a measure in and of itself, even in the general interest cannot justify general indiscriminate data retention [103]. This demonstrates that the CJEU didn’t reject anything, they didn’t even refer to the French government’s evidence, they just said as fundamental as fighting serious crime may be, and the measures employed, cannot by themselves justify such a fundamental departure from the protection of human rights. Just because you can, doesn’t mean you should. A certain ECtHR said something similar in Klass v Germany in that States ‘may not, in the name of the struggle against espionage and terrorism, adopt whatever measures they deem appropriate’ [49].

The CJEU doesn’t have to answer what it wasn’t asked:

Speaight then whines about the CJEU not addressing the issue of national security, well they weren’t asked about national security in Tele2 and Watson, were they? Like I said, even if the CJEU doesn’t have competence to rule on national security based data retention, Roman Zakharov is watching you from Strasbourg (he’s not actually in Strasbourg, I don’t think, but you dig).

What’s your problem with notification?

Speaight also bemoans the obligation to notify saying this requirement could damage investigations and surveillance and went beyond what the claimants had asked. Well, again, the claimants weren’t asking the questions, ffs, and the CJEU made this point by referring to previous case law, notably, Schrems [95]. The CJEU made very clear that notification should be done ‘as soon as that notification is no longer liable to jeopardise the investigations being undertaken by those authorities’ [121]. This is consistent with the ECtHR’s stance. Both courts are aware that notification can defeat the purpose of the investigation, and sometimes even after it has concluded, notification may still not be appropriate. But Speaight seems to omit this crucial detail.

Lawyers getting mad:

Speaight notes that criticism of Tele2 is not confined to Eurosceptics. Sure, but you don’t have to be a Europhile to defend it either. He also noted that it was roundly condemned by all the participants at a meeting of the Society of Conservative Lawyers. Well, no shit to my Sherlock, the name kinda gave it away. He also notes that the former Independent Reviewer of Terror law, David Anderson QC, said it was the worst judgment he knew of. Wait til Anderson reads the ECtHR’s case law on this matter then, which if anything, on proper reading goes further than Tele2. Speaight also points out that Demonic Grieve QC MP was pissed and that a well distinguished member of the French Bar, Francois-Henri Briard basically saying we need more conservative judges to trample on fundamental rights. If a judgment that protects the fundamental rights of all EU citizens pisses off a few lawyers, so be it.

Conclusions:

I’ve spent way too much time on Speaight’s post, and the really sad thing is, I’ve enjoyed it. It’s hard to have a conversation about data retention when you first have to sift through a load of bollocks, and there was plenty of bollocks, just to make your point. And by the time you’ve cleared through all the falsities and misleading or exaggerated points, you run close to 4k words without actually saying what your position is. So, my position for this blog post is, we should always shoot down rubbish when it shows its ugly face or else it festers. Actually, the point is, I can believe that blanket indiscriminate data retention is unlawful.

A disturbing plan for control…

The Conservative Manifesto, unlike the Labour Manifesto, has some quite detailed proposals for digital policy – and in particular for the internet. Sadly, however, though there are a few bright spots, the major proposals are deeply disturbing and will send shivers down the spine of anyone interested in internet freedom.

Their idea of a ‘digital charter’ is safe, bland, motherhood and apple-pie stuff about safely and security online, with all the appropriate buzzwords of prosperity and growth. It seems a surprise, indeed, that they haven’t talked about having a ‘strong and stable internet’. They want Britain to be the best place to start and run a digital business, and to make Britain the safest place in the world to be online. Don’t we all?

When the detail comes in, some of it sounds very familiar to people who know what the law already says – and in particular what EU law already says – the eIDAS, the E-Commerce Directive, the Directive on Consumer Rights already say much of what the Tory Manifesto says. Then, moving onto data protection, it gets even more familiar:

“We will give people new rights to ensure they are in control of their own data, including the ability to require major social media platforms to delete information held about them at the age of 18, the ability to access and export personal data, and an expectation that personal data held should be stored in a secure way.”

This is all from the General Data Protection Regulation (GDPR), passed in 2016, and due to come into force in 2018. Effectively, the Tories are trying to take credit for a piece of EU law – or they’re committing (as they’ve almost done before) to keeping compliant with that law after we’ve left the EU. That will be problematic, given that our surveillance law may make compliance impossible, but that’s for another time…

“…we will institute an expert Data Use and Ethics Commission to advise regulators and parliament on the nature of data use and how best to prevent its abuse.”

This is quite interesting – though notable that the word ‘privacy’ is conspicuous by its absence. It is, perhaps, the only genuinely positive thing in the Tory manifesto as it relates to the internet.

“We will make sure that our public services, businesses, charities and individual users are protected from cyber risks.”

Of course you will. The Investigatory Powers Act, however, does the opposite, as does the continued rhetoric against encryption. The NHS cyber attack, it must be remembered, was performed using a tool developed by GCHQ’s partners in the NSA. If the Tories really want to protect public services, businesses, charities and individuals, they need to change tack on this completely, and start promoting and supporting good practice and good, secure technology. Instead, they again double-down in the fight against encryption (and thus against security):

“….we do not believe that there should be a safe space for terrorists to communicate online and will work to prevent them from having this capability.”

…but as anyone with any understanding of technology knows, if you stop terrorists communicating safely, you stop all of us from communicating safely.

Next:

“…we also need to take steps to protect the reliability and objectivity of information that is essential to our democracy and a free and independent press.”

This presumably means some kind of measures against ‘fake news’. Most proposed measures elsewhere in the world are likely to amount to censorship – and given what else is in the manifesto (see below) I think that is the only reasonable conclusion here.

“We will ensure content creators are appropriately rewarded for the content they make available online.”

This looks as though it almost certainly means harsher and more intense copyright enforcement. That, again, is only to be expected.

Then, on internet safety, they say:

“…we must take steps to protect the vulnerable… …online rules should reflect those that govern our lives offline…”

Yes, We already do.

“We will put a responsibility on industry not to direct users – even unintentionally – to hate speech, pornography, or other sources of harm”

Note that this says ‘pornography’, not ‘illegal pornography’, and the ‘unintentionally’ part begins the more disturbing part of the manifesto. Intermediaries seem likely to be stripped of much of their ‘mere conduit’ protection – and be required to monitor much more closely what happens through their systems. This, in general, has two effects: to encourage surveillance, and to encourage caution about content (effectively to chill speech). This needs to be watched very carefully indeed.

“…we will establish a regulatory framework in law to underpin our digital charter and to ensure that digital companies, social media platforms and content providers abide by these principles. We will introduce a sanctions regime to ensure compliance, giving regulators the ability to fine or prosecute those companies that fail in their legal duties, and to order the removal of content where it clearly breaches UK law.”

This is the most worrying part of the whole piece. Essentially it looks like a clampdown on the social media – and, to all intents and purposes, the establishment of a full-scale internet censorship system (see the ‘fake news’ point above). Where the Tories are refusing to implement statutory regulation for the press (the abandonment of part 2 of Leveson is mentioned specifically in the manifesto, along with the repeal of Section 40 of the Crime and Courts Act 2013, which was one of the few bits of Leveson part 1 that was implemented) they look very much as though they want to impose it upon the online media. The Daily Mail will have more freedom than blogging platforms, Facebook and Twitter – and you can draw your own conclusions from that.

When this is all combined with the Investigatory Powers Act, it looks very much like a solid clampdown on internet freedom. Surveillance has been enabled – this will strengthen the second part of the authoritarian pincer movement, the censorship side. Privacy has been wounded, now it’s the turn of freedom of expression to be attacked. I can see how this will be attractive to some – and will go down very well indeed with both the proprietors and the readers of the Daily Mail – but anyone interested in internet freedom should be very much disturbed.

 

Investigatory Powers Bill – my written submission

As well as providing oral evidence to the Draft Investigatory Powers Bill Joint Committee (which I have written about here, can be watched here, and a transcript can be found here) I submitted written evidence on the 15th December 2015.

Screen Shot 2015-12-09 at 10.02.12

The contents of the written submission are set out below. It is a lot more detailed than the oral evidence, and a long read (around 7,000 words) but even so, given the timescale involved, it is not as comprehensive as I would have liked – and I didn’t have as much time to proof read it as I would have liked. There are a number of areas I would have liked to have covered that I did not, but I hope it helps.

As it is published, the written evidence is becoming available on the IP Bill Committee website here – my own evidence is part of what has been published so far.


 

Submission to the Joint Committee on the draft Investigatory Powers Bill by Dr Paul Bernal

I am making this submission in my capacity as Lecturer in Information Technology, Intellectual Property and Media Law at the UEA Law School. I research in internet law and specialise in internet privacy from both a theoretical and a practical perspective. My PhD thesis, completed at the LSE, looked into the impact that deficiencies in data privacy can have on our individual autonomy, and set out a possible rights-based approach to internet privacy. My book, Internet Privacy Rights – Rights to Protect Autonomy, was published by Cambridge University Press in 2014. I am a member of the National Police Chiefs’ Council’s Independent Digital Ethics Panel. The draft Investigatory Powers Bill therefore lies precisely within my academic field.

I gave oral evidence to the Committee on 7th December 2015: this written evidence is intended to expand on and explain some of the evidence that I gave on that date. If any further explanation is required, I would be happy to provide it.


 

One page summary of the submission

The submission looks specifically at the nature of internet surveillance, as set out in the Bill, at its impact on broad areas of our lives – not just what is conventionally called ‘communications’ – and on a broad range of human rights – not just privacy but freedom of expression, of association and assembly, and of protection from discrimination. It looks very specifically at the idea of ‘Internet Connection Records, briefly at data definitions and at encryption, as well as looking at how the Bill might be ‘future proofed’ more effectively.

The submission will suggest that in its current form, in terms of the overarching/thematic questions set out in the Committee’s Call for Written Evidence, it is hard to conclude that all of the powers sought are necessary, uncertain that they are legal, likely that many of them are neither workable nor carefully defined, and unclear whether they are sufficiently supervised. In some particular areas – Internet Connection Records is the example that I focus on in this submission – the supervision envisaged does not seem sufficient or appropriate. Moreover, there are critical issues – for example the vulnerability of gathered data – that are not addressed at all. These problems potentially leave the Bill open to successful legal challenge and rather than ‘future-proofing’ the Bill, they provide what might be described as hostages to fortune.

Many of the problems, in my opinion, could be avoided by taking a number of key steps. Firstly, rethinking (and possibly abandoning) the Internet Connection Records plans. Secondly, being more precise and open about the Bulk Powers, including a proper setting out of examples so that the Committee can make an appropriate judgment as to their proportionality and to reduce the likelihood of their being subject to legal challenge. Thirdly, taking a new look at encryption and being clear about the approach to end-to-end encryption. Fourthly, strengthening and broadening the scope of oversight. Fifthly, through the use of some form of renewal or sunset clauses to ensure that the powers are subject to full review and reflection on a regular basis.


1          Introductory remarks

1.1       Before dealing with the substance of the Bill, there is an overriding question that needs to be answered: why is the Committee being asked to follow such a tight timetable? This is a critically important piece of legislation – laws concerning surveillance and interception are not put forward often, particularly as they are long and complex and deal with highly technical issues. That makes detailed and careful scrutiny absolutely crucial. Andrew Parker of MI5 called for ‘mature debate’ on surveillance immediately prior to the introduction of the Bill: the timescale set out for the scrutiny of the Bill does not appear to give an adequate opportunity for that mature debate.

1.2       Moreover, it is equally important that the debate be an accurate one, and engaged upon with understanding and clarity. In the few weeks since the Bill was introduced the public debate has been far from this. As shall be discussed below, for example, the analogies chosen for some of the powers envisaged in the Bill have been very misleading. In particular, to suggest that the proposed ‘Internet Connection Records’ (‘ICRs’) are like an ‘itemised phone bill’, as the Home Secretary described it, is wholly inappropriate. As I set out below (in section 5) the reality is very different. There are two possible interpretations for the use of such inappropriate analogies: either the people using them don’t understand the implications of the powers, which means more discussion is needed to disabuse them of their illusions, or they are intentionally oversimplifying and misleading, which raises even more concerns.

1.3       For this reason, the first and most important point that I believe the Committee should be making in relation to the scrutiny of the Bill is that more time is needed. As I set out below (in 8.4 below) the case for the urgency of the Bill, particularly in the light of the recent attacks in Paris, has not been made: in many ways the attacks in Paris should make Parliament pause and reflect more carefully about the best approach to investigatory powers in relation to terrorism.

1.4       In its current form, in terms of the overarching/thematic questions set out in the Committee’s Call for Written Evidence, it is hard to conclude that all of the powers sought are necessary, uncertain that they are legal, likely that many of them are neither workable nor carefully defined, and unclear whether they are sufficiently supervised. In some particular areas – Internet Connection Records is the example that I focus on in this submission – the supervision envisaged does not seem sufficient or appropriate. Moreover, there are critical issues – for example the vulnerability of gathered data – that are not addressed at all. These problems potentially leave the Bill open to successful legal challenge and rather than ‘future-proofing’ the Bill, they provide what might be described as hostages to fortune.

1.5       Many of the problems, in my opinion, could be avoided by taking a number of key steps. Firstly, rethinking (and possibly abandoning) the Internet Connection Records plans. Secondly, being more precise and open about the Bulk Powers, including a proper setting out of examples so that the Committee can make an appropriate judgment as to their proportionality and to reduce the likelihood of their being subject to legal challenge. Thirdly, taking a new look at encryption and being clear about the approach to end-to-end encryption. Fourthly, strengthening and broadening the scope of oversight. Fifthly, through the use of some form of renewal or sunset clauses to ensure that the powers are subject to full review and reflection on a regular basis.

2          The scope and nature of this submission

2.1       This submission deals specifically with the gathering, use and retention of communications data, and of Internet Connection Records in particular. It deals more closely with the internet rather than other forms of communication – this is my particular area of expertise, and it is becoming more and more important as a form of communications. The submission does not address areas such as Equipment Interference, and deals only briefly with other issues such as interception and oversight. Many of the issues identified with the gathering, use and retention of communications data, however, have a broader application to the approach adopted by the Bill.

2.2       It should be noted, in particular, that this submission does not suggest that it is unnecessary for either the security and intelligence services or law enforcement to have investigatory powers such as those contained in the draft Bill. Many of the powers in the draft Bill are clearly critical for both security and intelligence services and law enforcement to do their jobs. Rather, this submission suggests that as it is currently drafted the bill includes some powers that are poorly defined, poorly suited to the stated function, have more serious repercussions than seem to have been understood, and could represent a distraction, a waste of resources and add an unnecessary set of additional risks to an already risky environment for the very people that the security and intelligence services and law enforcement are charged with protecting.

3          The Internet, Internet Surveillance and Communications Data

3.1       The internet has changed the way that people communicate in many radical ways. More than that, however, it has changed the way people live their lives. This is perhaps the single most important thing to understand about the internet: we do not just use it for what we have traditionally thought of as ‘communications’, but in almost every aspect of our lives. We don’t just talk to our friends online, or just do our professional work online, we do almost everything online. We bank online. We shop online. We research online. We find relationships online. We listen to music and watch TV and movies online. We plan our holidays online. We try to find out about our health problems online. We look at our finance online. For most people in our modern society, it is hard to find a single aspect of our lives that does not have a significant online element.

3.2       This means that internet interception and surveillance has a far bigger potential impact than traditional communications interception and surveillance might have had. Intercepting internet communications is not the equivalent of tapping a telephone line or examining the outside of letters sent and received, primarily because we use the internet for far more than we ever used telephones or letters. This point cannot be overemphasised: the uses of the internet are growing all the time and show no signs of slowing down. Indeed, more dimensions of internet use are emerging all the time: the so-called ‘internet of things’ which integrates ‘real world’ items (from cars and fridges to Barbie dolls[1]) into the internet is just one example.

3.3       This is also one of the reasons that likening Internet Connection Records to an itemised phone bill is particularly misleading. Another equally important reason to challenge that metaphor is the nature and potential uses of the data itself. What is labelled Communications Data (and in particular ‘relevant communications data’, as set out in clause 71(9) of the draft Bill) is by nature of its digital form ideal for analysis and profiling. Indeed, using this kind of data for profiling is the heart of the business models of Google, Facebook and the entire internet advertising industry.

3.4       The inferences that can be – and are – drawn from this kind of data, through automated, algorithmic analysis rather than through informed, human scrutiny – are enormous and are central to the kind of ‘behavioural targeting’ that are the current mode of choice for internet advertisers. Academic studies have shown that very detailed inferences can be drawn: analysis of Facebook ‘Likes’, for example, has been used to indicate the most personal of data including sexuality, intelligence and so forth. A recent study at Cambridge University concluded that ‘by mining Facebook Likes, the computer model was able to predict a person’s personality more accurately than most of their friends and family.’[2]

3.5       This means that the kind of ‘communications’ data discussed in the Bill is vastly more significant that what is traditionally considered to be communications. It also means that from a human rights perspective more rights are engaged by its gathering, holding and use. Internet ‘communications’ data does not just engage Article 8 in its ‘correspondence’ aspect, but in its ‘private and family life’ aspect. It engages Article 10 – the impact of internet surveillance on freedom of speech has become a bigger and bigger issue in recent years, as noted in depth by the UN Special Rapporteur on Freedom of Expression, most recently in his report on encryption and anonymity.[3]

3.6       Article 11, which governs Freedom of Association and Assembly, is also critically engaged: not only do people now associate and assemble online, but they use online tools to organise and coordinate ‘real world’ association and assembly. Indeed, using surveillance to perform what might loosely be called chilling for association and assembly has become one of the key tools of the more authoritarian governments to stifle dissent. Monitoring and even shutting off access to social media systems, for example, was used by many of the repressive regimes in the Arab Spring. Even in the UK, the government communications plan for 2013/14 included the monitoring of social media in order to ‘head off badger cull protests’, as the BBC reported.[4] This kind of monitoring does not necessarily engage Article 8, as Tweets (the most obvious example to monitor) are public, but it would engage both aspects of Article 11, and indeed of Article 10.

3.7       Article 14, the prohibition of discrimination, is also engaged: the kind of profiling discussed in paragraph 3.4 above can be used to attempt to determine a person’s race, gender, possible disability, religion, political views, even direct information like membership of a trade union. It should be noted, as is the case for all these profiling systems, that accuracy is far from guaranteed, giving rise to a bigger range of risks. Where derived or profiling data is accurate, it can involve invasions of privacy, chilling of speech and discrimination: where it is inaccurate it can generate injustice, inappropriate decisions and further chills and discrimination.

3.8       This broad range of human rights engaged means that the ‘proportionality bar’ for any gathering of this data, interception and so forth is higher than it would be if only the correspondence aspect of Article 8 were engaged. It is important to understand that the underlying reason for this is that privacy is not an individual, ‘selfish’, right, but one that underpins the way that our communities function. We need privacy to communicate, to express ourselves, to associate with those we choose, to assemble when and where we wish – indeed to do all those things that humans, as social creatures, need to do. Privacy is a collective right that needs to be considered in those terms.

3.9       It is also critical to note that communications data is not ‘less’ intrusive than content: it is ‘differently’ intrusive. In some ways, as has been historically evident, it is less intrusive – which is why historically it has been granted lower levels of protection – but increasingly the intrusion possible through the gathering of communications data is in other was greater than that possible through examination of content. There are a number of connected reasons for this. Firstly, it is more suitable for aggregation and analysis – communications data is in a structured form, and the volumes gathered make it possible to use ‘big data’ analysis, as noted above. Secondly, content can be disguised more easily – either by technical encryption or by using ‘coded’ language. Thirdly, there are many kinds of subjects that are often avoided deliberately when writing content – things like sexuality, health and religion – that can be determined by analysis of communications data. That means that the intrusive nature of communications data can often be greater than that of content. Moreover, as the levels and nature of data gathered grows, the possible intrusions are themselves growing. This means that the idea that communications data needs a lower level of control, and less scrutiny, than content data is not really appropriate – and in the future will become even less appropriate.

4          When rights are engaged

4.1       A key issue in relation to the gathering and retention of communications data is when the relevant rights are engaged: it is when data is gathered and retained, when it is subject to algorithmic analysis or automated filtering, or when it is subject to human examination. When looked at from what might be viewed an ‘old fashioned’ communications perspective, it is only when humans examine the data that ‘surveillance’ occurs and privacy is engaged. In relation to internet communications data this is to fundamentally miss the nature of the data and the nature of the risks. In practice, many of the most important risks occur at the gathering stage, and more at what might loosely be described as the ‘automated analysis’ stage.

4.2       It is fundamental to the nature of data that when it is gathered it becomes vulnerable. This vulnerability has a number of angles. There is vulnerability to loss – from human error to human malice, from insiders and whistle-blowers to hackers of various forms. The recent hacks of Talk Talk and Ashley Madison in particular should have focussed the minds of any envisaging asking communications providers to hold more and more sensitive data. There is vulnerability to what is variously called ‘function creep’ or ‘mission creep’: data gathered for one reason may end up being used for another reason. Indeed, when business models of companies such as Facebook and Google are concerned this is one of the key features: they gather data with the knowledge that this data is useful and that the uses will develop and grow with time.

4.3       It is also at the gathering stage that the chilling effects come in. The Panopticon, devised by Bentham and further theorised about by Foucault, was intended to work by encouraging ‘good’ behaviour in prisoners through the possibility of their being observed, not by the actual observation. Similarly it is the knowledge that data is being gathered that chills freedom of expression, freedom of association and assembly and so forth, not the specific human examination of that data. This is not only a theoretical analysis but one borne out in practice, which is one of the reasons that the UN Special Rapporteur on Freedom of Expression and many others have made the link between privacy and freedom of expression.[5]

4.4       Further vulnerabilities arise at the automated analysis stage: decisions are made by the algorithms, particular in regard to filtering based on automated profiling. In the business context, services are tailored to individuals automatically based on this kind of filtering – Google, for example, has been providing automatically and personally tailored search results to all individuals since 2009, without the involvement of humans at any stage. Whether security and intelligence services or law enforcement use this kind of a method is not clear, but it would be rational for them to do so: this does mean, however, that more risks are involved and that more controls and oversight are needed at this level as well as at the point that human examination takes place.

4.5       Different kinds of risks arise at each stage. It is not necessarily true that the risks are greater at the final, human examination stage. They are qualitatively different, and engage different rights and involve different issues. If anything, however, it is likely that as technology advances the risks at the earlier stages – the gathering and then the automated analysis stages – will become more important than the human examination stage. It is critical, therefore, that the Bill ensures that appropriate oversight and controls are put in place at these earlier stages. At present, this does not appear to be the case. Indeed, the essence of the data retention provisions appears to be that no real risk is considered by the ‘mere’ retention of data. That is to fundamentally misunderstand the impact of the gathering of internet communications data.

5          Internet Connection Records

5.1       Internet Connection Records (‘ICRs’) have been described as the only really new power in the Bill, and yet they are deeply problematic in a number of ways. The first is the question of definition. The ‘Context’ section of the Guide to Powers and Safeguards (the Guide) in the introduction to the Bill says that:

“The draft Bill will make provision for the retention of internet connection records (ICRs) in order for law enforcement to identify the communications service to which a device has connected. This will restore capabilities that have been lost as a result of changes in the way people communicate.” (paragraph 3)

This is further explained in paragraphs 44 and 45 of the Guide as follows:

“44. A kind of communications data, an ICR is a record of the internet services a specific device has connected to, such as a website or instant messaging application. It is captured by the company providing access to the internet. Where available, this data may be acquired from CSPs by law enforcement and the security and intelligence agencies.

45. An ICR is not a person’s full internet browsing history. It is a record of the services that they have connected to, which can provide vital investigative leads. It would not reveal every web page that they visit or anything that they do on that web page.”

Various briefings to the press have suggested that in the context of web browsing this would mean that the URL up to the first slash would be gathered (e.g. www.bbc.co.uk and not any further e.g. http://www.bbc.co.uk/sport/live/football/34706510 ). On this basis it seems reasonable to assume that in relation to app-based access to the internet via smartphones or tablets the ICR would include the activation of the app, but nothing further.

5.2       The ‘definition’ of ICRs in the bill is set out in 47(6) as follows:

“In this section “internet connection record” means data which—

(a) may be used to identify a telecommunications service to which a communication is transmitted through a telecommunication system for

the purpose of obtaining access to, or running, a computer file or computer program, and

(b) is generated or processed by a telecommunications operator in the process of supplying the telecommunications service to the sender of the communication (whether or not a person).”

This definition is vague, and press briefings have suggested that the details would be in some ways negotiated directly with the communications services. This does not seem satisfactory at all, particularly for something considered to be such a major part of the Bill: indeed, the only really new power according to the Guide. More precision should be provided within the Bill itself – and specific examples spelled out in Codes of Practice that accompany the Bill, covering the major categories of communications envisaged. Initial versions of these Codes of Practice should be available to Parliament at the same time as the Bill makes its passage through the Houses.

5.3       The Bill describes the functions to which ICRs may be put. In 47(4) it is set out that ICRs (and data obtained through the processing of ICRs) can only be used to identify:

“(a) which person or apparatus is using an internet service where—

(i) the service and time of use are already known, but

(ii) the identity of the person or apparatus using the service is not known,

(b) which internet communications service is being used, and when and how it is being used, by a person or apparatus whose identity is already known, or

(c) where or when a person or apparatus whose identity is already known is obtaining access to, or running, a computer file or computer program which wholly or mainly involves making available, or acquiring, material whose possession is a crime.”

The problem is that in all three cases ICRs insofar as they are currently defined are very poorly suited to performing any of these three functions – and better methods either already exist for them or could be devised to do so. ICRs provide at the same time much more information (and more intrusion) than is necessary and less information than is adequate to perform the function. In part this is because of the way that the internet is used and in part because of the way that ICRs are set out. Examples in the following paragraphs can illustrate some (but not all) of the problems.

5.4       The intrusion issue arises from the nature of internet use, as described in Section 3 of this submission. ICRs cannot be accurately likened to ‘itemised telephone bills’. They do not record the details of who a person is communicating with (as an itemised telephone bill would) but they do include vastly more information, and more sensitive and personal information, than an itemised telephone bill could possibly contain. A record of websites visited, even at the basic level, can reveal some of the most intimate information about an individual – and not in terms of what might traditionally be called ‘communications’. This intrusion could be direct – such as accessing a website such as www.samaritans.org at 3am or accessing information services about HIV – or could come from profiling possibilities. The commercial profilers, using what is often described as ‘big data’ analysis (and has been explained briefly in section 3 above) are able to draw inferences from very few pieces of information. Tastes, politics, sexuality, and so forth can be inferred from this data, with a relatively good chance of success.

5.5       This makes ICRs ideal for profiling and potentially subject to function-creep/mission-creep. It also makes them ideally suited for crimes such as identity theft and personalised scamming, and the databases of ICRs created by communications service providers a perfect target for hackers and malicious insiders. By gathering ICRs, a new range of vulnerabilities are created. Data, however held and whoever it is held by, is vulnerable in a wide range of ways.[6] Recent events have highlighted this very directly: the hacking of Talk Talk, precisely the sort of provider who would be expected to gather and store ICRs, should be taken very seriously. Currently it appears as though this hack was not done by the kind of ‘cyber-terrorists’ that were originally suggested, but by disparate teenagers around the UK. Databases of ICRs would seem highly likely to attract the interest both hackers of many different kinds. In practice, too, precisely those organisations who should have the greatest expertise and the greatest motivations to keep data secure – from the MOD and HMRC and the US DoD to Swiss Banks, technology companies including Sony and Apple – have all proved vulnerable to hacking or other forms of data loss in recent years. Hacking is the most dramatic, but human error, human malice, collusion and corruption, and commercial pressures (both to reduce costs and to ‘monetise’ data) may be more significant – and the ways that all these vulnerabilities can combine makes the risk even more significant.

5.6       ICRs are also unlikely to provide the information that law enforcement and the intelligence and security services need in order to perform the three functions noted above. The first example of this is Facebook. Facebook messages and more open communications would seem on the surface to be exactly the kind of information that law enforcement might need to locate missing children – the kind of example referred to in the introduction and guide to the bill. ICRs, however, would give almost no relevant information in respect of Facebook. In practice, Facebook is used in many different ways by many different people – but the general approach is to remain connected to Facebook all the time. Often this will literally be 24 hours a day, as devices are rarely turned off at night – the ‘connection’ event has little relationship to the use of the service. If Facebook is accessed by smartphone or tablet, it will generally be via an app that runs in the background at all times – this is crucial for the user to be able to receive notifications of events, of messages, of all kinds of things. If Facebook is accessed by PC, it may be by an app (with the same issues) or through the web – but if via the web this will often be using ‘tabbed browsing’ with one tab on the browser keeping the connection to Facebook available without the need to reconnect.

5.7       Facebook and others encourage and support this kind of long-term and even permanent connection to their services – it supports their business model and in a legal sense gives them some kind of consent to the kind of tracking and information gathering about their users that is the key to their success. ICRs would not help in relation to Facebook except in very, very rare circumstances. Further, most information remains available on Facebook in other ways. Much of it is public and searchable anyway. Facebook does not delete information except in extraordinary circumstances – the requirement for communications providers to maintain ICRs would add nothing to what Facebook retains.

5.8       The story is similar in relation to Twitter and similar services. A 24/7 connection is possible and indeed encouraged. Tweets are ‘public’ and available at all times, as well as being searchable and subject to possible data mining. Again, ICRs would add nothing to the ways that law enforcement and the intelligence and security services could use Twitter data. Almost all the current and developing communications services – from WhatsApp and SnapChat to Pinterest and more – have similar approaches and ICRs would be similarly unhelpful.

5.9       Further, the information gathered through ICRs would fail to capture a significant amount of the ‘communications’ that can and do happen on the internet – because the interactive nature of the internet now means that almost any form of website can be used for communication without that communication being the primary purpose of the website. Detailed conversations, for example, can and do happen on the comments sections of newspaper websites: if an analysis of ICRs showed access to www.telegraph.co.uk would the immediate thought be that communications are going on? Similarly, coded (rather than encrypted) messages can be put on product reviews on www.amazon.co.uk. I have had detailed political conversations on the message-boards of the ‘Internet Movies Database’ (www.imdb.com) but an ICR would neither reveal nor suggest the possibility of this.

5.10     This means that neither can the innocent missing child be found by ICRs via Facebook or its equivalents nor can the even slightly careful criminal or terrorist be located or tracked. Not enough information is revealed to find either – whilst extra information is gathered that adds to intrusion and vulnerability. The third function stated for ICRs refers to people whose identity is already known. For these people, ICRs provide insufficient information to help. This is one of the examples where more targeted powers would help – and are already envisaged elsewhere in the Bill.

5.11     The conclusion for all of this is that ICRs are not likely to be a useful tool in terms of the functions presented. The closest equivalent form of surveillance used around the world has been in Denmark, with very poor results. In their evaluation of five years’ experience the Danish Justice Ministry concluded that ‘session logging’, their equivalent of Internet Connection Records, had been of almost no use to the police. [7] It should be noted that when the Danish ‘session logging’ suggestion was first made, the Danish ISPs repeatedly warned that the system would not work and that the data would be of little use. Their warnings were not heeded. Similar warnings from ISPs in the UK have already begun to emerge. The argument has been made that the Danish failure was a result of the specific technical implementation – I would urge the Committee to examine it in depth to come to a conclusion. However, the fundamental issues as noted above are only likely to grow as the technology becomes more complex, the data more dense and interlinked, and the use of it more nuanced. All these trends are likely only to increase in speed.

5.12     The gathering and holding of ICRs are also likely to add vulnerabilities to all those about whom they are collected, as well as requiring massive amounts of data storage at a considerable cost. At a time when resources are naturally very tight, for the money, expertise and focus to be on something like this appears inappropriate.

 

6          Other brief observations about communications data, definitions and encryption

6.1       There is still confusion between ‘content’ and ‘communications’ data. The references to ‘meaning’ in 82(4), 82(8),106(8) and 136(4) and emphasised in 193(6) seem to add rather than reduce confusion – particularly when considered in relation to the kinds of profiling possible from the analysis of basic communications data. It is possible to derive ‘meaning’ from almost any data – this is one of the fundamental problems with the idea that content and communications can be simply and meaningfully separated. In practice, this is far from the case.[8] Further, Internet Connection Records are just one of many examples of ‘communications’ data that can be used to derive deeply personal information – and sometimes more directly (through analysis) than often confusing and coded (rather than encrypted) content.

6.2       There are other issues with the definitions of data – experts have been attempting to analyse them in detail in the short time since the Bill was published, and the fact that these experts have been unable to agree or at times even ascertain the meaning of some of the definitions is something that should be taken seriously. Again it emphasises the importance of having sufficient time to scrutinise the Bill. Graham Smith of Bird & Bird, in his submission to the Commons Science and Technology Committee,[9] notes that the terms ‘internet service’ and ‘internet communications service’ used in 47(4) are neither defined nor differentiated, as well as a number of other areas in which there appears to be significant doubt as to what does and does not count as ‘relevant communications data’ for retention purposes. One definition in the Bill particularly stands out: in 195(1) it is stated that ‘”data” includes any information which is not data’. Quite what is intended by this definition remains unclear.

6.3       In his report, ‘A question of trust’, David Anderson QC called for a law that would be ‘comprehensive and comprehensible’: the problems surrounding definitions and the lack of clarity about the separation of content and communications data mean that the Bill, as drafted, does not meet either of these targets yet. There are other issues that make this failure even more apparent. The lack of clarity over encryption – effectively leaving the coverage of encryption to RIPA rather than drafting new terms – has already caused a significant reaction in the internet industry. Whether or not the law would allow end-to-end encryption services such as Apple’s iMessage to continue in their current form, where Apple would not be able to decrypt messages themselves, needs to be spelled out clearly, directly and comprehensibly. In the current draft of the Bill it does not.

6.4       This could be solved relatively simply by the modification of 189 ‘Maintenance of technical capability’, and in particular 189(4)(c) to make it clear that the Secretary of State cannot impose an obligation to remove electronic protection that is a basic part of the service operated, and that the Bill does not require telecommunications services to be designed in such a way as to allow for the removal of electronic protection.

7          Future Proofing the Bill

7.1       One of the most important things for the Committee to consider is how well shaped the Bill is for future developments, and how the Bill might be protected from potential legal challenges. At present, there are a number of barriers to this, but there are ways forward that could provide this kind of protection.

7.2       The first of these relates to ICRs, as noted in section 5 above. The idea behind the gathering ICRs appears on the face of it to be based upon an already out-dated understanding of both the technology of the internet and of the way that people use it. In its current form, the idea of requiring communications providers to retain ICRs is also a hostage to fortune. The kind of data required is likely to become more complex, of a vastly greater volume and increasingly difficult to use. What is already an unconvincing case will become even less convincing as time passes. The best approach would seem to be to abandon the idea of requiring the collection of ICRs entirely, and looking for a different way forward.

7.3       Further, ICRs represent one of the two main ways in which the Bill appears to be vulnerable to legal challenge. It is important to understand that recent cases at both the CJEU (in particular the Digital Ireland case[10] and the Schrems case[11]) and the European Court of Human Rights (in particular the Zakharov case[12]) it is not just the examination of data that is considered to bring Article 8 privacy rights into play, but the gathering and holding of data. This is not a perverse trend, but rather a demonstration that the European courts are recognising some of the issues discussed above about the potential intrusion of gathering and holding data. It is a trend that is likely to continue. Holding data of innocent people on an indiscriminate basis is likely to be considered disproportionate. That means that the idea of ICRs – where this kind of data would be required to be held – is very likely to be challenged in either of these courts and indeed is likely to be overturned at some point.

7.4       The same is likely to be true of the ‘Bulk’ powers, unless those bulk powers are more tightly and clearly defined, including the giving of examples. At the moment quite what these bulk powers consist of – and how ‘bulky’ they are – is largely a matter of speculation, and while that speculation continues, so does legal uncertainty. If the powers involve the gathering and holding of the data of innocent people on a significant scale, a legal challenge either now or in the future seems to be highly likely.

7.5       It is hard to predict future developments either in communications technology or in the way that people use it. This, too, is something that seems certain to continue – and it means that being prepared for those changes needs to be built into the Bill. At present, this is done at least in part by having relatively broad definitions in a number of places, to try to ensure that future technological changes can be ‘covered’ by the law. This approach has a number of weaknesses – most notably that it gives less certainty than is helpful, and that it makes ‘function creep’ or ‘mission creep’ more of a possibility. Nonetheless, it is probably inevitable to a degree. It can, however, be ameliorated in a number of ways.

7.6       The first of these ways is to have a regular review process built in. This could take the form of a ‘sunset clause’, or perhaps a ‘renewal clause’ that requires a new, full, debate by Parliament on a regular basis. The precise form of this could be determined by the drafters of the Bill, but the intention should be clear: to avoid the situation that we find ourselves in today with the complex and almost incomprehensible regime so actively criticised by David Anderson QC, RUSI and to an extent the ISC in their reviews.

7.7       Accompanying this, it is important to consider not only the changes in technology, but the changes in people’s behaviour. One way to do this would be to charge those responsible for the oversight of communications with a specific remit to review how the powers are being used in relation to the current and developing uses of the internet. They should report on this aspect specifically.

8          Overall conclusions

8.1       I have outlined above a number of ways in which the Bill, in its current form, does not seem to be workable, proportionate, future-proofed and protected from potential legal challenges. I have made five specific recommendations:

8.1.1    I do not believe the case has been made for retaining ICRs. They appear unlikely to be of any real use to law enforcement in performing the functions that are set out, they add a significant range of risks and vulnerabilities, and are likely to end up being extremely expensive. This expense is likely to fall upon both the government – in which case it would be a waste of resources that could be put to more productive use to achieve the aims of the Bill – or ordinary internet users through increased connection costs.

8.1.2    The Bill needs to be more precise and open about the Bulk Powers, including a proper setting out of examples so that the Committee can make an appropriate judgment as to their proportionality and to reduce the likelihood of their being subject to legal challenge.

8.1.3    The Bill needs to be more precise about encryption and to be clear about the approach to end-to-end encryption. This is critical to building trust in the industry, and in particular with overseas companies such as those in Silicon Valley. It is also a way to future-proof the Bill: though some within the security and intelligence services may not like it, strong encryption is fundamental to the internet now and will become even more significant in the future. This should be embraced rather than fought against.

8.1.4    Oversight needs strengthening and broadening – including oversight of how the powers have been used in relation to changes in behaviour as well as changes in technology

8.1.5    The use of some form of renewal or sunset clause should be considered, to ensure that the powers are subject to full review and reflection by parliemant on a regular basis.

8.2       The question of resource allocation is a critical one. For example, have alternatives to the idea of retaining ICRs been properly considered for both effectiveness and costs? The level of intrusion of internet surveillance (as discussed in section 3 above) adds to the imperative to consider other options. Where a practice is so intrusive, and impacts upon such a wide range of human rights (Articles 8, 10, 11 and 14 of the ECHR – and possibly Article 6), a very high bar has to be set to make it acceptable. It is not at all clear either that the height of that bar has been appropriately set or that the benefits of the Bill mean that it has met them. In particular, the likely ineffectiveness of ICRs mean that it is very hard to argue that this part of the Bill would meet even a far lower requirement. The risks and vulnerabilities that retention of ICRs adds will in all probability exceed the possible benefits, even without considering the intrusiveness of their collection, retention and use.

8.3       The most important overall conclusion at this stage, however, is that more debate and analysis is needed. The time made available for analysis is too short for any kind of certainty, and that means that the debate is being held without sufficient information or understanding. Time is also needed to enable MPs and Lords to gain a better understanding of how the internet works, how people use it in practice, and how this law and the surveillance envisaged under its auspices could impact upon that use. This is not a criticism of MPs or Lords so much as a recognition that people in general do not have that much understanding of how the internet works – one of the best things about the internet is that we can use it quickly and easily without having to understand much of what is actually happening ‘underneath the bonnet’ as it were. In passing laws with significant effects – and the Investigatory Powers Bill is a very significant Bill – much more understanding is needed.

8.4       It is important for the Committee not to be persuaded that an event like the recent one in Paris should be considered a reason to ‘fast-track’ the Bill, or to extend the powers provided by the Bill. In Paris, as in all the notable terrorism cases in recent years, from the murder of Lee Rigby and the Boston Bombings to the Sydney Café Siege and the Charlie Hebdo shootings, the perpetrators (or at the very least a significant number of the perpetrators) were already known to the authorities. The problem was not a lack of data or a lack of intelligence, but the use of that data and that intelligence. The issue of resources noted above applies very directly here: if more resources had been applied to ‘conventional’ intelligence it seems, on the surface at least, as though there would have been more chance of the events being avoided. Indeed, examples like Paris, if anything, argue against extending large-scale surveillance powers. If the data being gathered is already too great for it to be properly followed up, why would gathering more data help?

8.5       As a consequence of this, in my opinion the Committee should look not just at the detailed powers outlined in the Bill and their justification, but also more directly at the alternatives to the overall approach of the Bill. There are significant costs and consequences, and the benefits of the approach as opposed to a different, more human-led approach, have not, at least in public, been proven. The question should be asked – and sufficient evidence provided to convince not just the Committee but the public and the critics in academia and elsewhere. David Anderson QC made ‘A Question of Trust’ the title of his review for a reason: gaining the trust of the public is a critical element here.

 

 

 

Dr Paul Bernal

Lecturer in Information Technology, Intellectual Property and Media Law

UEA Law School

University of East Anglia

Norwich NR4 7TJ

Email: paul.bernal@uea.ac.uk


 

[1] The new ‘Hello Barbie’ doll, through which a Barbie Doll can converse and communicate with a child, has caused some controversy recently (see for example http://www.theguardian.com/technology/2015/nov/26/hackers-can-hijack-wi-fi-hello-barbie-to-spy-on-your-children but is only one of a growing trend.

[2] See http://www.cam.ac.uk/research/news/computers-using-digital-footprints-are-better-judges-of-personality-than-friends-and-family#sthash.OSQ8dqdr.dpuf

[3] Available online at http://www.ohchr.org/EN/Issues/FreedomOpinion/Pages/CallForSubmission.aspx

[4] http://www.bbc.co.uk/news/uk-politics-22984367

[5] See for example the 2015 report of the UN Special Rapporteur on Freedom of Expression, where amongst other things he makes particular reference to encryption and anonymity. http://daccess-dds-ny.un.org/doc/UNDOC/GEN/G15/095/85/PDF/G1509585.pdf?OpenElement

[6] Some of the potential range of vulnerabilities are discussed in Chapter 6 of my book Internet Privacy Rights – Rights to Protect Autonomy, Cambridge University Press, 2014.

[7] See http://www.ft.dk/samling/20121/almdel/reu/bilag/125/1200765.pdf – in Danish

[8] This has been a major discussion point amongst legal academics for a long time. See for example the work of Daniel Solove, e.g. Reconstructing Electronic Surveillance Law, Geo. Wash. L. Review, vol 72, 2003-2004

[9] Published on the Committee website at http://data.parliament.uk/writtenevidence/committeeevidence.svc/evidencedocument/science-and-technology-committee/investigatory-powers-bill-technology-issues/written/25119.pdf

[10] Joined Cases C‑293/12 and C‑594/12, Digital Rights Ireland and Seitlinger and Others, April 2014, which resulted in the invalidation of the Data Retention Directive

[11] Case C-362/14, Maximillian Schrems v Data Protection Commissioner, October 2015, which resulted in the declaration of invalidity of the Safe Harbour agreement.

[12] Roman Zakharov v. Russia (application no. 47143/06), ECtHR, December 2015

Finding Proportionality in Surveillance Laws – Guest post by Andrew Murray

The United Kingdom Parliament is currently in the pre-legislative scrutiny phase of a new Investigatory Powers Bill, which aims to “consolidate existing legislation and ensure the powers in the Bill are fit for the digital age”. It is fair to sat this Bill is controversial with strong views being expressed by both critics and supporters of the Bill. Against this backdrop it is important to cut through the rhetoric and get to the heart of the Bill and to examine what it will do and what it will mean in terms of the legal framework for British citizens, and indeed for those overseas.

The Investigatory Powers Bill

Much of the Bill’s activity is to formalise and restate pre-existing surveillance powers. One of the key criticisms of the extant powers of the security and law enforcement services is that the law lacks clarity. Indeed it was this lack of clarity which led the Investigatory Powers Tribunal to rule in the landmark case of Liberty v GCHQ that the regulations which covered GCHQ’s access to emails and phone records intercepted by the US National Security Agency breached Articles 8 and 10 of the European Convention on Human Rights. Following a number of strong critiques of the law including numerous legal challenges the Government received three reports into the current law: the report of the Intelligence and Security Committee of Parliament, “Privacy and Security: A modern and transparent legal framework”; the report of the Independent Reviewer of Terrorism Legislation. “A Question of Trust”; and the report of the Royal United Services Institute: “A Democratic Licence to Operate”. All three reported deficiencies in the law’s transparency.

As a result the Bill restates much of the existing law in a way which should be more transparent and which, in theory, should allow for greater democratic and legal oversight of the powers of the security and law enforcement services. In essence the Bill is split into sections: interception, retention, equipment interference and oversight, with each of the three substantive powers split again into targeted and bulk. What this means in practice is the authorisation of three broad types of activity (each of which have sub-types); the authorisation to intercept data between sender and receiver, the authorisation to retain data such as communications data and internet connection records (more below) for possible processing later and authorisation to interfere with (in colloquial terms “hack”) systems and devices. For each of these there is a split between targeted activity, this is required when dealing with communications which are sent and received by individuals who are inside the British Islands (domestic communications) and bulk activity which is permissible where either the sender or receiver (or both) of the communications are located outside the British Islands.

Two of the more controversial aspects of the Bill are the oversight provisions and the introduction of a new form of retained data, so called “internet connection records.”

Proportionality

The retention of internet connection records are an entirely new power found in the Bill. It is an extension to the extant, but currently legally uncertain data retention powers found in the Data Retention and Investigatory Powers Act 2014 (DRIPA). This new power is thus controversial on two bases: (1) it fails to meet the proportionality principle on the basis it fails to comply with the EU Charter on Fundamental Rights; (2) even if the current law is proportionate an extension of powers is almost certainly disproportionate. With regard to the first of these the current law, as contained in DRIPA, is subject to an ongoing legal challenge brought by MPs David Davis and Tom Watson supported by Liberty. The case, Secretary of State for the Home Department v David Davis MP and others [2015] EWCA Civ 1185, has recently been referred by the Court of Appeal to the Court of Justice of the European Union where the court asks the CJEU to rule on whether the ground-breaking case of Digital Rights Ireland Ltd v Minister for Communications, Marine and Natural Resources & Others, the case which ruled that European data retention laws were incompatible with Articles 7 & 8 of the EU Charter, also binds national legislators in the making of domestic data retention laws. Thus the current status of domestic data retention laws is unclear, yet at the time that this case remains under review the Bill seeks to extend the powers of the state to order the retention of data from simple, yet still very invasive power to retain all traffic data on our communications to also cover internet connections records, described in the guide to the Bill as “a record of the internet services a specific device has connected to, such as a website or instant messaging application.” This would be data such as which banking services we use, which rail company or airline we tend to favour and which may reveal much about us including gender, ethnicity, religious beliefs, medical conditions and much more. University of East Anglia law lecturer Paul Bernal has written upon this issue very eloquently in his blog. As he notes despite the Home Office’s best attempts to paint these as akin to itemised phone records, they are much more invasive of personal privacy and they are also clearly likely to be more invasive than the mere retention of communications records, a practice ruled illegal under EU Law in Digital Rights Ireland, and which at domestic law is currently under review. It is difficult to see how this new provision could be seen to be proportionate.

The second key battleground over the Bill is likely to be the oversight procedure for the issuance of warrants. The three reports were split as to whether Ministers or judges should issue warrants. The Intelligence and Security Committee felt the power should remain with Ministers, as “Ministers are able to take into account the wider context of each warrant application and the risks involved, whereas judges can only decide whether a warrant application is legally compliant”. The Independent Reviewer of Terrorism Legislation recommended that “Specific interception warrants, combined warrants, bulk interception warrants and bulk communications data warrants should be issued and renewed only on the authority of a Judicial Commissioner”, however he recommended that the Secretary of State should be allowed to issue a national security certificate where the application related to “the interests of the defence and/or foreign policy of the UK” and in such cases the “Judicial Commissioner in determining whether to issue the warrant should be able to depart from that certificate only on the basis of the principles applicable in judicial review”, this is sometimes called a “double lock” provision. Finally the RUSI report recommended something very similar to the Independent Reviewer with warrants for a purpose relating to the detection or prevention of serious and organised crime “always being authorised by a judicial commissioner” while warrants for purposes relating to national security (including counter-terrorism, support to military operations, diplomacy and foreign policy) and economic well-being, the warrant should be authorised by the secretary of state subject to judicial review by a judicial commissioner. The provisions of the Bill though are quite different. Despite the recommendations of both the Independent Reviewer of Terrorism Legislation and RUSI that warrants in relation to serious crime be issued by a Judicial Commissioner they will continue to be issued by the Secretary of State or by Scottish Ministers. All forms of warrant, including national security warrants, will however be subject to review by Judicial Commissioners under cl.19 of the Bill. There remains however a further complication. While the RUSI and Independent Reviewer of Terrorism Legislation reports suggested that only in relation to national security warrants the Judicial Commissioner should apply “principles applicable in judicial review”, by cl.19 all warrants will be restricted to this narrow set of principles, essentially illegality, fairness, and irrationality and proportionality.

There have been a number of critiques of the way the double lock system has been set up with among others David Davis MP (one of the DRIPA challengers) and the Shadow Home Secretary being highly critical. Again the question of proportionality of the legislation is questionable. In terms of domestic intercept warrants, which Davis in his comment notes “should not be a political decision”, it is questionable whether the role of the Secretary of State is complaint with the spirit, if not the law of Article 8 ECHR, as well as Article 6’s “independent and impartial” requirement. One must ask is it proportionate, or even relevant, to involve a minster of cabinet rank, a political decision-maker, in a decision as to whether a warrant should be issued to intercept communications in an organised crime case. One of the many benefits of our legal systems in the United Kingdom is that judges are appointed and not elected, allowing them to remain apart from the political process. To retain a role for a political office holder in warrants such as these, and against the recommendations of the RUSI and Independent Reviewer of Terrorism Legislation reports appears disproportionate.

 

Andrew Murray is Professor of Law at London School of Economics. He is the author of Information Technology Law: The Law and Society. He is a leading expert in Information Technology Law and Regulation and has written many articles on aspects of the interface between information technology and the legal framework including surveillance and data protection laws.

Notes from the IP Bill Committee session

I was one of the panel of academic witnesses before the specially convened Draft Investigatory Powers Bill Select Committee on Monday 7th December. It was my first time before a Parliamentary Committee and I have to admit I was a little intimidated: from queueing up beneath the statue of Oliver Cromwell to walking through what CP Snow referred to as the ‘corridors of power’. It’s a cliché, but there really is a corridor off from which the Committee Rooms are reached – it has a little of the Alice in Wonderland about it, but the thing that I noticed the most whilst waiting to be called was that almost everyone seemed to be a bit lost. In relation to the Investigatory Powers Bill that might be more than a little appropriate.

The panel I was on was pretty intimidating too, from Professor Ross Anderson, one of the best computer science brains on the planet, Professor Sir David Omand, former head of GCHQ, Permanent Secretary at the Home Office and then Permanent Secretary and Security and Intelligence Co-ordinator in the Cabinet Office under Blair, and Professor Mark Ryan of Birmingham University, another highly distinguished computer scientist. It really was intimidating at first – feeling the weight of the place, the seriousness of the subject and the crucial part that a Parliamentary Committee is supposed to play in the process of scrutinising and passing laws. And as the chair of the Committee, Lord Murphy of Torfaen said in his opening remarks, this bill was crucial – perhaps the most important bill in this parliamentary session.

Screen Shot 2015-12-09 at 10.02.12

Once the session started, though, I found the level of intimidation diminished rapidly – because, in part at least, it was impossible for me not to become immersed in the discussion. It is easy (and often appropriate) to be cynical about our parliamentary process, but seeing it first hand, in this committee at least, it was clear that enough of the members of the committee really wanted to learn, and really wanted to understand the issues, that there was at least a chance that their scrutiny would have some kind of effect. The initial questions, which had been set out before the session, were reasonably good, but the follow ups and the discussions that arose were much better.

The choice of witnesses was interesting: having Ross Anderson at one end of the panel and Sir David Omand at the other end created an interesting dynamic from the start. Sir David seemed to have a particular role in mind from the start – a ‘reasonable’ voice, confirming that everything was OK, that the Bill, as it was written, was clear, balanced, fair and ‘world-leading’. As a number of people pointed out to me after the event, you could tell whether you’d made a good point by the speed and vehemence with which Sir David responded. There were a few key moments on that score, and I hope there is proper follow up on them.

The first is the Danish ‘session-logging’ experience – the nearest equivalent to the proposed ‘Internet Connection Record’ idea in the new Bill – which resulted in around 7 years of wasted money, time and effort, providing almost no help to the police at all, before it was abandoned. When I mentioned it, Sir David interjected immediately that the Home Office was planning to do it very differently. It would be interesting to know how they are doing it differently. I suspect that further investigation could convince the Committee that the problem wasn’t (and isn’t) the technical implementation but the fundamental approach. Session logging didn’t work in Denmark not because the Danes don’t have our technological expertise, but because it’s a fundamentally flawed approach.

The second was the idea that communications data is less intrusive than content – as all the other three member of the panel know, that might have been true once, but it’s no longer true. The intrusion is different, but it isn’t less. Indeed, because of the possibilities for analysis, the greater difficulty in disguising and the increasing ability to use for profiling, it is likely that the balance will shift very much the other way, with communications data being much more important and more intrusive than content.

There were many other things covered – but we had far less time than we needed to explore them in as much depth as we needed. That’s why I shall also be taking up the invitation of the Committee to submit written evidence as well as oral – and why I would seriously advise others to do the same. I was lucky enough to be on a panel – but the written evidence will be even more critical. This Committee, it seemed to me, wanted to learn and should be given the opportunity. Do take it up! Written submissions will be accepted until 21st December. To submit, follow the link here:

http://www.parliament.uk/business/committees/committees-a-z/joint-select/draft-investigatory-powers-bill/publications/written-evidence-form/

The video of the session can be found here:

http://videoplayback.parliamentlive.tv/Player/Index/80ee52fd-8719-4a57-85a3-f64ad9567559?audioOnly=False&autoStart=False&statsEnabled=True

A ‘mature debate’ on surveillance? Yes please!

Andrew Parker, the head of MI5, has said in a speech that he is hoping for a ‘mature debate’ on what he calls ‘intercepting communications data’ rather than surveillance: I’m sure that most people working in the area would very much welcome such a call. I know that I do. Mature debate is exactly what is needed. The question that immediately springs to mind is whether what Andrew Parker means by ‘mature debate’ is the same as what I would understand by the words. The record of the intelligence and security services and the government in relation to such a debate is not a very convincing one: it has been those who challenge surveillance powers who have shown more desire and willingness to debate than the services and their masters in government.

To suggest otherwise – indeed to hint that those challenging them have behaved like petulant, hyperbolic children – flies in the face of the experience of the last few years. There has been hype on both sides, of course – I can see why Parker and others dislike the term ‘Snoopers’ Charter’, for example – but on the other side the claims have been equally lurid and offensive: the suggestions by Theresa May and others that privacy advocates have ‘blood on their hands’ for opposing new powers have been regular and repellent. The record of seeking debate, however, has been distinctly one-sided. Back in 2012, when the coalition government first put forward the Communications Data Bill – dubbed by its ‘hyperbolic’ opponents the Snoopers’ Charter – the intention was to push it through without any real debate at all. Indeed, the hints were that it would be passed in a matter of weeks before the London Olympics. It took a lot of pressure to force the bill into proper scrutiny, and a special Joint Parliamentary Committee was eventually formed to examine it. Debate was very much sought by those interested in interception and surveillance powers: over 600 pages of written evidence was submitted to the committee from more than 100 witnesses (including myself). So yes, we want mature debate, whenever we get the chance.

That first batch of ‘mature debate’ did not get the results that the proponents of the Communications Data Bill wanted: the report of the Joint Parliamentary Committee was highly critical, and after the intervention of the then Deputy Prime Minister, Nick Clegg, the bill was dropped, with a promise of further debate and a new Bill to scrutinise. That new Bill, however, never materialised (though I understand that it was drafted) and neither did the promised further debate. Again, it was not those who challenged surveillance and interception that were avoiding the debate. Very much the opposite: we wanted more information and more debate, and our questions were largely fobbed off.

That debate, however, began to happen even without the participation of the intelligence and security services, when in June 2013 Edward Snowden dropped his bombshell on the whole business. The debate that followed might not have been mature at all times, but it was a debate – despite the efforts of the intelligence and security services, not because of those efforts. Indeed, most of the efforts seemed to be to shut down the debate, to shut Edward Snowden up, along with those in the media who worked with him, arresting them at airports, smashing their hard drives and so forth. Keith Vaz questioning whether Guardian Editor Alan Rusbridger ‘loved his country’ was a particularly mature part of this debate. All this was accompanied by yet more mature suggestions about opponents of surveillance having blood on their hands. The maturity level was immense.

Then, when the mature debate actually began – the three big inquiries, from the Intelligence and Security Committee, the Independent Reviewer of Terrorism Legislation and the Royal United Services Institute – along came the next attempt to shut down that debate: DRIP. The shabby process through which the Data Retention and Investigatory Powers Act was rushed through parliament in a matter of days without any opportunity for public debate and only a few brief hours of parliamentary debate – in a mostly empty chamber with MPs preoccupied with preparations for the forthcoming election – was about as far from opening up to mature debate as could be imagined. Barely a debate at all, let alone a mature one.

Even after that, there was a further attempt to force through legislation without debate – four members of the House of Lords, all associated in the past with the security side of government, tacked on pretty much the entire, rejected Communications Data Bill to the back of another bill, very late in the parliamentary process, to try to sneak in those powers once more without debate.

So, Andrew Parker, let’s have this mature debate. Please. As soon and as deeply as we can. But don’t pretend that you’ve been seeking it all along, or that those who are challenging you have wanted anything else.  What is more, let’s make sure it is a mature debate, and not the sort of ‘debate’ that happens when one side has all the power and has predetermined the result, like a parent telling a three-year-old what the rules are for their behaviour. A mature debate must leave a chance for different results. In this case in particular, mature debate does not mean a Brian Clough style discussion where you tell us your opinion, we tell you your opinion, and we agree that you are right. There has to be a possibility – and you have to be open to this possibility – that the powers of the intelligence and security services are in practice (as well as in law) curtailed. If there is no possibility of change, the debate – mature or immature – is meaningless.

Are you ready for this kind of debate? I hope so. Let’s have it as soon as we can.

A shout out for the Open Rights Group!

Screen Shot 2015-03-17 at 10.04.26Today is #DigitalRightsMatter day – and yes, I know there are days for many things (including, despite the complaints from some, an International Men’s Day (November 19th)). I’m usually fairly cynical about these days – but they do serve a purpose – to focus minds on significant issues, and hopefully to find ways to actually do something about them. In this case, the issue is digital rights – one close to my heart – and the thing to do is to support the Open Rights Group (ORG).

I should say, right from the start, that I’m on the Advisory Council of ORG so I have something of a vested interest – but I’m only on the Advisory Council because I think what ORG does is of critical importance, particularly right now. Never has there been a time when digital rights have been more important, and never has there been a time when they are more under threat. We use the internet for more and more things – from our work to our personal life, from our political activism to our entertainment, from finding jobs to finding romance. Indeed, there are pretty much no parts of our lives that are untouched by the internet – so what happens online, what happens to our digital freedoms and rights, is of ever increasing importance.

Now is when we need them

The threats that we face to our freedoms are growing at a seemingly exponential rate. Surveillance is almost everywhere, and the political pressure to increase it is frightening. Censorship, the other side of that authoritarian coin, is growing almost as fast – from more and more uses for ‘web-blocking’ to ‘porn’ filters that hide vastly more than porn, from critically important sex education websites to sites that discuss alcohol, anorexia and hate speech. David Cameron talks about banning encryption without seemingly having any idea of what he’s talking about – or the implications of his suggestions.

This last point highlights one of the reasons ORG is critically important right now. Politicians from all the mainstream parties seem to have very little grasp of how the internet works – and they reach for ‘easy’ solutions which get the right headlines in the Tabloid press but are not only almost always counterproductive and authoritarian but actually encourage the perpetuation of damaging myths that will make things continue to get worse. The media, left to their own devices, also have a tendency to look for easy headlines and worse.

That’s one of the places that ORG comes in. It campaigns on these issues – current campaigns include ‘Don’t Spy On Us’ dealing with surveillance, Blocked! which looks at filtering, and 451 Unavailable which tries to bring transparency to the blocking of websites by court orders. It produces information that cuts through the confusion and makes sense of these issues – and tries to help politicians and the media to understand them more. And it works – ORG representatives are now quoted regularly in the media and when they make submissions to government inquiries they’re the ones who are given hearings and referred to in reports.

They do much more than this. They help with court cases working with other excellent advocacy groups like Privacy International – the current challenge to the Data Retention and Investigatory Powers Act (DRIPA) is just one of many they’ve been involved in, and these cases really matter. They don’t always win – indeed, sadly they don’t win often – but they often force the disclosure of critical information, they sometimes bring about changes in the law, and they raise the profile of critical issues. ORG are also part of the critical European organisation EDRi who bring together digital rights groups from all over Europe to even more effect.

Now is when they need us

ORG, like other advocacy groups, regularly punches above its weight. It doesn’t have the massive resources of the government agencies and international corporations whose activities they often have to campaign against. There are no deep pockets in ORG, and no massive numbers of staff – they rely on donations, and on volunteers. That’s where #DigitalRightsMatter day comes in – ORG is trying to find new members, get more donations and find access to more expertise. Can you help?

ORG’s joining page is here

Their blog about #DigitalRightsMatter day is here

I would encourage anyone to consider joining – because Digital Rights really do matter, and not just on #DigitalRightsMatter day.