A better debate on surveillance?

screen-shot-2016-09-21-at-18-57-00Back in 2015, Andrew Parker, the head of MI5, called for a ‘mature debate’ on surveillance – in advance of the Investigatory Powers Bill, the surveillance law which has now almost finished making its way through parliament, and will almost certainly become law in a few months time. Though there has been, at least in some ways, a better debate over this bill than over previous attempts to update the UK’s surveillance law, it still seems as though the debate in both politics and the media remains distinctly superficial and indeed often deeply misleading.

It is in this context that I have a new academic paper out: “Data gathering, surveillance and human rights: recasting the debate”, in a new journal, the Journal of Cyber Policy. It is an academic piece, and access, sadly, is relatively restricted, so I wanted to say a little about the piece here, in a blog which is freely accessible to all – at least in places where censorship of the internet has not yet taken full hold.

The essence of the argument in the paper is relatively straightforward. The debate over surveillance is simplified and miscast in a number of ways, and those ways in general tend to make surveillance seem more positive and effective that it is, and with less broad and significant an impact on ordinary people than it might have. The rights that it impinges are underplayed, and the side-effects of the surveillance are barely mentioned, making surveillance seem much more attractive than should be – and hence decisions are made that might not have been made if the debate had been better informed. If the debate is improved, then the decisions will be improved – and we might have both better law and better surveillance practices.

Perhaps the most important way in which the debate needs to be improved is to understand that surveillance does not just impact upon what is portrayed as a kind of selfish, individual privacy – privacy that it is implied does not matter for those who ‘have nothing to hide’ – but upon a wide range of what are generally described as ‘civil liberties’. It has a big impact on freedom of speech – an impact that been empirically evidenced in the last year – and upon freedom of association and assembly, both online and in the ‘real’ world. One of the main reasons for this – a reason largely missed by those who advocate for more surveillance – is that we use the internet for so many more things than we ever used telephones and letters, or even email. We work, play, romance and research our health. We organise our social lives, find entertainment, shop, discuss politics, do our finances and much, much more. There is pretty much no element of our lives that does not have a very significant online element – and that means that surveillance touches all aspects of our lives, and any chilling effect doesn’t just chill speech or invade selfish privacy, but almost everything.

This, and much more, is discussed in my paper – which I hope will contribute to the debate, and indeed stimulate debate. Some of it is contentious – the role of commercial surveillance the interaction between it and state surveillance – but that too is intentional. Contentious issues need to be discussed.

There is one particular point that often gets missed – the question of when surveillance occurs. Is it when data is gathered, when it is algorithmically analysed, or when human eyes finally look at it. In the end, this may be a semantic point – what technically counts as ‘surveillance’ is less important than what actually has an impact on people, which begins at the data gathering stage. In my conclusion, I bring out that point by quoting our new Prime Minister, from her time as Home Secretary and chief instigator of our current manifestation of surveillance law. This is how I put it in the paper:

“Statements such as Theresa May’s that ‘the UK does not engage in mass surveillance’ though semantically arguable, are in effect deeply unhelpful. A more accurate statement would be that:

‘the UK engages in bulk data gathering that interferes not only with privacy but with freedom of expression, association and assembly, the right to a free trial and the prohibition of discrimination, and which puts people at a wide variety of unacknowledged and unquantified risks.’”

It is only when we can have clearer debate, acknowledging the real risks, that we can come to appropriate conclusions. We are probably too late for that to happen in relation to the Investigatory Powers Bill, but given that the bill includes measures such as the contentious Internet Connection Records that seem likely to fail, in expensive and probably farcical ways, the debate will be returned to again and again. Next time, perhaps it might be a better debate.

Labour and the #IPBill

I am a legal academic, specialising in internet privacy – a lecturer at the UEA Law School. I am the author of Internet Privacy Rights: Rights to Protect Autonomy, published by Cambridge University Press in 2014, and was one of the academics who was a witness before the Joint Parliamentary Committee on the Investigatory Powers Bill. I am also a member of the Labour Party – this piece is written from all of those perspectives.


 Labour and the Investigatory Powers Bill

The Investigatory Powers Bill has its second reading on Tuesday – part of what appears an attempt to pass the Bill with unseemly haste. One of the biggest questions is how Labour will approach the Bill – the messages so far have been mixed. Andy Burnham’s press release on the 1st of March in response to the latest draft was from my perspective the best thing that has emerged from Labour in relation to surveillance in many decades, if not ever.

What is important is that Labour builds on this – for in taking a strong and positive response to the Investigatory Powers Bill Labour has a chance to help shape its future in other areas. What is more, Labour can tap into some of its best and most important traditions and realise the promise of some of its best moments.

Demand more time

The first and most important thing that Labour should do at this stage is demand more time for scrutiny for the bill. There are some very significant issues that have not received sufficient time – the three parliamentary committees that have examined the bill so far (the Science and Technology Committee, the Intelligence and Security Committee and the specially convened Joint Parliamentary Committee on the Investigatory Powers Bill) all made that very clear. The Independent Reviewer of Terrorism Legislation, David Anderson QC has also been persistent in his calls for more time and more careful scrutiny – most recently in his piece in the Telegraph where he said:

“A historic opportunity now exists for comprehensive reform of the law governing electronic surveillance. Those who manage parliamentary business must ensure that adequate time – particularly in committee – is allowed before December 2016.”

David Anderson is right on all counts – this is a historic opportunity, and adequate time is required for that review. How Labour responds could well be the key to ensuring that this time is provided: a strong response now, and in particular the willingness to reject the bill in its entirety unless sufficient time is given, would put the government in a position where it has to provide that time.

As well as pushing for more time, there are a number of things that Labour – and others – should be requiring in the new bill, many of which were highlighted by the three parliamentary committees but have not been put into the new draft bill.

Proper, independent oversight

The first of these is proper, independent oversight – oversight not just of how the powers introduced or regulated by the bill are being used in a procedural way (whether warrants are being appropriately processed and so forth) but whether the powers are actually being used in the ways that parliament envisaged, that the people were being told and so forth. Reassurances made need to be not just verified but re-examined – and as time moves on, as technology develops and as the way that people use that technology develops it needs to be possible to keep asking whether the powers remain appropriate.

The oversight body needs not just to be independent, but to have real powers. Powers to sanction, powers to notify, and even powers to suspend the functioning of elements of the bill should those elements be found to be no longer appropriate or to have been misused.

Independent oversight – as provided, for example, by the Independent Reviewer of Terrorism Legislation – is not just valuable in itself, but in the way that it can build trust. Building trust is critical in this area: a lot of trust has been lost, as can be seen by the rancorous nature of a lot of the debate. It would help everyone if that rancour is reduced.

Re-examine and rebalance ‘Bulk Powers’

One of the most contentious areas in the bill is that of ‘Bulk Powers’: bulk interception, bulk acquisition (of communications data), bulk equipment interference (which includes what is generally referred to as ‘hacking’) and bulk personal datasets. These powers remain deeply contentious – and potentially legally challengeable. There are specific issues with some of them – with bulk equipment interference a sufficiently big issue that the Intelligence and Security Committee recommended their removal from the bill.

It is these powers that lead to the accusation that the bill involves ‘mass surveillance’ – and it is not sufficient for the Home Secretary simply to deny this. Her denials appear based on a semantic argument about what constitutes ‘surveillance’ – and argument that potentially puts her at odds with both the European Court of Human Rights and the Court of Justice of the European Union. It also puts the UK increasingly at odds with opinion around the world. The UN’s Special Rapporteur on the right to privacy, Joseph A. Cannataci, said in his Report to the UN Human Rights Council on the 8th March:

“It would appear that the serious and possibly unintended consequences of legitimising bulk interception and bulk hacking are not being fully appreciated by the UK Government.”

Much more care is needed here if the Investigatory Powers Bill is to be able to face up to legal challenge and not damage not only people’s privacy but the worldwide reputation of the UK. Again, proper and independent oversight would help here, as well as stronger limits on the powers.

An independent feasibility study for ICRs

The Home Office have described ‘Internet Connection Records’ as the one genuinely new part of the Investigatory Powers Bill: it is also one of the most concerning. Critics have come from many directions. Privacy advocates note that they are potentially the most intrusive measure of all, gathering what amounts to substantially all of our internet browsing history – and creating databases of highly vulnerable data, adding rather than reducing security and creating unnecessary risks. Industry experts have suggested they would be technically complex, extortionately expensive and extremely unlikely to achieve the aims that have been suggested. All three parliamentary committees asked for more information and clarity – and yet that clarity has not been provided. The suggestion that ICRs are like an ‘itemised phone bill’ for the internet has been roundly criticised (notably by the Joint IP Bill Committee) and yet it appears to remain the essential concept and underpinning logic to the idea.

Given all this, to introduce the idea without proper testing and discussion with the industry seems premature and ill conceived at best. If the idea cannot be rejected outright, it should at least be properly tested – and again, with independent oversight. Instead of including it within the bill, a feasibility study could be mounted – a year of working with industry to see if the concept can be made to work, without excessive cost, and producing results that can actually be useful, can be properly secured and so forth. If at the end of the feasibility study the evidence suggests the idea is workable, it can be added back into the bill. If not, alternative routes can be taken.

Reassess encryption

Perhaps the most contentious issue of all at present is the way in which the bill addresses encryption. All three parliamentary committees demanded clarity over the matter – particularly in relation to end-to-end encryption. That clarity is conspicuous by its absence in the bill. Whether the lack of clarity is intentional or not is somewhat beside the point: the industry in particular needs clarity. Specifically, the industry needs the government to be clear in the legislation that it will not either ban end-to-end encryption, demand that ‘back doors’ are built into systems, or pressurise companies to build in those back doors or weaken their encryption systems.

The current position not only puts the government at odds with the industry, it puts it at odds with computer scientists around the world. The best of those scientists have made their position entirely clear – and yet still the government seems unwilling to accept what both scientists and industry are telling them. This needs to change – what is being suggested right now is dangerous to privacy and security and potentially puts the UK technology industry at a serious competitive disadvantage compared to the rest of the world.

Working with industry and science

Therein lies one of the most important keys: working with rather than against the IT industry and computer scientists. Plans such as those in the Investigatory Powers Bill should have been made with the industry and science from the very start – and the real experts should be listened to, not ridden roughshod over. Inconvenient answers need to be faced up to, not rejected. Old concepts should not be used as models for new situations when the experts tell you otherwise.

This is where one of Labour’s longest traditions should come into play. Harold Wilson’s famous Scarborough speech in 1963, where he talked about the ‘white heat’ of technology is perhaps even more apt now than it was all those years ago. Labour should be a modernising party – and that means embracing technology and science, listening to scientists and learning from them, using evidence-based policy and all that entails. Currently, the Investigatory Powers Bill is very much the reverse of that – but it still could become that, if appropriate changes are made.

Protecting ordinary people

Labour should also be tapping into another strong tradition – indeed in many ways its founding tradition. Labour was born to support and protect working people – ‘ordinary’ people in the positive sense of that word. Surveillance, in practice, often does precisely the opposite – it can be used by the powerful against those with less power. It can be politically misused – and the history of surveillance of trade unionists, left-wing activists is one of which the Labour Party should be acutely aware. Without sufficient safeguards and limitations, any surveillance system can and will be misused, and often in precisely these kinds of ways.

Labour could and should remember this – and work very hard to ensure that those safeguards and limitations are built in. Some of the measures outlined above – proper oversight, rebalancing bulk powers, a feasibility study on ICRs in particular – are intended to do precisely that.

Not ‘soft’ but strong

Building in these safeguards, working with technology industries and scientists, protecting rather than undermining encryption should not be seen as something ‘soft’ – and any suggestion that by opposing the measures currently in the Bill is somehow being ‘soft’ on terrorists and paedophiles should not just be rejected but should be turned on its head. The current bill will not protect us in the ways suggested – indeed, it will make us less secure, more at risk from cybercriminals, create more openings for terrorists and others, and could be a massive waste of money, time and expertise. That money, time and expertise could be directed in ways that do provide more protection.

What is more, as noted above, the current bill would be much more vulnerable to legal challenge than it should be. That is not a sign of strength: very much the opposite.

Labour’s future direction

Most of these issues are relevant to all political parties – but for Labour the issue is particularly acute. Labour is currently trying to find a new direction – the challenge presented by the Investigatory Powers Bill could help it be found. A positive approach could build on the old traditions outlined above, as well as the human rights tradition build in Blair’s early years: the Human Rights Act is one of New Labour’s finest achievements, despite the bad treatment it receives in the press. A party that forges alliances with the technology industry and with computer science, one that embraces the internet rather than seeing it as a scary and dangerous place to be corralled and controlled, is a party that has a real future. Labour wants to engage with young people – so be the party that supports WhatsApp rather than tries to ban it or break it. Be the party that understands encryption rather than fights against it.

All this could begin right now. I hope Labour is up to the challenge.

 

 

The new IP Bill…. first thoughts…

This morning, in advance of the new draft of the Investigatory Powers Bill being released, I asked six questions:

Screen Shot 2016-03-01 at 09.46.09

At a first glance, they seem to have got about 2 out of 6, which is perhaps better than I suspected, but  not as good as I hoped.

  1. On encryption, I fear they’ve failed again – or if anything made things worse. The government claims to have clarified things in S217 and indeed in the Codes of Practice – but on a first reading this seems unconvincing. The Communications Data Draft Code of Practice section on ‘Maintenance of a Technical Capability’ relies on the idea of ‘reasonability’ which in itself is distinctly vague. No real clarification here – and still the possibility of ordering back-doors via a ‘Technical Capability Notice’ looms very large. (0 out of 1)
  2. Bulk Equipment Interference remains in the Act – large scale hacking ‘legitimised’ despite the recommendation from the usually ‘authority-friendly’ Intelligence and Security Committee that it be dropped from the Bill. (0 out of 2)
  3. A review clause has been added to the Bill – but it is so anaemic as to be scarcely worth its place. S222 of the new draft says that the Secretary of State must prepare a report by the end of the sixth year after the Bill is passed, publish it and lay it before parliament. This is not a sunset clause, and the report prepared is not required to be independent or undertaken by a review body, just by the Secretary of State. It’s a review clause without any claws, so worth only 1/4 a point. (1/4 out of 3)
  4. At first read-through, the ‘double-lock’ does not appear to have been notably changed, but the ‘urgent’ clause has seemingly been tightened a little, from 5 days to 3, but even that isn’t entirely clear. I’d give this 1/4 of a point (so that’s 1/2 out of 4)
  5. The Codes of Practice were indeed published with the bill (and are accessible here) which is something for which the Home Office should be applauded (so that’s 1 and 1/2 out of 5)
  6. As for giving full time for scrutiny of the Bill, the jury is still out – the rumour is second reading today, which still looks like undue haste, so the best I can give them is 1/2 a point – making it a total of 2 out of 6 on my immediate questions.

That’s not quite as bad as I feared – but it’s not as good as it might have been and should have been. Overall, it looks as though the substance of the bill is largely unchanged – which is very disappointing given the depth and breadth of the criticism levelled at it by the three parliamentary committees that examined it. The Home Office may be claiming to have made ‘most’ of the changes asked for – but the changes they have made seem to have been the small, ‘easy’ changes rather than the more important substantial ones.

Those still remain. The critical issue of encryption has been further obfuscated, the most intrusive powers – the Bulk Powers and the ICRs – remain effectively untouched, as do the most controversial ‘equipment interference’ powers. The devil may well be in the detail, though, and that takes time and careful study – there are people far more able and expert than me poring over the various documents as I type, and a great deal more will come out of that study. Time will tell – if we are given that time.

 

The Saga Of the Privacy Shield…

Screen Shot 2016-02-09 at 06.23.54

(With apologies to all poets everywhere)

 

Listen to the tale I tell

Of Princes bold and monsters fell

A tale of dangers well conceal’d

And of a bright and magic shield

 

There was a land, across the bay

A fair land called the USA

A land of freedom: true and just

A land that all the world might trust

 

Or so, at least, its people cheered

Though others thought this far from clear

From Europe all the Old Folk scowled

And in the darkness something howled

 

For a monster grew across the bay

A beast they called the NSA,

It lived for one thing: information

And for this it scoured that nation

 

It watched where people went and came

It listened and looked with naught of shame

The beast, howe’er, was very sly

And hid itself from prying eyes

 

It watched while folk from all around

Grew wealthy, strong and seeming’ sound

And Merchant Princes soon emerged

Their wealth it grew surge after surge

 

They gathered data, all they could

And used it well, for their own good

They gave the people things they sought

While keeping more than p’rhaps they ought

 

And then they looked across the bay

Saw Old Folk there, across the way

And knew that they could farm those nations

And take from them their information

 

But those Old Folk were not the same

They did not play the Princes’ game

They cared about their hope and glory

Their laws protected all their stories

 

‘You cannot have our information

Unless we have negotiations

Unless our data’s safe and sound

We’ll not let you plough our ground’

 

The Princes thought, and then procured

A harbour safe and quite secure

Or so they thought, and so they said

And those Old Folk gave them their trade

 

And so that trade just grew and grew

The Old Folks loved these ideas new

They trusted in that harbour’s role

They thought it would achieve its goal

 

But while the Princes’ realms just grew

The beast was learning all they knew

Its tentacles reached every nook

Its talons gripped each face, each book

 

It sucked up each and ev’ry drop:

None knew enough to make it stop

Indeed, they knew not what it did

‘Til one brave man, he raised his head

 

And told us all, around the world

‘There is a beast, you must be told’

He told us of this ‘NSA’

And how it watched us day by day

 

He told us of each blood-drenched claw

He named each tentacle – and more

And with each word, he made us fear

That this beast’s evil held us near

 

In Europe one man stood up tall

“Your harbour is not safe at all!

You can’t protect us from that beast

That’s not enough, not in the least!”

 

He went unto Bourg of Luxem

The judges listened care’fly to him

‘A beast ‘cross the bay sees ev’rywhere

Don’t send our secrets over there!

 

The judges liked not what they saw

‘That’s no safe habour,’ they all swore

“No more stories over there!

Sort it out! We do all care!”

 

The Princes knew not what to do

They could not see a good way through

The beast still lurked in shadows dark

The Princes’ choices seemed quite stark

 

Their friends and fellows ‘cross the bay

Tried to help them find a way

They whispered, plotted, thought and plann’d

And then the Princes raised their hands

 

“Don’t worry now, the beast is beaten

It’s promised us you won’t be eaten

It’s changed its ways; it’s kindly now

And on this change you have our vow

 

Behold, here is our mighty shield

And in its face, the mighty yield

It’s magic, and its trusty steel

Is strong enough for all to feel

 

Be brave, be bold, you know you should

You know we only want what’s good”

But those old folk, they still were wary

That beast, they knew, was mighty scary

 

“That beast of yours, is it well chained?

Its appetites, are they contained?

Does it still sniff at every door?

Its tentacles, on every floor?

 

The Princes stood up tall and proud

“We need no chains”, they cried aloud

“Our beast obeys us, and our laws

You need not fear it’s blunted claws.”

 

“Besides,” they said, “you are contrary

You have your own beasts, just as scary”

The Old Folk looked a mite ashamed

‘Twas true their own beasts were not tamed

 

“‘Tis true our beasts remain a blight

But two wrongs never make a right

It’s your beast now that we all fear

Tell us now, and make it clear!”

 

“Look here” the Princes cried aloud

“Of this fair shield we all are proud,

Its face is strong, its colours bright

There’s no more need for any fright.”

Shield

The Old Folk took that shield in hand

‘Twas shiny, coloured, bright and grand

But as they held it came a worry

Why were things in such a hurry?

 

Was this shield just made of paper?

Were their words just naught but vapour?

Would that beast still suck them dry?

And their privacy fade and die?

 

Did they trust the shield was magic?

The consequences could be tragic

The monster lurked and sucked its claws

It knew its might meant more than laws

 

Whatever happened, it would win

Despite the tales the Princes spin

It knew that well, and so did they

In that fair land across the bay.

 

 

 

 

Does the UK engage in ‘mass surveillance’?

Screen Shot 2016-01-15 at 07.42.03

When giving evidence to the Parliamentary Committee on the Draft Investigatory Powers Bill Home Secretary Theresa May stated categorically that the UK does not engage in mass surveillance. The reaction from privacy advocates and many in the media was something to see – words like ‘delusional’ have been mentioned – but it isn’t actually as clear cut as it might seem.

Both the words ‘mass’ and ‘surveillance’ are at issue here. The Investigatory Powers Bill uses the word ‘bulk’ rather than ‘mass’ – and Theresa May and her officials still refuse to give examples or evidence to identify how ‘bulky’ these ‘bulk’ powers really are. While they refuse, the question of whether ‘bulk’ powers count as ‘mass’ surveillance is very hard to determine. As a consequence, Theresa May will claim that they don’t, while skeptics will understandably assume that they do. Without more information, neither side can ‘prove’ they’re right.

The bigger difference, though, is with the word ‘surveillance’. Precisely what constitutes surveillance is far from agreed. In the context of the internet (and other digital data surveillance) there are, very broadly speaking, three stages: the gathering or collecting of data, the automated analysis of the data (including algorithmic filtering), and then the ‘human’ examination of the results of that analysis of filtering. This is where the difference lies: privacy advocates and others might argue that the ‘surveillance’ happens at the first stage – when the data is gathered or collected – while Theresa May, David Omand and those who work for them would be more likely to argue that it happens at the third stage – when human beings are involved.

If the surveillance occurs when the data is gathered, there is little doubt that the powers envisaged by the Investigatory Powers Bill would constitute mass surveillance – the Internet Connection Records, which appear to apply to pretty much everyone (so clearly ‘mass’) would certainly count, as would the data gathered through ‘bulk’ powers,  whether it be by interception, through ICRs, through the mysterious ‘bulk personal datasets’ about which we are still being told very little.

If, however, the surveillance only occurs when human beings are involved in the process, then Theresa May can argue her point: the amount of information looked at by humans may well not be ‘massive’, regardless of how much data is gathered. That, I suspect, is her point here. The UK doesn’t engage in ‘mass surveillance’ on her terms.

Who is right? Analogies are always dangerous in this area, but it would be like installing a camera in every room of every house in the UK, turning that camera on, having the footage recorded and stored for a year – but having police officers only look at limited amounts of the footage and only when they feel they really need to.

Does the surveillance happen when the cameras are installed? When they’re turned on? When the footage is stored? When it’s filtered? Or when the police officers actually look at it.  That is the issue here. Theresa May can say, and be right, that the UK does not engage in mass surveillance, if and only if it is accepted that surveillance only occurs at the later stages of the process.

In the end, however, it is largely a semantic point. Privacy invasion occurs when the camera is installed and the capability of looking at the footage is enabled. That’s been consistently shown by recent rulings at both the Court of Justice of the European Union and of the European Court of Human Rights. Whether it is called ‘surveillance’ or something else, it invades privacy – which is a fundamental right. That doesn’t mean that it is automatically wrong – but that the balancing act between the rights of privacy (and freedom of expression, of assembly and association etc that are protected by that privacy) and the need for ‘security’ needs to be considered at the gathering stage, and not just at the stage when people look at the data.

In practice, too, the middle of the three stages – the automated analysis, filtering or equivalent – may be more important than the last one. Decisions are already made at that stage, and this is likely to increase. Surveillance by algorithm is likely to be (and may already be) more important than surveillance by human eyes, ears and minds. That means that we need to change our mindset about which part of the surveillance process matters. Whether we call it ‘mass surveillance’ or something else is rather beside the point.

The Surveillance Elephant in the Room…

IMG_4425

Yesterday’s decision in the Court of Justice of the European Union (CJEU) in what has been dubbed the ‘Europe vs Facebook’ case was, as the Open Rights Group puts it, a ‘landmark victory for privacy rights’. Much has already been written about it. I do not propose to cover the same territory in any depth – the Open Rights Group blog post linked to above gives much of the background – but instead to examine the response of the European Commission, and the elephant in the Commission’s room: surveillance.

The judgment was published yesterday morning, and its essence was very simple. The ‘safe harbor’ agreement, which effectively allows personal data to be transferred from the EU to the US by some 4,000 or so companies, was declared invalid, because though under the agreement the relevant US companies promise to provide protection for that data in many ways – security, promising not to repurpose it, misuse it, hold it longer than necessary and so forth, essentially along the lines of European Data Protection law – there was one thing that it could not provide protection from: surveillance by the US authorities.

As the CJEU put it (paragraph 94 of the ruling):

“…legislation permitting the public authorities to have access on a generalised basis to the content of electronic communications must be regarded as compromising the essence of the fundamental right to respect for private life…”

This is where the European Commission comes in. It was the Commission that made the ‘safe harbor’ decision, setting up the safe harbor system, which should, in accordance with data protection law, have ensured that data was adequately protected in the US. The Commission did not ensure that – and did not even state that it did – primarily because the state of US surveillance law (and, as far as we know, US surveillance practice) could not allow it. US surveillance law means that ‘national security, public interest, or law enforcement requirements’ override privacy and other rights where non-US citizens are concerned, and EU citizens have no form of protection against this, or legal remedies available.

The Elephant in the Room

This, it must be clear, is a fundamental issue. If the US can do this, without control or redress, then whatever systems are in place, whatever systems are brought in to replace the now invalidated ‘Safe Harbor’, will similarly breach fundamental privacy rights. No new ‘safe harbor’, no individual arrangements for particular companies, no other sidestepping plans would seem to be possible.  Unless US surveillance law – and, US surveillance practice – is changed, no safe harbor would seem to be possible.

The Commission, however, does not seem willing – or perhaps ready – to confront this issue. Their brief statement in response to the ruling, published yesterday afternoon, does not mention surveillance even once. That in itself is quite remarkable. The closest it gets to accepting what is, in fact, the essence of the ruling, is a tangential reference to ‘the Snowden revelations in 2013’ without mentioning anything about what those revelations related to. There is no mention of US surveillance law, of the NSA, of national security or of anything else relating to it. The surveillance elephant in the room looms over everything but the Commission seems to be pretending that it does not even exist.

The US authorities, however, are quite aware of the elephant – in a somewhat panicky press release last week, between the opinion of the Attorney General that presaged the CJEU ruling, the ‘US Mission to the European Union’ said that the ‘United States does not and has not engaged in indiscriminate surveillance of anyone, including ordinary European citizens‘. They do not, however, seem to have convinced the CJEU of this. Far from it.

Heads in the sand

In a way it should not be a surprise that the Commission seems to have their heads in the sand about this issue. It is not at all easy to see a way out of this. Will the US stop or change its surveillance practices and law? It is hard to imagine that they would, particularly in response to a ruling in a European court. Can they provide convincing evidence that they are not engaging in mass, indiscriminate surveillance? Again it seems unlikely, primarily because the evidence points increasingly precisely the opposite way.

There are big questions about what actually constitutes ‘surveillance’ – does surveillance occur when data is ‘collected’, when it is accessed automatically or analysed algorithmically, or when human eyes are involved? The US (and UK) authorities suggest the latter, but the European Courts (both the CJEU and the European Court of Human Rights) have found that privacy rights are engaged when data is gathered or held – and rightly so, in the view of most privacy scholars. There are many reasons for this. There is a chilling effect of the existence of the surveillance apparatus itself and the ‘panopticon’ issue: we alter our behaviour when we believe we might be being watched, not just when we are watched. There is the question of data vulnerability – if data has been gathered, then it might be hacked, lost or leaked even before it is analysed. The very existence of the Snowden leaks makes it clear that even the NSA isn’t able to guarantee its data security. Fundamentally, where data exists, it is vulnerable. There are other arguments – the strength of algorithmic analysis, for example, may well mean that there is more effective intrusion without human involvement in the process, the importance of meta-data and so forth – but they all point in the same direction. Data gathering, despite what the US and UK authorities might wish to say, does interfere with our privacy. That means, in the end, that fundamental rights are engaged.

What happens next?

That is the big question. The invalidation of safe harbor has huge repercussions and there will be some manic lobbying taking place behind the scenes. The Commission will have to consider the surveillance elephant in the room soon. It isn’t going away on its own.

And behind that elephant there are other elephants: if US surveillance and surveillance law is a problem, then what about UK surveillance? Is GCHQ any less intrusive than the NSA? It does not seem so – and this puts even more pressure on the current reviews of UK surveillance law taking place. If, as many predict, the forthcoming Investigatory Powers Bill will be even more intrusive and extensive than current UK surveillance laws this will put the UK in a position that could rapidly become untenable. If the UK decides to leave the EU, will that mean that the UK is not considered a safe place for European data? Right now that seems the only logical conclusion – but the ramifications for UK businesses could be huge.

More huge elephants are also looming – the various world-wide trade agreements currently being semi-secretly negotiated, from the TPP (Trans-Pacific Partnership – between the various Pacific Rim countries including the US, Australia, NZ, Japan) to the TISA (the Trade In Services Agreement), TTIP (Transatlantic Trade and Investment Partnership – between the EU and the US) and CETA (Comprehensive Economic and Trade Agreement – between Canada and the EU)  seem to involve data flows (and freedom from government interference with those data flows) that would seem to fly directly in the face of the CJEU ruling. If data needs to be safe from surveillance, it cannot be allowed to flow freely into places where surveillance is too indiscriminate and uncontrolled. That means the US.  These agreements would also seem likely to allow (or even require) various forms of surveillance to let copyright holders ensure their rights are upheld – and if surveillance for national security and public safety is an infringement of fundamental rights, so would surveillance to enforce copyright.

What happens next, therefore, is hard to foresee. What cannot be done, however, is to ignore the elephant in the room. The issue of surveillance has to be taken on. The conflict between that surveillance and fundamental human rights is not a merely semantic one, or one for lawyers and academics, it’s a real one. In the words of historian and philosopher Quentin Skinner “the current situation seems to me untenable in a democratic society.” The conflict over Safe Harbor is in many ways just a symptom of that far bigger problem. The biggest elephant of all.

Ethical policing of the internet?

acpoheaderThe question of how to police the internet – and how the police can or should use the internet, which is a different but overlapping issue – is one that is often discussed on the internet. Yesterday afternoon, ACPO, the Association of Chief Police Officers, took what might (just might, at this stage) be a step in a positive direction towards finding a better way to do this. They asked for help – and it felt, for the most part at least, that they were asking with a genuine intent. I was one of those that they asked.

It was a very interesting gathering – a lot of academics, from many fields and most far more senior and distinguished than me – some representatives of journalism and civil society (though not enough of the latter), people from the police itself, from oversight bodies, from the internet industry and others. The official invitation had called the event a ‘Seminar to discuss possible Board of Ethics for the police use of communications data’ but in practice it covered far more than that, including the policing of social media, politics, the intelligence services, data retention and much more.

That in itself felt like a good thing – the breadth of discussion, and the openness of the people around the table really helped. Chatham House rules applied (so I won’t mention any names) but the discussion was robust from the start – very robust at one moment, when a couple of us caused a bit of a ruction and one even almost got ejected. That particular ruction came from a stated assumption that one of the weaknesses of ‘pressure groups’ was a lack of technical and legal knowledge – when those of us with experience of these ‘pressure groups’ (such as Privacy International, the Open Rights Group and Big Brother Watch) know that in many ways their technical knowledge is close to as good as it can be. Indeed, some of the best brains in the field on the planet work closely with those pressure groups.

That, however, was also indicative of one of the best things about the event: the people from ACPO were willing to tell us what they thought and believed, and let us challenge them on their assumptions, and tell them what we thought. And, to a great extent, we did. The idea behind all of this was to explore the possibility of establishing a kind of ‘Board of Ethics’ drawing upon academia, civil society, industry and others – and if so, what could such a board look like, what could and should it be able to do, and whether it would be a good idea to start with. This was very much early days – and personally I felt more positive after the event than I did before, mainly because I think many of the big problems with such an idea were raised, and the ACPO people did seem to understand them.

The first, and to me the most important. of those objections is to be quite clear that a board of this kind must not be just a matter of presentation. Alarm bells rang in the minds of a number of us when one of the points made by the ACPO presentation was that the police had ‘lost the narrative’ of the debate – there were slides of the media coverage, reference to the use of the term ‘snoopers’ charter’ and so forth. If the idea behind such a board is just to ‘regain the narrative’, or to provide better presentation of the existing actions of the police so as to reassure the public that everything is for the best in the best of all possible worlds, then it is not something that many of the people around the table would have wanted to be involved in.  Whilst a board like this could not (and probably should not) be involved in day-to-day operational matters, it must have the ability to criticise the actions, tactics and strategies of the police, and preferably in a way that could actually change those actions, tactics and strategies. One example given was the Met Police’s now notorious gathering of communications data from journalists – if such actions had been suggested to a ‘board of ethics’ that board, if the voices around the table yesterday were anything to go by, would have said ‘don’t do it’.  Saying that would have to have an effect – or if it had no effect, would have had to be made public – if the board is to be anything other than a fig leaf.

I got the impression that this was taken on board – and though there were other things that also rang alarm bells in quite a big way, including the reference on one of the slides to ‘technology driven deviance’ and the need to address it (Orwell might have rather liked that particular expression) it felt, after three hours of discussion, as though there were more possibilities to this idea than I had expected at the outset. For me, that’s a very good thing. The net must be policed – at least that’s how I feel – but getting that policing right, ensuring that it isn’t ‘over-policed’, and ensuring that the policing is by consent (which was something that all the police representatives around the table were very clear about) is vitally important. I’m currently far from sure that’s where we are – but it was good to feel that at least some really senior police officers want it to be that way.

I’m not quite clear what the next steps along this path will be – but I hope we find out soon. It is a big project, and at the very least ACPO should be applauded for taking it on.

Paris damages the case for mass surveillance…

Predictably, the horrific killings in Paris have led to a number of calls for more, and more invasive powers of surveillance for the police and the intelligence services. This always happens after an atrocity – the horrendous murder of Lee Rigby, for example – but as then, these calls are misguided at best. In particular, what happened in Paris doesn’t make the case for mass surveillance stronger – if anything, it damages that case. A huge amount has been written about this already, and I don’t want to go over the same material yet again, but there are a few key points to bear in mind.

Firstly, that France already has extensive surveillance powers. It already has ID cards. It already has more privacy invasions than we in the UK have – and we have a huge amount. That surveillance, those privacy invasions, didn’t stop the shooting in Paris. Why, therefore, would we believe that similar powers would work better in the UK? Because our police and intelligence services are somehow ‘better’ than the French? To say that’s an unconvincing argument is to put it mildly.

Secondly, and more importantly, it looks almost certain that the perpetrators of the atrocity were already known to the police and intelligence services. They had been identified, and noted. Just as the murderers of Lee Rigby had been identified. And the men accused of the Boston bombings. The intelligence services already knew who they were – so to suggest that more dragnet-style mass surveillance would have helped prevent the atrocity would simply be wrong. Let me say it again. We knew who they were. We didn’t need big-data-style mass surveillance to find them – and that’s supposed to be the point of mass surveillance, insofar as mass surveillance has a point.

Most privacy advocates such as myself are not, despite what the supporters of mass surveillance might suggest, ‘anti-police’ or ‘anti-intelligence services’. Most that I know are very much in favour of the police. None of us like terrorism – and to someone like me, a free-speech advocate, an amateur satirist and even occasional cartoonist – this particular attack hits home very sharply. When we say we oppose mass surveillance, amongst other things it’s because we don’t think it’s likely to work – and in particular, that we think other things are likely to work better.  And the evidence, such as it is, seems to support that. Police and intelligence services do not have unlimited resources – far from it in this age of austerity. If the resources – time, money, energy, intelligence – currently put into mass surveillance systems that are unproven, have huge and damaging side-effect, and are even potentially counterproductive, were, instead, devoted to a more intelligent, targeted approach, it might even be that counterterrorism is more effective. We should be looking for new ways, not going down paths that are costly in both financial and human terms.

The fundamental problem is that terrorism, by its very nature, is hard to deal with. That’s something we have to face up to – and not try to look for silver bullets. No amount of technology, no level of surveillance, will solve that fundamental problem. We shouldn’t pretend that it can.

A new Snoopers’ charter, drip by drip?

Snoopy with charterWhen the Data Retention and Investigatory Powers Act was passed with undue haste this summer, the one ‘saving grace’ promised to us by the Liberal Democrats, hitherto guardians of our civil liberties and killers of the Snoopers’ Charter, was the ‘sunset clause’ of December 2016, and the promise of careful and considered review of powers before then.

That careful and considered review – or rather several careful and considered reviews – began. Specifically, the Parliament Intelligence and Security Committee continued the review that it had begun before the hasty passing of DRIP, while the Independent Reviewer of Terrorism Legislation began his own consultation. Both these reviews do seem to have been both careful and considered – I made submissions to both of them, and was invited to a highly illuminating ’round table’ session by the ISC, as well as receiving a fast and clear response by the Independent Reviewer of Terrorism Legislation that showed he had read and understood what I said. In both cases, the feeling I was left with was one of cautious optimism. Those of us advocating a more privacy-friendly, less invasive approach were being listened to, or so it seemed….

…but at the same time, something very different seems to have been happening. There have been a series of speeches by important people that seem to be working directly against that careful, considered approach. The incoming head of GCHQ made a speech that was remarkably aggressive – effectively calling Google and Facebook tools for terrorists. The Commissioner of the Metropolitan Police followed that with what amounted to an anti-privacy tirade, in particular condemning the use of encryption, and saying that the net had become a ‘safe-haven for terrorists and paedophiles’. Both seemed to be trying to suggest that the social media was an untamed wilderness that needed to be reined in – and at the same time seemed to want to inspire fear of the ‘deep, dark web’. The Culture Secretary, Sajid Javid, followed that with a speech suggesting that Article 8 – the right to a private life – had gone too far, and again invoking the threat of terrorists and paedophiles.

…and then, yesterday, Theresa May announced more powers for the police – technical powers, crucial, she said, for the fight against terror. Technical, and yet not discussed with the Internet Service Providers Association, or, seemingly, with those few MPs (such as Julian Huppert) who actually understand the internet, at least to some degree.

…and all this, at the same time as the reviews are taking place. It has the feeling of a drip, drip, dripping, trying to build up a stronger ‘anti internet freedom’ atmosphere. ‘The internet is something to be scared of, full of paedophiles, terrorists and extremists’ so needs to be reined in. Theresa May openly admits she wants to bring back the Snoopers’ Charter – despite its defeat the last time around – so she’s trying to lay the foundations for its return. Working on our resistance. Wearing us down. Trying, it seems, to make sure that the careful, considered review is anything but careful and considered – because the invocation of terrorists and paedophiles makes it impossible to be careful and considered. If you aren’t in favour of these obviously sensible measures, you’re on the side of the extremists, of the terrorists, of the paedophiles. Today, whilst debating the subject on Twitter, I was effectively told that I would have blood on my hands if I opposed the extension of powers.

We should do our very best to resist this. The review must be careful and considered – because there are significant and important issues at stake. Privacy matters – as do all the rights and needs that it supports, from freedom of expression to freedom of assembly and association. Civil liberties like these need privacy – because without that privacy there is a distinct and direct chill. Those of us who suggest that surveillance has gone too far can point to a number of recent revelations – that communications between journalists and their sources, between lawyers and their clients, between prisoners and their MPs, have all been compromised. This matters – and needs to be taken seriously.

On the other hand security also matters – and none of those who I know as privacy advocates deny this, despite what some of our opponents might suggest. We know that it matters – and want to have a sensible, rational, level-headed review of the whole system. We don’t expect our privacy needs to override security – but we do expect that some kind of a balance can be found. That needs an atmosphere without the kind of hyperbole that has been produced in the last few weeks. Can we find that? It does not seem very hopeful at the moment, particularly with a general election looming at the two major parties seemingly competing to see who can be ‘tougher’. I hope we can be equally tough – but tough in terms of fighting for our rights. If we’re not, then we’ll have a new Snoopers’ Charter before we’re even aware what is happening.

 

Surveillance, power and chill…. and the Chatham House Rule

Yesterday I attended a conference at Wilton Park about privacy and security – some really stellar people from all the ‘stakeholders’, industry, government, civil society, academia and others, and from all over the world. A version of the Chatham House rule applied, making the discussion robust and open…. something to which I will return.

At one point, in a conversation over coffee, one of the other delegates asked me a direct question: had I seen any evidence of the ‘chilling effect’ of surveillance. They’d been told the previous day by someone from civil society that in the US there had been a direct chill – in particular of advocacy groups – as a result of the Snowden revelations, something that has been reported before a number of times, but that it’s hard to ‘prove’ in ways that seem to convince people. As I sipped my coffee I thought about it – and realised that I, personally, had seen two different but very graphic and direct examples of chill in the past few weeks, though I hadn’t thought of either of them in that kind of a direct way.

The first was the Samaritans Radar debacle. Not just theoretically, but individually I had been told by more than one person that they were keeping off Twitter for a while as a result of feeling under observation as a result of Samaritans Radar. Their tweets could be being scanned, and by people who they didn’t trust, and who they felt could do them harm. The second was even more direct, but I can’t give details. Another person, who felt under real, direct threat – their life in danger – told me they would be keeping offline for a while.

In both cases they felt threatened – not just because they felt under surveillance, but because they felt themselves under surveillance by others who have power over them. The power, it seemed to me, was one of keys – and one of the reasons that so many people, particularly in the UK, don’t find surveillance threatening. Where Samaritans Radar was concerned, a lot of the people affected were the sort of people who are vulnerable in various ways – partly because of their mental health issues, but more directly because they were under threat, whether from trolls and stalkers or from certain people in positions of authority. Some have very good reason to worry about how the local authorities or even mental health services might treat them. Or how their relatives might treat them. For my other friend, the threat was even more direct – and proven.

So yes, the chill of surveillance is real. And, perhaps most importantly, it’s real for precisely those people that need support in freedom of expression terms. People whose voices are heard the least often – and people who have the most need to be able to take advantage of the opportunities that our modern communications systems offer. The internet can enable a great deal, particularly for people in those kinds of positions – from freedom of expression to freedom of assembly and association and much, much more – but surveillance can not just jeopardise that but reverse it. If it only enables freedom of speech for those already with power, it exacerbates the power differences, and makes those already quiet even quieter, whilst those with power and voice can get their messages across even more powerfully.

…which bring me back to the conference, and the Chatham House Rule. Even the existence of the rule makes it clear that we understand that the chilling effect exists. If we know that for people to really speak freely, they need to know that their comments will not be attributed to them – the essence of the rule – then we must make the leap to recognise that surveillance chills. Surveillance is precisely about linking people’s communications to them as individuals – not just what they say, but what they seek out to read. At our conference, we gave ourselves – the vast majority of us people with at least some power and influence – the benefits of this. Surveillance, and mass surveillance by others with power over us – whether that means our or other governments, massive corporations (Google, Facebook etc) or others – denies that benefit to us all.