A poem for Data Protection Day

Today is Data Protection Day – or Data Privacy Day in the US. I thought I’d write a little poem to mark the occasion – so here it is:

 

Privacy’s dead, I’ve heard it said

It’s time to face the truth

It’s only fogeys who complain

Just listen to the youth

 

The young don’t care, don’t care at all

They share all day and night

And only those with things to hide

Put up some kind of fight

 

But is this true, I ask myself

Do kids not really mind?

Just talk to them, you’ll find they do

Their views are much maligned

 

But what they see as privacy

May not be what you say

For privacy’s not quite so clear

As hiding things away

 

What’s private may be bad or good

It may be big or small

It may not seem to matter much

But that’s just not your call

 

And in these days of online life

Of smartphones and the net

We pour our lives out digit’ly

In ways we might regret

 

…if data’s not protected well

And that means we need law

Law that’s written well and strong

With our rights at the core

 

Can law solve problems on its own?

Of course not, don’t be fooled

But law can play a crucial part

It can be one key tool

 

That’s why, though there are problems – huge

And many a massive flaw

I still campaign and still support

Data protection law

 

————————-

Apologies for the scansion…… and some of the rhymes!

 

The Snoopers’ Charter: Shameful Opportunism

The news that four peers are trying to bring back the Snoopers’ Charter – in its last incarnation the Communications Data Bill – is depressingly predictable, but perhaps even more shameful than other attempts at legitimising mass data gathering and surveillance. It displays shameful opportunism that seems to plumb new depths – and in a number of different way

1     Bringing it in based on an event

It is a bit of an axiom that reactive law – knee-jerk law – is a bad idea. Law by its nature needs to be considered carefully, not passed in the heat of a moment. The more oppressive and ill-considered of ‘counter-terror’ legislation, however, seems to tend to be done this way all too often. The USA-PATRIOT Act (whose long name, the Uniting and Strengthening America by Providing Appropriate Tools Required to Intercept and Obstruct Terrorism Act is worth a read in itself) is perhaps the best known example, but the Data Retention Directive worked just the same way, passed in the wake of the 7/7 bombings in London, and even making reference to those bombings in its preamble. That this directive was declared invalid by the Court of Justice of the European Union last year should give pause for thought. The CJEU said that the directive “entails a wide-ranging and particularly serious interference with the fundamental rights to respect for private life and to the protection of personal data, without that interference being limited to what is strictly necessary.” Authoritarian legislation, passed in haste, takes a long time to overturn. Even now, the repercussions are still being felt

2     Bringing it in based on this particular event

Hanging legislation on a hideous event is one thing – bringing it in based on this particular event, the Charlie Hebdo shootings, is even worse, as a careful examination of this event should have revealed not that more mass data gathering and surveillance is necessary, but rather the opposite. As I have written before, the shootings in Paris damage rather than enhance the case for mass data gathering and surveillance. The perpetrators were known to the authorities – they didn’t need to be rooted out by mass surveillance. The authorities had stopped watching them six months before, because, it seems, of lack of resources, resources that might have been available if a targeted rather than mass surveillance approach had been taken. This is part of an almost overwhelming trend – the killers of Lee Rigby and the suspects in the Boston bombings were also known to the authorities. There was no need for mass data gathering and surveillance to stop them – so to use this particular event as an excuse for bringing back the Snoopers’ Charter is particularly shameful.

3     Trying to rush the legislation through

It is almost never appropriate to rush legislation through – but sadly this is also all too familiar. Last summer, Parliament brought itself into significant disrepute by rushing through the Data Retention and Investigatory Powers Act (DRIP) in a matter of mere days, with no real time for scrutiny, no opportunity for independent expert analysis, and no real opposition from any of the main parties. This is not the way to legislate – it wasn’t right then, and it wouldn’t be right now.

4     Doing this in the midst of investigations and legal challenges

The one saving grace in DRIP was that it was intended to give breathing space, to allow proper, detailed and careful consideration to the many issues involved in surveillance. At the same time, there are a series of reviews over surveillance legislation in process – from the Intelligence and Security Committee and by the Independent Reviewer of Terrorism Legislation to start with. Moreover, DRIP itself is subject to legal challenge. To try to pass much more comprehensive and far-reaching legislation even before these reviews have been completed and their reports scrutinised, and before the legal challenges even make their way into the court room, is also deeply shameful – prejudging the results of those reports, and, in effect, disrespecting all those involved.

5     Doing this in the face of a clear CJEU ruling

What is perhaps even worse, is that on the face of it the planned legislation flies directly in the face of the CJEU ruling on data retention. The ruling was strong, clear and direct – but does not seem, on immediate reading of the legislation, to have been taken into account at all. Of course this may be wrong – but as the new legislation only appeared yesterday, and is planned to go before the Lords on Monday, there has not been time for proper, detailed analysis – and nor has there been any kind of explanation or reconciliation presented. This again highlights the point of taking time over legislation – and going through proper, detailed procedures.

6     Using a highly dodgy political method

The method which has been chosen to try to introduce this law is, to put it mildly, somewhat doubtful. Rather than a full Bill, the four peers have tabled an amendment – 18 pages of additional clauses – to an existing bill, the Counter Terrorism and Security, which has already gone through most of the processes necessary before becoming law. It’s like slipping in an entirely new law just before the first law is passed – it makes a mockery of parliamentary process, and in effect disrespects the whole of parliament.  Describing it as trying to sneak in the Snoopers’ Charter by the back door may even be too kind.

7     Ignoring the committee

The original Communications Data Bill was subject to analysis by a full parliamentary committee – and that parliamentary committee came out with a highly critical report, a report which ultimately led to the abandonment of the Bill.  By trying to bring it back now, seemingly virtually unchanged, the peers proposing the amendment are ignoring the committee and its findings – and as a consequence ignoring the whole process of parliamentary scrutiny.

8     Doing it at this time, in the run up to the election

To try to push through legislation like this in the run up to the election is in itself highly dubious tactics. Politicians have their minds on other things – and many of them may care much more about being re-elected than about whether the details of legislation to be passed are a good idea or not. Whether they ‘look’ good is what matters, and whether that makes them more electable. Right now, in the light of the anger and fear resulting from the Charlie Hebdo shootings, to oppose something that might make people safer, will be difficult – and may hinder the electoral prospects of MPs. This kind of thing has happened before – the way that the Digital Economy Act was passed in 2010 springs to mind – and again makes the timing of the bringing forward of the amendment feel very wrong

Why are they doing it this way?

The whole process – all these layers of opportunism – should make the alarm bells ring. This is a hugely significant piece of law – not just in terms of what it does but in terms of what it signifies, in terms of what kind of society we want to be living in, what kind of an internet we want to have. If we are going to make decisions like this, we should make them in careful, considered ways, weighing the evidence and seeking expert opinion. That’s the idea behind the parliamentary committee system, and the time it takes to bring laws in through normal procedures.

Why, then, are these procedures being avoided, and why are these underhand methods being used? It is hard to escape the conclusion that it is because those pushing it are afraid that if it is given the appropriate amount of time, of attention, and of scrutiny, then it will once again be defeated, as it was the last time around. In the cold light of day, do we want to live in such a surveillance society? I’m not sure – but I do think that trying to make those decisions in this way, in the heat of the moment and without the opportunity to give proper thought and proper scrutiny, is a disastrous way to proceed. Those behind it should be ashamed.

Facebook And Twitter – Handling Extremism And Disorder

After extensive consultation, FAT-HEAD has been amended to take into account its lack of clarity over costs (see 8) and the unfortunate limitation of extent (see 9).


 

Facebook And Twitter – Handling Extremism And Disorder Bill (‘FAT-HEAD’)

Contents:

  1. When this Act applies
  2. Facebook and Twitter
  3. Social and Moral Responsibility
  4. Code of conduct
  5. Extremism
  6. Disorder
  7. Acceptance of blame
  8. Costs
  9. Extent, commencement and short title

A

Bill

to

Make provision as to matters concerning the social and moral responsibility of Facebook and Twitter, to ensure that proper cooperation is made with the authorities in relation to morality, extremism and disorder.

BE IT ENACTED by the Queen’s most Excellent Majesty, by and with the advice and consent of the Lords Spiritual and Temporal, and Commons, in this present Parliament assembled, and by the authority of the same, as follows:—

1. When this Act applies

This Act applies whenever an event of such significance, as determined by the Secretary of State, requires it to. Events include but are not restricted to acts of extremism, of disorder and of embarrassment to the Secretary of State, the government, the intelligence and security services and the police, or any other event deemed appropriate by the Secretary of State.

2. Facebook and Twitter

The powers conferred through this Act apply to Facebook, Twitter and any other online services, systems, or their equivalents, successors or alternatives (‘the services’) as determined by the Secretary of State.

3. Social and moral responsibility

The services shall recognise that they have a social and moral responsibility above and beyond any requirements hitherto required by the law. The requirements that constitute this social and moral responsibility shall be determined by the Secretary of State, in consultation with the editors of the Sun and the Daily Mail.

4. Code of Conduct

The Secretary of State shall prepare a Code of Conduct to cover the actions of the services, in accordance with the social and moral responsibility as set out in section 1. This code of conduct shall cover extremism, disorder, obscenity, dissent and other factors as determined by the Secretary of State.

5. Extremism

i)  The services shall monitor the activities of all those who use their services for evidence of extremism, including but not limited to reading all their posts, messages and other communications, analysing all photographs, monitoring all location information, all music listened to and all areas of the internet linked to.

ii)  The services shall provide real-time access to all of their servers and all user information to the security services, the police and any others authorised by the Secretary of State, including the provision of tools to enable that access.

iii)  The services shall prepare reports on all its users activities, including but not limited to those activities relating to extremism, including contact information, personal details, locations visited and any other information that may be determined from such information.

iv)  The services shall provide these reports to the security services, the police and any others authorised by the Secretary of State.

v) The services shall delete the accounts of any user upon the request of the security services, the police or any others authorised by the Secretary of State.

vi)  The services may not report that they have provided the access or these reports to anyone without the express permission of the Secretary of State.

6. Disorder

At a time of disorder, as determined by the Secretary of State, the security services or a police officer, the services shall provide the following:

i) Immediate access to location data of all users.

ii) Immediate access to all communications data of all users

iii) Detailed information on all accounts that have any relationship to the disorder

iv) Deletion of accounts of any users deemed to be involved, or likely to be involved, in disorder.

v) Upon order by the Secretary of State, the security services or a police officer, the services shall block all access to their services in an area to be determined by the Secretary of State.

7. Acceptance of Blame

The services shall recognise that their social and moral responsibility includes the requirement to accept the blame for the existence, escalation or consequences of any extremism or disorder. This acceptance of blame must be acknowledged in writing and in the broadcast media, ensuring that the government, the security services and the police are not held responsible for their own roles in such extremism or disorder or their consequences.

8. Costs

All costs for the development, implementation, monitoring, updating and supporting the systems required for the services to comply with the Facebook And Twitter – Handling Extremism And Disorder Act 2014 shall be borne by the services.

9. Extent, commencement and short title

i) This Act extends to England, Wales, and anywhere else on the entire planet, and in addition to inner and outer space, the moon, any planets, comets and other bodies as deemed appropriate by the Secretary of State.

ii) This Act comes into force on the day on which this Act is passed.

iii) This Act may be cited as the Facebook And Twitter – Handling Extremism And Disorder Act 2014.


 

GCHQ: I’m not charmed yet….

A little over a week ago, GCHQ gave us a show. A giant poppy, part of the 2014 Armistice Day appeal. It was spectacular – and, for me at least, more than a little creepy.

GCHQ poppy

The poppy display seems to have been part of something bigger: the term that immediately sprang to mind was ‘charm offensive’. GCHQ has, over the last year or so, been trying to charm us into seeing them as purely positive, despite the revelations of Edward Snowden. They’re trying to appear less secretive, more something to be admired and supported than something to be concerned about and made accountable. The poppy was an open symbol of that. Look at us, GCHQ seemed to be saying, we’re patriotic, positive, part of what makes this country great. Support us, don’t be worried about it. Love us.

I assume that the speech by Robert Hannigan, the new Director of GCHQ, was intended to be part of that charm offensive. For me, however, it had precisely the opposite effect. The full speech was published in the FT here – but I wanted to pick out a few points.

Privacy an absolute right?

The first, which made the headlines in the Guardian and elsewhere, is Hannigan’s statement that ‘privacy is not an absolute right’. He’s right – but we all know that, even the staunchest of privacy advocates. Privacy is a right held in balance with other rights and needs – with freedom of expression, for example, when looking at press intrusions, with the duty of governments to provide security and so forth. That’s explicitly recognised in all the relevant human rights documents – in Article 8 of the European Convention of Human Rights, for example, it says of the right to a private life that:

“There shall be no interference by a public authority with the exercise of this right except such as is in accordance with the law and is necessary in a democratic society in the interests of national security, public safety or the economic well-being of the country, for the prevention of disorder or crime, for the protection of health or morals, or for the protection of the rights and freedoms of others”

So we already know that privacy is not an absolute right – so why is Hannigan making the point? It’s hard to see this as anything but disingenuous – almost as though he wants to imply that foolish privacy advocates want to help terrorists by demanding absolute privacy. We don’t. Absolutely we don’t. What we want is to have an appropriate balance, for the interference in our privacy to be lawful, proportionate and accountable. At the moment, it’s not at all clear that any of that is true – there are legal challenges to the surveillance, deep doubts as to its proportionality and little evidence that those undertaking the surveillance are properly accountable. On the accountability front, it’s interesting that he should make such a speech at a time when the Intelligence and Security Committee of Parliament, are undertaking a consultation – it made me wonder whether he’s trying to steer the committee in a particular direction.

Facebook – a tool for terrorists?

The other headline from the speech is the way Hannigan seems to be attacking Facebook and others for being too helpful to terrorists – which is an interesting reverse from the more commonly held view that they’re too helpful to the authorities. The argument seems to go that the ‘old’ forms of terrorists, exemplified by Al Qaeda, use the ‘dark web’, while the ‘new’ forms of terrorists, exemplified by IS, are using the social media – Facebook, Twitter and so forth. It’s an interesting point – and I’m sure there’s something in it. There’s no doubt that ‘bad guys’ do use what’s loosely called the dark web – and the social media activities of ‘bad guys’ all around the world are out there for all to see. Indeed, that’s the point – their visibility is the point. However, on the face of it, neither of those ‘facts’ support the need for the authorities to have better, more direct access to Facebook and so forth. Neither, on the face of it, is any justification for the kinds of mass data gathering and surveillance that seem to be going on – and that GCHQ and others seem to be asking us to approve.

By its very nature, the ‘dark web’ is not susceptible to mass surveillance and data gathering – so requires a more intelligent, targeted approach, something which privacy advocates would and do have no objection to. Social media – and Facebook in particular – don’t need mass surveillance either. To a great extent Facebook is mass surveillance. All that information is out there – that’s the point. It’s available for analysis, for aggregation, for pretty much whatever the authorities want it. And if Hannigan imagines that the secret activities of IS and others are undertaken on Facebook he’s more naive than I could imagine anyone in the intelligence services could be – they can’t have chosen to use Facebook and Twitter instead of using the dark web, but in addition to it. The secret stuff is still secret. The stuff on Facebook and Twitter is out there for all to see.

What’s more, there are already legal ways to access those bits of Facebook and Twitter than are not public – which is why the authorities already request that data on a massive scale.

Charming – or disarming?

Hannigan must know all of this – so why is he saying it? Does he think that the charm offensive has already worked, and that the giant GCHQ poppy has convinced us all that they’re wonderful, patriotic and entirely trustworthy? They may well be – I’m no conspiracy theorist, and suspect that they’re acting in good faith. That, however, is not the point. Trust isn’t enough here. We need accountability, we need transparency, we need honesty. Checks and balances. Not just charm.

In praise of pseudonyms…

A remarkably inappropriately titled article appeared in the Telegraph this morning.

“Facebook will soon let you post using someone else’s name”

The article itself, however, said something quite different: that ‘Facebook is reportedly working on a mobile app that will let its users interact without using their real name’. If true, this could be important – and a very positive move. Facebook have long been the champions of ‘real names’ policies: for them to recognise that there are important benefits that arise from the use of pseudonymity and sometimes anonymity is a big development – because there are benefits, and pseudonymity is one of the keys to real freedom of speech and autonomy, both online and in the ‘real’ world.

Firstly, to dispose of the Telegraph’s appalling headline, a pseudonym is very rarely ‘someone else’s name’. There are cases where people try to impersonate others, but these are a tiny fraction of the times that people use pseudonyms. Pseudonyms have been used for a very long time, and for very good reasons. Many people are better known for their pseudonyms than for their ‘real’ names – and they certainly didn’t ‘steal’ them. Did Eric Blair steal the name George Orwell? Did Mary Ann Evans steal the name ‘George Eliot’? Did Gideon Osborne steal the name George? And looking at the first two of those names, did Orwell and Eliot, ‘belong’ to someone else? Of course they don’t. Another George even springs to mind: George Osborne. Should we inset on calling him Gideon, because that was the name his parents gave him? I’m politically opposed to him in every way – but I’d defend his right to call himself George, and defend it to the hilt. Pseudonyms often belong to the people using them every bit as much as their ‘real’ names. In some ways they’re even more representative of the people: when choosing a pseudonym, people often put a lot of thought into the process, choosing something that represents them in some way, or represents some aspect of them.

Sometimes it’s about presentation – and sometimes it’s to protect your ‘real’ identity in an entirely reasonable way. It’s not that you have something to hide – but that your autonomy is better served by the ability to separate your life in some ways. Without that ability, your freedom of expression is chilled. As I’ve written before, there are many kinds of people for whom pseudonymity is crucial: whistle-blowers, people whose positions of responsibility make open speech difficult, people with problematic pasts, people with enemies, people in vulnerable positions, people living under oppressive regimes, young people, people with names that identify their ethnicity or religion, women (at times), victims of spousal abuse and others. It’s also something that helps people to let of steam, to explore different aspects of their lives – or simply to enjoy themselves.

I use my real name most of the time online – amongst other things because my ‘online presence’ is part of my job, an because I make professional links and connections here – but I’m in a privileged position, without any of the obvious vulnerabilities. I’m a white, middle-class, middle-aged, educated, employed, able-bodied, heterosexual, married man. It’s easier for me to function online with my real name – but even I don’t always do so. Over the last decade or so I’ve used a number of pseudonyms, and still use one now. For many years my main online presence was as ‘SpiritualWolf’, prowling the football message boards: I’m a Wolves fan. I didn’t particularly want to connect what I was doing on the football boards with my work life or even my home life – and wanted my football postings to be judged for their content, not on the basis of who I might be. Online life works like that. I created ‘SpiritualWolf’ – but I also was SpiritualWolf. It wasn’t someone else’s name – it was my name.

Even now I used a pseudonym – KipperNick – when I play at being the BBC’s Nick Robinson, in his role as cheerleader for UKIP, a role which, sadly, he often plays better than me. It’s a very different kind of identity – a clearly marked parody account – but it allows me a certain kind of freedom, and lets me have some fun. I don’t use it maliciously – at least I don’t try to….

…and that, in the end, is the rub. It’s not the pseudonymity that’s the problem when we’re looking at malicious communications, for example: it’s the malice. By attacking the pseudonyms we’re not just missing the target we’re potentially shutting off a great deal of freedom, chilling speech and controlling people when that control is really unnecessary. I’m delighted that Facebook has begun to realise this – though I’ll believe it when I see it.

 

Thanks to the many people who replied to my initial tweet about this earlier today – I’ve shamelessly used your examples in the blog post!

The Ballad of Google Spain

For National Poetry Day, with apologies to anyone with any sense of poetry….

 

There was a case, called ‘Google Spain’

That caused us all no end of pain

Do we have a right to be forgotten?

Are Google’s profits a touch ill-gotten?

 

From over the pond came shouts of ‘Free Speech!’

So loud and so shrill they were almost a screech

From the ECJ came a bit of a gloat

‘We’ve got that Google by the throat!’

 

Said Google “If it’s games you play”

“We’ll do that too, all night and day”

So they blocked and blocked, and told the press

“It’s that evil court, we’re so distressed”

 

“Such censorship,” they cried and cried

Though ’twas themselves who did the deeds

They didn’t need to block the links

They were just engaging in hijinks

 

And many stood beside them proudly

Shouting ‘freedom’, oh so loudly

‘Google is our free-speech hero!’

‘We’ll fight with them, let’s be clear-oh!’

 

Others watched and raised their eyebrows

Listening wryly to these vows

And thought ‘is Google really pure?’

‘From what we’ve seen, we’re far less sure.’

 

For Google blocks all kinds of sites

‘Specially for those with copyright

And, you know, this isn’t funny,

When blocking things will make them money

 

This isn’t just about free speech

No matter how much Google preach

What matters here is really power

Is this truly Google’s hour?

 

Does Google have complete control

Or do the law courts have a role?

Time will tell – but on the way

Our privacy will have to pay…

 

 

 

 

 

 

Censorship and surveillance…

Today’s ‘Internet Injunctions’ case in the high court (Cartier vs BSkyB) highlights one of the inherent problems with the kind of ‘porn-blocking’ censorship system that the current government has effectively forced ISPs to comply with: when you build a censorship system for one purpose, you can be pretty certain that it will be used for other purposes. As David Allen Green, who tweets as @JackofKent described it today:

Screen Shot 2014-09-25 at 15.06.55

 

 

Screen Shot 2014-09-25 at 15.07.13

 

 

 

Screen Shot 2014-09-25 at 15.07.20

 

 

Screen Shot 2014-09-25 at 15.07.34

 

 

 

I’ve argued this before – it’s question five in my ‘10 Questions about Cameron’s ‘new’ porn-blocking‘, but here it is in action, being argued in court. It was inevitable that it was going to be argued. Though people tend to deny it, ‘function creep’ or ‘mission creep’ is a reality, not a dream of the paranoid tin-foil hat brigade.

It’s not an argument restricted to censorship systems – the same applies to surveillance, and should remind us of the links between the two, and the need to oppose both. Just as advocates of censorship start with child-abuse imagery and then move on through ‘ordinary’ porn to other kinds of ‘offensive’ material, and then to copyright infringement, advocates of surveillance start with catching terrorists and paedophiles, through catching more ‘ordinary’ criminals, to finding people who are ‘offensive’ in some other way, through to those suspected (and it is generally based on suspicion, not proof) of infringing copyright. And from there, who knows where?

The links between surveillance and censorship are strong and multifaceted – though the motivation, in the end, is the same: control over people and restriction of freedom. Surveillance can be used to support censorship – watch everyone to see where they’re going, what they’re watching and reading, who they’re meeting, so that you can shut down their websites, close their meetings, track down the people they’re listening to, and so forth. Censorship can be used to support surveillance – particularly with things like the current ‘opt-out’ internet filters, where if you opt-out of censorship, that automatically makes you suspicious, and a target for surveillance. Anyone using a pseudonym, or trying to be anonymous, is already marked down as suspicious – anyone using TOR or an equivalent, for example.

This is one of the many reasons we should reject both censorship and surveillance. We should understand that the two are linked – and that there are slippery slopes associated with both. And they really are slippery, as today’s case in the High Court should help us to see.

For more details of the case, see David Allen Green’s piece for the Open Rights Group here, and the Open Rights Group press release here.