It’s not just the porn that will be over-blocked….

Newsnight last night included a feature on how the recently introduced internet ‘porn-filters’ were actually blocking more than just porn. Specifically, they noted that sex-education websites, sexual health websites and so forth were being blocked by the filters. This comes as no surprise to anyone working in the area – indeed, my own blog post asking questions about porn-filters was itself blocked – but it is still good to see that the mainstream media is now taking it on board, albeit very late in the day.

It wasn’t a bad feature, but it only began to scratch the surface of the issue. It left a lot of questions unanswered and a lot of crucial issues untouched. The first of these was the suggestion, insufficiently challenged, that this over-blocking was just some sort of ‘teething trouble’. Once the systems get properly up and running, the problems will be ironed out, and everything will work perfectly. As anyone who understands the systems would tell you, this is far from being the case: over-blocking is an inherent problem, one that will never go away. The nature of these filters, the fact that they essentially work algorithmically, means that they will always (and automatically) pick out sites that deal with the same subject matter as the ones that you are trying to block. If you want to block sites that deal with sex, you will block sites that deal with sex education, with sexuality and so forth. It will also be almost certain to block sites connected with LGBT issues, leaving a lot of young and vulnerable people without access to key information. Here, as in so many cases, ignorance is not bliss. Far from it. Now you can clean things up bit by bit in some cases, as site owners complain – but this is a slow and painstaking process, and anyway will only work when site owners discover that they have been blocked – something far from certain to occur. Very often, filtering/censorship will happen without anyone even noticing.

The second key absence from the Newsnight feature was the fact that these filters are not planned just to filter out pornography. They are planned to deal with a whole lot of other websites, from ‘extremism’ and ‘esoterica’, gambling, violence and so forth. Quite apart from the immense difficulty in defining things such as extremism and esoterica – let alone whether it is appropriate to block such sites in the first place – it needs to be remembered that the over blocking issue with porn sites will apply equally to these other sites. Blocking ‘extremist’ sites will end up blocking sites that discuss extremism, for example – and sites that might help people to find their way out of extremist groups and so forth. This won’t just fail to stop the growth in extremism – it will hinder attempts to prevent that growth. It won’t just fail to be effective – it will be actively counterproductive.

This is not an accident. Censorship in general does not work – and students of its history should be aware of this. Though freedom of speech should not be viewed as an absolute, it should be viewed as an ideal, and curtailing it without sufficient reason should only be done with great care and with great understanding. The blunt instrument of internet filtering has neither the care nor the understanding. It will do far more harm than good.

Porn filters, politics and liberalism….

This afternoon the Lib Dems conference rejected a call for ‘default-on’ porn filters – an idea being pushed strongly by David Cameron – and rejected it decisively. A great deal has been written about this before – including by me (e.g. ’10 questions for David Cameron’ and the ‘porn-filter Venn diagram’) – so I won’t rehash the all of the old discussions, except to say that it seems to me that the decision to reject the plan is a victory for intelligence, understanding and liberalism. The plan would not do what it intended to do, it would produce deeply counterproductive censorship, and could both encourage complacency and discourage the crucial kind of engagement between parents and children that could really help in this area.

A revealing debate….

What does interest me, however, is the nature of the discussions that happened at the conference. The divide between those pushing the motion and those opposing it was very stark: it was primarily a divide of understanding, but it was also one of liberalism. The difference in understanding was the most direct: those in favour of the filters seemed to have very little comprehension of how the internet worked – or even how young people think and behave. In technical terms, their arguments were close to meaningless and the solutions they proposed were unworkable – while in terms of understanding the needs of young people, they seemed to be even further from the mark. The contributors on the other side, however, were remarkably good. I’m not a Lib Dem, but I couldn’t help but be very impressed by many of them. Three stood out: Julian Huppert, who is my MP and may well be the MP who understands the internet the best, Sarah Brown (@auntysarah on twitter, and councillor and noted LGBT activist), and Jezz Palmer (@LoyaulteMeLie on Twitter).

Jezz Palmer was particularly impressive, setting out exactly why this kind of thing would be disastrous for kids.

“Implementing these filters punishes children who are otherwise completely alone,” she said. “I know there are still kids growing up who feel how I did and there will be for generations to come. Don’t take away their only research, don’t leave them alone in the dark.”

Ultimately that’s the point. This kind of filter over-blocks – indeed, the blog post I first wrote on the subject a few months ago was itself blocked by a porn filter – and over-blocks precisely the sort of material that young people need to be able to find if they are to grow up in the kind of healthy and positive way that anyone with any sense of liberalism should promote. They need to explore, to learn, to find ways to understand – much more than they need to be controlled and nannied, or even ‘protected’.

The role of liberalism…

Is it a coincidence that those who understood the issues – both the technical issues and those concerning young people – were also those with the most liberal ideas about filtering and censorship? I don’t know, but I suspect that often there is a connection. The people that I know who work with the internet insofar as it relates to privacy and free expression come from a wide variety of political backgrounds – from the extremes of both right and left – but the one thing they tend to have in common is a sense of liberalism (with a very small ‘l’) in that they believe in individual freedom and individual rights. An understanding of or belief in the importance of autonomy, at a personal level, doesn’t fit neatly in the ‘left-right’ spectrum…

Whether the decision of the Lib Dem conference really matters has yet to be seen. The Lib Dem conference often makes fairly radical decisions – supporting the legalisation of cannabis is one of the more notable ones – but its leadership (and Nick Clegg in particular) doesn’t always follow them. They (and he) have, however, taken the advice of Julian Huppert seriously, particularly in the crucial decision not to back the Communications Data Bill (the Snoopers’ Charter). I hope they listen to Dr Huppert again, and come out as firmly against these filters as their conference has – because it could be a significant and positive move politically.

How will the other parties react?

The other parties will have been watching the debate – and both the Tories and the Labour Party are in many ways deeply untrustworthy where the internet is concerned, leaning in distinctly authoritarian directions in relation to surveillance, hard-line enforcement of copyright and the idea of censorship. I hope they noticed the key aspect of today’s debate: that the people who knew and understood the internet were all firmly against filters. I  suspect that fear of the likely headlines in the tabloids will stop them being as strongly against porn-filters as the Lib Dem conference has shown itself, but they might at least decide not to push so strongly in favour of those filters, and perhaps let the idea melt away like so many other unworkable political ideas. I hope so – but I will be watching the other conferences very closely to find out.

Syria and the myth of simple solutions

Last night’s parliamentary defeat for the government over the proposed military intervention in Syria was both dramatic and potentially hugely significant. There will be (indeed there has already been) a great deal written about it by many, many people – from the impact it might have on the people of Syria to the possibly massive political ramifications in the UK. I don’t propose to add to those – I was hugely surprised by the events of last night, and fully expected, right until the moment that the result of the vote was announced, the government to win and for military action to be given the green light. That shows how much I know… and means I’ll await events rather than pretend that I know what’s going to happen next.

There is, however, one aspect that I want to say a few words about – and one aspect of the debate in the House of Commons that really impressed me. That’s the way that a large number of MPs, on all sides of the House, were unwilling to accept the idea that there was a simple solution to a highly complex problem: that there was one ‘obvious’ way to deal with it. It is a highly seductive way to look at things – but it very rarely turns out to be true.  The simple solution suggested here – effectively ‘surgical strikes’, limited in scope and in impact – didn’t convince me, and didn’t convince the MPs. Speech after speech asked questions to which there seemed to be no answers forthcoming: most directly, ‘how do you know this will actually work?’

There’s no question that the Assad regime is nightmarish. The atrocities that we’ve seen the results of should sicken everyone. They certainly sicken me – and I would love to be able to find a way to stop them. That, however, does not mean that I would do ‘anything’ that is suggested – however bad things are, they can get worse, and they very often do when a seemingly simple solution is suggested.

From a political perspective simple solutions are very attractive – they can be ‘sold’ to the public, they can make good headlines in the newspapers – but when you look closer, they rarely provide the answers that are needed. In my specialist field, we’ve seen this again and again in recent months. The idea that you can ‘solve’ the complex challenge of pornography on the internet with a ‘simple’ filter system (about which I’ve blogged many times, starting with these 10 questions) is attractive but pretty much unworkable and has highly damaging side effects. The ideas that you can ‘solve’ the problem of abusive tweeting with a ‘simple’ report abuse button (see here) or deal with trolling comments on blogs with a ‘simple’ real names policy (see here) are similarly seductive – and similarly counterproductive. These are complex problems – and in reality they need more complex, more nuanced, more thoughtful solutions.

The MPs in last night’s debate seemed to understand that – and not like it. They were being asked to support something essentially on trust – with minimalist legal evidence and even flimsier intelligence support – and they wanted to know more. Enough of them could see that the ‘simple’ solution being suggested was not as simple as it seemed – and wanted to know more, to think more, and to be given more time. Most of all, they wanted to have more evidence that it would work – that it would make the situation better for the people of Syria. Personally, I don’t think we know that it would. It might – but it might well not.

I was challenged last night on Twitter by a few people to offer an alternative – and I couldn’t provide one, certainly not in 140 characters. I want to see more diplomacy, I want to see more imagination, I want to see more humanitarian support, I want to see more engagement with the people not only of Syria but Iran – but do I know that this will work? Of course I don’t. I can’t offer a simple solution. I highly suspect there isn’t one. This will be messy, this will be bloody, this will be hideous – it already is. ‘Standing by and doing nothing’ is horrible – but military intervention is horrible too. There’s no easy way out – and we should stop pretending, particularly to ourselves, that there is.

My porn-blocking blog post got porn-blocked!

Screen Shot 2013-07-26 at 08.35.38Just to make the point about porn-blocking filters even more concrete, I’ve discovered that my blog post on porn-blocking has been automatically blocked by Strathmore University’s system (thanks to @LucyPurdon for pointing it out). Strathmore University is in Kenya, and I don’t know much about it, but the implication of the message is clear: the blog post was blocked because the system saw too many mentions of the word pornography – I’m still not clear about the proxies issue, though.

What does all this imply? Well, it shows the limitations of an automated system: analysing my blog post would indeed find I mention the word ‘pornography’ rather a lot – appropriately, as I’m discussing how we deal with pornography on the net – but it certainly doesn’t make the post pornographic. Any automated system will have that kind of a limitation… and will therefore block a whole swathe of material that is educational, informative and directly relevant to important issues. Automatically block things this way and you will drastically reduce access to information about crucial subject – sex is just one of them. Cutting down access to information, as well as all the freedom of speech issues, will leave kids less well informed, and less able to deal with these issues. Education is the key – and filters will and do(!) reduce that.

One key thing to note: the Strathmore University system is at least transparent – it tells you why a site is blocked, which might at least give you some way to get around it. Many systems (for example the way that many ISPs implement the IWF’s blacklist) are not transparent: you don’t know why you can’t get access to a site, either getting a ‘site not found’ message or even nothing. With those systems, there’s even more of a problem – and I have a feeling that those are the systems that David Cameron is likely to push….

Porn-blocking filters not only don’t work in their own terms, they’re actually damaging!

There’s no ‘silver bullet’ for porn….

Werewolf attack jumpI was lucky enough to be on Radio 4′s ‘The Media Show’ yesterday, to talk about Cameron’s porn-blocking plans: I think I was invited as a result of my blog post from Monday, asking 10 questions about the plan. I didn’t have much time – and I’m still very much an amateur on the radio – and though I think I managed to get across some of what I wanted to say, I didn’t get close to persuading the other person talking about the subject – Eleanor Mills, of the Sunday Times. I think I know why: she’s in many ways correctly identified the monster that she wants to slay, and she thinks that she’s found the silver bullet. The problem is, for porn, there IS no silver bullet. It’s not that simple.

The solution that she suggested – and she said that ‘the man from Google’ told her it was possible – was a simple ‘switch’ to turn a ‘porn filter’ on or off. If you wanted to see ‘restricted’ material for some justified reason (e.g. to look at material for sex education purposes) you could turn it on, and you’d be asked a question in a pop-up, something like ‘Do you want to look at this for research purposes?’. You’d click OK, look at the stuff, then turn the filter back on. Simple. Why not do it?

It doesn’t really take a technical expert to see the flaws in that plan even if it was possible to create such a switch – how it wouldn’t stop viewing stuff for bad reasons (who’s going to be honest when asked why you want?), how it avoids the fundamental question of how you define ‘porn’, and all the other crucial issues that I mentioned in my other blog. That’s not to mention the technical difficulties, the problem of over-censorship and under-censorship, of the way that the really bad stuff will avoid the filters anyway – let alone the even more fundamental issues of free speech and the need to be able to access information free of fetters or limitations…. There are so many flaws in the plan that it’s hard to know where to start – but it’s easy to see the attraction of the solution.

We all want to find easy solutions – and computerised, technical solutions often promise those kinds of easy solutions. Porn, however, is not amenable to easy solutions. It’s a complex subject – and sadly for those looking for silver bullets, it needs complex, multifaceted solutions that take time, effort and attention.

We do, however, know what a lot of those solutions are – but they’re not really politically acceptable at the moment, it seems. We know, for example, that really good sex and relationships education helps – but the government recently voted down a bill that would make that kind of education compulsory in schools. The ‘traditional’ education favoured by Michael Gove and the Daily Mail has no truck with new-fangled trendy things like that, and the puritanical religious approach still claims, despite all the evidence, that ignorance of sexual matters is bliss. It isn’t. Better education is the key starting point to helping kids to find their way with sex and relationships – and to make the ‘poisonous’ influence of ‘bad’ porn (which, it must be remembered, is generally NOT illegal) the kind of thing that Eleanor Mills justifiably wants to deal with. If she really wants to help, she should be fighting the government on that, not pushing technical, magical solutions that really won’t work.

The next stage is putting more resources – and yes, that means money – into the solutions that we know work well. The IWF in dealing with child abuse images. CEOP in dealing with sex-offenders online activities. Work on a targeted, intelligent level. The experts know it works – but it’s hard work, it’s not headline-grabbing, and it’s not ‘instant’. What’s more, it’s not cheap.

The other part of the jigsaw for me, is to start having a more intelligent, more mature and more honest debate about this. If the politicians didn’t go for soundbite solutions without talking to experts, but actually listened to what people said, this might be possible. Sadly, with the current lot of politicians on pretty much every side, that seems impossible. This isn’t a party-politcal issue: Labour are every bit as bad as the Tories on this, with Helen Goodman a notable offender. It’s an issue of politicians being unwilling to admit they don’t understand, and unwilling to take advice that doesn’t fit with their ‘world view’. It’s an issue of the corrosive influence of hypocritical and puritanical newspapers like the Daily Mail on the one hand calling for internet porn bans and on the other parading their ‘sidebar of shame’ complete with images and stories that objectify women and girls to an extreme.

The one saving grace here is that the solution they suggest simply won’t work – and eventually they’ll realise that. In Australia, a similarly facile solution was tried, only to be ignominiously abandoned a few years later. If only that lesson was the one from Australia that Lynton Crosby managed to get across to David Cameron….

10 questions about Cameron’s ‘new’ porn-blocking

There’s been a bit of a media onslaught from David Cameron about his ‘war on porn’ over the weekend. Some of the messages given out have been very welcome – but some are contradictory and others make very little sense when examined closely. The latest pronouncement, as presented to/by the BBC, says

“Online pornography to be blocked automatically, PM announces”

The overall thrust seem to be that, as Cameron is going to put in a speech:

“Every household in the UK is to have pornography blocked by their internet provider unless they choose to receive it.”

So is this the ‘opt-in to porn’ idea that the government has been pushing for the last couple of years? The BBC page seems to suggest so. It suggests that all new customers to ISPs will have their ‘porn-filters’ turned on by default, so will have to actively choose to turn them off – and that ‘millions of existing computer users will be contacted by their internet providers and told they must decide whether to activate filters’.

Some of this is welcome – the statement about making it a criminal offence to possess images depicting rape sounds a good idea on the face of it, for example, for such material is deeply offensive, though quite where it would leave anyone who owns a DVD of Jodie Foster being raped in The Accused doesn’t appear to be clear. Indeed, that is the first of my ten questions for David Cameron.

1     Who will decide what counts as ‘pornography’, and how?

And not just pornography, but images depicting rape? Will this be done automatically, or will there be some kind of ‘porn board’ of people who will scour the internet for images and decide what is ‘OK’ and what isn’t? Automatic systems already exist to do this for child abuse images, and by most accounts they work reasonably well, but they haven’t eradicated the problem of child abuse images. Far from it. If it’s going to be a ‘human’ system – perhaps an extension of the Child Exploitation and Online Protection Centre (CEOP) – how are you planning to fund it, and do you have any idea how much this is going to cost?

2     Do you understand and acknowledge the difference between pornography, child abuse images and images depicting rape? 

One of the greatest sources of confusion over the various messages given out over the weekend has been the mismatch between headlines, sound bites, and actual proposals (such as they exist) over what you’re actually talking about. Child abuse images are already illegal pretty much everywhere on the planet – and are hunted down and policed as such. As Google’s spokespeople say, Google already has a zero-tolerance policy for those images, and has done for a while. Images depicting rape are another category, and the idea of making it illegal to possess them would be a significant step – but what about ‘pornography’. Currently, pornography is legal – but it comes in many forms, and is generally legal – and to many people have very little to do with either of the first two categories…. which brings me to the third question

3     Are you planning to make all pornography illegal?

…because that seems to be the logical extension of the idea that the essential position should be that ‘pornography’ should be blocked as standard. That, of course, brings up the first two questions again. Who’s going to make the decisions, and on what basis? Further to that, who’s going to ‘watch the watchmen’. The Internet Watch Foundation, that currently ‘police’ child abuse images, though an admirable body in many ways, are far from a model of transparency (see this excellent article by my colleague Emily Laidlaw). If a body is to have sweeping powers to control content is available – powers above and beyond those set out in law – that body needs to be accountable and their operations transparent. How are you planning to do that?

4     What about Page 3?

I assume you’re not considering banning this. If you want to be logically consistent – and, indeed, if you want to stop the ‘corrosion of childhood’ then doing something about Page 3 would seem to make much more sense. Given the new seriousness of your attitude, I assume you don’t subscribe to the view that Page 3 is just ‘harmless fun’…. but perhaps you do. Where is your line drawn? What would Mr Murdoch say?

5     What else do you want to censor?

…and I use the word ‘censor’ advisedly, because this is censorship, unless you confine it to material that is illegal. As I have said, child abuse images are already illegal, and the extension to images depicting rape is a welcome idea, so long as the definitions can be made to work (which may be very difficult). Deciding to censor pornography is one step – but what next? Censoring material depicting violence? ‘Glorifying’ terrorism etc?  Anything linking to ‘illegal content’ like material in breach of copyright? It’s a very slippery slope towards censoring pretty much anything you don’t like, whether it be for political purposes or otherwise. ‘Function creep’ is a recognised phenomenon in this area, and one that’s very difficult to guard against. What you design and build for one purpose can easily end up being used for quite another, which brings me to another question…

6     What happens when people ‘opt-in’?

In particular, what kind of records will be kept? Will there be a ‘list’ of those people who have ‘opted-in to porn’? Actually, scratch that part of the question – because there will, automatically be a list of those people who have opted in. That’s how the digital world works – perhaps not a single list, but a set of lists that can be complied into a complete list. The real question is what are you planning to do with that list. Will it be considered a list of people who are ‘untrustworthy’. Will the police have immediate access to it at all times? How will the list be kept secure? Will is become available to others? How about GCHQ? The NSA? Have the opportunities for the misuse of such a list been considered? Function creep applies here as well – and it’s equally difficult to guard against!

7     What was that letter to the ISPs about?

You know, the letter that got leaked, asking the ISPs to keep doing what they were already doing, but allow you to say that this was a great new initiative? Are you really ‘at war’ with the ISPs? Or does the letter reveal that this initiative of yours is essentially a PR exercise, aimed at saying that you’re doing something when in reality you’re not? Conversely, have you been talking to the ISPs in any detail? Do you have their agreement over much of this? Or are you going to try to ‘strong-arm’ them into cooperating with you in a plan that they think won’t work and will cost a great deal of money, time and effort? For a plan like this to work you need to work closely with them, not fight against them.

8     Are you going to get the ISPs to block Facebook?

I have been wondering about this for a while – because Facebook regularly includes images and pages that would fit within your apparent definitions, particularly as regards violence against women, and Facebook show no signs of removing them. The most they’ve done is remove advertisements from these kinds of pages – so anyone who accesses Facebook will have access to this material. Will the default be for Facebook to be blocked? Or do you imagine you’re going to convince Facebook to change their policy? If you do, I fear you don’t understand the strength of the ‘First Amendment’ lobby in the US… which brings me to another question

9     How do you think your plans will go down with US internet companies?

All I’ve seen from Google have been some pretty stony-faced comments – but for your plan to work you need to be able to get US companies to comply. Few will do so easily and willingly, partly on principle (the First Amendment really matters to most Americans), partly because it will cost them money to do so, and partly because it will thoroughly piss-off many of their American customers. So how do you plan to get them to comply? I assume you do have a plan…

10     Do you really think these plans will stop the ‘corrosion’ of childhood?

That’s my biggest question. As I’ve blogged before, I suspect this whole thing misses the point. It perpetuates a myth that you can make the internet a ‘safe’ place, and absolves parents of the real responsibility they have for helping their kids to grow up as savvy, wary and discerning internet users. It creates a straw man – the corrosion of childhood, such as it exists, comes from a much broader societal problem than internet porn, and if you focus only on internet porn, you can miss all the rest.

Plans like these, worthy though they may appear, do not, to me, seem likely to be in any way effective – the real ‘bad guys’ will find ways around them, the material will still exist, will keep being created, and we’ll pretend to have solved the problem – and at the same time put in a structure to allow censorship, create a deeply vulnerable database of ‘untrustworthy people’, and potentially alienate many of the most important companies on the internet. I’m not convinced it’s a good idea. To say the least.