There’s no ‘silver bullet’ for porn….

Werewolf attack jumpI was lucky enough to be on Radio 4’s ‘The Media Show’ yesterday, to talk about Cameron’s porn-blocking plans: I think I was invited as a result of my blog post from Monday, asking 10 questions about the plan. I didn’t have much time – and I’m still very much an amateur on the radio – and though I think I managed to get across some of what I wanted to say, I didn’t get close to persuading the other person talking about the subject – Eleanor Mills, of the Sunday Times. I think I know why: she’s in many ways correctly identified the monster that she wants to slay, and she thinks that she’s found the silver bullet. The problem is, for porn, there IS no silver bullet. It’s not that simple.

The solution that she suggested – and she said that ‘the man from Google’ told her it was possible – was a simple ‘switch’ to turn a ‘porn filter’ on or off. If you wanted to see ‘restricted’ material for some justified reason (e.g. to look at material for sex education purposes) you could turn it on, and you’d be asked a question in a pop-up, something like ‘Do you want to look at this for research purposes?’. You’d click OK, look at the stuff, then turn the filter back on. Simple. Why not do it?

It doesn’t really take a technical expert to see the flaws in that plan even if it was possible to create such a switch – how it wouldn’t stop viewing stuff for bad reasons (who’s going to be honest when asked why you want?), how it avoids the fundamental question of how you define ‘porn’, and all the other crucial issues that I mentioned in my other blog. That’s not to mention the technical difficulties, the problem of over-censorship and under-censorship, of the way that the really bad stuff will avoid the filters anyway – let alone the even more fundamental issues of free speech and the need to be able to access information free of fetters or limitations…. There are so many flaws in the plan that it’s hard to know where to start – but it’s easy to see the attraction of the solution.

We all want to find easy solutions – and computerised, technical solutions often promise those kinds of easy solutions. Porn, however, is not amenable to easy solutions. It’s a complex subject – and sadly for those looking for silver bullets, it needs complex, multifaceted solutions that take time, effort and attention.

We do, however, know what a lot of those solutions are – but they’re not really politically acceptable at the moment, it seems. We know, for example, that really good sex and relationships education helps – but the government recently voted down a bill that would make that kind of education compulsory in schools. The ‘traditional’ education favoured by Michael Gove and the Daily Mail has no truck with new-fangled trendy things like that, and the puritanical religious approach still claims, despite all the evidence, that ignorance of sexual matters is bliss. It isn’t. Better education is the key starting point to helping kids to find their way with sex and relationships – and to make the ‘poisonous’ influence of ‘bad’ porn (which, it must be remembered, is generally NOT illegal) the kind of thing that Eleanor Mills justifiably wants to deal with. If she really wants to help, she should be fighting the government on that, not pushing technical, magical solutions that really won’t work.

The next stage is putting more resources – and yes, that means money – into the solutions that we know work well. The IWF in dealing with child abuse images. CEOP in dealing with sex-offenders online activities. Work on a targeted, intelligent level. The experts know it works – but it’s hard work, it’s not headline-grabbing, and it’s not ‘instant’. What’s more, it’s not cheap.

The other part of the jigsaw for me, is to start having a more intelligent, more mature and more honest debate about this. If the politicians didn’t go for soundbite solutions without talking to experts, but actually listened to what people said, this might be possible. Sadly, with the current lot of politicians on pretty much every side, that seems impossible. This isn’t a party-politcal issue: Labour are every bit as bad as the Tories on this, with Helen Goodman a notable offender. It’s an issue of politicians being unwilling to admit they don’t understand, and unwilling to take advice that doesn’t fit with their ‘world view’. It’s an issue of the corrosive influence of hypocritical and puritanical newspapers like the Daily Mail on the one hand calling for internet porn bans and on the other parading their ‘sidebar of shame’ complete with images and stories that objectify women and girls to an extreme.

The one saving grace here is that the solution they suggest simply won’t work – and eventually they’ll realise that. In Australia, a similarly facile solution was tried, only to be ignominiously abandoned a few years later. If only that lesson was the one from Australia that Lynton Crosby managed to get across to David Cameron….

10 questions about Cameron’s ‘new’ porn-blocking

There’s been a bit of a media onslaught from David Cameron about his ‘war on porn’ over the weekend. Some of the messages given out have been very welcome – but some are contradictory and others make very little sense when examined closely. The latest pronouncement, as presented to/by the BBC, says

“Online pornography to be blocked automatically, PM announces”

The overall thrust seem to be that, as Cameron is going to put in a speech:

“Every household in the UK is to have pornography blocked by their internet provider unless they choose to receive it.”

So is this the ‘opt-in to porn’ idea that the government has been pushing for the last couple of years? The BBC page seems to suggest so. It suggests that all new customers to ISPs will have their ‘porn-filters’ turned on by default, so will have to actively choose to turn them off – and that ‘millions of existing computer users will be contacted by their internet providers and told they must decide whether to activate filters’.

Some of this is welcome – the statement about making it a criminal offence to possess images depicting rape sounds a good idea on the face of it, for example, for such material is deeply offensive, though quite where it would leave anyone who owns a DVD of Jodie Foster being raped in The Accused doesn’t appear to be clear. Indeed, that is the first of my ten questions for David Cameron.

1     Who will decide what counts as ‘pornography’, and how?

And not just pornography, but images depicting rape? Will this be done automatically, or will there be some kind of ‘porn board’ of people who will scour the internet for images and decide what is ‘OK’ and what isn’t? Automatic systems already exist to do this for child abuse images, and by most accounts they work reasonably well, but they haven’t eradicated the problem of child abuse images. Far from it. If it’s going to be a ‘human’ system – perhaps an extension of the Child Exploitation and Online Protection Centre (CEOP) – how are you planning to fund it, and do you have any idea how much this is going to cost?

2     Do you understand and acknowledge the difference between pornography, child abuse images and images depicting rape? 

One of the greatest sources of confusion over the various messages given out over the weekend has been the mismatch between headlines, sound bites, and actual proposals (such as they exist) over what you’re actually talking about. Child abuse images are already illegal pretty much everywhere on the planet – and are hunted down and policed as such. As Google’s spokespeople say, Google already has a zero-tolerance policy for those images, and has done for a while. Images depicting rape are another category, and the idea of making it illegal to possess them would be a significant step – but what about ‘pornography’. Currently, pornography is legal – but it comes in many forms, and is generally legal – and to many people have very little to do with either of the first two categories…. which brings me to the third question

3     Are you planning to make all pornography illegal?

…because that seems to be the logical extension of the idea that the essential position should be that ‘pornography’ should be blocked as standard. That, of course, brings up the first two questions again. Who’s going to make the decisions, and on what basis? Further to that, who’s going to ‘watch the watchmen’. The Internet Watch Foundation, that currently ‘police’ child abuse images, though an admirable body in many ways, are far from a model of transparency (see this excellent article by my colleague Emily Laidlaw). If a body is to have sweeping powers to control content is available – powers above and beyond those set out in law – that body needs to be accountable and their operations transparent. How are you planning to do that?

4     What about Page 3?

I assume you’re not considering banning this. If you want to be logically consistent – and, indeed, if you want to stop the ‘corrosion of childhood’ then doing something about Page 3 would seem to make much more sense. Given the new seriousness of your attitude, I assume you don’t subscribe to the view that Page 3 is just ‘harmless fun’…. but perhaps you do. Where is your line drawn? What would Mr Murdoch say?

5     What else do you want to censor?

…and I use the word ‘censor’ advisedly, because this is censorship, unless you confine it to material that is illegal. As I have said, child abuse images are already illegal, and the extension to images depicting rape is a welcome idea, so long as the definitions can be made to work (which may be very difficult). Deciding to censor pornography is one step – but what next? Censoring material depicting violence? ‘Glorifying’ terrorism etc?  Anything linking to ‘illegal content’ like material in breach of copyright? It’s a very slippery slope towards censoring pretty much anything you don’t like, whether it be for political purposes or otherwise. ‘Function creep’ is a recognised phenomenon in this area, and one that’s very difficult to guard against. What you design and build for one purpose can easily end up being used for quite another, which brings me to another question…

6     What happens when people ‘opt-in’?

In particular, what kind of records will be kept? Will there be a ‘list’ of those people who have ‘opted-in to porn’? Actually, scratch that part of the question – because there will, automatically be a list of those people who have opted in. That’s how the digital world works – perhaps not a single list, but a set of lists that can be complied into a complete list. The real question is what are you planning to do with that list. Will it be considered a list of people who are ‘untrustworthy’. Will the police have immediate access to it at all times? How will the list be kept secure? Will is become available to others? How about GCHQ? The NSA? Have the opportunities for the misuse of such a list been considered? Function creep applies here as well – and it’s equally difficult to guard against!

7     What was that letter to the ISPs about?

You know, the letter that got leaked, asking the ISPs to keep doing what they were already doing, but allow you to say that this was a great new initiative? Are you really ‘at war’ with the ISPs? Or does the letter reveal that this initiative of yours is essentially a PR exercise, aimed at saying that you’re doing something when in reality you’re not? Conversely, have you been talking to the ISPs in any detail? Do you have their agreement over much of this? Or are you going to try to ‘strong-arm’ them into cooperating with you in a plan that they think won’t work and will cost a great deal of money, time and effort? For a plan like this to work you need to work closely with them, not fight against them.

8     Are you going to get the ISPs to block Facebook?

I have been wondering about this for a while – because Facebook regularly includes images and pages that would fit within your apparent definitions, particularly as regards violence against women, and Facebook show no signs of removing them. The most they’ve done is remove advertisements from these kinds of pages – so anyone who accesses Facebook will have access to this material. Will the default be for Facebook to be blocked? Or do you imagine you’re going to convince Facebook to change their policy? If you do, I fear you don’t understand the strength of the ‘First Amendment’ lobby in the US… which brings me to another question

9     How do you think your plans will go down with US internet companies?

All I’ve seen from Google have been some pretty stony-faced comments – but for your plan to work you need to be able to get US companies to comply. Few will do so easily and willingly, partly on principle (the First Amendment really matters to most Americans), partly because it will cost them money to do so, and partly because it will thoroughly piss-off many of their American customers. So how do you plan to get them to comply? I assume you do have a plan…

10     Do you really think these plans will stop the ‘corrosion’ of childhood?

That’s my biggest question. As I’ve blogged before, I suspect this whole thing misses the point. It perpetuates a myth that you can make the internet a ‘safe’ place, and absolves parents of the real responsibility they have for helping their kids to grow up as savvy, wary and discerning internet users. It creates a straw man – the corrosion of childhood, such as it exists, comes from a much broader societal problem than internet porn, and if you focus only on internet porn, you can miss all the rest.

Plans like these, worthy though they may appear, do not, to me, seem likely to be in any way effective – the real ‘bad guys’ will find ways around them, the material will still exist, will keep being created, and we’ll pretend to have solved the problem – and at the same time put in a structure to allow censorship, create a deeply vulnerable database of ‘untrustworthy people’, and potentially alienate many of the most important companies on the internet. I’m not convinced it’s a good idea. To say the least.

Privacy, Parenting and Porn

One of the stories doing the media rounds today surrounded the latest pronouncements from the Prime Minister concerning porn on the internet. Two of my most commonly used news sources, the BBC and the Guardian, had very different takes on in. The BBC suggested that internet providers were offering parents an opportunity to block porn (and ‘opt-in’ to website blocking) while the Guardian took it exactly the other way – suggesting that users would have to opt out of the blocking – or, to be more direct, to ‘opt-in’ to being able to receive porn.

Fool that I am, I fell for the Guardian’s version of the story (as did a lot of people, from the buzz on twitter) which seems now to have been thoroughly debunked, with the main ISPs saying that the new system would make no difference, and bloggers like the excellent David Meyer of ZDNet making it clear that the BBC was a lot closer to the truth. The idea would be that parents would be given the choice as to whether to accept the filtering/blocking system, which, on the face of it, seems much more sensible.

Even so, the whole thing sets off a series of alarm bells. Why does this sort of thing seem worrying? The first angle that bothers me is the censorship one – who is it that decides what is filtered and what is not? Where do the boundaries lie? One person’s porn is another person’s art – and standards are constantly changing. Cultural and religious attitudes all come into play. Now I’m not an expert in this area – and there are plenty of people who have written and said a great deal about it, far more eloquently than me – but at the very least it appears clear that there are no universal standards, and that decisions as to what should or should not be put on ‘block lists’ need to be made very carefully, with transparency about the process and accountability from those who make the decisions. There needs to be a proper notification and appeals process – because decisions made can have a huge impact. None of that appears true about most ‘porn-blocking’ systems, including the UK’s Internet Watch Foundation, often very misleadingly portrayed as an example of how this kind of thing should be done.

The censorship side of things, however, is not the angle that interests me the most. Two others are of far more interest: the parenting angle, and the privacy angle. As a father myself, of course I want to protect my child – but children need independence and privacy, and need to learn how to protect themselves. The more we try to wrap them in cotton wool, to make their world risk-free, the less able they are to learn how to judge for themselves, and to protect themselves. If I expect technology, the prime minister, the Internet Watch Foundation to do all the work for me, not only am I abdicating responsibility as a parent but I’m denying my child the opportunity to learn and to develop. The existence of schemes like the one planned could work both ways at once: it could make parents think that their parenting job is done for them, and it could also reduce children’s chances to learn to discriminate, to decide, and to develop their moral judgment….

….but that is, of course, a very personal view. Other parents might view it very differently – what we need is some kind of balance, and, as noted above, proper transparency and accountability.

The other angle is that of privacy. Systems like this have huge potential impacts on privacy, in many different ways. One, however, is of particular concern to me. First of all, suppose the Guardian was right, and you had to ‘opt-in’ to be able to view the ‘uncensored internet’. That would create a database of people who might be considered ‘people who want to watch porn’. How long before that becomes something that can be searched when looking for potential sex offenders? If I want an uncensored internet, does that make me a potential paedophile? Now the Guardian appears to be wrong, and instead we’re going to have to opt-in to accept the filtering system – so there won’t be a list of people who want to watch porn, instead a list of people who want to block porn. It wouldn’t take much work, however, on the customer database of a participating ISP to select all those users who had the option to choose the blocking system, and didn’t take it. Again, you have a database of people who, if looked at from this perspective, want to watch porn….

Now maybe I’m overreacting, maybe I’m thinking too much about what might happen rather than what will happen – but slippery slopes and function creep are far from rare in this kind of a field. I always think of the words of Bruce Schneier, on a related subject:

“It’s bad civic hygiene to build technologies that could someday be used to facilitate a police state”

Now I’m not suggesting that this kind of thing would work like this – but the more ‘lists’ and ‘databases’ we have of people who don’t do what’s ‘expected’ of them, or what society deems ‘normal’, the more opportunities we create for potential abuse. We should be very careful…

Do we have a ‘right to be found’?

I’ve thought (and written) a lot about privacy and autonomy – but I’m fully aware that privacy and autonomy are not the only important human rights relevant to the internet. Indeed, they may not be the most important, particularly compared to freedom of expression – the internet is to a great extent a communications medium, and much of the current use of the internet relates to the expression of ideas, particularly in relation to what might loosely be described as the Web 2.0 applications: blogs, wikis, social networking and related services. Free expression can be considered another aspect of autonomy – what’s more, the privacy-related threats to autonomy can have a significant impact on freedom of expression, not just directly (for example where a dissident blogger is tracked down and arrested as a result of breaches of privacy) but through the chilling effect – a kind of ‘Internet Panopticon’ effect – that the knowledge of the potential privacy-related risks can produce.
There are also, however, aspects to the issue of free expression that do not relate just to privacy. One particularly direct aspect relates to the functioning of search engines and other navigation methods through the internet. Does the creator of a website have a ‘right to be found’? That kind of a right wouldn’t mean that a site could demand special treatment from a search engine – but that the site should be able to be sure that it wouldn’t receive specially unfair treatment, and that a search engine should treat it on its merits, according the principles that are known and understood. Any kind of a right to free expression, in relation to the internet, wouldn’t seem to mean much if what you express can’t be found.
The implementation of a right like this would have particular difficulties in relation to the rights of the search engines themselves to trade secrets insofar as their search algorithms are concerned, but companies and the EC have already bitten the bullet sufficiently to take Google on in terms of possible biasing of search results in the ‘Foundem’ case and this kind of right would related directly to this kind of bias, as bias in favour of something is by its very nature bias against something else. Google have responded to that accusation in relation to Foundem by saying (amongst other things) ‘We built Google for users, not websites,’ (see for example here) but in an increasingly personal internet, where users are becoming publishers, is that a sufficiently strong argument? If free expression is to be taken seriously, it may not be.
A ‘right to be found’ would be intended to prevent both bias and censorship – of the kind exercised by authoritarian regimes, for example – and would also have direct implications on the work of organisations like the Internet Watch Foundation, requiring them to be properly transparent and accountable, something that at present they generally seem not to be…