Monday, July 22, 2013

Moral panic: privacy vs porn

It was inevitable. The signs have been evident for some time.  David Cameron has decreed that every household connected to the Internet will have to tell their ISPs whether they want to access Internet porn. Our ISPs will be forced to operate porn filters and ask each household whether they want to opt out of using the filters if they want porn. 

The plan is described as ‘opt-out’ but that’s rather dishonest since “do you want to see porn?” is a loaded question.  It’s loaded because the plan will generate a list of people who say they want porn. History shows that when governments have data like this, they can’t help but think of exciting and horrible new ways to use it. And once the legislation is in place, it’s increasingly easy to add to. 

It’s the sort of information that could turn up to make people look bad in a trial or discredit witnesses: “she admits she likes porn!”. Worse, it could be used to indiscriminately target suspects or justify harassment by authorities.  The investigation of a sexual crime might begin by looking for everyone present in the area at the time who previously opted-in to porn.  Worse, in fact, everyone from a household that opted-in.

This is evidence at it’s most circumstantial, but it’s exactly the sort of thing that might sway juries, trump-up charges or demonise people within their own communities.

This move is a reaction to the moral panic currently being whipped up about pornography, largely by organisations devoid of morality such as the Daily Mail.  The argument is that people are harmed by looking at pornography.  The evidence for this is at best dubious.  I suspect that the people who are most harmed by pornography are those who are exploited by pornographers, not the users.  I think the answer is to use porn responsibly. Legitimise it. Regulate it. Unionise it. Let people who want to work in porn do so and protect them from exploitation. And allow people to opt out of porn if they want to in the way they want to. For example, they could try not downloading it. Or they could install filters at home and learn how to use them properly.  Or they could use an ISP that filters porn. 

But don’t force people to tell the government whether they want to watch porn or not.  This is information that can very easily be misused.

Of course, it’s entirely possible that porn could have indirect negative effects, such as the continued sexualisation and objectification of (especially) women. But this is by no means limited to porn and I’m not convinced that reducing access to porn will stop that. How about more porn for women? How about more porn for couples of any gender and combinations of sexuality? How about – in other words – producing and viewing porn responsibly?  Perhaps if we all had better attitudes in the first place, porn wouldn’t be seen by some as the problem or the cause of the problem.

There’s more to the plan.  For instance, possession of “extreme pornography” is to be banned. It isn’t clear what counts as “extreme”, but it is apparently to include porn that simulates rape.  This idea is also dangerous, for at least three reasons.

First, nothing illegal is happening in the production of rape fantasy porn (assuming the adults consent). Nobody is being harmed, nobody is really being raped.

Second, it is not clear that the possession of rape fantasy porn leads to violence.  Perhaps it will be established that use of such porn really is a significant factor that leads to violent acts, but until we have that evidence, it seems dangerous to ban its possession. Personally, I find porn that depicts violence of any kind awful and I’d rather there wasn’t any.  Even if it doesn’t lead to violence – particularly against women – it seems likely that it might lead to contempt and dehumanisation, But most porn can be accused of the same thing to a certain extent. The difference is that with more mainstream porn, we tend to assume that most people can tell the difference between fantasy and reality and that in more “extreme” cases, we don’t. I’m just not comfortable with that distinction, much as I’m sickened by violent images, simulated or otherwise. I’m happy to hear opinions on this point, I’m having trouble coming to a conclusion.

Third, defining something as 'extreme’ is a safe haven for feature creep. Once there is such a category it gets easier for governments to add new things to that category. We know that governments will cheerfully erode our rights in exchange for the votes of the ignorant. This seems like a perfect vehicle for doing that. 

The same might be true of the plan for the Child Exploitation and Online Protection Centre (CEOP) to draw up a list of “abhorrent” search terms. I have nothing against that in itself: it’s how this might be managed and used that concerns me.  What’s to stop new terms creeping onto the list, which might be seen as ‘abhorrent’ by some but aren’t actually harmful to anyone?

But more importantly, surely this is the very definition of thought crime?  Paedophiles are sexually attracted to children. They can’t help it. We hope that they never act on their urges either by assaulting a child or by looking at porn that involves the assault of a child. I certainly think it’s right to outlaw the assault of children and behaviour that directly encourages, endorses or supports the assault of children, such as buying or sharing child pornography.  But searching for particular terms doesn’t imply that an abusive act is about to take place or even that it is more likely to take place in a particular instance. 

I agree that it’s hard to justify searching for child porn, but I don’t think it should be a crime.  Producing it, owning it, appearing in it (for adults), yes. Distributing it, yes. Searching for it? That’s a very dangerous thing to make illegal and a dangerous sort of thing – regardless of the details – to allow a government to do.

No comments:

Post a Comment