Out-Law News 7 min. read

Why the IWF was wrong to lift its ban on a Wikipedia page


EDITORIAL: The Internet Watch Foundation faced a storm of criticism this week over its decision to add a Wikipedia entry to a blacklist of pages that ISPs block. Under pressure, the IWF removed the image from its blacklist. That decision was a mistake.

The IWF exists primarily to minimise the availability of indecent images of children on the internet. Its existence is now being challenged. For those who don't know the background, the IWF received a complaint last Thursday about an image that appeared on the Wikipedia entry for an album by German rock band Scorpions. The image featured a naked young girl in an erotic pose. It was the original sleeve design for the band's 1976 album 'Virgin Killer'.

The IWF deemed the image potentially illegal and added the page on which it featured to a blacklist that is enforced by all of the UK's major internet service providers (ISPs). That action prevented most UK internet users from accessing the Wikipedia page and it had the unintended side-effect of stopping those users from editing any of the millions of Wikipedia pages.

The web community reacted by erupting in fury. Comments on blogs were overwhelmingly anti-IWF. On Tuesday night, the IWF gave in to some of the criticism. It removed the image from its blacklist, though, after consulting with senior police officers, reiterated its view that the image is potentially illegal.

The IWF was right to put the image on its blacklist. It was wrong to remove it.

Many people were alarmed to learn this week that there are restrictions on their freedom to surf the internet. Wikimedia, the non-profit operator of Wikipedia, said that this was the first time its site has been censored in the UK. It noted that it has been censored at various times in China, Syria and Iran.

Like it or not, censorship exists in the UK. Our right to freedom of expression is a qualified right. The European Convention of Human Rights provides that it can be restricted by laws to protect morals, the reputation or rights of others or laws to prevent the leak of confidential information. Our laws of child protection, defamation, intellectual property and confidence all contain a right to suppress online material – which falls within Wikipedia's own definition of censorship.

Other critics dislike that the IWF decides what to censor. They are right that it should not make that decision if it fails to perform that duty well. But do not judge it on the basis of one decision. It assessed 35,000 complaints in the course of last year alone (and in two-thirds of cases the image was deemed lawful).

The blacklist is kept secret for obvious reasons, but internet users who try to visit the pages on it will be oblivious to the censorship – and that is a mistake, I think. ISPs present an error message that does not disclose the censorship. While the IWF can't control that message, it could encourage transparency.

There is another problem with the IWF's model: it bans pages, not the images themselves. It says this approach is simpler and more effective, though I confess that I don't understand why. Still, if that policy is disproportionate it is only slightly so: it did not blacklist an entire site.

The IWF was set up by the UK's internet industry. It began as a 'notice and takedown' body for images of child abuse that are hosted in the UK, giving the public a hotline for reporting illegal images. Web hosts do not know what images are on their customers' sites and do not need to, provided they react quickly when alerted to potentially illegal content. Only a court can officially declare an image illegal – and that is why the IWF always refers to 'potentially illegal' images.

Hosts cannot afford to await a court's declaration that an image is illegal – otherwise it may come at their own trial on charges that carry a maximum 10-year sentence. And they don't want to give their own staff the job of receiving complaints with images of child abuse attached. So they outsourced the bulk of the work of receiving and assessing complaints and the IWF was born.

When child abuse images are hosted in the UK, the IWF can identify and notify the host and the host removes them. Its work has cut dramatically the volume of illegal images hosted in the UK.

The IWF's more controversial operation is the maintenance of its blacklist. When images are found to be hosted outside the UK, the IWF cannot ensure takedown. It reports the image to equivalent bodies and law enforcement in the hosting country and adds the URL to its blacklist, a list it updates twice a day. That list is followed by most ISPs in the UK, mainly to prevent their customers stumbling upon illegal images, images which are illegal to view.

Some have pointed out that the blacklist only stops one means of accessing illegal images. It does not stop exchanges by file-sharing, for example. Yet some proponents of this argument also maintain that the IWF is wrong to interfere. Surely an argument that the IWF does not go far enough to meet its objectives is incompatible with one that says it goes too far?

Many people don't like that ISPs use the IWF's blacklist to censor their surfing. But ISPs do this voluntarily in the knowledge that if they don't, the government will intervene. Vernon Coaker, Minister of State for policing, security and crime, said so. He set a target for the end of 2007 for all ISPs to put in place technical measures "that prevent their customers accessing websites containing illegal images of child abuse identified by the IWF."

"If it appears that we are not going to meet our target through co-operation," Coaker warned, "we will review the options for stopping UK residents accessing websites on the IWF list."

The IWF does not write laws or lobby for legal reform – it just interprets the legislation and court rulings that exist. It has a small team of analysts who train with the police and are experts in assessing content in line with those laws. The government trusts it to do this job.

Other industries have their own self-regulatory bodies staffed by experts in the field. The Advertising Standards Authority can ban adverts from TV without a court ruling. Spamhaus blacklists spammers to protect our email inboxes. Such bodies are accountable to their industries and the IWF is no different. If it fails in its duty, ISPs can kill it. If they do, they can either replace it or the government will replace it for them.

Some people have said that the Scorpions' image is harmless. My advice: consider it illegal because the IWF's opinion is likely to be, in legal terms, better informed – and therefore more influential in the mind of a judge.

It was blacklisted because, in the IWF's view, it is likely to fail a test of the Protection of Children Act (a law that did not exist at the time of the album's release). The Act refers to "indecent photographs" of children (under-18s) and what is indecent is, according to case law, for a jury to decide "based on the recognised standards of propriety."

Sentencing guidelines from the Sentencing Guidelines Council (see Part 6A of this 144-page PDF) provide guidance on the mode of trial and sentence that should apply. Every court must have regard to these guidelines. They describe five levels of seriousness, level one being "images of erotic posing, with no sexual activity".

The test will not censor Michelangelo's David or a cartoon, as some have feared, because it is limited to photographs and pseudo-photographs. Some have suggested that the album cover of Nirvana's Nevermind, on which a naked baby swims towards a dollar bill, is also at risk of a ban. That is unlikely because courts have never interpreted such images as erotically posed.

The IWF invoked its appeals procedure, carried out with senior police officers, after Wikimedia complained about the blacklisting. It upheld its original ruling. But the IWF decided to remove the page from its blacklist "in light of the length of time the image has existed and its wide availability," it said. Further reports of the image being hosted abroad will not be added to its blacklist, though if the image is found to be hosted in the UK, it "will be assessed in line with IWF procedures."

"IWF's overriding objective is to minimise the availability of indecent images of children on the internet, however, on this occasion our efforts have had the opposite effect," it said.

I think it was right to blacklist the image after it deemed the image illegal. But a dangerous message is sent by the decision to remove the image from its blacklist. If an image is illegal, should its age or wide dissemination excuse repeated publication? The image cannot be erased from every corner of the internet and prosecutors will not be taking action against all those who own a copy on either album sleeve or hard drive. But the IWF's job is to minimise exposure to illegal images – not to eliminate it. Under enormous pressure I think it lost sight of the distinction.

Censorship takes place in Britain every day, for legal, moral and commercial reasons. When Wikipedia blocks those who vandalise its pages or deletes their hateful comments, it too engages in censorship. Internet companies engage in censorship because they have to –and they outsource part of that burden to the IWF. This incident has focused attention not just on a 1970s album cover. Clearly some people dislike our laws, our industry's preference for self-regulation and/or the operation of the IWF. If the critics seek reform they should suggest a credible alternative, one that the industry and government would support.

Balancing our freedom of expression with the protection of children is difficult and important. It is a healthy issue to debate. But like any Wikipedia article, that debate needs some balance. This week that balance was missing.

This is a variation of a piece by Struan Robertson, Editor of OUT-LAW, which appeared earlier today on FT.com. These are the personal views of the author and do not necessarily represent the views of Pinsent Masons.


UPDATES, 15/12/2008:

  1. This article initially said that the Government had rejected plans to extend the law to cover cartoons. In fact, it proposed such an extension this year, albeit the law has not yet changed.
  2. The article also implied that the 'sentencing guidelines' contain the test of indecency. A reader rightly pointed out that it is for the jury to decide what is or is not indecent, based on the test laid down in the 1972 case of R v Stamford ("recognised standards of propriety").

I have updated this article and I apologise for these errors.

We are processing your request. \n Thank you for your patience. An error occurred. This could be due to inactivity on the page - please try again.