Cookies on Pinsent Masons website

This website uses cookies to allow us to see how the site is used. The cookies cannot identify you. If you continue to use this site we will assume that you are happy with this

If you want to use the sites without cookies or would like to know more, you can do that here.

Mobile internet filters block legitimate content, campaign group says

Mobile operators' content filters cut off access to legitimate content, a campaign group for digital rights has said. 16 May 2012

Mobile operators currently provide filter systems that enable parents to stop children accessing websites deemed to contain content suitable for individuals aged 18 or over. However, the Open Rights Group (ORG) said there are "a number of serious problems" with how those systems work.

"At present the filtering systems are too blunt an instrument and too poorly implemented," the ORG said in a report (43-page / 408KB PDF) on 'mobile internet censorship'. "Mobile Internet filtering blocks too much content, and applies to too many people, meaning it effectively adds up to a system of censorship across UK networks."

"As more people use mobile devices to access the Internet, and as the Internet continues to provide a potential platform for promoting both freedom of expression and economic innovation, it is critical that such problems are addressed. If they are not, then this form of censorship will continue to create unwanted restrictions on access to information for adults and young people, which will damage markets, undermine the free flow of ideas and open communication, and make it harder to promote responsible Internet governance internationally," it said.

ORG said that its report was based on complaints it had received about "over-blocking" and also on a 'mystery shopper' exercise it had conducted where it "complained about incorrectly blocked sites to the mobile operators and assessed the response."

The group said its findings show that the filters do not just prevent access to "adult sexual content" and that there were problems with the "classification" of websites on 'blacklists' that has led to some being wrongly blocked by filtering.

Mobile operators are also "not transparent enough about how their filtering systems work or the kind of content they block" and it is "often not clear how to report mistakes and problems." the ORG said. Another problem the group identified was the difficulty in turning filters off.

"Getting mobile operators to turn off blocks often requires consumers to provide credit card details as a means of identification or to go to a store," the ORG said. "For many this may not be too onerous or problematic, although some may not want to provide credit card details either over the phone or through the page returned to a user when a site is blocked."

 

"A more significant concern may be finding a way for those who run website they believe have been incorrectly blocked to ‘opt-out. It is not at all clear that it is possible for sites to have themselves removed from content filters," it said.

"The result [of these problems] is that a system ostensibly designed to help parents manage their children’s access to the Internet is effectively implementing much broader restrictions on access to information that affect a much wider group of people than intended," it said.

"The worthwhile aspiration to help parents manage their children’s Internet access has led to filtering systems that are clumsy, inaccurate, and inefficient, based on opaque and error-ridden lists of sites considered ‘blockable’," the group added.

The ORG said that some mobile internet providers are better than others at providing clear information about opting-out of filters, but that it was "clear that all the systems in use by the mobile operators suffer in some respects from these four issues."

ORG said that current mobile filtering in the UK does not follow three rules recommended by the UN's rapporteur on freedom of expression, Frank La Rue, over how filtering measures should operate. Last year La Rue reported that filtering of content can have a "chilling effect" on freedom of expression unless the use of filters are limited to exceptional circumstances, governed by law and a clear legal process and are necessary and the least restrictive means required to achieve the aim, the ORG said.

The campaigners said that the UK's mobile filtering system is "overly broad, and governed by informal industry frameworks and contractual relationships with filtering service providers."

It said some of the "consequences" of the UK's system is that businesses are cut off from accessing their market because of over-blocking and that the system can also lead to censorship.

"There are clear problems for free access to and sharing of information when decisions about access are taken out of people’s hands, and left to opaque and informal agreements or clumsy and unresponsive technical systems," it said. "This is especially problematic in a filtering system that is not ‘granular’ enough, leading to blanket filtering that covers far too much material, for example sites such as restaurant sites, blogs about shelves, or political discussion sites."

"Furthermore, if online censorship is widespread and accepted with little opposition as a way to implement a broad range of public policy issues, it becomes far harder to argue for Internet freedom elsewhere," ORG said.

Other "consequences" of problems with the current mobile filtering systems is that young people are "denied access to legitimate and age-appropriate information and resources such as sexual health information and advice". The systems also offer a "false sense of security" and may not actually adequately protect children from online risks, because filters can be circumvented and because new encryption technology is making it "impossible for an ISP to ‘check’ the web address the user is visiting," ORG said.

ORG said that "in the longer term" filtering systems should be "device-based" rather than at "ISP level" because, generally, "the closer to a user the filtering happens, the more control the user has over it".

In the shorter term the campaigners said mobile operators should enable customers to choose whether to opt-in to censorship measures at the point of signing-up to their service, the ORG said. The nature of the censorship tools should be referred to as 'parental controls' rather than 'adult content' because "the range of material caught stretches far beyond sexual content and the terminology should reflect this," it said.

The operators should also be more transparent about what their filters block and provide adults with "clear advice about the kind of content that may be blocked, and ... with clear information on how the blocking works," including any information on the identity of any third-party suppliers of the filtering technology, ORG said. It added that there should also be "clear and easy ways to check if a site is blocked" and that there should be an easy-to-use mechanism "to complain about wrongful blocks, including at the time when an incorrectly blocked website is found."

ORG also recommended that mobile operators regularly review the operation of their filter systems and provide a means of redress to website operators in order that they can "challenge a refusal to remove their site from a blocking system."

ORG said that problems with filtering of content may extend beyond those problems affecting mobile operators' systems if new Government policies on child protection filtering, or filtering content related to terrorism and extremism and for copyright enforcement, are not carefully drafted.

"Where filtering is mandatory – meaning imposed by the government or mandated by a court order with no choice to have filtering applied – questions about necessity, proportionality, and due legal process become even more significant," it said. "What mobile filtering already helps to demonstrate is that seemingly simple, laudable goals such as protecting children through technical intervention may have significant harmful and unintended consequences for everybody’s access to information."

ORG also said that no filtering systems should be introduced "by default". However, draft legislation currently before the House of Lords would, if enacted, introduce such a statutory requirement on both ISPs and mobile network operators.

The Online Safety Bill places a "duty" on ISPs and mobile network operators that provide internet access services to "provide a service that excludes pornographic images" by default. It has received a first reading in the House of Lords after it previously made "no further progress" beyond a first reading in the Lords during the last Parliamentary session.

Under the Bill, only if subscribers aged over 18 actively "opt-in" to access the adult content would the ISPs and operators allow the material to be accessed, and even then access would have to be denied unless the website featuring the images "has an age verification policy which has been used to confirm that the subscriber is aged 18 or over".

Join My Out-Law

  • See only the content that matters to you
  • Tailor Out-Law to your exact needs
  • Save the most useful content for later reading
  • Tailor our weekly eNewsletter to your interests

Join My Out-Law

Already signed up to My Out-Law? Sign in