Out-Law News 3 min. read

UK government promises new 'online safety legislation'


The UK government will set out new "online safety legislation" later this year, it has confirmed.

Included in the new legislation will be a new statutory social media code of practice along with a duty for online platforms to report on the measures taken to address both harmful and illegal content posted by users, according to the plans contained in the government's response to its internet safety strategy green paper (75-page /1.10MB PDF).

However, further legislative reforms could also be proposed, including in relation to online platforms' liability for harmful and illegal content they host, the government said.

"We are currently assessing legislative options to modify the online liability regime in the UK, including both the smaller changes consistent with the EU's E-Commerce Directive, and the larger changes that may be possible when we leave the EU," the government said in its paper. "This legislative work will help the industry understand exactly what government and the UK public expects in relation to safety and provide better consistency across a wide range of companies."

"As a first step, we will be introducing our statutory social media code of practice and transparency reporting. Our code of practice will tackle abusive and harmful conduct and content on social media, by setting a clear, common approach to online safety. And the internet safety transparency reports will allow us to track company performance on safety and benchmark companies against each other. Taken together, the code and the reports will allow government to monitor social media companies’ online safety efforts and evaluate their success," it said.

In its paper, the government also suggested that new age verification obligations could be imposed "to assist companies to enforce terms and conditions" and prevent children from using services where they may be exposed to harmful content.

However, while the government said further action is necessary to apply standards across all major online platforms, it used its paper to highlight the work some of the largest technology companies have already engaged in to address harmful and illegal online content.

It said: "It is important to recognise that the leading social media companies are already taking steps to improve their platforms. They have developed important technical tools and successful partnerships with charities to deliver online safety initiatives - with plans to do more in this area. The growth of AI and machine learning means that algorithms are used to remove harmful content more quickly. These measures are having a positive impact."

Currently, the E-Commerce Directive protects online service providers from liability for material that they neither create nor monitor but simply store or pass on to users of their service in some circumstances.

Under the rules, service providers acting as intermediaries, such as internet service providers (ISPs) and online platforms, are generally not responsible for the unlawful activity of publishers or internet users. EU countries are prohibited, under the Directive, from setting rules that force service providers to pro-actively monitor for illegal activity on their service.

Under the E-Commerce Directive, online service providers are generally not liable for unlawful information they transmit, temporary store or host on behalf of publishers. The exemption from liability for caching or hosting the information ends, however, if the service providers have 'actual knowledge' that the information they are storing or hosting is illegal or an awareness of facts or circumstances that suggest illegality and they fail to act "expeditiously to remove or to disable access to the information".

The government said: "We are looking at the legal liability that social media companies have for the illegal content shared on their sites. The status quo is increasingly unsustainable as it becomes clear many platforms are no longer just passive hosts."

"Whilst the case for change is clear, we also recognise that applying publisher standards of liability to all online platforms could risk real damage to the digital economy, which would be to the detriment of the public who benefit from them. That is why we are working with our European and international partners, as well as the businesses themselves, to understand how we can make the existing frameworks and definitions work better, and what a liability regime of the future should look like. This will play an important role in helping to protect users from illegal content online and will supplement our strategy," it said.

We are processing your request. \n Thank you for your patience. An error occurred. This could be due to inactivity on the page - please try again.