Out-Law / Your Daily Need-To-Know

Out-Law News 4 min. read

MPs call for ethics-based internet regulation


A new code of ethics should govern the removal of harmful content from the internet and there should be "large fines" for technology companies that fail to comply with it, a prominent group of MPs has said.

The Digital, Culture, Media and Sport Committee (DCMS) made the recommendations in a wide-ranging new report on disinformation and 'fake news' (111-page / 3.59MB PDF).

The Committee's call for a new code of ethics, which it said would be similar to the broadcasting code that governs broadcasting standards in the UK and which is overseen by Ofcom, would set out "clear legal liabilities … for tech companies to act against harmful or illegal content on their sites".

The new regulation should be supported by the creation of a "new category of tech company" which would mean tech companies would no longer have to be treated as either a ‘platform’ or a ‘publisher’ for the purposes of establishing their legal liability, it said.

"This approach would see the tech companies assume legal liability for content identified as harmful after it has been posted by users," the Committee said.

Currently, EU law prohibits member states from forcing online service providers, such as social media companies and online platforms, from pro-actively monitoring for illegal activity on their service.

The E-Commerce Directive also serves to restrict when online service providers are considered liable for material that they neither create nor monitor but simply store or pass on to users. The providers are generally not liable for unlawful information they transmit, temporarily store or host on behalf of publishers.

That exemption, though, falls when providers have 'actual knowledge' that the information they are storing or hosting is illegal, or they have an awareness of facts or circumstances that suggest illegality, and they fail to act "expeditiously to remove or to disable access to the information".

In the UK it is necessary for the service provider to be aware of the location of the harmful content, and not just its mere existence on its platform, for 'actual knowledge' to be said to have been acquired by the service provider and the requirement for it to act expeditiously to apply.

In their report, the MPs said an independent regulator should be appointed to oversee compliance with the new code of ethics. They said the regulator should be free to act on the complaints from the public, be given powers of audit and have scope to fine technology companies that fail to adhere to the new code, and that it should be funded through a levy on technology companies.

"The code of ethics should be developed by technical experts and overseen by the independent regulator, in order to set down in writing what is and is not acceptable on social media. This should include harmful and illegal content that has been referred to the companies for removal by their users, or that should have been easy for tech companies themselves to identify," the DCMS Committee said.

"The process should establish clear, legal liability for tech companies to act against agreed harmful and illegal content on their platform and such companies should have relevant systems in place to highlight and remove ‘types of harm’ and to ensure that cybersecurity structures are in place. If tech companies (including technical engineers involved in creating the software for the companies) are found to have failed to meet their obligations under such a code, and not acted against the distribution of harmful and illegal content, the independent regulator should have the ability to launch legal proceedings against them, with the prospect of large fines being administered as the penalty for non-compliance with the code," it said.

"This same public body should have statutory powers to obtain any information from social media companies that are relevant to its inquiries. This could include the capability to check what data is being held on an individual user, if a user requests such information. This body should also have access to tech companies’ security mechanisms and algorithms, to ensure they are operating responsibly," the Committee said.

The Committee also backed calls from the UK's data protection watchdog, the information commissioner, to extend the "protections of privacy law" to information models that are "used to make inferences about individuals, in particular during political campaigning".

"This will ensure that inferences about individuals are treated as importantly as individuals’ personal information," it said.

The DCMS Committee's report was issued after a near 18-month long inquiry. It examined where biased but legitimate commentary can stray into propaganda and lies and how fake news can influence the public's understanding of the world. It also looked into whether changes in the selling and placing of advertising encouraged the growth of fake news.

The inquiry developed into a deeper examination of issues such as data sharing and guardianship by social media companies, political fundraising and the foreign interference in elections.

The Committee said the Cambridge Analytica scandal was "facilitated by Facebook’s policies" and took issue with the social networking giant's practices on sharing data with developers, alleging that the company "intentionally and knowingly violated both data privacy and anti-competition laws". In response, Facebook told Out-Law.com that it rejects all claims that it breached data protection and competition laws.

The Committee concluded that data is being used by businesses to influence political decision making by the UK electorate and that UK electoral law is "not fit for purpose" and needs updated to account for "online, microtargeted political campaigning".

The Committee also called for clarity on how many investigations are currently being carried out into Russian interference in UK politics and urged the government to open an independent investigation into a number of elections held in the UK in recent years, including the UK general election in 2017, the EU referendum in 2016 and the Scottish independence referendum of 2014.

The investigations would "explore what actually happened with regard to foreign influence, disinformation, funding, voter manipulation, and the sharing of data, so that appropriate changes to the law can be made and lessons can be learnt for future elections and referenda", the Committee said.

We are processing your request. \n Thank you for your patience. An error occurred. This could be due to inactivity on the page - please try again.