The House of Commons committee on culture, media and sport said that a body should be set up to police user-submitted content because companies' own regulation of content was "unsatisfactory".
The committee has backed a proposal for a UK Council for Child Internet Safety to act as an internet industry self regulatory body to force companies to take control of the content they publish.
The committee has also said that all companies should screen all content before it is hosted online. In the hearings leading to the committee's report, "Harmful content on the Internet and in video Games", Google told the committee that its video subsidiary YouTube received 10 hours of video material for posting every minute.
"Sites which host user-generated content – typically photos and videos uploaded by members of the public – have taken some steps to set minimum standards for that content. They could and should do more," said the report. "It is not standard practice for staff employed by social networking sites or video-sharing sites to preview content before it can be viewed by consumers. Some firms do not even undertake routine review of material uploaded, claiming that the volumes involved make it impractical."
"We were not persuaded by this argument, and we recommend that proactive review of content should be standard practice for sites hosting user-generated content," it said.
The committee also recommended that publishers provide a one-click method of reporting material to law enforcement agencies, not just to site owners.
The committee said that though self regulation is attractive, it has failed in the internet industry and needed to be stricter.
"Rather than leap to statutory regulation, we propose a tighter form of self-regulation, under which the industry would speedily establish a self-regulatory body to draw up agreed minimum standards based upon the recommendations of the UK Council for Child Internet Safety, monitor their effectiveness, publish performance statistics, and adjudicate on complaints," said the committee's proposal. "In time, the new body might also take on the task of setting rules governing practice in other areas such as online piracy and peer to peer file-sharing, and targeted or so-called 'behavioural' advertising."
Internet service providers and website operators often refuse to pre-screen material because in doing so they make it more likely that they will be held responsible for it in any defamation action or another case based on the content of the material. If they have not pre-approved or even seen the material it is far less likely that they will bear legal responsibility for it.
The committee said that this was an unfortunate consequence of the law, and urged the Government to clarify publishers' liability in such as way as to encourage responsible publishing.
"We do not believe that it is in the public interest for Internet service providers or networking sites to neglect screening content because of a fear that they will become liable under the terms of the EC E-Commerce Directive for material which is illegal but which is not identified," it said. "It would be perverse if the law were to make such sites more vulnerable for trying to offer protection to consumers."
"We recommend that Ofcom or the Government should set out their interpretation of when the E-commerce Directive will place upon Internet service providers liability for content which they host or to which they enable access. Ultimately, the Government should be prepared to seek amendment to the Directive if it is preventing ISPs and websites from exercising more rigorous controls over content," it said.
The committee also said that it worried about children's privacy on social networking sites, and that such sites' privacy settings should be at their highest by default.
"It is clear that many users of social networking sites, particularly children, do not realise that by posting information about themselves, they may be making it publicly available for all to see," it said. "We recommend that social networking sites should have a default setting restricting access and that users should be required to take a deliberate decision to make their personal information more widely available."