Google had made available a 'first click free' policy which allowed newspapers to choose to show articles in Google searches even when they were part of subscription-only services. Users could read that article but when they clicked on links to visit other pages they were presented with a screen demanding payment.
Google has now altered that system to allow publishers to restrict any single web user to seeing just five such pages in any one day.
"First Click Free is a great way for publishers to promote their content and for users to check out a news source before deciding whether to pay," said Josh Cohen, senior business product manager at Google. "Previously, each click from a user would be treated as free. Now, we've updated the program so that publishers can limit users to no more than five pages per day without registering or subscribing."
"If you're a Google user, this means that you may start to see a registration page after you've clicked through to more than five articles on the website of a publisher using First Click Free in a day," he said.
Some newspaper publishers have criticised Google for its use of their material. News International head Rupert Murdoch has been particularly critical recently, calling Google a "parasite".
Newspapers can easily hide their sites from Google, though, by adding a small file to their websites called robots.txt or by adding code to individual pages. Collectively known as the Robots Exclusion Protocol, these instructions will tell search engines not to access and index sites or pages. Publishers are reluctant, though, to forego the traffic that Google can deliver.
Google's Cohen said that the company would also now begin to index article preview pages and label them as 'subscription' content.
"We will crawl, index and treat as 'free' any preview pages – generally the headline and first few paragraphs of a story – that [publishers] make available to us," he said. "We will then label such stories as 'subscription' in Google News."
Cohen warned, though, that publishers who put content behind paywalls could struggle to find an audience for that content.
"The ranking of these articles will be subject to the same criteria as all sites in Google, whether paid or free. Paid content may not do as well as free options, but that is not a decision we make based on whether or not it's free. It's simply based on the popularity of the content with users and other sites that link to it," he said.
The newspaper industry has produced an alternative system to Google's which it hopes search engines will adopt. Called the Automated Content Access Protocol (ACAP), it is designed to give publishers more options for how they want their material to appear in search engines than is currently available with the Robots Exclusion Protocol.
Search engine expert Danny Sullivan, though, noted this week that newspapers including the newspaper publishing group most fervently behind ACAP, Independent News and Media, only use ACAP to perform functions that robots.txt is also capable of performing.
"There’s nothing I see within ACAP that provides some type of crucial control that if only news publishers had, all their online woes would be over," said Sullivan.