The research, carried out in 2012, evaluated the extent to which Facebook users' emotions were affected by postings by friends by manipulating the extent to which more than 689,000 users' "were exposed to emotional expressions" in friends' postings on their 'news feed' on the site.
The report on the study said that the research was carried out in accordance with Facebook's 'data use policy' which users must agree to "prior to creating an account on Facebook". This meant the researchers had users' "informed consent" for participation in the research, the report claimed.
The ICO said it would question Facebook about the study and liaise with its counterpart in Ireland, the Office of the Irish Data Protection Commissioner (ODPC), on the issue. Facebook Ireland has responsibility for all Facebook users outside of the US and Canada. The ODPC is awaiting "a comprehensive response" from Facebook to questions it has raised about privacy and consent in relation to the company's research, according to a report by technology news website The Register.
Co-author of the study, Facebook's Adam Kramer, has admitted that the company "didn't clearly state our motivations" for the study.
"The goal of all of our research at Facebook is to learn how to provide a better service," Kramer said in a Facebook post. "Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone. I can understand why some people have concerns about it, and my co-authors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety."
"While we’ve always considered what research we do carefully, we (not just me, several other researchers at Facebook) have been working on improving our internal review practices. The experiment in question was run in early 2012, and we have come a long way since then. Those review practices will also incorporate what we’ve learned from the reaction to this paper," he said.
Data protection law expert Kathryn Wynn of Pinsent Masons, the law firm behind Out-Law.com, said the case demonstrates the problems with relying on user terms and conditions alone to proceed with personal data processing.
"Facebook appears to be conceding that the way it conducted this study has upset some of its users even if it does maintain the research was conducted in line with data protection rules," Wynn said. "The case highlights the benefit organisations can gain from focusing on user expectations regarding privacy and asking themselves whether those users would expect to be notified about the possible processing of their personal data and the specific purpose of the processing and, where required, asked for their consent to the activity."
"Viewing legal compliance as a mere 'tick-box' exercise risks both upsetting consumers and is also likely to draw regulators into assessing whether the methods being relied on for compliance, in this case users' acceptance of Facebook's data use policy, are in fact sufficient. Reliance on terms and conditions which users may never have read, read thoroughly or not read in months or even years is a risky approach for organisations to take," Wynn said.
"While interaction with users on privacy issues may be an administrative burden for many traditional businesses, Facebook have the ideal platform with which to communicate with users about their personal data processing activities in a way that ensures transparency with users and compliance with data protection obligations," Wynn said.
The UK's Data Protection Act requires organisations to ensure that they process personal data fairly and lawfully and that they only collect personal data "for one or more specified and lawful purposes". The subsequent processing of that personal data "in any manner incompatible with that purpose or those purposes" is prohibited, meaning businesses may need to update or reissue privacy notices and, where applicable, obtain fresh consent.
The Article 29 Working Party, a committee that represents privacy watchdogs based across the EU, including the ICO, last year published guidance on purpose limitation.
The guidance outlined examples of what would generally be considered unacceptable explanations of the purpose for collecting personal data. However, it said that, ultimately, it would be the "particular context in which the data are collected" as well as the "personal data involved" that would determine how descriptive organisations would need to be about the purpose of their collection of personal data.