Out-Law / Your Daily Need-To-Know

Out-Law News 6 min. read

Anonymising personal data need not guarantee privacy, says ICO while German watchdog raises internet disclosure concerns


Data anonymisation does not have to provide a 100% guarantee to individuals' privacy in order for it to be lawful for organisations to disclose the information, the UK's data protection watchdog has said.

The view of the Information Commissioner's Office (ICO), detailed in a new code of practice (108-page / 2.15MB PDF) on anonymisation it has published, is that organisations that anonymise personal data can disclose that information even if there is a "remote" chance that the data can be matched with other information and lead to individuals being identified.

The ICO said that organisations that take action to mitigate the risk of anonymised data being used to identify individuals will be considered to have complied with the Data Protection Act (DPA) even if that action cannot eradicate the threat of the data being used to identify someone. The Act "does not require anonymisation to be completely risk free," it added.

The data protection authority in Hamburg, known for its strong stance on privacy issues, told Out-Law.com that it too acknowledged that the "re-identification" of individuals, achieved from matching anonymised data with other information in the public domain or held by others, was impossible to prevent in all cases.

"Our general stance towards anonymisation is not far off of that of our British colleagues," a spokesman for the Hamburg authority said. "German privacy law defines 'rendering anonymous' as 'the alteration of personal data so that information concerning personal or material circumstances cannot be attributed to an identified or identifiable natural person or that such attribution would require a disproportionate amount of time, expense and effort'. It is therefore acknowledged that the absolute impossibility for re-identification in practice cannot always be achieved. Obviously this is addressed by the ICO in terms of a 'remote risk' remaining."

Data protection law specialist Marc Dautlich of Pinsent Masons said that "The code is a very important one and has been published at a time when the Government is increasingly seeking to liberalise public sector-held datasets for research purposes, and when the private sector is increasingly exploiting data mining techniques for commercial purposes."

In a statement the watchdog announced that a new "consortium" involving the University of Manchester, the University of Southampton, the Office for National Statistics and the government’s new Open Data Institute (ODI), would set up a new UK Anonymisation Network (UKAN). The Network will "enable sharing of good practice related to anonymisation, across the public and private sector" with information provided on a website, in case studies, clinics and seminars.

"What practical impact the new UK Anonymisation Network will have remains to be seen, but it could be a potentially valuable resource for organisations seeking guidance on their own anonymisation schemes," Dautlich added.

Under its code, the ICO said that it was not always possible for personal data to be anonymised. It said that it was therefore "paramount" that data which could not be anonymised was kept secure. It said, though, that it is generally "easier" to disclose anonymised data than it is to disclose personal data because "fewer legal restrictions will apply".

"There is clear legal authority for the view that where an organisation converts personal data into an anonymised form and discloses it, this will not amount to a disclosure of personal data," the ICO said. "This is the case even though the organisation disclosing the data still holds the other data that would allow re-identification to take place."

The ICO said that it can be difficult for organisations to know whether data they have anonymised can still be classed as 'personal data'. It said, though, that a High Court ruling had made clear that "the risk of identification must be greater than remote and reasonably likely for information to be classed as personal data under the DPA".

In "borderline" cases, organisations will have to assess the individual "circumstances of the case" to determine whether there is too great a risk that disclosing anonymised data would lead to individuals being identified, the ICO said.

"In borderline cases where the consequences of re-identification could be significant eg because they would leave an individual open to damage, distress or financial loss, organisations should: seek data subject consent for the disclosure of the data, explaining its possible consequences; adopt a more rigorous form of risk analysis and anonymisation," the ICO said. "In some scenarios, data should only be disclosed within a properly constituted closed community and with specific safeguards in place. In some particularly high-risk situations, it may not even be possible to share within a closed community."

In cases where the risk of data matching is high, organisations can reduce that risk by only disclosing "parts of databases" in order to make "direct linkage more difficult".

Under freedom of information (FOI) laws, organisations asked to disclose anonymised data will have to consider whether a "particular member of the public" has additional information that "could allow data to be combined to produce information that relates to and identifies a particular individual – and that is therefore personal data," the watchdog added.

The ICO said that organisations will generally not require the consent of individuals to disclose anonymised data, but warned that it may not always be appropriate to disclose such information if there is a risk that an "educated guess" can be made as to the identity of the person whose data is anonymised where that "leads to the misidentification of an individual".

The watchdog laid out a number of different safeguards that organisations should put in place in order to limit the access of people to anonymised datasets. It added that organisations anonymising personal information "need an effective and comprehensive governance structure" and that they should carry out "re-identification testing ... to detect and deal with re-identification vulnerabilities".

The ICO said that organisations that adhere to its recommendations should have a "reasonable degree of confidence" that their "publication of anonymised data will not lead to an inappropriate disclosure of personal data – through ‘re-identification’".

Technology law specialist Luke Scanlon of Pinsent Masons, the law firm behind Out-Law.com, said that the watchdogs' stance on anonymisation was "practical" but questioned whether it was consistent with wording in the EU's Data Protection Directive.

A recital of the EU's Data Protection Directive states that the "principles of protection must apply to any information concerning an identified or identifiable person" and that to "determine whether a person is identifiable, account should be taken of all the means likely reasonably to be used either by the controller or by any other person to identify the said person".

However, the recital also states that "the principles of protection shall not apply to data rendered anonymous in such a way that the data subject is no longer identifiable". It further adds that "codes of conduct ... may be a useful instrument for providing guidance as to the ways in which data may be rendered anonymous and retained in a form in which identification of the data subject is no longer possible".

"The recital appears to place the non-identifiability of the individual in absolute terms," Scanlon said. "There is no indication in the recital which indicates that the principles of protection would not apply if an individual is only no longer 'reasonably' identifiable or in circumstances where there is a remote risk of identifiability."

"Organisations therefore should remain cautious when using anonymised data, particularly if the use of such data would be in European jurisdictions other than the UK, wherever a conclusion can be drawn that there is a remote risk of identifiability," he said.

The privacy watchdog for the German region of Schleswig-Holstein – the Independent Centre for Privacy Protection (ICPP) –  which has been vocal on a number of data protection issues, told Out-Law.com that it was its view that both present and future risks must be taken into account when assessing the decision to disclose anonymised data.

"The [German] legal commentary argues that in some cases (similar to the ICO) 100% anonymity is not possible to achieve, but that the risk has to be minimal," Marit Hansen, deputy Privacy & Information Commissioner in Schleswig-Holstein said.

"Further, the legal commentary demands that the available knowledge (whether lawfully available or not) has to be taken into account for assessing the possible risks of re-identification. It also stresses that the assessment result may change over time, e.g. if new methods become available to link information," she said.

"This may influence the way how to treat anonymised data: If you publish data on the Internet that have been anonymised and are sufficiently protected against re-identification at one point in time, a later assessment may reveal that the protection may not be regarded adequate anymore. But then harm may already be done, and it would not be sufficient to delete the data (copies may be available, the re-identification may have been conducted already). This means that in our point of view anonymisation does not only mean to assess the risk once, but also to think of future risks, act accordingly (e.g. to refrain from publishing these data on the internet) and assess the risk again if the conditions may have changed," Hansen said.

We are processing your request. \n Thank you for your patience. An error occurred. This could be due to inactivity on the page - please try again.