The Article 29 Working Party said, though, that the networks can process the images legitimately without the consent of those featured in the photos under EU data protection laws in order to assess whether that consent has been given. However, it said that sites processing images in order to verify consent must delete that information "immediately after" that processing is complete.
"Only those registered users who have a reference template enrolled in the identification database will match against these new images and therefore have a tag suggested automatically," the Working Party said in an opinion on facial recognition in online and mobile services. (10-page / 53KB PDF)
"If the consent of the individual was to be considered as the only possible legitimate basis for all processing the entire service would be blocked as, for example, there is no means to gain consent of non-registered users who may have their personal data processed during face detection and feature extraction," the watchdog said.
"Furthermore it would not be possible to distinguish between the faces of registered users who had and had not consented without first performing facial recognition. Only after identification has taken place (or a failure to identify) would a data controller be able to determine whether or not they have the appropriate consent for the specific processing," it said.
Facebook is one social network that uses facial recognition technology automatically to suggest the names of people featured in photos uploaded by users.
Users can 'tag' themselves and their friends in photos they upload to the site. The tag labels the pictures with pop-up captions to enable people who view the photos to identify who is in the shot by hovering their cursor over the picture. The company launched the feature in 2010 for users in the US and it is now widely available to users in most countries.
Social networks need consent from users in the first place in order to store "templates" of images that can be used to verify users' identity, the Working Party said.
The group said that the social networks must have the consent of the "image uploaders" in order for processing of those images to "take place for the purposes of facial recognition." Registered users must be "clearly informed" that images they upload "will be subject to a facial recognition system" before they upload them.
Under the EU's Data Protection Directive personal data can only be processed under strict conditions. Personal data must be "processed fairly and lawfully" and generally it can only be collected for "specified, explicit and legitimate purposes and not further processed in a way incompatible with those purposes".
Organisations must either obtain "unambiguous consent" from individuals before processing is lawful or satisfy one of a number of other conditions instead. One of those conditions is if the processing "is necessary for compliance with a legal obligation to which the [data] controller is subject".
The Working Party, which is a committee featuring representatives of all the EU's national data protection regulators, said that social networking sites "must take appropriate steps to ensure the security" of images as they are being uploaded to the site. "This may include encrypted communication channels or encrypting the acquired image itself. Where possible, and especially in the case of authentication/verification, local processing should be favoured," it said.
"Technical controls" should also be installed to "reduce the risk that digital images are further processed by third parties for purposes for which the user has not consented to", whilst users of social networking sites should have access to "tools" in order to "control the visibility of their images that they have uploaded" if settings automatically restrict third parties' access to the content, the watchdog said.
Social networking sites must also ensure that image "templates" they create do not contain more information about individuals than is "necessary" in order for those individuals to be identifiable during facial recognition processing.
Users of social networks also have to be given "appropriate mechanisms" in order to "exercise their right of access" to both the images and their templates, the Working Party said.
Social networks probably would need to store templates in order to use them to authenticate and verify individuals' identities, but they must ensure that this information is kept securely, the Working Party said.
"The data controller must consider the most appropriate location for storage of the data," it said. "This may include on the user’s device or within the data controller’s systems."
"The data controller must take appropriate steps to ensure the security of the data stored. This may include encrypting the template. It should not be possible to obtain unauthorised access to the template or storage location. Especially for the case of facial recognition for the purpose of verification, biometric encryption techniques may be used; with these techniques, the cryptographic key is directly bound to the biometric data and is re-created only if the correct live biometric sample is presented on verification, whereas no image or template is stored (thus forming a type of 'untraceable biometrics')," the Working Party said.
Facebook was the subject of an audit by the Irish data protection regulator last year over its privacy practices and polices. In its audit report the Office of the Irish Data Protection Commissioner said the company's decision to introduce facial recognition technology on an 'opt-out' basis should have been handled "in a more appropriate manner". In response Facebook Ireland said it would notify users up to three times in order to give users more information on adjusting their settings for the feature.