The IAPP published an article: Is encrypted data personal data under the GDPR? The article is a great summary of issues related to encryption as a means to protect data when not in use. The issues change, however, when a data controller wants to actually make use of the data. As soon as the data is decrypted, it is then indisputably personal data and (as decrypted data) will not be protected against misuse.
Encryption Does Not Support GDPR Compliant Data Use
Encryption does not protect personal data in use because when decrypted the data is exposed and vulnerable to misuse. Similarly, Differential Privacy, Static Tokenisation and data masking do not protect personal data from unauthorized re-identification when data sets are combined and used for multiple use purposes via the "Mosaic Effect." In contrast, Pseudonymisation has gained attention with its explicit codification in the GDPR. Legal experts have highlighted the potential for new Pseudonymisation technologies to address the unique privacy issues raised for legal possession and processing of personal data.
Article 4(5) of the GDPR now specifically defines Pseudonymisation as “the processing of personal data in such a manner that the personal data can no longer be attributed to a specific data subject without the use of additional information, provided that such additional information is kept separately and is subject to technical and organisational measures to ensure that the personal data are not attributed to an identified or identifiable natural person.”
Static Tokenisation (where a common token is used to replace different occurrences of the same value – e.g., replacing all occurrences “James Smith” with “ABCD”) fails to satisfy GDPR definitional requirements since unauthorized re-identification is “trivial between records using the same Pseudonymised attribute to refer to the same individual.”
As a result, Static Tokenisation does not satisfy the “Balancing of Interest” test necessary to satisfy Article 6(1)(f) requirements for Legitimate Purpose processing nor is it included in the technical safeguards listed in Article 6(4) to help ensure that secondary processing like Analytics, AI & ML is a lawful compatible purpose. The Article 29 Working Party has highlighted “the special role that safeguards play in reducing the undue impact on the data subjects thereby changing the balance of rights and interests to the extent that the data controller’s legitimate interests will not be overridden” and “safeguards may include technical and organizational measures to ensure functional separation” and ”Pseudonymisation…will play a role with regard to the evaluation of the potential impact of the processing on the data subject, and thus, may in some cases play a role in tipping the balance in favour of the controller.”
The Article 29 Working Party further highlights that “functional separation includes secure key-coding personal data transferred outside of an organization and prohibiting outsiders from re-identifying data subject” by using “rotating salts” or “randomly allocated” dynamic versus static, persistent or recurring tokens.
GDPR compliant Pseudonymisation, represents a unique means to help support the actual use of data in the form of lawful secondary processing like Analytics, AI & ML by technically enforcing functional separation protection.
Anonos technology is the only technology that has been certified by EuroPrivacy as satisfying GDPR requirements for Pseudonymisation.