Legitimate Interest Processing Webinar
(60 Min replay)

Presentation Transcript
Gary LaFever
CEO & General Counsel
Anonos
Dr. Sachiko Scheuing
European Privacy Officer,
Acxiom
Martin Abrams
Chief Strategist
Information Accountability Foundation (IAF)
DOWNLOAD PRESENTATION SLIDES
Summary Slide from Webinar
VIEW FAQ & WEBINAR SUMMARY
Gary LaFever (Anonos):
[00:04] Thank you, everyone, for joining us today! We're very appreciative of the assistance of the Data Protection World Forum and my co-hosts today. We first wanted to thank all of you for making time for this webinar in the uncertain times in which we are currently living. I'm sure all of us have had loved ones, family, coworkers, and colleagues impacted by the Coronavirus (COVID-19). I think everyone is still committed to doing business and having commerce and society continue. This is how we will get through this.

[00:39] We will be using a particular case study to give the topic of “Pseudonymisation-Enabled Legitimate Interest Processing” relevance in today’s uncertain times. The case study will be direct marketing. In such uncertain times, we think that direct marketing is critical because for an unknown period of time going into the future, we will not have the opportunity to communicate with and to do business with customers in many of the traditional ways. And so, direct marketing will be important. How do we get communications to customers in a way that enables us to meet their needs, as well as the needs of the commercial entities involved, all in a way that satisfies new legal and ethical requirements.

[01:29] My name is Gary LaFever. I am the CEO and General Counsel of Anonos, which is a Pseudonymisation technology company. Anonos has over 8 years of R&D and subject matter expertise in Pseudonymisation. You may think: “Well, the GDPR hasn't been around for 8 years.” The reality is that Anonos has been working for over 8 years on technology that enforces Organisation for Economic Co-operation and Development (OECD) and Fair Information Practice Principles (FIPPs) concepts that are now reflected in laws such as the GDPR. My dual roles as CEO and General Counsel at Anonos are highly relevant because, as you will see from this webinar, in order to do business effectively under the GDPR and other evolving data privacy laws, you need to not only have knowledge of the law and awareness of business objectives, but you must also have technology enforced accountability controls that are demonstrable and verifiable. Sachiko, would you please introduce yourself?
Dr. Sachiko Scheuing (Acxiom):
[02:27] Sure. Thank you, Gary. My name is Sachiko Scheuing. I am the European Privacy Officer of Acxiom, a global leader in the marketing services industry.
Gary LaFever (Anonos):
[02:35] Marty, would you please introduce yourself?
Martin Abrams (IAF):
[02:37] I am Marty Abrams. I am the Chief Strategist and Leader of The Information Accountability Foundation (IAF). The IAF is mostly business funded, but we also work with regulators to evaluate assessment processes for ethical use of data. So, we work with both industry as well as with regulators.
Gary LaFever (Anonos):
[03:01] For any questions that are not addressed during the webinar, or if you'd like to get copies of the slides, feel free to email LearnMore@anonos.com. And with that, we will start.
[03:21] We will focus on the particular use case of direct personal marketing to give relevance to the general topic of Pseudonymisation-Enabled Legitimate Interest Processing. But the principles we're going to discuss are going to apply to more than just marketing, direct marketing, and personal marketing. And the reality is if you can't answer “yes” to all four of these questions, you have to stop personal marketing. And this is not just an alarmist comment. We will walk through some recent regulator enforcement actions and draft guidance that makes this very clear.

[04:01] So, let's walk through these four questions. First, does your technology demonstrably and verifiably enforce policies? What's important is that these policies protect data when in use. That's where a lot of companies fall short. We will go into more detail on this point. Protecting data when in use is necessary for lawful legitimate interest processing. An example is encryption, which a lot of people rely upon and they should. But encryption is all about protecting data when at rest and in transit, but not in use. So, the first question is “Does your technology demonstrably and verifiably protect data in use?”

[04:48] Secondly, are you using dynamically changing identifiers? The standard for years has been static identifiers, but static identifiers are not adequate anymore if you want to rely on the legal basis of legitimate interest processing. In today’s webinar we will provide several interactive websites that you can go to for more information. The first interactive website that we're highlighting here is www.MosaicEffect.com for you to go to for more information. The example provided for this question is persistent marketing identifiers that are used rampantly throughout the industry. There is nothing wrong with persistent marketing identifiers unless you're trying to justify the use of legitimate interest processing because static identifiers do not protect against re-identification via the Mosaic Effect. So, that's the second question. Are you using dynamically changing identifiers?

[05:37] The third question is whether your technology satisfies new requirements for Pseudonymisation? Many people are told by their legal counsel: “Don't Pseudonymise you data, anonymise it. Why? Because if you anonymise data then it is outside of the scope of the GDPR.” But Sachiko will touch upon how, while that is a correct statement, it limits your use of data. And we will touch upon the fact that Pseudonymisation is actually highly rewarded under the GDPR and has very specific requirements that are not what you probably think.

[06:14] The fourth question is: “Does your Pseudonymisation technology satisfy new EU standards?” The focus of our conversation today is that regulators are concerned that what is missing is actual technical enforcement that can prove data minimization, purpose limitation, and data protection by design and by default - necessary because policies alone are simply not enough anymore.
[06:44] So, direct marketing is the case study that we will talk about. We want to avoid a world where there is no advertising or direct marketing with any level of personalization or even minimal analysis. If you're trying to reach your customers in today's uncertain society, you will need to be able to speak to them in a language that they understand. Today, we will touch upon how consent, contract, and anonymisation simply don't get you all the way where you want to be. However, you can embrace the concept of Pseudonymisation - which has been newly defined under the GDPR - to actually enable you to do business. What none of us want is for regulators to eliminate personalized, segment-based marketing. While you may think this is an overstatement. But by the end of today’s webinar, I think you will see that it's not. So, with that, Marty, would you please touch upon the ICO draft code?
Martin Abrams (IAF):
[07:50] Sure. Thanks so much. Regulators, particularly European regulators, are increasingly cynical that processing is conducted in a manner that is lawful first and then under control. “Lawful first” means that observed data - the data collected from cookies, for example, but in other settings as well - that observed data is collected based on consent. The ePrivacy Directive and it's enactment in national laws requires consent for that actual collection, but the continued processing must be done in a lawful manner. So, this whole question of how you do segmentation of markets has to be done so, and lawful at the end of the day is really simple. It means that the processing is done under control, and we'll increasingly talk about what it means for it to be done under control.

[08:49] The UK ICO, the Information Commissioner, just completed a consultation on the direct marketing code of practice. And that code of practice when finalized will then be sent to parliament. So, it's got some legal stature. This follows guidance from the Belgian Commissioner on direct marketing as well, but we're going to focus on the ICO. The ICO is indicating that segmentation is profiling. In other words, this process of taking populations and separating them into those who might want to buy your product and those who are not a good candidate for your product is profiling, and they're also suggesting this profiling has legal or similarly significant effect. Therefore, segmentation in the way they're looking at it since it has legal or similarly significant effect, therefore, requires it to be done under consent. And every additional step after that also has to be done in consent and that consent is not the old fashioned consent. It's the consent that is required by the GDPR.

[10:03] The reason that regulators are going there is because, to be honest, the previous guidance had been very open at using legitimate interest for marketing. The reason they are going there is they don't trust industry to have rules in place in companies that inform and then enforce those rules with verifiable technology controls. In other words, the rules say what you can and can't do with the data when the data has impact and when it does not, and then enforce those rules with technology.

[10:37] The IAF, my organization, on the end of this month was going to do a session in Ireland under sponsorship from the Irish Commissioner on how you do legitimate interest balancing. And the reason the Irish Commissioner was asking us to do that is they don't believe they've seen demonstrations by business that indicate businesses understand what it means to do a balancing process. So, it's all little evidence that organizations were doing things under control. So, legitimate interest requires not just that you have an assessment that shows that the interest of the organization and the full range of interests of the individuals impacted has been assessed. But it requires that you be able to demonstrate that you have the controls to enforce those rules to make sure that balance is maintained when you're actually doing the processing. And this has carried through this much further.

[11:37] The fact is, segmentation is a process that we've often talked about as thinking with data. And thinking with data in theory, if you have the right controls in place, has no impact on individuals. It’s about beginning to understand who might be a customer for your product and who may not. So, it has no effect. And if it has no effect, then it should not require consent. For that to be proved to regulators, there needs to be processing under controls with effective Pseudonymisation. And that means the data cannot be connectable to the individual without system permissions that you as the Privacy Officer or Data Protection Officer governs. That means no Mosaic Effect, and we may explain the Mosaic Effect in this discussion.

[12:29] The IAF has pushed back on this guidance from the ICO. In essence, we're arguing that regulators should be less cynical about the ability for organizations that have controls. But for that to happen, organizations actually will be able to demonstrate that they can do segmentation without impact on individuals to the segmentation themselves. In other words, this understanding of the market can be done without having impacts on individuals that are not anticipated.
Gary LaFever (Anonos):
[13:05] Thank you, Marty. Now, Sachiko is going to highlight two comment letters that were filed in response to the ICO draft code.
Dr. Sachiko Scheuing (Acxiom):
[13:14] Yes. So, here are two comments from the two UK marketing trade associations. IAB said the ICO should not be prescriptive about which legal bases are or are not available for a particular processing activity. And they continue: “Our view is that subsequent processing of personal data is subject to the legal basis provisions of the GDPR. Legitimate interest is therefore a possible legal basis for processing personal data in these circumstances.”

[13:52] A similar comment came from the DMA (Data & Marketing Association) in the UK. Quoting from their response that I have here in bold: “By more or less denying legitimate interest for marketing purposes, ICO is ignoring the fundamental premise of the GDPR and the EU Charter of Fundamental Rights.” DMA’s response is also an interesting thing because they touched on the topic of Pseudonymisation. I quote here again: “Pseudonymisation, for instance, helps marketers gain insights into their clients while the identities of the data subjects are protected. Consent provides a stronger level of permission initially and acts as a gateway for subsequent legitimate interest use.” Gary, to you.
Gary LaFever (Anonos):
[14:52] Thank you, Sachiko. We've already had several requests for copies of the slides. As I mentioned, if you send an email to LearnMore@anonos.com, we will provide you with copies of the slides, as well as a replay of the webinar itself, and a summary and a link to the comment letters, so everyone can have access to all the information that we are covering. On this slide, we are highlighting the comment letters filed by Marty's organization, The Information Accountability Foundation, and by my company, Anonos. I would like to highlight four questions. We began and ended the Anonos comment letter with these four questions, and we genuinely hope that the ICO will provide answers to these four questions.

[15:37] The first question is: “May different legal grounds co-exist to support separate processes comprising lawful direct marketing, or must a single, unitary legal basis be established to support all end-to-end processing steps (e.g., collection, analytics, outreach, etc.) of personal data for direct marketing?” Now, many of us would have thought that this question didn't have to be asked, but when you read the ICO draft code, you have to ask the question because it really comes through that consent is the only legal basis that works. The second question is: “Can direct marketing itself serve as the purpose for which data is collected based on consent?” If there's appropriate notice to the data subject that their data will be used in a legitimate interest fashion leveraging Pseudonymisation-enabled controls to enforce data minimization, purpose limitation, data protection by design and by default, can direct marketing itself be the purpose? Again, one would have thought that question didn't have to be asked, but when you read the draft code, you feel it must be. Question number three is: “Can the further processing of personal data for direct marketing purposes be based on Legitimate Interests when supported by pseudonymised microsegments to respect and enforce the fundamental rights of data subjects?” The fourth and last question is: “Does all profiling necessarily constitute automated decision making?” If you're interested, the IAF and Anonos common letters are available at www.MicroSegmentation.com/CommentLetters.
Gary LaFever (Anonos):
[17:02] This next slide graphically highlights what would happen if the ICO draft code was adopted in its current form, which business would be stopped, like hitting a brick wall. And it's because if you rely only on consent - and Sachiko is going to go into a lot more detail ON this and Marty already referred to it. This is not old fashioned consent. It's new GDPR consent, and it's very restrictive. Purposely, to protect the rights of the data subject. But in doing so, you would lose the ability to do a lot of thinking with data and iterative processing. And so, it is saying that you can only use consent. That's a problem. Consent is critical and is at the heart of the GDPR. But there's a reason there's another five legal bases, and the picture on the far right is highlighting it that there's meant to be a doorway for further processing if those controls are in place that can show that you're managing the risks of data subjects and a focus of this presentation is on how Pseudonymisation enables that to happen. So with that, Sachiko, would you like to pick up on this slide?
Dr. Sachiko Scheuing (Acxiom):
[18:12] Sure. Thank you, Gary. So, the importance of legitimate interest processing for ongoing lawful direct marketing and/or other innovative uses of data is highlighted by limitations of consent, contract, and anonymisation. So, this is why we have these three points on this slide. So, let us begin by taking a look at consent. The GDPR sets a very high bar for consent. There are actually four requirements for consent which I would like to highlight here. First of all, consent must be freely given. Examples of situations where the freely given requirement is not satisfied include cookie walls where visitors may not access the site unless they agree to the use of cookies for tracking. Employers processing of employee personal data. Another one would be school systems processing of student personal data. And clinical study processing of research subject personal data. All of these are due to an imbalance of power between the parties. GDPR Recital 43, however, states that if a controller does not seek separate consent for each purpose there is a lack of freedom necessary to use consent as a legal basis.

[19:46] Secondly, consent must be specific. The requirement for specificity is closely aligned with the requirement for purpose limitation. The Article 29 Working Party, which is now the EDPB, has stated that a vague or general purpose such as using data to improve uses in experience or for marketing purposes, IT security purposes, or future research does not satisfy the specificity requirements for consent to serve as a lawful basis for processing. Another point on consent that has to be specific was shown back in 2018. In their decision against Vectaury, a demand-side platform or a DSP, CNIL the French Regulator said showing the message “authorize app for this and that” to access the position of your device and then ask the user to click either the accept or refuse button is not specific enough for the company, in this cause Vectaury, to use the data.

[21:02] Freely given. Specific. And the third requirement of consent would be to be informed. Consent must be informed. GDPR Recital 42 requires that the identification of all controllers and joint controllers must be specified at the time of initial collection in order for consent to be informed. This means the initial data controller must identify itself - and here's the thing - must also name all future third party controllers at the time of initial data collection. So, you know, the point is who can actually tell the future third party controllers in the AdTech context? That would be very difficult. If a party purchases consented data, that consent is only valid for a new party processing the data that was specifically identified at the time the data was initially collected using consent.

[22:09] And now, I move on to the last important requirement, which is that consent must be unambiguous. The CNIL, once again the French Regulator, fined Google for using pre-ticked boxes to authorize the use of personal data since there was no unambiguous or clear affirmative action taken by the user
[22:34] So, now, let's take a look at “necessary for contract” as a legal ground for processing. The European Data Protection Board has held that “necessary for contract” should be strictly interpreted to cover the minimal requirements under a contract and no more. So, for instance, a contract is not considered an appropriate lawful basis for improving a service of developing new functions within an existing service. The rationale is that a customer enters into a contract to procure a service or product in the format that it exists in at the time of entering into the contract. While the right to receive improvements may be included in the contract, these improvements are not objectively necessary for the performance of the contract.

[23:33] Another point why I think a contract is not considered a suitable legal ground for building consumer segments based on user's tastes and lifestyle choices is because a data controller is not contracted to carry out profiling, but rather to deliver a particular goods or services. Here, I'm actually using the word “consumer segment” as being similar to profiling in our context.
[24:02] Let's take a look at the anonymisation under the GDPR. Under the GDPR for data to be anonymous, the data must not be capable of being cross referenced with other data to review identity taking into account all the means reasonably unlikely to be used such as singling out either by controller or by another person to identify the natural person directly or indirectly. This is a very high standard and it is required because if data does satisfy the requirements for anonymity, it is treated as being outside the scope of legal protection under the GDPR. Why? Because of the very safe and protected nature of data that actually satisfies the requirement of not being cross referenceable or re-identifiable. However, very often in the context of AdTech, data is only meaningful if it can be cross referenced.

[25:12] Now, according to the joint Spanish AEPD (Agencia Española de Protección de Datos) and EDPS (European Data Protection Supervisor) guidance, anonymisation could not support direct marketing to customers since their identity could not be revealed. Well, I think that this is a very blanket statement made without recognizing the different ways that data can be used. Using data for creating insights, for instance, is very different from using data to deliver a specific ad to a specific device. That's what Marty just earlier referred to as “thinking with data” by creating insights as opposed to acting with data or doing with data by showing and delivering the specific ad.

[26:08] But let me continue, the AEPD and EDPS guidance states that anonymisation procedures must ensure that not even the data controller is capable of re-identifying the data holders in an anonymised file. So, that means unless serious measures are taken like using data attributes aggregated on say neighborhood or micro-geographic level instead of data attributes on personal level, regulators will not consider data anonymous. This leads us to legitimate interests processing as an available legal basis for direct marketing as well as other innovative uses of data. GDPR Recital 47 explicitly states that the processing of personal data for direct marketing purposes may be regarded as carried out for legitimate interest. The term “maybe” is used because you always need to carry out a balancing exercise, of course.

[27:22] An established requirement for relying on legitimate interest processing is the application of a three-part test to show that data subjects’ rights and interests are considered and protected. Those three tests being: One, the purpose test: “Are you pursuing a legitimate interest?” Second, the necessity test: “Is the processing necessary for the desired purpose?” And thirdly, balancing of the interest test: “Are technical and organizational safeguards in place so that data subjects’ interests do not override the legitimate interests of the data controller or third party in the result of the desired processing.” I will now transition to Marty Abrams to discuss why demonstrable technology enforced accountability is critical for satisfying the critical third test, the balancing of interest test. Marty?
Martin Abrams (IAF):
[28:25] Thank you very much. So, the fact is that demonstratable accountability or demonstrable accountability is the standard that regulators are beginning to look to and that's first starting in Canada but spreading beyond Canada. And it requires that you have the assessment processes at every stage to balance and understand the interest of all the stakeholders who are impacted by the processing. And you need to be able to determine that there is no legal or similarly significant effect if you don't have consent. So, that requires that you have a rule set. And that rule set has to be very specific, and it has to lead you to doing assessments that you are free to share with regulators when asked by the regulators. But the regulator is going to ask you: “How do you enforce those rule sets that you put in place to make sure that the use of the data is balanced?” And you need to be able to show that the technology is indeed in place and that you, at the end of the day, can demonstrate that that technology was effective in protecting the interests of the individuals.

[29:39] So, you need to have a Pseudonymisation process to back up your use of legitimate interest that in turn proves that there is not a risk of re-identification of the individual in an unintended way by some participants in the process to have to avoid this concept of the Mosaic Effect. So, you need to be able to demonstrate from soup to nuts that you understand rules need to be in place. You have those rules in place. It's led to the right type of assessments, you have decisions that you then have the Pseudonymisation technology that will assure that the interests of the individual are maintained while you process the data for segmentation and that you can then, once you have the insights, be able to then re-identify cohorts of individuals that might meet your market and then market to those individuals. So, it has to be a soup to nuts process. You need to have the policies, the implementation standards, the implementation of those standards, and then the technologies that support them, and last, the demonstration of the fact that those technologies work.
[30:57] So, if you think about the old days - and the old days are not that far in the past - data was connected fairly robustly for multiple endpoints and that data was really put together without ANY regard for having controls in place. As we think about implementation of the GDPR, when you're thinking about consent only based marketing, that tends to be within very limited defined silos. So, data that was collected in one silo stays in that silo and is processed within that silo only and leads to marketing to individuals based only on the rules in that silo. You can't in any way push the data together for analysis to better understand the market. Today's days in silos, sometimes we refer to these silos as walled gardens.

[31:56] To really get the effect of “thinking with data” - being able to truly segment your market so you can understand who truly are the candidates for the products and services that you sell, you need to have a controlled system for intermingling the data. I wish in terms of these three fish bowls we could actually show control pipes between them to say that the data has been Pseudonymised as it goes into the middle pot or the middle bowl for processing because that's really what it requires that you have the controls that allow for the segmentation without the actual identification of the individuals and that when you're bringing data together, that it can't be re-identified because you have common keys that are used for all of the datasets that might be brought together. So, you need to have this ability to bring the data together to do segmentation, begin to figure out who your market is to create the cohorts that allow you to, with a lawful basis based on all the requirements of data protection law, be able to reach your market in a fashion that is effective.

[33:22] So, we go to the ICO code of direct marketing. The ICO is leery that you can do market segmentation based on the vast amount of data that is used in a controlled fashion and because they don't see data being used in a controlled fashion, they don't see individuals as having the ability to exercise their rights to take control directly. So, in terms of moving forward, the argument needs to be made to regulators that consent governs that initial collection, but you can use legitimate interests in a demonstrable fashion with verifiable technology controls so that you can begin to create those cohorts that you can then market to your marketplace in a fashion that can be demonstrated as under control. So, really what we're showing in that last slide as you go through the door is that the cohorts have been put together in a fashion that is respectful of individual interests and thinking about the full range of fundamental rights that accrue to individuals, and that you have those controls in a demonstrable way so you can reach your market.
Gary LaFever (Anonos):
[34:42] So, Marty, we have a question already. And by the way, I really want to encourage people to ask questions through the interface. If we do not get to them during the webinar, we will follow up and also at LearnMore@anonos.com. We had a particular question that I think is worth taking right now. The participant says: “I'm confused. The ICO has always maintained that consent is the last resort to rely on for a legal basis given the resource overhead and maintainability together with the constraints around legacy infrastructures” and asks us to provide some more context. I think, Marty, this is why many people are so surprised by the draft ICO code where it seems to say that consent is, if not the best, the only and yet in the past it said it was the last resort. Do you want to mention or comment on that briefly?
Martin Abrams (IAF):
[35:34] It goes back to my initial comments that when you talk to regulators, they believe that industry has not demonstrated that they actually take their responsibility to do processing in a controlled fashion seriously. And because they don't think the industry can do that processing in a controlled fashion, then they're doubling down on this concept that if you're profiling all your segmentation has a legal effect, the mere fact that you're thinking with data has a legal or similarly impactful effect on the individual. So, they're saying: “Since we don't trust you to have controlled mechanisms, we're going to double down on this concept of consent.”

[36:21] So, yes, what's being said in the draft code of practice is definitely more rigid than what the ICO has said in the past. And it's really based on sort of a rethinking of what marketing entails. So, part of that goes to this concept that marketing was used in Cambridge Analytica to reach individuals and try to take away their right to vote in an informed fashion. And so, they're saying that marketing isn't just about marketing a product, it's about marketing ideas and we don't trust the controls that are in place. So, unless someone can show us a mechanism that proves that there is control, then this cynicism has creeped into sort of what they're putting on the table. So, yes, it's a change. It's based on their sense that the market is not effectively putting controls in place. So, they'll double back on consent. Is that helpful to you, Gary?
Gary LaFever (Anonos):
[37:23] Thank you, Marty. Yes. So, let's look at these four questions. We're going to go into each of these questions in detail. So, I'm going to hit the first two first. And what we're talking about here is first off: “Are you protecting data when in use?” and “Are you using dynamically changing identifiers?”
[37:43] I want to recommend to you the www.MosaicEffect.com link at the bottom of this slide, If you go to www.MosaicEffect.comyou will see a dynamic representation of what happens when static identifiers are used to replace a person’s name across multiple transactions. This is a famous study that shows that when someone's name is replaced with a static identifier - in this instance, “7abc1a23” - watching multiple purchases results in a situation where the author finds that re-identification of up to 90% of all people in the study is possible.

[38:24] So, the idea of representing and replacing identifying information with a static identifier - I need to point out that some people still call what I'm describing now as Pseudonymisation - but static identifiers do NOT deliver Pseudonymisation as now newly defined under the GDPR. So, one of the biggest shifts here is what has been the standard of anonymisation and pre-GDPR forms of tokenization simply do NOT cut it anymore. Now, compare the top table where you can re-identify someone through the Mosaic Effect - the Mosaic Effect is where you can piece together different parts of a person's life, activities or behavior to create a mosaic of that person in order to identify who they are. But on the bottom table, you can see that what were previously consistent (persistent or static) identifiers associated with the same party have now been replaced with different identifiers. In order to re-identify a person, “Additional Information” - this is a term of art - is NOW required to link the different transactions together. If, and only if, you satisfy the requirements for GDPR compliant Pseudonymisation, do you get the significant benefits under the GDPR. These explicit GDPR statutory benefits are surprising to many people.
[39:52] Oftentimes, you’ll hear: “Anonymise your data. Don't pseudonymise it.” This slide highlights eight statutorily specified benefits under the GDPR if you properly satisfy the new definitional requirements for compliant Pseudonymisation under the GDPR. The citations are at the bottom of this slide. [Citations: 1. Tips the balance in favor of Legitimate Interests - Articles 5(1)(a), 6(1)(f), WP 217; 2. More flexible change of purpose - Article 5(1)(b), WP 203; 3. More expansive data minimisation - Articles 5(1)(c), 89(1); 4. More flexible storage limitation - Articles 5(1)(e), 89(1); 5. Enhanced security - Articles 5(1)(f), 32; 6. More expansive further processing - Article 6(4), WP 217; 7. More flexible profiling - WP 251rev.01 - Annex 1 - Good Practice Recommendations, Article 22, Recital 71; 8. Ability to lawfully share and combine data - Articles 11(2), 12(2); 9. GDPR definition of Pseudonymisation - Article 4(5); 10. Incentives to apply Pseudonymisation - Recital 29].
 
Let's look at these benefits. The Article 29 Working Party has literally said that Pseudonymisation can help tip the balance in favor of the data controller for legitimate interest processing. It allows for greater use of data in many different respects - data minimization, the length of time you can store it. It reduces potential liability in the event of a breach. It allows for more expansive processing - even profiling. 

[40:47] The Article 29 Working Party guidance on profiling and automated decision making highlights Pseudonymisation in their best practices as a way to be able to do it lawfully. In the current market where your data chain - the parties with whom you receive data, the parties to whom you deliver data, both your customers and partners - the ability to lawfully share and combine data is critical. But look on the right hand side, you have to satisfy the new definition of Pseudonymisation. This is not a generalized term. It is a statutorily defined term. So, what is that term? Very simply. It requires that you can prove demonstrably and verifiably that you have separated the information value of data from identity, but not as in anonymisation where it's never recoverable or re-linkable. Rather, in order to go back and forth over that wall on the top - you must require access to the additional information that's kept separately. We refer to that as the courtyard. This is actually a higher standard than anonymisation.
[42:03] If you look at the definition of anonymisation in Recital 26 of the GDPR, it has some reasonableness language in there. That in consideration of the protections that are in place, is it reasonable that data can be re-identified? With Pseudonymisation, it's absolute - it cannot be re-identified but for access to this Additional Information. There is significant helpful guidance on this. ENISA, the EU cybersecurity agency, has published two different reports in November of 2018 and November of 2019. And you can find those at the URL at the bottom - www.ENISAguidelines.com. And we actually have provided at that web page www.ENISAguidelines.com, a checklist where you can compare your technology against the 50 specified capabilities that Pseudonymisation compliant technology should have. So, people ask for greater specificity and details - there they are. So, again, in order to get the benefits of the statutory rights associated with Pseudonymisation you must comply with these requirements.
[43:12] And I just want to point out here, that this is not by accident. I spend a lot of time with people who were involved in the working groups drafting the GDPR. There is a reward for you - if you have GDPR-compliant Pseudonymisation to demonstrate these technical controls, these are the benefits that you get. That's what our clients tell us. That's why they work with Anonos to use Anonos  GDPR-compliant technology. [Citations: 1. Tips the balance in favor of Legitimate Interests - Articles 5(1)(a), 6(1)(f), WP 217; 2. More flexible change of purpose - Article 5(1)(b), WP 203; 3. More expansive data minimisation - Articles 5(1)(c), 89(1); 4. More flexible storage limitation - Articles 5(1)(e), 89(1); 5. Enhanced security - Articles 5(1)(f), 32; 6. More expansive further processing - Article 6(4), WP 217; 7. More flexible profiling - WP 251rev.01 - Annex 1 - Good Practice Recommendations, Article 22, Recital 71; 8. Ability to lawfully share and combine data - Articles 11(2), 12(2); 9. GDPR definition of Pseudonymisation - Article 4(5); 10. Incentives to apply Pseudonymisation - Recital 29].  
[43:37] But to get those statutorily provided benefits, you have to satisfy the statutory requirements. And again, you can see those at www.ENISAguidelines.com.
[43:47] So, I'd like to raise a case that actually I think has caused more concern in the direct marketing community than even the ICO draft code. And this is a case at the end of last year where the Dutch Data Protection Authority took action against a lawn tennis association and this has been misreported as standing for the proposition that you cannot use commercial interests as the basis for your legitimate interest. And unfortunately, if you read just the summaries that are available in English, you can come away with this. If anyone's interested, we're happy to provide you with an English language translation of the original case because when you review the English language translation of the entire case, it becomes quite clear.

[44:38] What the Dutch DPA is saying is commercial interest by itself is not sufficient to give you a legal basis of legitimate interest processing that you must combine that commercial interest with, as Marty described - demonstrable technically enforced accountability. And this is where Pseudonymisation technology that supports the statutory requirements of that term, by actually enforcing data minimization, purpose limitation, data protection by design and by default, to allow you to continue to do business as you want to. So, it's not that consent is bad. Consent is very important, but consent is limited and somewhat fragile. It's not that contract is bad. You need contracts for primary processing. It's not that anonymisation is bad. The fact is, they all have limitations. And legitimate interest processing - WHEN implemented with the right technical controls to enforce the right policies and procedures so that they're demonstrable and verifiable, allows you to still conduct business.
[45:48] So, just a little foreshadowing, if you've been interested in what we've been covering here today, we have a follow-on webinar on April 9th. And if you're registered for this webinar, we will make sure to invite you. But this is actually the proposed approach where in fact you could continue to reach out and do direct marketing and there's three steps - the first is where you have data collection that's consent based, and you have to comply with the requirements of consent, but you also put the data subjects on notice that you're going to use legitimate interest processing to do direct marketing to them. They always have the opportunity to opt out, and their identity will be protected by being put into pseudonymised cohorts and classes. And then the second step is actually where the data science occurs - the “thinking with data” step that would be based on legitimate interest. And then you'd have the outreach and the fulfillment based on either legitimate interest and/or consent. So, if you're interested, please, we will invite you for that follow-on webinar. Yes, Marty?
Martin Abrams (IAF):
[46:52] So, this concept of “thinking with data” - the data science step, and I know that is the topic for the next call. This concept of thinking with data and acting with data is a topic that has been vetted with regulators since 2013. So, it's a topic where there's been lots of conversation and lots of interest and lots of understanding that thinking with data and acting with data are two different things and they can't really be bundled together under this concept of profiling. So, the fact is that this is not something that has just sprung to mind. It is something that has been well vetted.
Gary LaFever (Anonos):
[47:30] Thank you, Marty. So, again, before we start to go to the questions, I just want to hit back at these four points and also highlight that for three of them there are interactive websites that could be very helpful to you - www.MosaicEffect.com, www.Pseudonymisation.com, and www.ENISAguidelines.com. You need to be able to show you’re protecting data when in use. We used to protect data when in use by contracts and by restricting processing to enclave or controlled access to data. The fact of the matter is, contracts are not enough, as Sachiko described because they're very, very narrowly construed. And, if you're limiting the processing to an enclave, a controlled environment, that may well work for that purpose. But the second you do an outreach to a partner or an outreach to a customer, you have violated that restriction. And so, that's why you need technology that protects data when in use. That's what Pseudonymisation does. Pseudonymisation helps to defeat unauthorised re-identification via the Mosaic Effect. Again, that's question number two. And that's the website www.MosaicEffect.com that highlights how using dynamically changing identifiers dramatically reduces the risk of unauthorised re-identification.

[48:40] Number three, what are the benefits of Pseudonymisation? A lot of people are oftentimes surprised to find out that Pseudonymisation is mentioned 15 times in the GDPR. Anonymisation is mentioned three times all in the same recital. Encryption is mentioned four. It's not by accident that Pseudonymisation is mentioned 15 times in the GDPR with relaxed obligations because you have put in place technical and organizational safeguards. So, again, www.Pseudonymisation.com provides more information. And then, what are the standards for EU certification? ENISA has come out with two great reports. You can get more information at www.ENISAguidelines.com. So, with that, we'd like to move on to questions. And again, please submit questions even if we don't answer them during the webinar. We will get to you. Or if you want to ask additional information, do so at LearnMore@anonos.com. So, with that, we'll start with some of the questions that have come in.

[49:44] Marty, I think this one is probably for you. Can you explain again what you mean by demonstrable, technically enforced accountability?
Martin Abrams (IAF):
49:55 Sure. And if someone wants more information about this concept of accountability 2.0 or demonstrable accountability, I would direct them to The IAF website, and we can supply that to individuals.
Gary LaFever (Anonos):
[50:11] Marty, do you want to provide a URL for that?
Martin Abrams (IAF):
[50:15] It’s www.informationaccountability.org and look for “Publications” and “Accountability 2.0.” And also the guidance that we provided in Canada, which would be very useful to understand that. In fact, next Monday or next Tuesday, we will be publishing a paper we wrote for the Canadian government on the concept of “people beneficial processing.” So, the fact is that the concept of accountability and the essential elements of accountability were established in 2009. It was the subject of the Article 29 Working Party Opinion in 2010. That opinion from 2010 is what sparked accountability being part of the GDPR. But it's really not well described - all the elements of accountability. But what is developed. Accountability requires an organization to be responsible and to stand ready to be answerable for how they process the data and requires policies, implementation standards, internal review, a way for individuals to participate, and the ability to stand ready to demonstrate your processes are sound.

[51:32] But after 10 years, regulators are beginning to get dubious that organizations actually know how to do accountability. So, today they're asking the organizations to be able to demonstrate every step in accountability including the technology controls that enforce that accountability. So, it's no longer enough to adopt policies and implementation standards for those policies to do the data protection assessment. You have to actually demonstrate that you have the technology controls to make sure once those decisions are made that they are governed throughout the processing. So, it's now a higher level of answerability. It's this ability to proactively demonstrate that everything you say you do, you did, and you can prove it was effective.
Gary LaFever (Anonos):
[52:22] Thank you, Marty. Sachiko, if you want to take this one. Someone asked us to explain again the Dutch DPA decision against the lawn tennis association, and that they had heard that that meant commercial interests cannot support legitimate interests. Do you want to touch on this one, please?
Dr. Sachiko Scheuing (Acxiom):
[52:39] Yes. I’d be delighted to. Thank you, Gary. I personally see indeed, you know, apart from the fact that yes providing concrete technical and organizational measures may alleviate, so to say, the situation, I think there are three areas that I would like to highlight. Number one, legitimate interest has to be real, concrete, direct, and not speculative, says AP (Autoriteit Persoonsgegevens), the Dutch Regulators. The second thing that they say is that and I'm not really sure this is how all other regulators and other lawmakers would understand, but the Dutch Regulator feels that legitimate interest has to be named in a law. So, it can't be just any interest. Thirdly, the way they interpret the freedom of entrepreneurship which is in the EU charter of Fundamental Human Rights, they say that this is not an unlimited right, and they believe that this right gives entrepreneurs the freedom to choose with whom they do business and the freedom to determine what the prices are and so on. But they say that this does not mean that freedom of entrepreneurship means general fundamental rights to make money. All in all, if I would think about that, personally, I feel that that in a way contradicts the very existence of the Recital 47, which was early on cited that “direct marketing may be a legitimate interest in this particular.
Martin Abrams (IAF):
[54:53] Okay. So to jump in, there's the sense that anytime you process data, there is risk to individuals from that processing of data. So, the technical controls need to be there to mitigate that minimal level of risk. But there is also a sense that if you're going to have any level of risk that you need to really understand the impact of the intent behind the processing on all of the stakeholders. So, an organization really needs to understand what their legitimate interest is and how that is impactful on and may create value for the individuals to whom the data pertains. So, essentially, they're asking for a more robust demonstration of why you're using the data and how it creates value in the end for others.
Gary LaFever (Anonos):
[55:52] This is a segue to our last question. As Marty mentioned, you need to be able to show the benefits both to the data subject, the data controller, commerce, and society as a whole. And then you need to show the controls that you have in place to limit the risk. Someone asked the question: “Please explain why you say static identifiers are not compliant with the GDPR.” Please don't misinterpret what we’re saying. Static identifiers are a good security technique. But if you're using static identifiers to try to protect data when in use, they fail because of the Mosaic Effect. And again, you can see an example of this at www.MosaicEffect.com. And so, we're not saying that static identifiers and tokenization is not an effective security technique. What we're saying is when you go to make use of data in a distributed fashion and people can see those static identifiers repeated again and again and again (think of marketing IDs) the reality is the regulators are smart. They say: “Well, wait a minute. People can figure out who that is.” There are numerous examples of where anonymisation using static identifiers does not work. So, we're not saying that there's not a place for them. What we're saying is if you're trying to show demonstrable accountability that's technologically enforced, they don't carry the day. And so, you have to think through the benefits of the data subject, the benefits of the data controller, the benefit to commerce, and society as a whole. Now, what controls do you have in place that protects the data when in use so to make it so that it's in balance, something that makes sense.

[57:33] So, before we head out, I just want to mention we have had over 700 registrants for this webinar. I think that that is an amazing ratification of the importance of this topic. Things have changed. And in order to continue to do business, you need to have new controls in place. Those controls enforce your policies and procedures. They evidence the balancing of interests between those being involved. And if we as a community within the industry don't start doing things in this manner, you will start to have regulators, as it's happening saying: “Since you haven't showed us that you can do it, that you know how to do it, that you're willing to do it, we're going to take a step back and just allow you to process on consent.” So, in the last few minutes, Marty for 30 seconds and then Sachiko why you think we don't want to have just consent and why these controls and policies are important. Marty, please.
Martin Abrams (IAF):
[58:38] Innovation comes with thinking with data, and innovation doesn't just serve the commercial interest. They serve all of our interests of having a growing economy, new choices, better choices. So, the fact is, we cannot sleepwalk into losing our ability to “think with data.”
Gary LaFever (Anonos):
[58:57] Sachiko?
Dr. Sachiko Scheuing (Acxiom):
[58:58] Gary, I echo Marty's excellent summary.
Gary LaFever (Anonos):
[59:02] Fantastic. In closing, these are the four questions you have to ask. And if you can't answer YES to all of them, you may well have to stop personal marketing. We appreciate your checking in with us today. We invite you to come back on April 9th. Please feel free to send any questions to LearnMore@anonos.com and best of luck to everyone in these trying and very uncertain times. Thank you very much. Have a good day.
 
CLICK TO VIEW CURRENT NEWS

Are you facing any of these 4 problems with data?

You need a solution that removes the impediments to achieving speed to insight, lawfully & ethically

Roadblocks
to Insight
Are you unable to get desired business outcomes from your data within critical time frames? 53% of CDOs cannot achieve their desired uses of data. Are you one of them?
Lack of
Access
Do you have trouble getting access to the third-party data that you need to maximise the value of your data assets? Are third-parties and partners you work with worried about liability, or disruption of their operations?
Inability to
Process
Are you unable to process data due to limitations imposed by internal or external parties? Do they have concerns about your ability to control data use, sharing or combining?
Unlawful
Activity
Are you unable to defend the lawfulness of your current data processing activities, or data processing you have done in the past?
THE PROBLEM
Traditional privacy technologies focus on protecting data by putting it in “cages,” “containers,” or limiting use to centralised processing only. This limitation is done without considering the context of what the desired data use will be, including decentralised data sharing and combining. These approaches are based on decades-old, limited-use perspectives on data protection that severely minimise the kinds of data uses that remain available after controls have been applied. On the other hand, many other new data-use technologies focus on delivering desired business outcomes without considering that roadblocks may exist, such as those noted in the four problems above.
THE SOLUTION
Anonos technology allows data to be accessed and processed in line with desired business outcomes (including sharing and combining data) with full awareness of, and the ability to remove, potential roadblocks.