IAPP Pseudonymisation Webinar

Presentation Transcript
Can Pseudonymisation Save AdTech: Pseudonymisation and the 5th Cookie Initiative
Dave:
Welcome to the IAPP Web Conference. Can Pseudonymisation Save AdTech: Pseudonymisation and the 5th Cookie Initiative, brought to you today by Anonos. My name is Dave Cohen. I'm the IAPP’s knowledge manager and I'll be your host for today's program. We'll be getting started with the presentation in just a minute, but before we do, a few program details. Participating in today's program will automatically provide IAPP-certified privacy professionals or the named registrants with one CPE credit. Others who are listening in can apply for those credits through an easy-to-use online form on our website.

We'd also like to remind you that today's program is being recorded and will be provided free to registered attendees approximately 48 hours following the live event. We encourage you to ask questions at any time during the program by typing them into the Q&A field that's just to the right of your PowerPoint window and your questions will be answered by the presenters after the presentation.
Welcome & Introductions
Now, onto our program and I'd like to introduce today's panelists. Paul Comerford is a principal technology policy advisor with the ICO and had significant input into the development of this program and the slides, but unfortunately, is unable to be with us due to an unavoidable conflict.

So we're sorry to miss Paul, but do know that his input will be reflected within the presentation today. Joining us on the panel today, Marty Abrams is chief strategist at the IAF. Marty, welcome to the panel and can you tell us a little bit about your background in the IAF.
Marty:
Sure. The Information Accountability Foundation is a policy-centered think tank. It is a nonprofit. We do privacy theory internationally. I personally have been doing privacy work for 30 years and have run think tanks for the last 20.
Dave:
Excellent. It’s a real honor and pleasure to have you on the panel with us today, Marty. Joining Marty, Gary LaFever is the CEO at Anonos. Gary, welcome and can you tell us a little bit about your role in Anonos?
Gary:
Thank you, David. Appreciate that. My name is Gary LaFever and I am both the CEO and general counsel at Anonos and both of those titles and roles are important because our belief is that in order to truly make most effective use of data, you have to both address the business issues, hence the CEO, but also in a way that complies with the legal requirements and ethical requirements, hence the general counsel. And so, Anonos is a pseudonymisation technology developer and provider.

Thank you.
Dave:
Excellent. Thanks, Gary. And to round out our panel today, Dr. Sachiko Scheuing is a European privacy officer at Acxiom. Sachiko, welcome to the panel and can you tell us a little bit about your role over there at Acxiom?
Sachiko:
Yes. I am the European privacy officer of Acxiom, which is a marketing services company. I have also been the data protection officer for 15 years based in Germany.
Dave:
Fantastic. And with that, let's go ahead and get started with the conversation. So I'm going to turn it over to Gary to begin that. Gary, it's all yours.
GDPR: Balancing Interests & Rights
Gary:
Thank you, Dave. So today's webinar is actually going to be in four parts. The first part, we're going to be talking about the GDPR treatment of pseudonymisation both with regard to ad tech and beyond, but the important thing to realize is that pseudonymisation is a suggested, recommended and actually rewarded means of technical controls. And so, that's going to be very much grounded in the terms of the GDPR. That's the first quarter.

Then Marty is going to speak for about five minutes on his perspective on these issues of accountability and ethical processing of data. Again, as it applies to ad tech and generally. We'll then talk about what the 5th Cookie Initiative is, what the working group is, how you can join, what its objectives are.

Then Sachiko is going to close with about a five-minute presentation or a discourse discussion about her perspective on ethical data processing. And then we'll go to the closing and questions.

So the questions of those of you watching and participating in the webinar are a big part of this as well. So please think, as you're going through and watching the webinar, of questions you'd like to ask and we very much do appreciate that.

So we're going to start off here with this slide, which is sometimes surprising to people, and that is that the GDPR actually is all about balancing fundamental rights to privacy and data protection on the one hand and data value. And this slide highlights that it's not an absolute right on either side of the scale.

And so, what it comes down to is, I think just about everyone would agree that when consent can be used for ethical and lawful processing, that is the preferred legal basis.

The discussion we're having here is when consent can't work. And what I mean by that is consent has been very clearly defined as requiring specificity, that it'd be provided voluntary, that the data subject knows what it is that they're agreeing to at the time that consent is requested and provided.

There are certain processes that simply don't satisfy one of those criteria. In those situations, you have to be willing to ask, if consent doesn't work here, should this process occur? And you have to be willing to ask that question and live with the answer.

There may well be, in fact, there are certain processing where when consent doesn't work, that processing should not occur. But that's not what we're here to talk about. We're here to talk about, as a complement to consent, what technical and organizational controls exist and how they can be properly used under the GDPR to enable legitimate, lawful and ethical processing.
Pseudonymisation BEFORE & AFTER GDPR
And we have a particular focus here on one term, Pseudonymisation and I really want to stress that if you have not read the definition of Pseudonymisation in Article 4 or 5 of the GDPR, you need to, because this definition is only four years old and I like to emphasize that because if you're using a definition or a concept of Pseudonymisation that is older than four years, it's the wrong one because we'll walk through how the GDPR actually provides that Pseudonymisation is now defined under the GDPR, requires demonstrable accountability and that means something.
GDPR : Balancing Interests & Rights
So really, what the GDPR is about is balancing interests. And you'll notice that actually in this graphic and it is purposeful, that while privacy and value are balanced, the privacy side gets slightly more attention and that's appropriate because what we're trying to do is to achieve what's at the top of this slide, right? Which is all sides working together and happy with the result as opposed to being adverse to one another. And that requires new technical controls.
Pseudonymisation BEFORE GDPR
So why is it that pseudonymisation is not the same as it has been in the past? Well, the term pseudonymisation has been used in the past to refer to what I would now call static tokenization. When you replace a data value with a token, and I'm purposely not using the term pseudonym because I do not believe replacing a data value with a static or persistent token is Pseudonymisation. We'll walk through why that's the case.

And there are several examples of what is known as the Mosaic Effect and the Mosaic Effect takes its name because if I can get different pictures of parts of you and assemble them almost as in a puzzle, I create a mosaic of who you are and one of the most popular or well-known examples of re-identification by the Mosaic Effect is one by Latanya Sweeney where she showed that if you go to the US Department of Census, you can get three different datasets. Each of which on their own are privacy respectful.

I can get a data set of US citizens based on birthdate and my name will not appear in there. It will be replaced with a static token. I can get another dataset of US citizens based on zip code. Once again, my name will not appear. It's been replaced with the same static token. And I can get a third dataset which is US citizens broken down by gender. And once again, my name will not appear. It will be replaced with the same static token.

But Latanya Sweeney and others at Harvard University showed that when you combine those three different datasets, each time, you reduce any uncertainty or what's known as entropy as to identities, to the point that you can actually identify up to 87% of the US population by name.

So again, the traditional use of static tokens to replace data values no longer works in the big data world. I would encourage you to go to www.mosaiceffect.com. It is a website that shows the difference between anonymization using static tokens and pseudonymisation using dynamically-changing tokens and their ability to defeat the Mosaic Effect.

And so, this conversation we're having, it's critical to realize, this is not a conversation about technical controls that don't defeat the Mosaic Effect. So before you can determine whether or not ad tech and digital marketing is a lawful and ethical purpose, you need to be using the state of the art in the technology controls.
Pseudonymisation THROUGH & AFTER GDPR
So now, let's look at what it is that the GDPR holds out and requires for pseudonymisation. And it's very critical that the personal data cannot be attributed to a data subject, okay?

Actually, a higher standard than what you see for anonymization. Anonymization says reasonably, this is a prohibition, personal data can no longer be attributed to a specific data subject, okay?

If you have new controls in place that cause that first part to be true, it actually is okay and allowed and permissible that the data controller using additional information that they have under their control can in fact do the re-linking. But what I was focusing on there was, it's the data controller who can control the re linking, not outside parties. So again, the Mosaic Effect website has a number of different examples that shows you interactively how this happens.
Pseudonymisation AFTER GDPR: ENISA Guidance
And there's actually some fantastic guidance that's come out from a number of different sources on pseudonymisation. So you don't have to try to figure this out on your own.

ENISA, the cybersecurity agency for the European Union has two different reports and there's links to these on the mosaic.com website. The first one came out in November 2018 and this is quotes from that and it's highlighting that Pseudonymisation was purposely and intentionally put into the GDPR to provide a means to lawfully use data in a way that's still privacy respectful.

And so again, the 2018 report is somewhat at a higher level, talks about the strategic benefits, the business benefits. And then in 2019, November of 2019, they came out with another report which is very detailed, this technical means to achieve these goals. So again, ENISA is a fantastic resource.
Pseudonymisation AFTER GDPR: GDD Draft Code of Conduct
In addition, the GDD, the German GDD has come out with a draft code of conduct on pseudonymisation which is available. And that as well, provides some great insight as to what pseudonymisation under the GDPR would involve.
Anonymisation vs. Pseudonymisation under GDPR
So let's look quickly at what the differences between anonymization and Pseudonymisation. There are some interesting stats here. Anonymization, the term anonymization appears only three times in the GDPR, all within the same recital and it's talking about how if you properly anonymized data, it falls outside of the scope and jurisdiction of the GDPR.

Pseudonymisation is listed 15 times. 15 times. Five times as many times in the GDPR. And it's about how you stay within the GDPR. And most people stop the analysis right there and they say, “Well look, I'd prefer to anonymize my data and be outside of the GDPR than pseudonymise my data and stay within the GDPR.” But that's not as clear of a decision as it may seem at first. And here's why.

First off, it's harder and harder to anonymize data, given the availability of data that can be used to combine with the data set and re-identify it. And one of the things that has to be taken into account is the increasing amount of data breaches which reveal clear text data into the overall data ecosystem.

So again, anonymization is not just what you can do with data, it's what is reasonable that might happen to the data. And therefore, you have to include within your assessment of whether data is truly anonymized, the availability of data from third parties, whether that data was lawfully acquired or not.

That analysis of the reasonableness of its re-identifiability has to take all that into account. And if for whatever reason, even for factors beyond your control, the tests for anonymization cannot be met and you relied on being outside of the scope of the GDPR, you likely will have none of the controls in place and therefore, you are exposed and could be liable.

So Pseudonymisation and we'll walk through. There are numerous benefits that are provided in the GDPR specifically to encourage data controllers and processors to put controls in place that reduce the risk to data subjects. And again, all of this is in the context of, there's a process which a data controller feels is in the best interest of the data subject, business objectives, society as a whole and you're trying to determine if technical controls can make that processing lawful and ethical because consent does not work in this particular situation.
GDPR Recognition of 'Pseudonymisation'
Here are some of the examples of situations where the GDPR recognizes pseudonymisation and one of the interesting things, there is not another privacy-enhancing technique that is mentioned in the GDPR. It mentions encryption four times and two of those times, it's mentioned in the same sentence as pseudonymisation, but it doesn't mention differential privacy. It doesn't mention static tokenization. It doesn't mention other privacy-enhancing techniques. And yet it mentioned pseudonymisation 15 times.

When it talks about the opportunity and the benefit of having codes of conduct that could help people understand how to operate within the GDPR while still serving the needs of data subjects and business and society generally, the only mention of Pseudonymisation and that's where the GDD proposed code of conduct on Pseudonymisation comes into place.

It also in many different places, talks about different benefits. And it's an interesting thing to note. There are two terms that exist for the first time in the GDPR and those two terms are pseudonymisation as currently defined in Article 4 or 5 and data protection by design and by default as defined in Article 25. And when defining what data protection by design and by default is, it references pseudonymisation.

It is important when evaluating and construing a statute that you look at the terms of the statute because the language is determinative. So data protection by design and by default is the highest example of privacy by design but merely complying with privacy by design requirements does not mean you've satisfied data protection by design and by default requirements.

You must look at the requirements in the statute. And similarly, pseudonymisation as now defined in Article 4 or 5, must be strictly adhered to. But again, there are many benefits if you do.
GDPR Statutory Benefits of 'Pseudonymisation'
Here's another highlight of some of the benefits that go to Pseudonymisation, but what we're really going to focus on here is the bottom right, okay?

The Article 29 Working Party in prior guidance highlighted the fact that properly pseudonymised data can actually tip the scales in the favor of the data controller if properly implemented.
Pseudonymisation Benefits According to Article 29 Working Party Guidance
So again, this is all coming back to the role and importance of technical controls. We're focusing on pseudonymisation here because it's newly defined as a way to balance interests and the ability of those controls to enforce appropriate policies and procedures to deliver demonstrable accountability. And it is that demonstrable accountability that could be viewed and verified by auditors, both internal and external regulators to prove that methods and procedures were put in place to reduce risk to data subjects.

So with that, I'd like to turn it over to Marty to get his perspective on just this, the importance of demonstrable accountability, tools and controls to technologically enforce policies and procedures to ensure accountability and ethical processing.

Marty?
Martin Abrams, Chief Strategist at the IAF
Marty:
Well, thank you very much, Gary and I'm going to actually talk from the perspective of societal expectations, regulatory expectations, the current external environment for organizations. So I'm going to talk in those terms.

As I mentioned before, I am a privacy theorist. It is my job to think about, what are policy solutions that make it possible for data over time to serve people? So I'm really into this concept of data serving people. And for the past two decades, I've been on an endless quest to assure there is a space where we can actually learn from the data that is out there. We can think with data. Thinking with data is a term that we at the IAF coined back in 2013 and there's one that has been discussed in the regulatory environment.

But I would also say this, that back in 2009, there was something called the global accountability dialogue which defined the concept of accountability. And that the basis of accountability that's in the GDPR is the basis of accountability that is, for example, in Canadian interpretations of accountability. And it goes to this concept that I need to have things under control. I need to have sound policies. I need to have an implementation strategy for those policies. I need to be able to test that those policies work.

And now, we're increasingly talking about, I need to demonstrate that to external parties, be they accountability agents or be they regulators. So the fact is that we have to have this concept of demonstrable accountability.

So we're talking today about the tools that make it possible for those accountability rules within an organization to actually be enforced and they'd be able to prove to regulators that indeed, they're real and enforceable and provide evidence that indeed, they're effective.

When we think about this topic, we really need to think about the concept of data and when we think about consent, consent is most effective when data originates from people in a way that they consciously provide that data. At the IAF, we define data based on its origin or the paper for the OECD, which has been used by multiple regulators to define the concepts of provided data, observed data and inferred data.

And it's important to start with that concept of types of data because it begins to let us understand really sort of the patterns that have occurred over the last few years. The fact is that provided data has stayed steady while observed data, this ability to either actively or passively watch what people do has been accelerating since the mid-1990s and continues to accelerate and there's no way it can't continue to accelerate because as we as individuals begin to adopt smart stuff like defibrillators that alert our cardiologists or smart cars or even smart refrigerators, the fact is that this internet of things environment continues to generate more and more observed data.

But the fact is that the sheer amount of parties watching has begun to really spook regulators and that's not new. I mean, I think that began with the beginning of the century, but over the summer, the New York Times did some survey work and they found that on the Washington Post, when you go to the Washington Post website, there are about 500 parties that might be watching you when you're there and that truly spook folks of the sense that things are under control.

So as we think about this observed data environment, regulators have been trying unsuccessfully to stem it from, really, the turn of the century when Do Not Track systems began to emerge. And in part, that's because they have a sense that just the number of parties observing is unseemly, yet prohibiting either observation or the ability to think with that data once you observed it would become counterproductive because we're a knowledge society and it's this observed data that then sparks data science, which lets us come up with new insights. Some of those insights are in the ad tech industry and what we're talking about specifically today.

As we think about this ability to think about data, and as we think about where consent is fully applicable, we're going to see ad tech guidance in 2020. First from the ICO in the UK, but then from other regulators as well that will look for simple solutions for this complex problem of, how do we assure that we can observe where we need to observe, assure that we can think with that data, but actually have a controlled system where there are a more manageable number of players?

And that then brings us back to this concept of accountability, which has been in play for about 20 years. So the fact is, to be able to demonstrate that we are accountable in an ad tech environment, we need to be able to demonstrate that we have these sound policies that are in place at organizations, but we also have effective controls with what we do with that data so that we can actually think with that data, to come with the uplifting insights that lets us provide an ad to someone that is the appropriate ads, the ads that they will be interested in, the ad that goes better than a random interest.

And we also need to have effective transparency. So if I'm an organization that uses pseudonymisation as the technical controls to backup my policies, I actually have to be able to describe that to the public. And that is also part of the solution of demonstrable accountability. That means technology solutions that are truly effective in avoiding the mosaic effect. And I can describe why they're there and begin to serve the public in a more meaningful way.

I've taken a leadership role on this initiative, the 5th Cookie Initiative, because I think there needs to be a conversation about how we maintain diversity in the ad tech environment, have effective controls that prove to regulators that we really do have things under control, but avoid sort of a system where only a narrow few number of organizations can support the ad tech industry.

So I'm looking for technology solutions. I'm looking for a conversation about how those technology solutions are part of a policy solution within organizations that is demonstrable by others. And like I said, the 5th Cookie Initiative is a conversation to create an alternative as regulators begin to think about how they gain control in this marketplace.

Gary?
Maximizing Ethical & Lawful Data Value
Gary:
Thank you, Marty. So I'd like to pick up on two words that Marty used, conversation and observation. The 5th Cookie working group is all about just that. A conversation among the different market participants on ways that we can continue to manage and use data, leveraging technology controls so that observational data is managed. And this slide actually highlights two of the traditional ways in which privacy was addressed.

One is contract. The problem with contracts is, who's enforcing the contract? And is the remedy of enforcement actually something that makes data subjects whole? When you have hundreds of participants, who's the contracting party? What's the enforcement mechanism?

So look, improved contracts are definitely worthwhile and should be done, but if it's contract-based only, then there's not sufficient controls in place and you still have the risk of misuse of observational data. So then you can look to a walled garden proposal.

The issue with walled garden is, first off, you're kind of taking away some of the power of what we're trying to do with data, right? I'm old enough that I remember Prodigy, CompuServe, AOL in its early days, all walled gardens. We all celebrated when we moved to the open internet and look at the power of the information, and we've gone beyond even using the term internet because it's so pervasive. The interconnectedness of vehicles and of different observational data, we want to encourage that.

So the two traditional approaches of trying to deal with distributed data processing; contracts which is a big part of what the IB proposal is about, it's a great proposal. We're just saying it could be improved with controls and even the walled garden proposals such as proposed by Google, there are absolute merits to that, but there are also detriments that we think could be overcome with technical controls.

So now, we're introducing the concept of the 5th Cookie proposal. What is the 5th Cookie? I will explain it. But before I do, I just wanted to spend a moment on this slide. What we're trying to do here is to say, “Look, that's not normal behavior of fish to jump from one bowl to another, okay?” But if you do things slightly different with the right tools, methodology and intent, you can do almost anything that you did before the GDPR. We would actually argue, you can even do more. And the reason you can do more is because you now have demonstrable accountability with technically-enforced controls that enable you to show both your internal and your external auditors that you have put these in place to respect and honor and enforce the rights of data subjects.

And so, this is a shift, okay? We're not saying that other proposals to ad tech and other data uses don't make sense. They make a lot of sense. What we're doing is starting a conversation that if augmented with technical controls, they actually get better.
What is the 5th Cookie
So the 5th Cookie is just that. It's a means of establishing greater conversation about these issues with a focus, but not exclusively on Pseudonymisation. If you're interested in joining the 5th Cookie working group, you can go to https://www.5thcookie.com/ and that's both with the number, 5-T-H or spelled out fifth. And what the 5th Cookie working group is about is about public education.

So this webinar is the first outreach by the 5th Cookie working group. There'll be others, and what the 5th Cookie becomes will evolve based on the people who want to support it. But again, the primary goal here is one of communication, one of helping to explain why it is that technical controls are different than they were before and how it can help.
Can Pseudonymisation Save AdTech
So let's get a little more into what the 5th Cookie is. You hear about first party cookies. You hear about third party cookies, okay? First party cookies are cookies that the publisher of a website, and I'll just use a website in this example, control. Third party cookies are the cookies that third parties attached to a browser that comes to the website, not under the control of the publisher.

We believe that restricting the number of cookies, and I'm talking about first party cookies now, almost akin to spaces at a retail shopping mall. And so, you have a limited number of those cookies that will be managed by the actual publisher of the website. And just for argument's sake, we're assuming, and all it is, is we can have a conversation that the four major players, Apple, Amazon, Google and Facebook would each get a cookie.

And what we're saying is, allocate one or more additional cookies for other working groups and that's why it's the 5th Cookie, right? If you allocated one to each of those four players and you allocated just one more, it would be the 5th Cookie. That's the significance to the title. It's not meant to determine how this goes. It just is a title so we have something to refer to the working group.

Okay. So how does it work?
Three Steps of The 5th Cookie Metaphor for Ethical AdTech
Well, the reality is, the 5th Cookie is advocating that absolutely, you predicate this on consent. And that as Marty indicated, data subjects would have the right to consent and you have to comply with all the requirements of consent, okay? But you would enable data subjects to control the ability to use one of three different types of data. Provided data, inferred data and observed data. And they would add step one, actually consent to the use of one or more or all of those different types of data. And they would also be put on notice and it would be explained to them how Pseudonymisation controls would be used to protect their rights in connection with legitimate interest processing of that data to provide them ads. Not as individuals, but as members of small cohorts, small groups, we refer to them as micro-segments. And there's a lot of detail on the 5th Cookie website on this proposal.

But the idea is, you're now still able, through data science and step two, to actually target ads to groups of people that are small enough that they're reflective of individual characteristics and behavior to make them likely to be interested in the ad, but large enough that you don't know the actual identities of the people within it.

There'd be one or more, but a very small number of trusted parties who would broker as it were, the difference between the small cohort groups and the individuals. Because in essence, you have to deliver the mail, okay? But the data subjects would always have the right to say, “Stop sending me those ads.”

So you're leveraging well-established GDPR principles of consent in step one and in step two, again, showing that you in fact have the technological controls in place to have a balancing of interest test and show that the interest of the data controller and third parties are such that the data subjects’ rights have been controlled and therefore, it's lawful.

And this comes down to a little parenthetical under two, okay? The profiling and I hesitate to even use that word because it sometimes has a bad connotation, but this allocation of people within these cohort groups, micro-segments, would have to be determined not to have a legal effect. It's a huge issue under the GDPR.

And putting people in groups that they have the right to opt out of, but aren't identified as being within those, we think accomplishes that goal. But that's key to this is that this process is not having legal effect on the individuals, but is improving their experience and the ability of companies to reach out to them. And while we are focused on ad tech, the same broad-based applicability and power of pseudonymisation as a more effective technical control is not limited to ad tech.

And then the last step is … please, Marty. Yes.
Marty:
So the importance of really understanding when profiling is going to have legal effect on individuals or something similar and when it's not, is an incredibly important distinction. It's a distinction that is made in the GDPR, but it's a distinction that is having, in practical terms, we're having difficulty proving to those who oversee these markets that indeed, there is a difference between profiling where we just gain knowledge and then use that knowledge in a non-impactful way versus those situations where we do actually intend the knowledge to be impactful.

And this whole ability to differentiate profiling that has legal effect from non-legal effect or similar is an important part of this process and the concept of really being able to pseudoanonymisation is a proof point that what you're doing indeed won't have legal effect.

Gary?
The 5th Cookie Working Group - Coverage
Gary:
Thank you, Marty. That's absolutely critical because what the 5th Cookie working group, it's all about more focus on the fact that you can use the GDPR to enhance your ability to service your customer base by using the provisions and the protective mechanisms that it respects and rewards. It's all about this balance.

So compliance with the GDPR is not just about avoiding liability. Compliance with the GDPR, we believe, can maximize privacy, security and data utility. And it's embracing these concepts of technical enforcement of controls and we focus on Pseudonymisation as a means of achieving this balance.

So the working group has already had very good coverage. And again, the whole idea here is to get the word out, have people look at critically what Pseudonymisation is.
GDPR Pseudonymisation = Technical & Organisational Controls
And again, Pseudonymisation is not a silver bullet, a golden shield or a magic wand and it's different from what you've done in the past. So it's worth just for a minute or two here before I pass it on to Sachiko to wrap us up here, to talk yet again about what Pseudonymisation requires. And in essence, you need to be able to show that you have technical controls in place, that the processing of information value on the top left does not reveal the identity of the individual.

And yes, new technical approaches to processing are required to do this, but as I pointed out, there are many resources. I would highly recommend the ENISA guidance both in November 2018, November 2019 and there are links to those if you go to mosaiceffect.com.

So what it's saying is, if you can show that you have these technical controls in here, the brick wall to the top left, so that you can't cross between information value and identity without permission, then it's okay that the data controller has that additional information without which those can't be re-linked and the re-linking of which is limited to authorized purposes.

So that's my view of the definition of Pseudonymisation. But I recommend that you take a look at these resources, ENISA, GDD and others that go into a lot of detail.
GDPR Benefit Highlights: Pseudonymisation
So the last thing I want to do is go back to the point of why most people try to get out of the GDPR under anonymization versus embrace Pseudonymisation. And as I mentioned, anonymization is a dangerous path because if you don't pull it off, you're exposed, you have liability, you have not stepped up to your obligation vis-a-vis data subjects and their fundamental rights.

Pseudonymisation though is actually recommended, okay? It's a new state of the art. There's no question. But if and when used properly to enforce policies and procedures and accountability, it gives you greater uses of data. So there's another option here. It has to be viewed.

And so, in the context of ad tech, what we are really saying is yes, Pseudonymisation is hard to say. It's hard to spell. Is it an S? Is it a Z? It's 16 letters. But it's worth the effort to get to know because it has these rights and it can in fact allow all parties to win.

Again, provided data, inferred data, observational data are very powerful, but that power has to be controlled so that you balance the interest of companies and society and data subjects in a way that makes sense.

And with that, I’d like to lead it over to Sachiko to close us up for another five minutes or whatever, and then we'll take questions.
Dr. Sachiko Scheuing, European Privacy Officer
Sachiko:
Thanks, Gary. What I actually wanted to do is to provide a practitioner's point of view. Of course, in my daily work life, I come in contact with a lot of data protection officers and privacy professionals who are working with the question of, how to engage in the ad tech industry in a compliant manner?

However, it is so complex. And unfortunately, the solution to this complexity so far has been limited to consent, as Gary has previously explained, by the regulators. And also, if I revert back to what Gary has stated, it promotes the walled garden solution.

Let me actually just take one step backwards and think, “Well, what is it that the marketers are really, really wanting to do?” In very rudimentary term, what they really want to do is they want to show ads. They want to communicate the advertorial messages. And actually, they want to show ads to the relevant people. And in doing so, improving the chance of communicating the advertisement by 10% to the relevant people will save not only a lot of money, but will avoid an awful lot of unnecessary irritation by the consumer.

So once again, Marty said it is not data with surgical accuracy collected through perpetual tracking of an individual that they need. Considering this need or this demand, you can say, “Well, if that's the story, well, maybe this is the very area where a balance can be struck.” This is how maximizing the value of data to our marketers and the protection of personal data can be balanced.

And this is the very reason why Acxiom has become part of the 5th Cookie Initiative. At our company, we have been using pseudonymous data as a means of protecting data for well over two decades and as part of our ethical data approach. And we thought, “Well, the idea is to actually add another layer of data protection through the 5th Cookie Initiative, through the 5th Cookie. And if we would do that to the ad tech industry, maybe this would lead to one of the solutions to create a sustainable marketing ecosystem.”

I will pass this back over to Gary.
Summary Take-Aways
Gary:
Thank you very much, Sachiko. I appreciate that. So in closing, I just wanted to hit upon some of the key points that we were hoping to convey, okay? The first one is that Pseudonymisation is newly-defined. It is absolutely imperative that you not think what you know to be Pseudonymisation is what you use in your evaluation of what is appropriate going forward. It is something that is newly-defined at the EU level for the first time, okay?

And when done correctly to implement and enforce reasonable policies, it delivers demonstrable accountability to enable greater data value and utility, while preserving fundamental rights. So again, it's this balancing, okay?

As Marty said, consent is no longer a not in some circumstances. Some data uses, if you rely entirely on consent, one of two things will happen. You actually take away from the precision of the definition of consent under the GDPR and you have a broad base just put all the risk on the shoulders of the data subject. Say I can, even though I can't describe to you what I'm going to do. That's simply not effective consent, okay?

But these technical measures like Pseudonymisation, again, working hand in glove to enforce policies, okay? That's what helps to balance the data innovation and the assurance of rights, okay?

The 5th Cookie, it's a metaphor. It's not a technical solution that's being delivered to the market. It's a metaphor for what all the other proposals could do as well which is take a step back, look at the benefits of Pseudonymisation as a technical control to help balance the interests, rights and privileges of all the parties involved to bridge these consent gaps, because there are consent gaps where you typically, I mean, you basically cannot describe with sufficient detail in advance or there's not true voluntariness.

There's a reason that legitimate interest is one of the six legal bases, but legitimate interest is not just a label that you throw on a process that you want to accomplish. It is a requirement to have technical operations, technical safeguards, organizational safeguards that enforce policies. So you can show, you put the safeguards in place, but when you do that, you can balance.

And then, I think Sachiko’s point is an important one. You have to be able to provide consumers with the value that they want and need. And you want to do it in a way that still meets the requirements for the brands. And if you look at this whole topic on a broader basis than just ad tech, there is a societal benefit as well.

The GDPR has within it, a path to that. Pseudonymisation is mentioned 15 times for a reason. It's worth your consideration and evaluation, and that's what the 5th Cookie is about. It's about a conversation at the industry level as to how we might do a better job of balancing these interests so that we can achieve what makes sense, which is both the objectives of companies, society but data subjects and consumers as well. And we believe that by embracing the GDPR, it provides an answer for that as opposed to trying to avoid the GDPR.

So with that, Dave, if you want to take us to the questions and answers.
Questions & Answers
Dave:
Fantastic. Thanks, Gary. And thanks Sachiko and Marty for your comments and remarks as well. We are now entering the question and answer portion of the program. So as a reminder to all of you out there listening in the audience, if you'd like to submit a question, you can type them in the field that's just to the right of the PowerPoint window there and we'll tackle as many as we can during the time that we have allotted.

So let's go ahead and get started with Sachiko with this first question and it has to do with the one-to-one personalized advertising versus what's known as contextualized advertising. So the question is, why is personalized advertising necessary when ads can be delivered in a contextual way? Contextual-based advertising based on appearance on website, particular areas, et cetera.

So Sachiko, would you like to tackle that first question?
Sachiko:
Yes, I can. Very good question. And my immediate response to that is, well, if that's what the marketers want, if that's what the brands want, then personalized data is still going to be something that we want to provide them with. It's really a demand and supply story.

However, very, very generally though, compared to contextual, I mean, there are many contextual advertisements that are very creative on its own. It is still perceived as the more effective way of improving the relevance of advertisements to the consumer.
Dave:
Terrific. Anybody else care to comment there?
Gary:
Yeah. So I think that's a great question and it highlights, hopefully, this is coming across that the 5th Cookie working group is not about an either/or or we versus they. It's about a broader conversation. And in those situations where contextual advertising meet the needs of the consumers, the data subjects and the brands, it makes complete sense. But there are, as Sachiko referred to, there are efficiencies to more targeted advertising that benefit both the consumers as well as the brands, the people paying for the advertising. And so, I think it's probably a combination mixture of both.
Dave:
Excellent. Makes sense. Marty, anything there?
Marty:
Not on that topic.
Dave:
Okay, great. Well Marty, this next one is actually, I think I'm going to start with you. Here's the question. Why is it important to determine if data science used in ad tech has a legal effect on data subjects? You referred to this in your comments. Care to elaborate there?
Marty:
So in the simplest mode, thinking with data is using statistical methodology to figure out what the data tells you or to actually test hypothesis and it depends on the methodologies you're using. From my perspective, that is knowledge discovery, but it comes down to the question in the data science, what is your general question? What is your motive? What is your reason?

So if your reason in a data science perspective is to determine what is the best market for a particular product and then offer that product to an individual, to a cohort, not to a particularly named individual, but to a cohort, that's really not data science that has legal effect. It's data science whose purpose is to segment markets.

If your purpose is like, was the case in or suggested in the last US election year, it is to define the right messages to discourage individuals from voting, voting is a legal right. So that would be data science or profiling that does have a legal effect because you're taking away a specific right.

So when you're doing the data science, the motive behind the data science is important. It's not the technical processes, it's the motive. And then when you apply those learnings, again, it goes to, what is the effect of applying those learnings?
Dave:
Okay. That's helpful, Marty. And just staying with you for a moment for this next question with regards to the purpose of direct marketing, if the GDPR says that we can use legitimate interest to process data for direct marketing, then what is the problem you're trying to solve? Care to comment there?
Marty:
The problem is that the GDPR was designed with a couple of motives. One was to go to a risk-based approach to data protection and data protection goes through the full range of fundamental rights and freedoms and that's a fairly broad range. It was also to assure that the right legal basis to permission data was applied and that the legal basis be effective in protecting individuals when it's applied.

I can tell you in my conversation with regulators. Right now, we have a trust deficit with regulators as it relates to legal basis other than consent. And part of what we need to do is to be able to demonstrate to regulators that there's process and theory that would assure that when organizations are using legitimate interest that they've done a balancing process that is empowering of individuals while it's empowering for the organizations.

So part of why we have a question in play is that there is a trust deficit among regulators as it relates to legal bases other than consent.
Dave:
Excellent. Thanks, Marty. And …
Gary:
If I could just jump in on that, because we do a lot of work with regulators and legislators, people that were involved in the working group for the GDPR. And quite candidly, there's a good deal of surprise on their part that people haven't picked up on Pseudonymisation. You don't mention the term 15 times by accident. But the term by itself doesn't cause magic to occur.

So legitimate interest is in fact, specifically identified in the GDPR as being applicable to direct marketing, digital marketing, but it's not just the term. You have to satisfy that term and it's the intermarrying of Pseudonymisation and legitimate interest processing that enables this to occur because as Marty said, you need to address this trust deficit with the regulators who have seen too many companies merely claim a legitimate interest without being able to show the technical and organizational safeguards they put in place to satisfy its requirements.

And so, that's really what this whole conversation is about. It's really what the 5th Cookie working group is about. It's, let's elevate this conversation as to how technical controls can enable us to address this trust deficit so regulators have greater confidence in data controllers and processors, data subjects and consumers have greater confidence in what's being done and not only businesses benefit, but society benefits.

There's a lot of benefits from data processing, a lot of innovation that can come from AI, machine learning, other processes, but it has to be balanced against the rights of the individuals and the expectations of the regulators and legislators.

And so, I truly underscore what you said, Marty. This is all about addressing the trust deficit that we currently have with regulators.
Marty:
So I can tell you, I've had direct conversations with numerous European regulators and actually, recently with Canadian regulators who have a different set of laws to apply. And what I'm being told by them is they literally have had no company come in to demonstrate their methodology for legitimate interest that to them, demonstrate that they actually understand what that balancing process is.

So the regulators are defaulting to consent because they haven't seen from organizations the knowledge that they understand how to apply legitimate interests.
Gary:
And there is data out there. The two ENISA reports on Pseudonymisation, the GDD draft code of conduct. It shows how this can be done. It's not how it's been done. That was the picture of the goldfish jumping from bowl to bowl, okay? That's not normal activity of fish, but if you show them how to do it and you give them the tools to do it, they can do it.

And so, this is absolutely critical, I believe, which is, it's the intermarrying of the technical controls to enforce the policies and the procedures to show accountable processing that regulators can then say, “Okay, I understand what you've done. I understand how you did it. And now, you have the right to do what you're about.”
Dave:
Excellent. Thanks so much.
Sachiko:
Hey, Gary. I want to just add something to what you said about the code of conduct of the GDD. Well, actually, this code of conduct of pseudonymous data of GDD is actually being sponsored by the Ministry of Interior of Germany. And it was actually presented at the so-called Digital Summit, which is an Angela Merkel-hosted event and it talks about how Germany can actually grow in the digital economy. And they really do bank on accountable organization using protective measures such as pseudonymous data.

So it is not only the regulators, but also on the governmental level, there is a strong interest in using, applying pseudonymous data as well as, of course, carrying out appropriate legitimate interest assessments.
Gary:
Thank you for that, Sachiko. And I think that what I at least believe is a key point here is, it's not an either/or decision. There is a way to balance the interests. There's a path and a tool and a means that the GDPR talks about and it doesn't only talk about Pseudonymisation. I'm not saying that. But it reflects the idea that technologies can be used not just to track and service ads to individuals, but to also reduce the risk to them and to do it in a way that still can be targeted to their individual interests and characteristics and behavior, but not in such a way that is naturally identifying. And I think that's what we're talking about here, right?

So countries, regions, the globe benefits, if and when we pull this off correctly. So thank you for pointing out, it’s not just the regulators.
Marty:
So to that point, the Canadian government which is in the midst of thinking about how to revise their private sector privacy law, has asked the IAF to think about how you can process data for advanced uses in a trustworthy fashion with means other than consent.
Dave:
Fantastic. Well, thank you, Marty and Sachiko and Gary. That brings us to the end of our question and answer period here. There were a couple of questions that we didn't get a chance to get to. But I can tell you that we'll do our best to export those and try to provide some answers to them either on the IAPP website or the Anonos site.

And as you probably noticed, the email addresses for all of us are available there on the screen in front of you or if you're watching this as a live presentation, you can go ahead and click those. And if you didn't get your question answered, you might fire it off to one of the folks on the panel available here as well.
Web Conference Participant Feedback Survey
So with that, and before you drop off the line, I would like everybody to please, if you can, take literally just two minutes and hop in and take a quick survey for us. We here at the IAPP really value your input. Very much appreciate hearing whether this webinar met your needs and your expectations. And importantly, there's a field in that survey that allows you to tell us what topics you'd like to hear about on upcoming programs.
Thank You!
So with that, I wanted to move on and say a very quick thank you to Anonos for making this program available to the IAPP membership and beyond and thank all of you on the panel, Sachiko and Marty and Gary. And Gary in particular, thanks to Anonos for being able to promote this for everybody.
Gary:
Absolutely, Dave. It's our pleasure and please, if anyone's interested, send me an email. We actually have significant documentation and published papers on Pseudonymisation. I'd be happy to share it with anyone that's interested.
Attention IAPP Certified Privacy Professionals
Dave:
Fantastic. Thanks, Gary. And as I mentioned at the beginning of the program, if you're an IAPP-certified privacy professional and you have your certification with us and you registered through our website, you're going to be automatically granted one CPE credit. So you don't have to do anything. We'll just upkeep your certification there.

And if you're watching, for instance, a recording and you didn't actually register through the website, which is fine, you can get that CPE credit by going to the Certification tab on the IAPP website and there's a very easy-to-fill-out form and you can submit for that CPE and we'll be happy to update your records.

And if you're an attorney and you're wondering about continuing legal education credits, we do not pre-certify these programs, but they are often eligible depending on your jurisdiction. So you'll need to apply to your particular bar for that.
For questions on this or other IAPP Web Conferences or recordings or to obtain a copy of the slide presentation please contact
And if you need supporting materials, you can feel free to contact me directly and I can do what I can to provide those for you. A copy of the slides, et cetera.

And by the way, as I mentioned earlier, if you are listening to this program live now and you want to hear the recording, we'll have that posted in the MYIAPP section of the website. Within about 48 hours, you can go back and listen to the full program, including the Q&A section.

And there's a live link just below the recording window where you can click on that and download a PDF of the slides. So if you're interested in the slide materials, you can get them that way. Feel free to reach out to me with anything else you might wonder about.

And one last time, thanks everybody for joining us today. Hope you enjoyed the program and hope to see you on another IAPP privacy education web conference soon.

And with that, I will take us to a program close.
CLICK TO VIEW CURRENT NEWS