Saving Direct Marketing in the Post-Pandemic Economic Recovery:
The Role of Robust Pseudonymisation Controls Under the GDPR

Presentation Transcript
Dave Cohen
Knowledge Manager
IAPP
Marc M. Groman
Principal, Groman Consulting Group LLC; Adjunct Professor
Georgetown University Law Center
Christopher Docksey
Honorary Director General
EDPS
Gary LaFever
CEO & General Counsel
Anonos
Summary Slide from Webinar
TO VIEW / DOWNLOAD SLIDES
Dave Cohen (IAPP)
[00:03] Welcome to the IAPP web conference “Saving Direct Marketing in the Post-Pandemic Economic Recovery” brought to you today by Anonos. My name is Dave Cohen. I'm the IAPP Knowledge Manager, and I'll be your host for today's program. Now, onto our program and I would like to introduce today's panelists.
[00:21] Marc Groman is the Principal of Groman Consulting Group LLC, an adjunct professor at Georgetown University Law Center. Marc, welcome to the webinar. Can you tell us a little bit about your professional background?
Marc M. Groman (Groman Consulting Group LLC)
[00:29] Thank you, Dave. It's a pleasure to be here. I served as the President and CEO of the Network Advertising Initiative, a US-based self-regulatory organization for digital advertising and I left that organization to join the Obama White House where I served as the Senior Advisor for Privacy for the second term of the Obama administration. And since then, I am doing consulting, teaching, and I also serve as an advisor to the US Department of Commerce’s National Institute of Standards and Technology on privacy and cybersecurity.
Dave Cohen (IAPP)
[01:05] Excellent. Thanks, Marc. Welcome to the panel today. And joining Marc is Christopher Docksey. He’s the Honorary Director General at the EDPS. Chris, it’s a real honor to have you with us on the program today. Thank you for joining us from overseas, and can tell us a bit about your professional background?
Christopher Docksey (EDPS)
[01:22] Hi Dave. I used to be the Director of the Office of the EDPS, and they kind of gave me the title of Honorary Director General. I'm now a member of the Guernsey Data Protection Authority and the Board of the European Centre on Privacy and Cybersecurity at Maastricht University and I’m one of the editors of the OUP Commentary on the GDPR.
Dave Cohen (IAPP)
[01:50] Excellent. Thank you, Chris. And to round out our panel today, Gary LaFever is the CEO and General Counsel at Anonos. Gary, can you tell us about Anonos and your role?
Gary LaFever (Anonos)
[01:59] Thank you, Dave. Yes, I serve in a dual capacity as CEO and General Counsel of Anonos, which is a company with 20 years experience of research and development in reconciling conflicts between data innovation and protection by leveraging technology to accelerate speed to insight, lawfully and ethically. We spent the last 8 years operationalizing functional separation to achieve business objectives. And prior to Anonos, I was a partner at the international law firm of Hogan Lovells and a consultant at Accenture.
Dave Cohen (IAPP)
[02:36] Terrific. Thanks, Gary. And with that, we'll turn it over to you to begin the program. Gary, the stage is yours.
Gary LaFever (Anonos)
[02:42] Thank you very much. With this first slide, I'd like to introduce the concept of “Rethink or Become Extinct” in the context of our current pandemic conditions. The term “Rethink or Become Extinct” communicates the need for transformative changes to enable digitization, advanced data processing capabilities by evolving beyond traditional approaches to privacy in order to support new requirements for innovation and insight in a post-pandemic world. But we need to be smart about these changes. This slide highlights that even before the COVID-19 increased pace of digitization, roughly two-thirds of projects failed, and you do not want to be this company.
[03:37] The overwhelming increase in people working from home and purchasing goods online has dramatically accelerated transition to a digital world. To survive and thrive, companies must have better data-driven insights from the customers, as well as supply chain partners to anticipate and quickly react to changing behaviors. Just this week, there was an article on The Verge where the algorithms that Walmart is using are not working for their supply chain partners during the pandemic. This highlights the need for current and effective data to stay on top of and manage the situation.
[04:20] McKinsey reports the timeframes for digitization are being compressed as a result of the pandemic. And when I say “compressed,” I mean from the years to months, and the companies that are able to efficiently collect and analyze data to target the places where consumers are now collecting, purchasing and actively engaged and by providing those consumers offers that communicate accountability and transparency. Those will be the companies that are the winners in the post-pandemic world of digital savvy consumers.
[05:00] The current pandemic crisis serves as a launchpad for this increasing digitization and direct-to-consumer commerce. One need to only look at the extraordinary adoption rates for digital payment apps because consumers have not been able to visit brick and mortar locations as proof of this radical shift. But, and here's the irony, the ongoing availability of electronic direct marketing and support of AdTech that are necessary to satisfy this customer demand is at risk.

[05:41] This slide highlights the dire situation in which brands and publishers currently find themselves. They're represented by the T-Rex balancing precariously atop the melting iceberg. Brands and publishers want nothing more right now than to help consumers more quickly find relevant products that meet their specific needs in this time of crisis. However, on one side, they're surrounded by regulators, the Court of Justice, and non-governmental organizations that are rightfully concerned about privacy issues and candidly see no point in being flexible if the industry remains adamantly non-compliant.

[06:28] And on the other side, they're supported by an AdTech ecosystem that simply refuses to change. As an example, ongoing reliance on TCF 2.0 notwithstanding its dependence on third-party cookies, which are soon to be abolished as well as other noted shortcomings, but this concept of “Rethink or Become Extinct” is not limited to direct marketing and AdTech. In fact, the T-Rex could be your company in today's uncertain situation with customers on one side and competitors on the other. Regardless of whether it's the functional separation approach covered in this webinar, all stakeholders across multiple industries must rethink their current situation to resolve these conflicts and meet the increasing demand for lawful and ethical digitization. And with that, I'll turn it over to Marc Groman.
Marc M. Groman (Groman Consulting Group LLC)
[07:31] Gary, I thought that was a great introduction. I'm going to pivot to a slightly more optimistic tone. I don't know that I would go quite as far as extinction. But I do think that in the current environment, we are definitely facing some significant challenges. And certainly in this AdTech space and the AdTech ecosystem, the challenges are even greater in the current environment. But I want to say that I am optimistic and I have seen AdTech pivot. I've seen companies being nimble and evolve as challenges arise. And I'm optimistic that AdTech and personalization will continue not just here in the US, but in the EU as well under GDPR and the privacy framework in the EU. And so, I’m excited to talk about that.

[08:19] The first point of my slide though, and it seems silly and I don't know if this is a discussion across the world, but certainly, there has been this buzz here in the US that in this post-COVID world privacy is dead, and I've seen lots of articles and speculation about that, and I just want to push back on that and say that is definitely not true and I could give you many anecdotes about that. But it's interesting and I want to keep it current. So, in today's Washington Post, which is our primary newspaper here in Washington, DC, I thought it was interesting about this exact topic. A poll was conducted here in the United States by the Washington Post and University of Maryland about whether or not Americans will be using the voluntary COVID contact tracing apps. And interestingly, only 40% of individuals who were polled to have cell phones said they would, and one of the biggest issues that popped was that people here - the Americans who were polled - don't trust the technology companies. And I thought that was a really interesting point and worth highlighting for many reasons.

[09:30] One, privacy is not dead. And two, it does indicate that we, the industry, we've got work to do and we've got to take that seriously. I love this. In today's paper, one of the individuals who responded to the poll actually said, “I don't feel like the companies have a good track record of taking care of people's privacy and data. I'm not going to take the time and give them more data.” So, privacy is not dead but everything is in flux. And that's part of the theme of this presentation. Certainly, if you're in AdTech or marketing and many other areas, we're experiencing the impact of the economic downturn, the reduction in spend on advertising. Everyone's feeling it. Although let's highlight and be honest, everyone's not feeling it equally, and there are issues around that. But beyond that and the reference that Gary had said on the pace of digitization, the pace of technology changing is also a factor for AdTech. And so, we're facing issues around cookies and changes in browsers and changes in operating systems and new business models.

[10:35] So, AdTech is grappling with all of that change. At the same time, they're dealing with GDPR evolving regulations and changes in laws both in the US and EU and the impact of the pandemic. So, the challenge now in this environment is to take stock and then build that long-term strategy. But let's be fair and I really do want to be fair and say for some entities survival right now is a priority. Business decisions need to be made. I'm working with clients on that. But once we get to that longer term approach, we want to have risk management because I have found that when technology and new practices are instituted in a crisis mood, security and privacy are often sacrificed. We don't want to have that happen. We understand it, but we want to mitigate it.
[11:27] So, turning to the issues that we're addressing today is the relationship between GDPR and marketing and innovation and competition. Now, from my perspective, it's a challenging environment. But personalized marketing and advertising like privacy also I don't think is dead in the EU, and there are a number of issues that companies are working through to address the demands of compliance. Among them, we’ll talk about consent, legitimate interest, and processing. But I do want to know before I get to that the issue of competition and it’s something that in this discussion about the future of marketing and extinction and what the world looks like after COVID-19, at least for myself, I'm hoping that the world isn't only five companies and that is a very difficult issue in the world of AdTech and privacy. But as we move forward, it is my hope that policymakers, regulators, and others take into account that at least I don’t want a world that has only five companies and by the way, four or five of them would be American. And I don't want that to be what the Internet looks like, and I hope others share that concern.

[12:46] So, how does AdTech move forward? Under GDPR, I think it can and I'm optimistic about that. But there are issues to work through and the industry and AdTech companies need to step up and meet the obligations and work with regulators in order to make it work in the EU framework environment and globally, frankly. Consent, does it work? Will it work? I know that certainly under the GDPR it is more challenging, and we've seen cases about that. But it's evolving and the AdTech industry is on the industry to make sure that consent mechanisms are going to address all of the requirements of GDPR around specific and informed and explicit consent, which will require modifications to consent mechanisms, modifications to the way information is presented and timing, ensuring that the controller is actually identified and third parties, but it can be done, and I think it can be done but it’s challenging and we'll see. But we'll see how it applies to RTB (Real Time Bidding) and AdTech and if it's something that can work

[13:57] But the industry can't try and fake it. And there's been a perception by regulators and it’s sometimes justified that the industry is not willing to do what it takes to actually meet all their obligations under GDPR and we'll need to move beyond that. But the second most talked about mode for lawful processing under GDPR becomes legitimate interest, and that's where I know Chris is passionate and there's a lot of misunderstanding I think and I would say and concede in the US there's misunderstanding of how to apply legitimate interest to advertising. It's a balancing test, and I think that in many cases AdTech and all industries haven't done a sufficient job at the balancing test and demonstrating to regulators that we understand there's a balancing test, and that we take into account data subject rights and interest as we decide that we have a legitimate interest. So, there's work to be done there as well and there are other issues with respect to whether or not it's profiling or automated processing, and I think most of us would agree that most target advertising meets that definition, which triggers another balancing test around whether or not there is a legal or similarly significant effects from that.

[15:21] And so, those multiple steps require more work than many companies are accustomed to, rigorous analysis, and being able to prove and be accountable for compliance. And so, I’m optimistic that work can be done globally with companies from the US, EU, and otherwise to modify practices, consent mechanisms, and that kind of balancing and being able to demonstrate accountable business practices for advertising including personalized advertising, such that it will go forward in a way that achieves the business interest, is good for consumers and the marketplace, and respects and protects the fundamental rights of privacy and data protection which are so key and essential in the EU and many parts of the world. And that leads me to the point about accountability.

[16:16] This part is missing in too many entities and it's fundamental to GDPR and it's going to be more and more integrated into US law, which is this idea that you don't get consent and stop. You do the balancing, but you have policies that show that you can actually engage in the balancing, evaluate risk, mitigate risk, be accountable for remaining risks, and you can prove it and audit - all of which is important. And I think that as we go into the future, new technologies will be available to help support that. I have been doing privacy for a very long time, and I've been looking for new ways to make points and to illustrate issues as I teach it at Georgetown University Law School and elsewhere.
[17:04] So, again, my point being that privacy is not dead. It is an issue that we are going to continue to address and there is this waging debate here in the US around how technology companies like Google and others will, in fact, implement contact tracing around the world. To me, it doesn't show that privacy is dead. In contrast, I am rather impressed by the extent to which the privacy issue has risen to the surface and certainly even here in the US, and maybe that surprises some of our listeners from around the world, but it has really generated a huge debate and that Washington Post poll that I referenced certainly bore that out.
[18:52] That’s a great one to close on. I think and allow Chris to talk, which is that this is a global issue. Of course, it's a global issue. And we're just seeing trends across the globe around tension in trade. Certainly, we're experiencing it here in the US between the US and China that impacts tech and business. There are occasionally tensions or misunderstandings on the two sides of the Atlantic around GDPR and that doesn't benefit anyone. It is very difficult to keep data within borders. This new effort of data localization where the data has to stay in the country. I think global companies will find that a challenging environment and it won't necessarily provide benefits while placing burdens on companies.
[19:45] So, I was asked to do a blog post with a gentleman who used to be the CPO of Microsoft to sort of rather bluntly try and address some of the issues that we're talking about and what we've been observing, and I have a very strong view around how US privacy ought to develop and what I've seen in the past I am very skeptical of consent. And a core element of this is that I don't think we, meaning any policymakers or country, should be placing too heavy of an emphasis on consent, because it shifts the burden to consumers and I would argue that it is impossible for consumers to understand these choices that we're being asked to make at dozens and dozens of websites or apps or technology in our home and elsewhere. And therefore, we need to look for a new way forward. We also need to recognize that the industry is going to have to step up and invest resources. That will be true regardless of the legal framework. It just doesn't matter regardless of whose legal framework or approach. Companies are going to have to invest resources in privacy and data protection if it's going to be meaningful.

[20:56] No, there are no cheap seats in this game. And then, we concluded with thoughts about risk-based accountability and ensuring companies have the right mechanisms in place to not only implement requirements whether it's GDPR and the regulations there or in the US. Accountability ought to allow a company in AdTech or otherwise to pivot to map data practices to laws globally. My companies are mostly global, and my advice is generally develop your one global continuous strategic privacy program and try not to map every business model to every law, but to start with a foundation of good practices and accountability. And if you do that, the other things start to fall in place more easily. It's not easy, but it makes more sense and it's a better approach. And with that, I'm going to stop and I'm really eager and excited for a lot of questions and hopefully a rigorous debate and some fun with my colleague, Chris, in Brussels.
Christopher Docksey (EDPS)
[21:58] Hi there. Thanks a lot, Marc. I’d like to say before I go into my spiel, I really agree with what you were saying, especially your blog on “Demonstrable Accountability.” I recommend everybody to read this.
[22:13] Hello, everybody. I'm in this panel today because I want to give you a heads up about something important which is happening about the enforcement of EU law at the moment and it answers the question in this great cartoon by Marc. “Do we need to comply with the GDPR?” And this is really quite personal for me. I've been working on the EU Charter for Fundamental Rights since it was given constitutional status a decade ago. I've been working on the GDPR and the Data Protection Directive before that, and the EU Privacy Directive over the same period and I've been tracking the case law of the Court of Justice of the European Union for almost 20 years now. First as a litigant for the court and then as a regulator and now as a teacher. And they're all coming together. I can see the tipping point developing in a way that means it's a good time to come into compliance.
[23:24] In a nutshell, the GDPR and the Court of Justice have brought about a step change in the application and the enforcement of the law. Three things are happening. One, a significant increase in regulators’ enforcement powers under the GDPR and that means new powers to suspend or ban unlawful processing and to impose heavy fines. Then second, there's a new rights of non-governmental organizations to bring representative complaints under the GDPR, which is giving them the power to drive the enforcement agenda. And third, rapidly expanding case law of the Court of Justice and the National Court in the 30 European states in the EU and the EEA really revolutionized the legal landscape.
[24:24] I've singled out four landmark rulings. The ruling in Google Spain in 2014 on the right to be forgotten and the three more recent rulings in 2018 and 2019 on profiling, transparency, and consent - the Wirtschaftsakademie, Fashion ID, and Planet 49. The famous right to be forgotten ruling in Google Spain hardly needs any introduction, but it was revolutionary at the time. Contrary to expectation, the Court found that Google was liable in the EU for processing data in the US originally published in a newspaper in Spain.
[25:05] The Court made its agenda absolutely clear to assure effective and complete protection of the fundamental rights and freedoms of natural persons and in particular their right to privacy with respect to the process in personal data. So, this was the first ruling in what was known as the Accountability Case Law of the Court of Justice. The principle of accountability itself is set out in Article 5 (2) and 24 of the GDPR. And what I find really interesting is that in 2018, in a speech to World Privacy Regulators, Koen Lenaerts, the President of the Court of Justice described the principle of accountability as the central theme of the GDPR. And he stressed how the principle of accountability is underpinned by the court in the case law, which demands high levels of accountability.
[26:07] The second accountability ruling was in Wirtschaftsakademie. In this case, there was a German business school, which used the Facebook fan page to advertise its services. As a result, Facebook was able to place cookies on visitor’s devices and these cookies allowed the business school, Facebook, Facebook’s partners, and third parties to track the internet user visitors of the fan page to profile them and thus be able to offer them more relevant content. Problem: There was a complete lack of transparency by both the business school and Facebook about this processing of personal data of visitors to the fan page. So, the local Data Protection Authority ordered the business school to deactivate the fan page. The Court of Justice held that the business school was the controller along with Facebook as a joint controller and confirmed the regulator had the right therefore to order the business school to deactivate the fan page. To be clear, they were ordered to stop the monitoring and profiling.

[27:19] In Fashion ID, there was an online fashion store that didn’t have a fan page. It had its own website but it placed the social media plugin on its website, a Facebook “Like” button. This website contained this plugin and immediately triggered monitoring and tracking by Facebook in the same way as visiting a fan page. The Advocate General, Mr. Bobek, said the case was essentially about the collection and the transmission of the personal data for the purpose of advertising optimization. By the way and this is a complete parenthesis, Mr. Bobek is the only member of the Court of Justice, to my knowledge, to have used the opening words of Star Trek as part of his legal reasoning. So, for all Star Trek fans, I think we have to agree that he has to be taken very seriously in his analysis. In any event, the court applied its approach. In this case, the same approach as in Wirtschaftsakademie, that whether you're on the web using a fan page, a like button or even a pixel, the result will be the same - responsibility and accountability.

[28:29] Now, President Lenaerts, in his speech in 2018 told us that Planet 49, which had not been decided yet, was going to be a really important case. And when it came, we found that the Courts of Justice laid down a very strict approach to consent. It says that neither bundled consent or implied consent are legally acceptable. And, in addition and I think I've underlined this, the court held that transparency under the GDPR requires a website operator must provide information on the duration of cookies and whether third parties may have access to those cookies. The Advocate General, Mr. Szpunar, in this case added: “If third parties had access to these cookies, their identity must be disclosed.” So, this accountability case law intimately concerns the AdTech industry as of now. And there are two more things happening, which you should know.
[29:33] First, the case law is rocketing. There was a slow start between 1998 and 2010. But then, the court found there had been 53 rulings over the last 10 years. Don't forget that the Court of Justice is a general court like the US Supreme Court. So, this is a lot of cases in one area. And at the moment, in April 2020, there are 13 more cases pending, 17% of the total, many of which are likely to be decided this year in 2020. And second point on this is regulators don't have a free hand. On the one hand, they have the Court of Justice and the rocketing case law. And the case law is not just about substance. It's also about procedure. And in fact, in the Schrems ruling that invalidated the Safe Harbor, the Court told us they had to handle complaints with due diligence and that means care and that means time in preparing cases.
[30:40] And on the other hand, the regulators are being targeted by NGO complaints. Article 80 of the GDPR empowers NGOs to bring collective complaints, which are our local version of class actions. And this has triggered an explosion of complaints to regulators. And these complaints have an agenda. Social platforms and AdTech are squarely in their sights. On the day that the GDPR entered in force in May 2018, NOYB (None of Your Business) lodged complaints against Google, Instagram, Whatsapp, Facebook with regulators in Austria, Belgium, France, and Germany. As a result, in January 2019, the French Regulator, the CNIL, imposed the first and so far the heaviest GDPR fine of 50 million Euros on Google

[31:40] In November of 2018, Privacy International lodged another series of complaints with the regulators in France, Ireland, and the UK against seven data brokers, credit reference data brokers, and AdTech data brokers and these complaints alleged typical elements of private surveillance, lack of transparency, lack of adequate consent, lack of a legal basis for the processing and unlawful profiling. It's interesting to outline what has happened since these complaints and these cases. Why is the CNIL fine the only massive fine imposed so far? I mean, if you saw that New York Times article the other day, you’d assume that nothing is happening over here. But when Wirtschaftsakademie and Fashion ID came out in 2018 and 2019, commentators said that data protection regulators were likely to give the industry time to adapt rather than start imposing fines immediately. And indeed, the Information Commissioner (ICO) in the UK said in 2018 in its update report: “We intend to provide market participants with an appropriate period of time to adjust their practices. After this period, we expect controllers and market participants to address that concern.” But now it’s 2020, the ICO as reported has consulted. Once it brings out its code of practice, the next stage will be enforcement whether fines, blocking orders, or bans. So, the ICO procedure is doing due diligence. The code of practice is a step in the procedure in the enforcement process.
[33:32] So, what I really want to share with you today is that these significant developments in the case law, the legislation, and the complaints are coming together. We're approaching a tipping point, and we can assume that privacy is going to be an aggressive enforcement of EU Data Protection Law. So, it's a question of choice for the AdTech industry to start moving towards accountability now. The court and the regulators will take that into account, and you can dialogue with them on innovative solutions or you can carry on as before and face the consequences. One of those consequences is the case laws are crystal clear. There is no excuse not to come into compliance. Accountability means taking the role seriously and failure to do so will now be seen as intentional behavior, not negligence, which triggers under the GDPR stricter remedies and higher fines. And, of course, in the absence of dialogue and trust with the regulators, they will simply impose strict inflexible solutions. And with that, Gary, I'm going to hand it over to you.
Gary LaFever (Anonos)
[34:50] Thank you, Chris. I'm going to cover how by doing things differently, leveraging functional separation that you can enable the accountability desired by the court and compliant legitimate interests processing so that parties can actually achieve their desired business results.
[35:12] I love this cartoon by Marc Groman because it highlights the importance of the accuracy of defined terms under the law. While many were very surprised to discover that IP addresses are in fact protected as personal data under the GDPR, many more might be surprised to learn the IP addresses do not satisfy GDPR requirements for Pseudonymisation if parties are able to re-link the identity of a data subject to their IP address without requiring access to additional information that's kept separately by the data controller.
[35:47] And here I'd like to emphasize a couple of facts about Pseudonymisation. Pseudonymisation is newly defined under the GDPR. So, the term as you know it in the past is not the term under the law and that's why knowing the proper defined term is so critical. Traditional tokenization is not compliant Pseudonymisation under the GDPR and it's untrue that failed anonymisation equals Pseudonymisation. In fact, I believe it's appropriate to look at the definition of Pseudonymisation under Article 4 (5) as requiring a much higher standard than anonymisation under Recital 26. If you want more information on the differences between anonymisation and Pseudonymisation, they're available at www.MosaicEffect.com, which is one of the biggest shortcomings of anonymisation.

[36:45] In essence, under the GDPR, it cannot be possible - importantly, there is no qualification here of reasonableness - it cannot be possible to re-link the information value related to a data subject back to that data subject’s identity without requiring access to additional information that is specifically separately maintained by the data controller. And again, there are specific details regarding the requirements for GDPR Pseudonymisation that have been produced by ENISA, the European cybersecurity agency, and you can see more information about that at www.ENISAGuidelines.com.
[37:25] So, why should you care about the requirements for Pseudonymisation under the GDPR? Well, the reality is the GDPR provides for express statutory benefits. These are not loopholes and are not workarounds. These are express statutory benefits that are baked into the GDPR, and you can see them cited here on the right, if you can satisfy the requirements for GDPR Pseudonymisation. And I want to highlight two here that are very relevant for AdTech and direct marketing. The first one tips the balance of the favor of the processing on the side of the data controller, as well as the last one the eighth one the ability to lawfully share and combine data, both of which are critical to AdTech and to direct marketing. Again, you can get more information at www.Pseudonymisation.com.
[38:17] GDPR compliant Pseudonymisation is a critical part of functional separation approach that resolves conflicts between innovative data use and data protection thereby enabling the necessary successful global digitization that we need to deal with in the post-pandemic economy. And this is possible because functional separation overcomes limitations of traditional approaches to data protection and privacy that are actually decades old and do not support modern data use.
[38:51] Back in 2015, an EDPS report on big data highlighted the potential for the functional separation to help resolve these conflicts between innovative data use and data protection. In 2015, the EDPS noted at the time that few organizations had experience in the area, but they hoped that others would develop further expertise.
[39:13] When it comes to personalized electronic marketing, they've evolved over the years from an initial broad based segmentation to more efficient but less privacy respectful broad based internet enabled marketing to the ultimately very efficient, but not very privacy respectful, cookie-enabled identity resolution profiling. And this leads us to our discussion about how GDPR Pseudonymisation can enable functional separation to support digitization strategies using dynamically generated micro segments that we call mSegs.
[39:50] The best way to think about mSegs are look alike audiences that are small enough to represent the distinct behavior, attributes, characteristics and even location, necessary to achieve business objectives of direct marketing and AdTech and other data uses, but large enough so as not to enable the inference, singling out, or linking to the identities of data subjects. So, in essence, small enough in size to have high relevance, but less sensitive information so as not to reveal identifying information.
[40:24] This slide highlights some of the business benefits of functional separation and related mSegs. I'd like to focus on the first three. Increased speed of access to lawful analytics, AI, & ML, which is critical as you can tell in this post-pandemic economy to be able to react to changing conditions, as well as enabling data sharing and combining between organizations, and expanded lawful high risk and high reward data use cases. Again, you can see the corollary between the expressed rights under the GDPR for compliant Pseudonymisation and the benefits of functional separation.
[40:59] This slide highlights the shortcomings of traditional approaches to data protection enablement for needed global digitization. And that's because what works in a low risk, controlled environment depicted by the small boat in a “bathtub” simply does not scale to support high risk use out in the open ocean, as it were, of decentralized processing, which is critically necessary for global digitization. And this is because the techniques that were designed and architected to protect data in low risk environments simply don't scale. They become ineffective in high risk, high volume environments like those surrounding advanced analytics, data sharing, and combining.
[41:43] But this shouldn't come as a surprise since these “bathtub” approaches have been around for decades and were developed to protect data uses where you control all aspects of processing - data, data users, uses, and distribution. As a result, they only work for limited, small scale, centralized use cases.
[42:04] In contrast, functional separation enabled mSegs allow you to achieve legitimate business objectives in a more timely and accurate manner, allow data controllers to unlock the full value of data for processing under decentralized global applications while protecting individual privacy rights and providing transparency and auditability. This is critical to enable ongoing access to the data that you need to achieve your business results. And once you achieve access to the data, to process the data in the way that you need to to achieve your results. And more and more, to defend practices that you've done for decades, that due to changes in law are no longer permissible without improved controls.
[42:52] So, let's look at how functional separation can enable greater privacy respectful use of data in the case of a membership organization. This example is extracted from the Dutch DPA tennis association case, which imposed a fine of 525,000 Euros on KNLTB for improperly selling data about its members.
[43:13] The Dutch DPA found that KNLTB failed to provide adequate technical and organizational controls necessary to support legitimate interest processing. What the Dutch DPA found most troubling was the inadequate consent and unbridled broad grant of access to the entirety of the KNLTB database as evidenced by the two red X's on the slide.
[43:40] We believe the result would be different if KNLTB had secured compliant consent to send personalized ads and provided proper notice of lawful legitimate interest processing using functional separation to deliver privacy respectful advertisements. In this way, data minimization and purpose limitation can be enforced.
[44:04] So, now let's look at the AdTech ecosystem. This is actually taken from the identified shortcomings by the ICO and the ICO proposed code of conduct for direct marketing. The application of functional separation to the AdTech ecosystem has been referred to as the 5th Cookie Initiative. If you want more detail on the 5th Cookie Initiative, you can get that at www.anonos.com/5thcookie/.
[44:30] So, some of the ICOs biggest concerns are seen around noncompliant bundled consent as also Chris has highlighted in the case law of the Court of Justice. Also, the surveillance of data subjects using third-party cookies and wanton collection of data that is shared among hundreds if not thousands of participants.
[44:49] But here again, by providing consent that we will hit in the next slide as a more detailed level and approved transparency and legitimate interest processing, you can actually address these shortcomings.
[45:02] In this instance, we're showing that to provide more transparency and detailed specificity as to the type of consent that's going to be collected, we're going to divide that into three different categories that’s collected.
[45:15] And those three categories are: Provided Data, Observed Data, and Inferred Data. This allows more specificity to the data subjects as to the type of data that will be collected and then puts them on notice that legitimate interest processing is going to be used to actually think with data, analyze the data, and send offers to the recipients. Chris, on an earlier webinar, you had noted something on those slides. Do you have anything to comment here?
Christopher Docksey (EDPS)
[45:44] Yes, I think this is really one of the most important slides in the deck actually, Gary, because here you’re actually doing the transparency that’s required by the GDPR. And maybe this isn’t the definitive set of information that has to be put over but it gives us a real feel for what you have to put in front of people when you want to profile them basically. This slide I think is crucial for that. Well done.
Gary LaFever (Anonos)
[46:21] Well, thank you, Chris, and I think what it sets out here is what we’re looking at is the theme of “Rethink or Become Extinct.” And Marc may disagree whether it's actually extinction or what the implications may be, but the reality is by showing a willingness to invest and do things differently, the desired business results and the need for digitization on a global basis, actually can still be achieved. So, here you break it into three steps. The first step is you have data collection with compliant consent based on the different types of data that's collected, and you put the data subjects on notice that legitimate interest will be used for the second step of data science and analysis. But that's a type of analysis that because of its nature and the use of Functional Separation and mSegs, will not have a legal or similarly significant effect. And then, the last step of customer engagement will actually be left up to the data subject as to whether or not they avail themselves of the offers that come to them or whether they exercise their right to opt out of receiving additional offers. So, with that, Dave, I think we should take some questions.
Dave Cohen (IAPP)
[47:29] All right. That sounds great, Gary. Thank you, Mark. Thank you, Chris. Thank you, Gary, for that excellent presentation. We do have some time left and we've got some questions here.
Marc M. Groman (Groman Consulting Group LLC)
[47:39] I can't resist the opportunity to have Chris Docksey on a call with me. And so, I wanted to pose a question to Chris if that's okay since he's here, and I think it would be of interest which is: We both referenced legitimate interest as a way to our mechanism for lawful processing under GDPR for AdTech. And so, Chris, can you pretend that I’m your client and I’m in AdTech. So, now I have to do balancing. I have to balance the legitimate interest of my company in advertising or personalized advertising versus the rights to a data subject. That's a little alien for a lot of people outside Europe. How would I go about doing that?
Christopher Docksey (EDPS)
[48:23] Oh, boy, thanks for that question, Marc. But I think what I have to do is maybe go through how the Court of Justice does it and then embellish that a bit. And you saw that analysis in the Google Spain case. So, if anybody wants to read it, they can find it there. And they did it in the subsequent accountability cases. There's basically a three-step formula. What's the legitimate interest of the controller? The EU Charter includes the freedom to conduct a business as long as its fundamental rights, and President Lenaerts pointed to this as one of the freedoms guaranteed by the charter. So, I would just plead to read them to conduct a business as a fundamental right, possibly also the benefits of innovation and competition, which are really in the public interest. As you said, Marc, we don't want to come out of the Coronavirus experience with there being just five companies left. So, first, what's the legitimate interest of the controller? And there is one.

[49:31] Secondly, does this infringe the rights to privacy and data protection? If so, the balance is against the processing. And if you just say freedom to conduct a business per se and you do whatever you want, then that's going to infringe the rights to privacy and data protection. So, thinking about your IAF column, I would ask a number of questions. Was there information about the processing? Which is a transparency requirement, which was not respected in Wirtschaftsakademie and Fashion ID. Is it necessary for the specific business purpose or does it go further as the necessity requirement? Which is in Planet 49 where they were bundling different things for consent.

[50:15] Could it be achieved in a less intrusive way? The proportionality requirement, which was in Google Spain. And this is crucial, are there safeguards which adequately protect the rights such as risk analysis and mitigation, after a data protection impact assessment, or human intervention with regard to automated decision making using profiling that produces legal effects? These are the issues the regulators will also have in mind because they're worried. And the ICO in its report in 2019 said that the scale of the creation and sharing of personal data profiles in RTB appears disproportionate, intrusive, and unfair, particularly when data subjects are unaware that it's taking place.

[51:08] And so, you have those tests and if you can put in the safeguards and comply with the need to be specific and transparent, then you can move the balance into your favor as an accountable company. And finally, just to complete the analysis, it may be that there is actually a public interest in the data being out there. For example, freedom of the press, and that was a subject of another ruling recently in a CNIL case from France. Is that okay, Marc? Does that answer your question?
Marc M. Groman (Groman Consulting Group LLC)
[51:46] It does. I guess the one nugget that I sometimes get stuck on is in the context of advertising or direct marketing, what does “necessary” mean in that balancing?
Christopher Docksey (EDPS)
[52:02] Well, you look at the purpose. So, it would depend which of the actors in the AdTech ecosystem you're looking at. Is this the publisher? Is it the advertiser? Is it the AdTech in the middle? They will all have their own specific business purposes. And once you've identified the business purpose. Let's take the easy one, the advertiser: “I want to sell something.” Then, you have to identify what can I do that is necessary for that purpose as opposed to being extra? I would like it, but it's not actually necessary.
Marc M. Groman (Groman Consulting Group LLC)
[52:49] Thank you, Chris. I appreciate that. Now, let's turn to the other questions from the audience.
Dave Cohen (IAPP)
[52:57] Terrific. Thanks. That was great. And staying with you for a moment here, Chris, can you comment on how much divergence there is now between member states and how you suggest companies navigate those differences across the EU? Does this make accountability more difficult?
Christopher Docksey (EDPS)
[53:14] You've got two questions there actually, the level of divergence and whether it makes accountability more difficult. In principle, the GDPR is like an Act of Parliament. It's a regulation. It should be the same law in every member state, but they did allow a surprising amount of divergence between member states for regulation over a dozen areas where they can make their own rules or go further. And in fact, some of those I don't think they should be there, but they were the price of adopting it. But there are three saving graces for these differences. Firstly, if there is a difference, it's up to national regulators to find solutions that guarantee the free flow of data in internal markets. They cannot block the free flow of data. Second, national rules can't undermine the protections in the GDPR. And as we've seen in the case law, the Court of Justice is in charge of policing both controllers and regulators, and it's insisting on guaranteeing the most effective protection of the rights in privacy and data protection. So, if there's a difference, you take the higher standard.

[54:27] And third, and this is the reply to the question about accountability, I would turn it around and I would say accountability is the solution for the differences. It's not that the difference is causing problems for accountability. It's true they can make compliance more challenging, but an accountable controller has thought in advance about what it needs to do and how it can do that whilst it's respecting the privacy and data protection rules. Accountability is hardwired into the GDPR. It is actually probably the most important innovation in the GDPR. So, a company doesn't need to know the GDPR by heart, or these differences, so long as it has got a privacy professional in place that does know the law or know where to find it. Now, I'll just finish by saying the Article 29 Working Party has asserted that the DPO or the privacy officer is the cornerstone of accountability.
Dave Cohen (IAPP)
[55:32] Terrific. Thanks, Chris. That makes very good sense. And, Gary, I'd like to direct this next question to you, if I may. And it's this: Does all profiling for advertising have similarly significant effects? If yes, how is this a risk-based framework? Certainly, all profiling is not equal, but it appears to be treated that way. How would you respond to that, Gary?
Gary LaFever (Anonos)
[55:55] It's a great question and there's a lot of confusion on this. If you just read some of the case law or the guidance, it sounds as if all profiling by definition has a legal or similarly significant effect, but in fact that's not true. I just want to make a couple of notes here in case people want to verify it on their own. The Article 29 Working Party guidelines on automated decision making and profiling itself says: “In many typical cases, the decision to present targeted advertising based on profiling will not have a similarly significant effect on individuals. For example, an advertisement for mainstream online fashion outlets based on a simple demographic profile.” That's one of the reasons why functional separation enabled microsegments (mSegs) which are targeted to small groups of people and leave it up to the individuals within those groups whether or not to respond takes away the legal or similarly significant effect. Even the ICO report, which is causing a lot of concerns, that even says - even in the context of real time bidding - that automated decision making and profiling can have a significant effect on individuals.

[57:09] So I think what's really, really important is that a data controller be able to show that they went through the analysis - the data protection impact assessment, the legitimate interest impact assessment, they've applied principles of accountability and proportionality, and they have technology and controls in place that support the policies coming from that analysis to show that they've sufficiently mitigated the risk to the individuals. That's what it's about. You have to show demonstrable accountability and technical safeguards that enforce your policies and procedures to mitigate the risks to the data subjects. That's when the balancing of interest tests can be won by the data controller so they can continue to process the data. And in that instance, profiling should not and I believe does not involve a legal or similar effect.
Dave Cohen (IAPP)
[57:59] Okay. Thanks, Gary. And speaking of demonstrable accountability, there is a good follow-on here and I think we'll address this one to Marc. Why is demonstrable accountability an acceptable solution? Isn’t that a soft option? Marc?
Marc M. Groman (Groman Consulting Group LLC)
[58:12] Thanks. I'd like to unpack the question because it suggests that demonstrable accountability is the solution. It depends on the question we're posing in a problem that we're exploring. But I think that I'd like to go back to the answer that Gary just gave, which really keyed out where accountability plays just a critical and integral role, not only for GDPR compliance, but it should play a critical role for any company's global, continuous, and comprehensive risk based data protection program. And so, we talk about it in the context of GDPR where we do analysis under legitimate interests. We do balancing where it's baked into GDPR where we have to show to potentially a regulator or a business partner that we have the right processes and procedures in place. That we have data governance that we are taking steps through whether it's Pseudonymisation or encryption or other technologies or data protection by design - all of that goes into accountability, which is really like the foundation underneath the other requirements that get all a lot more attention.

[59:22] We always talk about it's about consent or it's about your legitimate business interest but below that has to be a framework of accountability to show compliance with GDPR, to show that in at least the context of the EU that the rights and interests of data subjects have been effectively taken into account and considered, or equivalent risks addressed or mitigated on this side of the Atlantic. So, it is a critical part of GDPR compliance. And I think it will be a critical part of any global company's global strategic data privacy program going forward. It's just absolutely pivotal. It doesn't get the attention of the other concepts, but in some respects it's more important.
Dave Cohen (IAPP)
[1:00:11] Terrific, Marc. I’m staying with you for a moment here. There's a good follow on question for you related. Why do you think the ICO is so skeptical about legitimate interest for subsequent processing then? And after you answer, Mark, I'd be curious to hear Chris’ and Gary's answer as well.
Marc M. Groman (Groman Consulting Group LLC)
[1:00:29] Well, I think that unfortunately, there's a lot of skepticism directed from regulators across Europe towards the industry in general and sometimes the larger tech companies that are here in the US, but there's that general skepticism about a willingness to really invest the resources to conduct the balancing, to demonstrate accountability, to show you've done the risk assessment. And so, that skepticism, I think, sort of purveys other issues and the relationship and how they go forward. It's unfortunately, not in all cases but in many cases, justified based on at least their perception of how compliance has gone to date with GDPR. But on the other hand, we've seen companies and trade associations reacting to guidance. Hopefully, that's a positive step. And hopefully, what's critical will be the ability to have a really thoughtful and nuanced discussion so that we can approach the balancing and the analysis thoughtfully based on the facts that technology and data are being used and not have that sort of a conclusion ahead, or before we do the analysis. I don't want there to be a conclusion already that: “Oh, you probably don't have it.” That would not be helpful for regulators, for consumers around the world, or for the business.
Dave Cohen (IAPP)
[1:01:51] Terrific. Thanks, Marc. Chris?
Christopher Docksey (EDPS)
[1:01:55] Well, Dave, if you look at what the ICO itself has said, I think it is worried. Even if you could argue in favor of relying on legitimate interest, what it's found and what it's put in this report is that a lot of controllers don't know what legitimate interest means and what it requires. They think it's a soft option compared to consent. I’m only telling you what they think and to think of legitimate interest as an easy option is really unbelievable. And what the ICO said is that controllers are simply not carrying out the legitimate interest balancing tests that Marc and I just discussed - the legitimate interests balancing test - or implementing the technical controls or the safeguards that you need. So, the ICO has been looking at the industry and I think it has found it wanting.
Dave Cohen (IAPP)
[1:03:07] It makes good sense and we’ll see how this plays out over the coming months for sure. Gary, I know you have some comments here as well. Yes?
Gary LaFever (Anonos)
[1:03:15] Yes. So what I find interesting is I find a lot of data controllers believe the legitimate interest test is an outcome-based test meaning they feel they have a legitimate interest in using the outcome of the processing when actually legitimate interest is a process-based test. There are three tests to it. First, do you actually have a legitimate purpose? And so, there they may satisfy the first of the three. But the second is the necessity test. Do you have to get this particular data from this source to achieve your legitimate purpose? It's the third one, the actual balancing of interests test is a process-based test where you have to show you have in fact done the analysis and have tools in place that enforce the policies and procedures that come out of your analysis so as to mitigate the risks to the data subject. If you can't pass that process test, you don't get the benefit of the outcome. So, I think there is, for whatever reason, a feeling that if you have a legitimate interest in the result of the process that you should be able to use legitimate interest and everything that Marc and Chris just said is actually true, which is that's a naive assumption and mindset when it comes to legitimate interest. It can be done, but not with that mindset. And so, I do believe and that's my hope for the industry, and not just for AdTech and direct marketing but for many innovative uses of data, that if the controls are put in place to enforce the appropriate policies and procedures and the concepts of accountability and transparency, that processing can still occur for the benefit of all society and that includes AdTech and direct marketing and many other applications.
Dave Cohen (IAPP)
[1:05:02] Terrific. Thanks very much, Gary. We're running out of time here. So, we have one last question. I'm going to direct this toward Chris. It’s a question on the current environment and also has tendencies to look to the horizon and what may be in store down the road for companies as they seek to comply in this new environment. And here's the question, Chris, why has there been such a massive increase in decisions by the Court of Justice? What do you think is going on here?
Christopher Docksey (EDPS)
[1:05:30] Well, the first thing, Dave, to note is that the Court is essentially reactive. It doesn't have its own agenda. It decides the cases that have been referred to it by the National Courts. And if you look at those accountability cases that I discussed, they were cases that were brought at the national level either by Data Protection Authorities, by consumer protection organizations and of course there are other cases brought by NGOs. So, the simple answer first is that a lot more people are going to court at the national level. And the National Courts are asking the Court of Justice to rule on what European Union law means, and this is having a self fulfilling effect. The more decisions there are, the more interest there is in National Lawyers and the Courts in using the Courts of Justice to get clarification on the law. So, I think processes are going on, and I really think we're going to see more and more decisions coming out to the Court of Justice as a result.
Dave Cohen (IAPP)
[1:06:47] Wonderful. Thanks, Chris, very much. And with that, unfortunately, it's been a wonderful and rich discussion here today but we have run out of time. So, thanks very much, Marc, Chris, and Gary for your time and expertise sharing here on the program today. It's much appreciated to all of you making some time for us. Thank you all very much.
Gary LaFever (Anonos)
[1:07:06] Our pleasure. Thank you, Dave.
Dave Cohen (IAPP)
[1:07:08] And a big hearty thank you to Anonos for sponsoring this program. We here at the IAPP certainly appreciate the support and the privacy education for our membership. We couldn't do it without you. So, we wanted to give a big shout out to you for underwriting this program. It's very much appreciated. And if you have a moment and you haven't dropped off the line yet and have the ability to click on this live link and give us some feedback as to how you enjoyed this program recording, we'd really appreciate that. It takes you to a very short survey. It takes literally 2 minutes to fill out. Importantly, there's a field in there where we can hear about issues and topics you'd like to hear about on future programs.

[1:07:48] If you are an IAPP Certified Privacy Professional and you're wondering about CPEs or Continuing Privacy Education credits for this program, and you registered through the IAPP website, you're going to automatically be granted one (1) CPE credit. You don't have to do anything. It will just be credited to your account. If you're listening to this recording and you'd like to receive that CPE and you didn't actually register through the website, you can still receive that credit. There's an easy-to-fill-out form under the certification tab on the IAPP website and go there, fill that out, and we'll get that credit for you. If you're an attorney, and you're wondering about Continuing Legal Education credits, we don't actually provide those pre-certified but this program will be eligible in some jurisdictions in the United States. So, you'll need to apply within your particular jurisdiction for that. And if you need supporting materials for that, you can feel free to reach out to me with that or any other feedback. My email address and phone number are here on the slide in front of you. So, please feel free to contact us. We'd love to hear from you and that would be terrific. So, with that, thanks again one last time everybody for joining us today. We really appreciate it and we hope to see you at another privacy education web conference in the near future. With that, I will take us through a program close.
CLICK TO VIEW CURRENT NEWS

Are you facing any of these 4 problems with data?

You need a solution that removes the impediments to achieving speed to insight, lawfully & ethically

Roadblocks
to Insight
Are you unable to get desired business outcomes from your data within critical time frames? 53% of CDOs cannot achieve their desired uses of data. Are you one of them?
Lack of
Access
Do you have trouble getting access to the third-party data that you need to maximise the value of your data assets? Are third-parties and partners you work with worried about liability, or disruption of their operations?
Inability to
Process
Are you unable to process data due to limitations imposed by internal or external parties? Do they have concerns about your ability to control data use, sharing or combining?
Unlawful
Activity
Are you unable to defend the lawfulness of your current data processing activities, or data processing you have done in the past?
THE PROBLEM
Traditional privacy technologies focus on protecting data by putting it in “cages,” “containers,” or limiting use to centralised processing only. This limitation is done without considering the context of what the desired data use will be, including decentralised data sharing and combining. These approaches are based on decades-old, limited-use perspectives on data protection that severely minimise the kinds of data uses that remain available after controls have been applied. On the other hand, many other new data-use technologies focus on delivering desired business outcomes without considering that roadblocks may exist, such as those noted in the four problems above.
THE SOLUTION
Anonos technology allows data to be accessed and processed in line with desired business outcomes (including sharing and combining data) with full awareness of, and the ability to remove, potential roadblocks.