Legitimate Interest Microsegmentation-Based Direct Marketing

Presentation Transcript
Christopher Docksey
Honorary Director General
EDPS
Martin Abrams
Chief Strategist
Information Accountability Foundation (IAF)
Gary LaFever
CEO & General Counsel
Anonos
Dr. Sachiko Scheuing
European Privacy Officer
Acxiom
Summary Slide from Webinar
TO VIEW / DOWNLOAD SLIDES
Gary LaFever (Anonos)
[0:00] Welcome everyone! We appreciate you taking the time particularly in these uncertain times to join us today for a very insightful and timely message - Legitimate Interest Microsegmentation-Based Direct Marketing. We have a fantastic panel and I certainly can't do a better job of introducing them than they can themselves but I will start first. My name is Gary LaFever. I am the CEO and General Counsel of Anonos, which is a Pseudonymisation and functional separation company, and it's important actually that I have those dual roles because I believe you will find in this presentation it's the bridge between achieving business goals and complying with legal requirements that is both the challenge but more importantly and exciting the opportunity that we have going forward to achieve the dual goals of data utility and privacy. And with that, I'd like to ask Christopher Docksey. Chris, if you could introduce yourself, please.
Christopher Docksey (EDPS)
[01:04] Thanks, Gary. I am Chris Docksey. I used to be the Director at the EDPS, and I hold the title now as the Honorary Director General. I’m honored to be a member of the Guernsey Data Protection Authority. I’m on the Board of the European Centre on Privacy and Cybersecurity (ECPC) at Maastricht University, and I’m one of the authors and editors of the Oxford University Press commentary on the GDPR.
Gary LaFever (Anonos)
[01:29] Thank you, Chris. We also have Martin Abrams, Chief Strategist of the Information Accountability Foundation, generally known as the IAF. Marty, if you could please introduce yourself.
Martin Abrams (IAF)
[01:42] Sure. I'm Marty Abrams, and I'm a privacy purist and strategist. The IAF is a nonprofit research and educational organization. Mostly, I’m known as the person who led the global accountability dialogue that created the structure for modern accountability as adopted in the European Data Protection Regulation and in other jurisdictions as well.
Gary LaFever (Anonos)
[02:10] Thank you, Marty. We also have Dr. Sachiko Scheuing. Sachiko, if you could please introduce yourself.
Dr Sachiko Scheuing (Acxiom)
[02:18] Sure. Thank you, Gary. I am Sachiko Scheuing. I am the European Privacy Officer of Acxiom, a global marketing services company. I'm also the co-chairwoman of FEDMA that is a Brussels-based association called the Federation of the European Direct and Interactive Marketing Association.
Gary LaFever (Anonos)
[02:41] Fantastic. So, we have a great panel. We have an hour. We have a lot to cover. So, let's get to it.
[02:47] Slide two has a repeating theme here that you'll see through the webinar these two questions. “Do I really need to do things differently?” And you will see by the end of this webinar, if not early on into the webinar, the answer to that question is YES. The current status quo for direct and personalized marketing and AdTech is simply not lawful. But the good news is the second question. “Can I still achieve my business objectives?” And there, the answer is YES!. You just have to approach things differently in high-risk environments.

[03:26] And a particular focus of this webinar is a concept of functional separation. It is a technique that was recommended back in a prescient 2015 report by the EDPS. You will also see this game board to the right use throughout this webinar, and you'll actually see it with different wording sometimes on the face of the game board. But it's all about the need to balance interests and objectives.
[03:54] Slide three highlights the EDPS report that I focused on, and I just want to quote from it. Again, this is back in 2015, a very prescient report. The EDPS noted a specific area where “Innovative engineering solutions should be encouraged related to the concept of functional separation.” The reason they said that was because functional separation has been identified for years as an aspirational way to balance the interest of data utility usefulness, value, and innovation while protecting the fundamental rights of data subjects. But as noted on this slide, there had only been a few organizations who had pursued it in the past and the EDPS was hoping in fact that companies would pick this up. Unbeknownst to the EDPS at the time, Anonos, my company, had actually been working on this concept for 3 years. And now, 8 years later, we will spend a small portion of the webinar talking about how Anonos does functional separation. But more importantly, let's turn it over to Chris now on slide four as he starts his part of the panel. Chris, if you would, please.
Christopher Docksey (EDPS)
[05:14] Thanks, Gary. I'd like to focus on the need to respect and to balance the fundamental rights enshrined in the GDPR and in the overarching EU Charter of Fundamental Rights especially, of course, the rights of privacy and data protection. These rights are being vigorously interpreted and applied in the rapidly expanding case law of the Court of Justice of the European Union and at the National Courts of the 30 European states of the EU. And the result is one of the central themes of this webinar. Things have changed and it is not possible to carry on as before
[05:54] But you’ve heard a lot about the GDPR and the reform of EU Data Protection Law. And if we go to the next slide, we'll see the EU Data Protection Law has other components that are not limited to the GDPR. I’ll stress two other elements in particular. The first element is the massive influence of the EU Charter of Fundamental Rights, which is our European Bill of Rights. The starting point of the charter is that we're dealing with fundamental rights of human beings for the most part, not simply a right for consumers as for one of the charter starts the ringing declaration that human dignity is inviolable. A part of that human dignity is the right to privacy in Article 7 of the charter and the right to protection of personal data in Article 8. The charter also protects other fundamental rights such as freedom of expression and freedom of information under Article 11, the freedom to conduct a business under Article 16, and the right to property under Article 17. The second crucial element is the role of the Court of Justice, and we'll see that in a moment in this recent case law.

[07:08] These two elements have had a powerful influence on the regulators to enforce the law and on NGOs (nongovernmental organizations) that now have the rights under Article 80 of the GDPR to bring representative complaints. So, I think that actually what the GDPR has brought is not so much a change in the law as a step change in its application and its enforcement. The law may look complex, but much of it had been in place since the previous 1995 Data Protection Directive came into force in 1998. Consent, legitimate interest, necessity and proportionality without explicitly, the Court of Justice has made it plain that Pseudonymisation and profiling were also there. So, the level of complexity imposed by European Law is not new. People should have been working hard to respect it long before now. But now the GDPR has been enacted, the Court of Justice is intervening forcefully to support it.
[08:11] As we go to the next slide here, we'll see that even though privacy and data protection is in the process of fundamental change because of the reform of the legal framework. But it is also because of the development of the case law of the Court of Justice and the number of landmark rulings and I would single out for Google Spain in 2014 on the right to be forgotten, rulings in 2018 and 2019 on social plugins and the need to provide information of the Wirtschaftsakademie and the Fashion ID case, and on profiling and implied unbundled consent in Planet 49.
[09:01] And indeed, the next slide shows us that the number of cases is increasing almost exponentially. If we look at the data protection cases, which were brought before the court over the period of 1998 to the present, we can see there is a slow start up to 2010. But then, the court found its feet and then the decade from 2011 to now there being 53 rulings. And at this moment in April 2020, there are 13 more cases pending, many of which are likely to be decided this year. These cases by the way are almost all national cases, which should be referred to the Court of Justice on points of law, and they show that the pace of enforcement within the EU Member States is equally dramatic. And if we look then at the legislation and at the case law, what is really crucial is an emphasis on two things - accountability and enforcement.
[10:05] Accountability in the next slide is a major element of the legislative reform. And it was anticipated and supported by the case law and the Court of Justice. In his speech in 2018, the president of the court, Mr. Lenaerts described accountability as the central theme of the GDPR. He stressed how accountability is independent by the accountability case for the court. He also said that the case law of the court on profiling was very significant. This case law supports the GDPR’s new system in respect to Data Protection rules and the need to move away from what we call the private surveillance model. The problem is not profiling in itself. There are ways of carrying out profiling lawfully if the essential GDPR requirements are met in particular stress transparency and the need for fair and lawful processing. The problem is the hidden surveillance which has developed as the business model of the Internet.

[11:13] The GDPR and the case law of the court say this has to stop and that takes us to enforcement and the GDPR gives new powers to regulators including the power to suspend or block unlawful processing and the power to impose antitrust level expensive fines on companies that carry on as before, and the case law of the court has reinforced the GDPR by laying down a series of uncompromising requirements on regulators to hear complaints and to enforce.
[11:49] As well as this, the GDPR has opened up this new avenue of enforcement by empowering NGOs to bring collected complaints on behalf of individuals. This has triggered an explosion in complaints to regulators. On the day that GDPR entered into force in May 2018, the NGO, None of Your Business, lodged complaints against Google, Instagram, WhatsApp and Facebook to the regulators in Austria, Belgium, France, and Germany in Hamburg. As a result in January 2019, the first fine came out - 50 million Euros on Google imposed by the French regulator, the CNIL.

[12:31] In November 2018, the NGO, Privacy International, lodged a series of complaints to the regulators in France, Ireland, and the UK, against seven data brokers, credit reference data brokers, and AdTech data brokers and these complaints alleged the typical elements of private surveillance, lack of transparency and consent, lack of the legal basis for the processing, unlawful profiling of individuals. If you want to understand what EU law requires and why the ICO has made its Draft Code so tough, then read these complaints and read the ICO update report of June 2019 and they show why the ICO and maybe all the regulators and the EDPB are so concerned about the massive lack of compliance involved in AdTech and real-time vision. So, this is the context that we have to look at - the ICO Draft code. It is not just guidance. It is a step in the enforcement process.
[13:38] And one last point if we look at the next slide on proportionality and necessity, when you're considering how to carry out profiling lawfully, it's crucial to understand what the Court of Justice means by the concept of proportionality. I mentioned right at the start that the EU Charter protects the fundamental rights of human beings. It's not simply a charter for consumers. It is vital for the GDPR to note that these rights have to be balanced. Protection of personal data is an absolute right. It must be balanced against other fundamental rights such as data protection, freedom of expression, freedom to conduct a business in accordance with the principle of proportionality.

[14:25] And it's a common mistake to assume the principle of proportionality and the EU law is something like the concept of reasonableness under common law. It is, however, much more demanding if combined with the requirements of necessity. Article 6, Paragraph 1(f) of the GDPR or the so-called “balancing clause” provides that processing shall be lawful only if and to some extent that it is necessary for the purposes of legitimate interest pursued by the controller. The requirements of proportionality are strengthened by the requirements of necessity. Not only must the processing be proportionate to the objective, it must be the least intrusive way of achieving that objective. If there's another less intrusive means of achieving the objective, that should be used. That is the only legally viable option.
[15:21] So, if I conclude here how I come to the idea that we're approaching a tipping point because of these significant developments in the legislation and the case law, and this tipping point is the context in which we have to assess the ICO Draft Code. Regulators and the courts are losing patience in the systemic lack of compliance, and it seems to me they are proceeding to enforcements. Consent is restricted and legally the safest option for such enforcement. The Court of Justice will not criticize the regulator for being strict. But consent is not the only lawful reason for processing and regulators do not completely rule out a more flexible approach as long as they ensure compliance.

[16:16] You can see it in the Draft Code where the ICO says that: “Generally speaking, the two lawful basis that are most likely to be applicable to your direct marketing purposes are consent and legitimate interest.” And the gateway to a more flexible solution is accountability. Now, Marty is going to talk about accountability but just let me say here that it is not a form of self regulation. In the EU, accountability is the core of the regulatory framework. It's at the heart of it. It means proactive and demonstrated compliance with the rules and compliance has to be effective. It has to be hardwired into the technology using the GDPR tools such as privacy by design, encryption, and Pseudonymisation.

[17:05] Accountability is actually the basis of the legitimate interests approach. The Article 29 Working Party said in 2014 that the proper balancing of interests is an accountable process and requires an analysis of what processing is necessary, how does it impact on the privacy of individuals, and how do you mitigate that impact to the right side of the balance. So, accountability sets a very high standard for legitimate interest. The ICO already warned in June last year that the scenarios where legitimate interest could apply are limited. It is not an easy option. It requires work. Things have changed and it is not possible to carry on as before. Maybe one avenue is an accountability-based ecosystem that can build trust with the regulators. And on that point, it sounds like a good time for you to take over, Sachiko.
Dr Sachiko Scheuing (Acxiom)
[18:05] Thank you, Chris. So, this part of the webinar focuses on the evolution of processing activities resulting in privacy practices, which may have worked in the past, as Chris said, but no longer protect personal data in today's complex environment for digital personalized marketing and advertisement. And this is why the answer to the first question on the screen: “Do I really need to do things differently?” is YES. My talk creates a bridge to Gary’s section later on on how functional separation can help enable lawful, personalized marketing and advertisement, which is why the answer to the second question: “Can I still achieve my business objective?” is a YES.
[18:57] Several decades in the past, marketers and advertisers used broad-based segmentation to reach desired audiences. By advertising in publications and on television shows, they believe they attract the type of customers that are interested in the advertised goods and services. This broad segment based approach was very privacy respectful because individual consumers were not targeted, but rather large segments of the population were targeted. However, the approach was not efficient in reaching the best qualified prospects. With the introduction of broad-based internet-enabled marketing, advertisers like Pampers, Ford, or Financial Times could purchase ad space directly from publishers of websites based on customer types likely to visit the website. This approach was more efficient than broad-based segmentation and was also privacy respectful since no data was accessed or observed about the visitors.
[20:08] Very quickly, however, access to cookies made it possible to, for instance, find out which websites were visited by the user. The easy attribution of data to individuals was very efficient, but it was not privacy respecting. This is because risk-based measures that protect privacy when personal data is used in controlled environments like first-party cookies that improve visitor’s experience on websites are ineffective at protecting privacy when used on a widespread basis. For example, third-party cookies used in today's distributed AdTech ecosystem can be accessed by hundreds if not thousands of parties in the ecosystem. So, rather than protecting privacy, the open nature of the digital advertisement environment inadvertently enabled access to people's browsing behavior causing concern among lawmakers, regulators, and the court.
[21:18] The problem is that even though you don't have directly identifying data like name and email addresses, accumulating attribute data elements like age group, interest in saving, or has a cat associated to the static ID can be used to single out the person behind the ID. This process is known as the Mosaic Effect. For these reasons, organizations that desire to process data at the scale and speed required for effective personalized marketing and advertisement must adopt advanced risk-based measures that mitigate risk in high-risk processing environments.

[22:07] If data is anonymous, the GDPR does not apply. And if it isn't, the GDPR applies. So, under the GDPR every processing of personal data needs to be based on one of the six legal bases listed out in Article 6(1). Consent, necessary for contract, and legitimate interests are usually the three legal grounds used for marketing. Now, I just spoke about how rudimentary anonymisation techniques do not always work in large distributed environments. You may wonder whether organizations can rely on necessary for contract or consent for lawful personalized marketing and advertising.

[22:59] Let's first look at contract. The European Data Protection Board (EDPB) said the term “necessary for contract” should be strictly interpreted. For example, a contract is not considered an appropriate lawful basis for improving an offering. While the right to receive improvements may be included in a contract, these improvements are not necessary for the performance of the contract. Contract is also not considered a suitable legal ground for maintaining a customer intelligence database containing users tastes and purchasing behavior because the company's only contracted to deliver a specific good or service.

[23:47] Next, let's look at consent. The requirement that consent requests must be clear and easy to understand creates difficulties when explaining complex purchasing in data subjects such as AI or machine learning basic blackbox tools. While the importance of consent under the GDPR and law like the ePrivacy directive or its successor cannot be overstated, the clear standards for securing GDPR-compliant consent can be difficult to achieve. This produces two unacceptable risks. First, the risk of nullifying protection for data subjects by watering down the requirement. A data subject or the user must be sufficiently informed and aware of what they're being asked to consent to. Second, the risk of removing from the data ecosystem all societal and economic benefits for processing was just too difficult to explain, as may be the case for personalized marketing and advertising.
[25:03] The shortcomings of anonymisation, contract, and consent highlight the importance of legitimate interest as an available legal basis for marketing and advertisement. While you are not allowed to switch between legal bases - and allow me to emphasize that there is no repurposing possibility for consent collected data under the GDPR, you are allowed to specify more than one legal basis when data is collected. So, having legitimate interest as an additional legal basis helps to overcome a number of limitations that apply when consent is used by itself. For example, if you rely exclusively on consent, you must be prepared to support withdrawal of consent as well as request for data erasure or data portability to your competitor.

[25:58] In addition, reliance on consent by itself requires the disclosure of all of the data recipients at the time of data collection in addition to securing consent for each separate process, which may be difficult to provide at the time of data collection. These requirements do not apply when legitimate interest is used to support desired processing. But here’s where it’s really important, merely claiming to have a legitimate interest in the results of processing is not enough. A compliant and legitimate interest purchasing requires an organization to successfully pass a three-part test. First, The Purpose Test. “Are you pursuing a legitimate purpose?” Second, The Necessity Test. “Is the desired data necessary for that purpose?” And third, The Balancing of Interests Test. “Are technical and organizational safeguards in place for instance so that data subject’s interests do not override the legitimate interest of the data controller or third party in the results of the desired processing?”
[27:18] So, this brings us to the benefits of GDPR Pseudonymisation. First, we must stress that Pseudonymisation is newly defined under the GDPR and that not all tokenization is equal to GDPR-compliant Pseudonymisation. Before the GDPR, static tokenization like simply substituting clear text identifiers with an ID was generally considered Pseudonymisation. Under the prior German Federal Data Protection Act, Pseudonymisation was defined as replacing the data subject’s name and other identifying features with another identifier in order to make it impossible or extremely difficult to identify the data subject. Today, however, Pseudonymisation under the GDPR requires first protection of both direct and indirect identifiers and to specifically keep additional information needed for re-identification separately plus implementation of safeguards to ensure data subjects cannot be singled out even by combining attribute data elements.
[28:45] GDPR-compliant Pseudonymisation requires the separation of the information value of personal data from the means of associating the data back to the identity of the data subject. The two cannot be re-linkable without requiring access to additional information kept separately by the data controller. So, in this example on this slide, the information value is male and middle aged, and the fictitious data subject is John J. Jeffries who is 47 years old. Merely assigning a token to replace the name John J. Jeffries and the age 47 does not make the data pseudonymous under the GDPR if other data is available that can re-identify John J. Jeffries. For example, if you knew that only one record in the dataset is for a male who is 47 years old and you knew that John J. Jeffries is in the dataset and is 47 years old and you have access to the information indicating the token used to indicate 47 years old. Then, you know which record relates to John J. Jeffries.

[30:08] To be pseudonymous under the GDPR, it must not be possible to go back and forth over the brick wall in the graphic indicating the separation of information and value from identity without access to additional information that is kept separately by the data controller in this graphic protected by what you see there as the enclosed brick courtyard. The EU Cybersecurity Agency, ENISA, has published two reports in November 2018 and again in November 2019 on recommendations and requirements for GDPR-compliant Pseudonymisation.
[30:47] In addition, the German Ministry of Interior has published a draft for a code of conduct on the use of GDPR-compliant Pseudonymisation.
[30:58] So, why should you care about GDPR-compliant Pseudonymisation? First, Pseudonymisation is specifically recognized by the Working Party Article 29 and current EDPB as a technical and organizational safeguard that can help to tip the balance in favor of a data controller in the balancing of interest test for legitimate interest processing. In addition, Pseudonymisation is explicitly recognized under the GDPR as helping to enable more flexible change of purpose, more expansive data minimization, more flexible storage limitation, enhanced security, more expansive further processing, more flexible profiling, and the ability to lawfully share and combine data. Now, over to you, Gary.
Gary LaFever (Anonos)
[31:54] Thank you, Sachiko. So, on slide 21, we're looking at the same two questions again. “Do I really need to do things differently?” And I think by now you've realized the answer is YES. But the exciting part here is: “Can I still achieve my business objectives?” And there, the answer is YES as well.
[32:15] So, on slide 22, I'd like to highlight the benefits of functional separation in high-risk processing environments by focusing on just one of the items noted by Sachiko, the lawful sharing and combining of data. When engaging in high-risk processing like data sharing and combining, you simply cannot turn anonymity, contract, or consent into something they were never intended to be.
[32:46] Slide 23 shows that when working in a low-risk controlled environment depicted by the small boat in a bathtub, those same capabilities simply do not support high-risk use out in the open ocean as it were of decentralized processing. Techniques to protect data in a low-risk environment do not scale and in fact become ineffective in high-risk environments like data sharing and combining. As one example, data that is anonymous in a low risk self-contained environment like a bathtub quickly becomes not anonymous, and therefore personal data subject to the GDPR when combined with additional data available in the open ocean of decentralized processing and several of the cases cited by Chris make it clear that an entity does not itself have to process personal data to be considered a joint controller. In fact, if a party helps to determine the purposes and means of processing, and causes the processing of data to start, the fact that another party causes the data to lose its anonymity at a later stage processing does not necessarily relieve the first party from potential liability as a joint controller of personal data. So, as you can see, when you move from the little boat in the bathtub to the open ocean, the risk increases measurably. And the same problem exists for consent, as noted by Sachiko. Consent that supports lawful use in the bathtub of low-risk processing does not generally scale to support high-risk processing like sharing and combining of data.
[34:36] So, on slide 24, we start to get to the good news. In contrast to the limitations of anonymisation, contract, and consent, functional separation can help enable legitimate interest processing and actually supports increased data protection because it mitigates the risks in high-risk processing environments as a complement to consent and those balancing of interest requirements are enforced technologically at all times. So, in the context of direct marketing and AdTech, this can be done by targeting data subjects as members of privacy respectful lookalike audiences that we will call dynamic micro-segments or mSegs.
[35:29] Slide 25 actually highlights the three steps that Anonos uses for functional separation. I encourage everyone to check out the three websites that are noted on this slide. Due to the limited nature of the time of this webinar and the desire to cover the different perspectives, I'm only going to be able to go into a certain detail. But in summary, Anonos technology enables functional separation and the three steps noted on slide 25. First, the information value of personal data is separated from identity with re-linking only possible via a separately held additional information. This is the definition of GDPR Pseudonymisation. And more information on this is available at www.Pseudonymisation.com.

[36:20] In the second step, controls are embedded in the data at the data element level so that the data use policies flow with the data and remain enforced even when processing is decentralised beyond your control, and this is done using dynamically changing identifiers that insert maximum entropy into the data to defeat unauthorized re-identification by the mosaic effect, which Sachiko had noted. Again, more information on this step is available at www.MosaicEffect.com

[36:55] And lastly, control over the re-linkability of the data is put in the hands of a data steward who selectively enables the re-identification of individuals, but only for authorized processing. For more information on this step, visit www.ENISAguidelines.com where we show how Anonos technology complies with the guidelines established by the EU Cybersecurity Agency, ENISA, which Sachiko referred to previously.
[37:27] So, let's move on to slide 26. The best way to think of mSegs are lookalike audiences that are small enough to represent the distinct behavior, attributes, and characteristics necessary to achieve the business objectives of direct marketing and AdTech campaigns, but large enough to prevent the inference, singling out, or linking to individual data subject identities.
[37:58] Now, let's go to slide 27. All of the parties who registered for this webinar will receive copies of these slides. So, rather than reading all the legal benefits of functional separation, I want to highlight that the accountability and transparency enabled by functional separation improves the integrity of the entire data supply chain, enabling compliant direct marketing campaigns to scale at a global level.
[38:28] This leads us to slide 28 where even more exciting than the legal benefits are the business benefits that functional separation brings to the table. And I won't read the slide to you but the reality is this is much more than just compliance. You actually get access to the data that you want faster, access to greater opportunities to use the data, and all in a way that allows you to operate in a global manner.
[38:55] So now, let's turn to slide 29 and look at a couple of examples of functional separation. The first one on slide 29 is an example that's derived from the Dutch DPA Tennis Association case where the AP imposed a fine of 525,000 Euros on KNLTB for selling personal data. I want to note that we believe the AP ruling has been largely misunderstood by people who've only had access to summaries and not the full analysis, and this is why we at Anonos have made an unofficial full English translation of the AP case available at the URL indicated at the bottom of this slide. This is critical because a full reading of the AP case shows that KNLTB was not penalized because it used personal data to achieve commercial purposes. Rather, it was penalized because all it had was its claim of commercial interest. The AP analysis highlights that KNLTB failed to provide adequate technical and organizational safeguards to ensure demonstrable accountability.
[40:09] Slide 30 shows that what the AP found most troubling was inadequate consent and unbridled broad grant of access to the entirety of the KNLTB database, as evidenced by the two red X's on the slide.
[40:24] In contrast, slide 31 highlights what we believe the results of the AP analysis would have been different if KNLTB had secured compliant consent to send personalized ads and provided proper notice of legitimate interest processing using functional separation to deliver privacy respectful advertising. In this way, data minimization and purpose limitation would have been enforced by offering advertisers access to association members only in the context of mSeg lookalike audiences with the KNLTB serving as a data steward to deliver ads to its members.
[41:06] So, now let's move to slide 32 and let's look at a simplified use case for AdTech that is abstracted from the general deficiencies noted by the ICO and its analysis of the AdTech industry, as well as the ICO Draft Code from direct marketing.
[41:24] On slide 33, we see that some of the ICO’s biggest concerns center around: First, noncompliant bundled consent; Two, to the surveillance of data subjects using third-party cookies; And three, the wanton collection and sharing of personal data among hundreds if not thousands of market participants as evidenced by the three red X's on this slide.
[41:50] So, now let's go to slide 34. The ICO and other regulators would be more supportive of AdTech and real-time bidding that involves, first, proper consent and transparency for data collection, storage, and receipt of ads based on, second, the results of legitimate interest processing. These ads can be analyzed to determine the best ads to satisfy the customers’ expectations for a customized experience and delivered by privacy respectful mSegs that do not produce a legal or similar effect for the data subjects. In the AdTech situation, data minimization and purpose limitations would be enforced by offering advertisers access to data subjects only in the context of mSeg lookalike audiences, and a data steward function would service the needs of multiple web publishers using first-party cookies allocated by the publisher for this very purpose to deliver the ads with the desired mSegs.
[42:58] So, on slide 35, what we see is that in order to enable consent and transparency for the data collection, storage, and agreement to receive customized ads, three types of data are captured - provided data, observed data, and inferred data.
[43:20] And slide 36 gives us a double click - a closer look at this. This slide is a mockup of the kinds of questions that could be used to improve the specificity and transparency of consent, requests to the data subjects participating in a functional separation-enabled AdTech ecosystem.
Christopher Docksey (EDPS)
[43:43] Gary, could I pitch in here?
Gary LaFever (Anonos)
[43:46] Please. Please do, Chris.
Christopher Docksey (EDPS)
[43:48] Actually, I think this is one of the most important slides in the webinar. You've made a start here in drafting a very necessary but a really difficult take. But this information is absolutely crucial if the GDPR transparency requirement is to be met. One of the main problems in the accountability case law I mentioned is there was absolutely no information whatsoever via the website operator or Facebook on the processing that was going on behind the scenes. The court regarded this as totally unacceptable and it found that both the website operator and Facebook were accountable for it. Indeed, the ICO has fined Facebook for lack of transparency in the second case. But one of the huge challenges of the GDPR is effective transparency regardless of whether you’re using consent or legitimate interest. So, information like this on provided data, observed data, and inferred data would be a huge step forward in the right direction. Marty, do you have a take on this slide?
Martin Abrams (IAF)
[45:04] I do. Thank you very much. In 2014, the OECD, which is a global international organization that involves the industrialized world, they published a paper which was a taxonomy of data based on its origin. And this concept of provided data, observed data, and inferred data was defined in that paper and it's the basis for how many policymakers around the world think about the types of data. Being able to take that classification of data and put it into your own analysis of what's appropriate and what's not and how you're going to be public about the data you use is incredibly important. And if someone wants to see that taxonomy, it's available on the IAF website.
Gary LaFever (Anonos)
[45:56] Thank you, Chris and Marty. I appreciate that. And that's really what we're talking about here. The two questions we keep asking: “Do I have to do things differently?” “Can I still achieve my business objectives?” The answer to both are: YES, if you do things differently. If you put in technical and organizational controls that support policies that reflect the balancing of interest and proportionality, we believe you can not only do everything you've done in the past, but even perhaps more. And best of all, in a lawful and ethical manner.
[46:29] So, let’s move on to slide 37. So, the Anonos approach to functional separation-enabled direct marketing and AdTech can be divided into three phases. The first is data collection, the second is data science and analysis, and the third is customer engagement and interaction. It's important to focus on the fact that consent does serve as the cornerstone for lawful data collection, storage, and approval to send customized ads and that legitimate interest leverages that consent to serve as a lawful basis for thinking with data as it were and the delivery of customized experience to consumers.
[47:14] So, for my last slides, slide 38 and also 39, I just want to note that in the remaining minute that I have, I don't have time to go through all the benefits of functional separation. But again, you will get these slides and there's two slides to go through but I want to highlight that core capabilities that the industry relies upon today, such as frequency capping, recency, frequency, and monetary or RFM analysis are still supported by functional separation-enabled mSegs and that's done by leveraging the data steward function who provides the integration points necessary among the different industry stakeholder groups. In addition, and importantly, new and improved capabilities, like fraud detection, as noted on slide 38, surveillance bidding deterrence, and omnichannel cross device support that is not just AI pixie dust are all possible with functional separation.
[48:16] Again, as I noted, everyone will receive a copy of this slide deck and happy to answer questions through the chat interface on the webinar. Or afterwards, we will publish frequently asked questions in response to any questions that we get related to this subject matter that we're unable to cover during the webinar itself. So, with that, Marty, I turn the floor over to you so you can finish up before we start taking questions as we move on to the next slide.
Martin Abrams (IAF)
[48:48] Thank you very much. Please go to the next slide. The fact is that yes, you have to do things differently if you're going to be complying with the law and still meet your business objectives. And what I'm saying is, while it takes work, you can actually do that. Please go to the next slide.
[49:19] My colleagues, Marc Groman and Peter Cullen of the IAF published a blog that's available on our website and the blog makes the point that right now regulators are pushing back and requiring consent in the area of AdTech and direct marketing. And in part, they’re doing so because of frustration in seeing the lack of skills and organizations to actually do it right and dropping back onto consent is a trap because once you drop back on consent, then there’s a question of whether consent is actually adequate for the purposes and the regulators will find that's not the case. So, the fact is that all of the organizations on this call actually know that they need to step up. And the question is, how do you get the skills and the willingness to do that? Please change the slides.
[50:18] As we saw before with the game piece, the game piece that my colleagues developed is this ability to balance and not fall into the trap holes that are there is really what legitimate interest is about. It's about a balancing process that is a multi-factor balancing process. Some of us with the GDPR and we'll think of legitimate interest as a teeter-totter that my desire to use data is balanced against the individual's desire for data not to be used. That's really not the case that the proportionality requires a balancing process that’s more algorithmic that means thinking about all the interests in play to make sure that your use of data is appropriate at the end of the day and that all of the stakeholders’ interests are recognized. Please change the slide.
[51:18] There are at least two models on how one can do a legitimate interest balancing. Both of those models were developed with involvement by regulators particularly the ICO in the UK. The first is the model that was developed by the Data Protection Network. The second is a model that we at the Information Accountability Foundation published in 2017. Both of those models require you to truly understand all of the stakeholders who were impacted by the processing of the data be able to show that you understand what the benefits are to those parties and the risk and can balance them out, and it also requires you to have a mechanism within your process to make sure that the outcomes that you expect are indeed the outcomes that you get. In other words, you have to have an internal enforcement process to make sure that your assessments are not just done, but they're done in an appropriate and adequate fashion. Please change the slide.
[52:22] So, you think about that the law requires you not just to do something. It requires you to understand what it is you're doing. You need to understand the nature of the data that you're processing. Is it appropriate? Is it compatible for the uses that you're doing? Have you had an appropriate transparency process in place? Do you know all the parties that are going to be impacted by the processing, not just the data subject and you as a company but what does it mean for the marketplace? What does it mean for competition? What does it mean for society as a whole to actually have to identify the stakeholders. You need to understand what is both a positive outcome and a negative outcome. You've got to accentuate the positive and mitigate the negative. You need to be transparent to the public. The European Court of Justice has been clear that you need to figure out how to communicate what it is that you're going to do with the data and your objectives. That just doesn't mean privacy notice. That means outside of the box thinking in terms of what it means to be transparent and that's an area that we really need to have a lot of improvements and you have to be able to mitigate the risks that come with the use of data and that includes the risks to the fundamental right to autonomy. And in part the separation of data factors is part of that mitigation to ensure the fundamental right to autonomy to family life is enhanced as you think about the other fundamental rights that are in place. Please change the slide.
[54:02] And when you think about accountability, we're stepping up in accountability. The global accountability dialogue in 2009 developed the essential elements of accountability. Today, we need to apply them in a more demonstrable way. Back in 2009, it is said you need to stand ready to demonstrate. Now, you need to proactively demonstrate that you have all of the pieces in place to do effective assessed legitimate interest assessment and you have an enforcement assessment that works well with that, but you have to be able to demonstrate that you're competent to do the type of analysis that regulators require. If not, if you can't demonstrate effectively that you truly can comply with the law, then the regulators and the courts will fall back on this question of: “Did you get consent from the individual and was that consent effective?”

[54:57] So, the fact is that everyone on this call has the ability to step up and do demonstrable accountability. It just requires the commitment and the knowledge to do so and all the tools to do that are effective and are out there if you're just willing to use them. And you know, I want to leave you with this positive message. Every organization can step up and do the demonstrable accountability that's required by the GDPR, by the California Consumer Privacy Protection Act, and by the other laws that are coming on board that need beyond consent to this concept of a legal basis to process. And with that, Gary, let's open up for questions.
 
 
Gary LaFever (Anonos)
[55:39] All right, so we'll close up on slide 47 very quickly, and basically say: Yes, you can continue to achieve your business objectives if you incorporate the different elements that we've talked about during this webinar. So, let's take the first question, which is an interesting question since the ICO Draft Code for direct marketing actually raised some concerns. And so, the first question is: “Are there good things out of the draft code?” I'd like to pose this to you, Chris, do you think there are positive elements of the ICO Draft Code?
Christopher Docksey (EDPS)
[56:15] Yeah. I would say there are three. Firstly, as I mentioned, if you read the draft code, and if you read the June 2019 update report, you'll have a very clear idea of what is required to do AdTech advertising lawfully. In particular, the definition of direct marketing purposes and that definition would include the stages of analysis discussed in this webinar. And finally, I don't deny the emphasis on accountability and accountability tools, such as data protection by design or default and in particular data protection impact assessments.
Gary LaFever (Anonos)
[57:04] Thank you for that. Marty or Sachiko, do you have anything you wanted to add on that?
Martin Abrams (IAF)
[57:10] I think that Chris is right. The fact is that if you want to be able to do profiling for market segmentation, you have to prove to the regulator that you have the competency and the integrity to truly do it in a fashion that is respectful for the fundamental rights for the individual to understand and be able to object. And if you can't do that, then you're limited in what you can do with the data in terms of market segmentation.
Gary LaFever (Anonos)
[57:46] So, Sachiko, perhaps you could answer the second question and it’s related. “Why is the ICO so skeptical about legitimate interest for subsequent processing?”
Dr Sachiko Scheuing (Acxiom)
[57:58] I think and this is once again just my assumption, but I think there is this myth that consent is a more superior legal basis than all other legal basis. And I think I want you to really think about the fact that why has GDPR evolved in the way that it actually embraces the accountability principle? That is because the European Data Protection Directive was considered inadequate because of the shortcomings of consent. So, I find that very interesting.
Gary LaFever (Anonos)
[58:34] That's fascinating. We apologize we could not get to more of your questions. But again, if you want to send a follow-on question to LearnMore@anonos.com, we will follow up with any questions that we received during the webinar through the interface or submitted to LearnMore@anonos.com and we very much appreciate your time today, and thank you very much and best of luck to everyone on the call. Thank you. Goodbye.
CLICK TO VIEW CURRENT NEWS




Are you facing any of these 4 problems with data?

You need a solution that removes the impediments to achieving speed to insight, lawfully & ethically

Roadblocks
to Insight
Are you unable to get desired business outcomes from your data within critical time frames? 53% of CDOs cannot achieve their desired uses of data. Are you one of them?
Lack of
Access
Do you have trouble getting access to the third-party data that you need to maximise the value of your data assets? Are third-parties and partners you work with worried about liability, or disruption of their operations?
Inability to
Process
Are you unable to process data due to limitations imposed by internal or external parties? Do they have concerns about your ability to control data use, sharing or combining?
Unlawful
Activity
Are you unable to defend the lawfulness of your current data processing activities, or data processing you have done in the past?
THE PROBLEM
Traditional privacy technologies focus on protecting data by putting it in “cages,” “containers,” or limiting use to centralised processing only. This limitation is done without considering the context of what the desired data use will be, including decentralised data sharing and combining. These approaches are based on decades-old, limited-use perspectives on data protection that severely minimise the kinds of data uses that remain available after controls have been applied. On the other hand, many other new data-use technologies focus on delivering desired business outcomes without considering that roadblocks may exist, such as those noted in the four problems above.
THE SOLUTION
Anonos technology allows data to be accessed and processed in line with desired business outcomes (including sharing and combining data) with full awareness of, and the ability to remove, potential roadblocks.