Introduction to Fair Trade Data - Balancing Innovation and Data Privacy

Fair Trade Data Webinar
Presentation Transcript
Gary LaFever Gary LaFever
CEO & General Counsel
Anonos
Günther Leissler Günther Leissler
Counsel
Schoenherr Attorneys at Law
Introduction to Fair Trade Data - Balancing Innovation and Data Privacy
Gary LaFever Gary:
We would like to welcome everyone to the first webinar on Fair Trade Data, and I would like to start by introducing my co-host, Günther Leissler. Günther, would you like to introduce yourself, please
Günther Leissler Günther:
Thank you very much, Gary. My name is Günther. I’m an attorney specializing in data protection law. I’m based here in Vienna in Austria, so kind of in the heart of the GDPR and whatever else that comes along with the GDPR. We have been involved in lots of discussions about all the varieties of legal aspects coming along with the GDPR. And now, we are facing numerous proceedings in all the varieties of legal aspects coming along with the GDPR and given the broad spectrum of what we have been facing thus far, I'm more than excited in this new approach about putting all this regulatory stuff into the techniques, into the data, into the processing of the data. I think that is a completely new approach that's driving not only for those having to process data and work with data, but also as attorneys having to give advice on the processing of data. That's my view on that.
Gary LaFever Gary:
Fantastic. And I should let the audience know that when Günther and I first met, he was representing a client using BigPrivacy technology and we really struck a chord because we both realized that while compliance with the law is important, what's actually more exciting is that the GDPR has a means to both not only comply but be then innovative and quite arguably even expanding opportunities for data use both internally - if you comply with technical and organizational requirements - as well as externally. The ability to get data from other parties in a way that is privacy-respectful but enhances the value of your data, and the ability to share your data externally. So really, Fair Trade Data is about that. It is going to be primarily focused on secondary uses and this is a concept that is starting to take off.
Achieving AI’s Full Potential with Fair Trade Data
You should see on your screen right now, links to two articles or two materials. The first is an article in Lexology that's entitled, “Achieving AI’s Full Potential with Fair Trade Data” and we may refer to that from time to time. It's a short article, but it has some good substantive matters and they’re related to Fair Trade Data. And the other one is a link to the EU Agency for Network and Information Security. You may know them as ENISA, and they have a report out on privacy by design in big data. Again, ENISA’s privacy by design in big data report. So this is going to be a relatively brief webinar, where Günther and I both just have a discussion about what Fair Trade Data may or may not mean and compare it to this one article and this one ENISA report, and we hope all the viewers get some value out of this and learn something from it.

So with that, Günther, I'm just curious. I'm going to start off with what the Lexology article defines as Fair Trade Data and Conflict Data, and would just love to get your reaction to these concepts.
Fair Trade Data refers to data that has embedded, technically-enforced, granular privacy controls to eliminate the risk of 'Conflict Data' and protects against bias, discrimination, and violation of data subjects privacy.
So Fair Trade Data refers to data that has embedded, technically-enforced granular privacy controls. And why? To limit Conflict Data, which I'll define, and it does that so it can protect against bias, discrimination and violation of data subject’s fundamental rights to privacy. So, that's what Fair Trade Data is. So when we use the term “Fair Trade Data,” what we're going to be referring to is technology that has embedded controls to accomplish those goals. So, what's Conflict Data? Conflict data is the risk of personal information concerning an individual (obviously) being used to the disadvantage of that very same individual and it's analogous to conflict diamonds, which are where a rebel group may use minerals, diamonds, et cetera, that are actually harvested or mined from a sovereign entity, and they use the proceeds of that sale against the sovereign entity. So we're talking about data that can be used against the best interest of the person to whom the data relates. And we want to make it clear here, that misuse may not always be intentional. We're not just talking about criminals here and bad people. We're talking about the use of data in means that’s beyond what the data subject expected or what the law allows.

So just curious, Günther, to get your reaction to those two concepts - Fair Trade Data and Conflict Data.
Günther Leissler Günther:
Thank you, Gary. From my end and from my perspective, what is really interesting about this is … I can tell you, I'm located here at a law firm, Schönherr Attorneys At Law, which is located in Vienna and we have coverage through a lot of member states in Europe in all the variety and all the different scales of knowledge. So if you have a company that's located, for instance, in Germany or in Austria and you have got your branches or your subsidiaries somewhere in the CE countries, you should not expect the very same level of knowledge under the GDPR, under the regulatory framework and whatsoever, in each of these countries. So the more you are able to shift this compliance level, this level of legal responsibility into your data, the more intelligent your data processing is and the less you need to rely on local data protection knowledge and the better you can ensure uniform level of compliance throughout your corporate group or group of companies no matter where you are located. So from the attorney’s way of thinking, that's one of the very, very innovative approach of this concept. And on the other hand, of course, if you are able to use intelligent data protection services, you're also better off with giving evidence that you have secured what can be secured under the GDPR. It puts relief on you as the CEO, as the DPO, compliance manager or whatsoever. It gives you relief the more you can put this into the processing techniques for you.
Gary LaFever Gary:
So what I hear you saying which is fascinating is that for multinational companies, for companies that are doing business in a variety of different countries and jurisdictions, that the sophistication of the people in each of the different countries will be different. And that by embedding the controls - the Fair Trade Data controls as it were - the intelligent controls within the data itself, you not only have more uniformity of processing across the different jurisdictions, but you also could support your obligations as a data controller and data processor to specify what you've done, how you've done it, the predictability of it, so that is both with regard to your customers, your partners and the regulators, you have a more defensible position. Is that accurate?
Fair Trade Data supports centralized compliance over decentralized processing because the controls flow with the data to enable more efficient decentralized DPO practices with greater uniformity of governance expertise.
Günther Leissler Günther:
Yeah, that's totally true. And also, there are some other advantages as well that we didn't mention so far. For instance, if you appoint a central DPO within your corporate group, that DPO if it’s for instance located at a parent company, also is better off with understanding how the system works no matter which branch is involved. As it is the current status where essentially in a lot of cases, you have to nominate DPOs in a decentralized basis, which have to cover their own jurisdiction. So the compliance handling would be easier as well if you follow this process of Fair Trade Data.
Gary LaFever Gary:
Interesting. So, it's almost a centralized compliance over decentralized processing because the controls flow with the data. It's a good way to look at it. So there are five things that are mentioned in the Lexology article and what I thought I'd do is we just touch upon each of those, and then we'll move over to the ENISA report.
Technical and Organizational Safeguards Required for Pre-GDPR Data and Advanced Processing to be Legal
So the first one is technical and organizational safeguards. What the article is saying is basically that Fair Trade Data principles embody these five things. So the first one is technical and organizational safeguards required for pre-GDPR data to be legal and for advanced processing to be legal. So, we'll touch upon both of those quickly.

So the first one deals with under Recital 171 that there is no grandfather clause. So upon enactment of the GDPR, which a lot of people don't realize was two years after adoption, right? So it was adopted originally in April. And then once it was available in all the different languages in May, two years were provided for people to come up to speed. So by May 25, 2018, companies were supposed to have taken all the data that had been collected previously and brought it up to the new standards of the GDPR. And the inability to support those standards, I would think primarily because of ineffective consent - broad-base generalized consent that doesn't meet the new requirements - that had to be rectified and there is no grandfather clause saying it's okay that you collected this in the past. So we'll just stop on that one. If Fair Trade Data controls mean controls embedded in the data and you have an obligation to bring data up to the requirements of the GDPR that you collected in the past, how does that happen? Is it because it enforces the controls necessary for legitimate interest processing or how might that occur?
Fair Trade Data can help transform illegal pre-GDPR data to become legal by enforcing controls necessary to support legitimate interest processing.
Günther Leissler Günther:
I would say on that end, it helps because as you correctly said, there's no grandfather clause or anything that gives you absolute clearance to further process the data under the GDPR as you've done before. So you have to react. And very often, it's not easy to react in a pure legal basis. It's not easy to re-establish new consent under the GDPR. There, again, you have different requirements depending on the jurisdiction, how to obtain consent or voluntariness of the consent and all that stuff. It's not so easy to go. But if you can shift this, at least partially, to the nature of data you're working with, so if you can put relief on your legal allowance by shaping the data to a level of intrusiveness that complies with the GDPR, it might allow you because you have changed the nature of the data. It might allow you to shift from consent to legitimate interest. That, of course, is a huge advantage for every company under the GDPR and it's essentially what we have said before.

Also, it allows you more harmonized acting because you don't have to deal anymore with consent specifics country per country. You could do kind of a centralized balancing test based on this nature - this GDPR-approved nature of the data you’re working with.
Gary LaFever Gary:
And we should point out that while typically you can't switch from one legal basis to the other, the Article 29 Working Party in the April 2018 Guidelines on Consent absolutely acknowledge that because they were changing the law’s requirements that controllers had a one-off opportunity to do this. So that's very important but typically, you do not have this opportunity but the regulators explicitly acknowledge because of the change in the law a one-off opportunity. So that's great. And so, that hits the historical data. On a similar note, the Lexology article talks about in order to support a non-consent and non-contract legal basis, in essence, legitimate interest processing, that Fair Trade Data could be helpful. And I just want to highlight for those who may or may not be aware, if you combine the April 2018 Consent Guidelines, which talked about what consent doesn't cover and talks about the fact that consent has to be voluntary, there must be a non-consent means of using ... in other words, you have to give a data subject the right to access a product or a service without consenting to the use of data for secondary processing.
Fair Trade Data can help support legitimate interest processing for lawful analytics, AI, product development profiling not supported by consent or necessary for contract.
So if you don't give them that alternative approach, it's not freely given. It's a condition precedent, right? Giovanni Buttarelli calls it blackmail consent. So you can't do that. So the April 2018 Guidelines on consent identify the limitations of consent. Similarly, just recently, in the April 2019 European Data Protection Board Guidelines on contract, they identified the things that are necessary for contract won't support. So profiling, product development, and a lot of different processes analytics that people want to do that they think they can rely on consent or contract, they can't.

And so again, it highlights perhaps the value of Fair Trade Data capabilities for advanced analytics, AI and things like that. Could you speak to that?
Günther Leissler Günther:
Absolutely. I think no matter what you’re referring to, it might be profiling or it might be scoring or it might even be pure masked data processing, you want to go on a balancing interest test because you don't have a contract relationship whatsoever.

The overall outcome of what the GDPR wants to have insured is, if you weighed in the interest and having possessed the data from the end of the company and if you compared that to the secrecy interest of the data subject, whether it be an employee or might it be a consumer or whomsoever, then your overall outcome should be that the company's interest should override or should at least not be overridden by the secrecy interest of the individual. And the more the data gets shaped, the more the data by itself says, “Okay, now I'm in a state that allows big data mining,” or the less you have conflicting secrecy interest with the data subject, the more it supports the balancing interest.

And so, that's a completely new approach. So far, the legal thinking had been thinking about the data as it is purely data but not have in mind that the data can change its nature. And with changing its nature, it can be supportive to your company's interest.
Gary LaFever Gary:
So I want to jump over for a minute. We've been talking about the Lexology article. But the ENISA report actually had some interesting concepts that tied into what you're talking about. I'm just going to read a quick quote here from page five.
We take the view that selectiveness (for effectiveness) should be the new era of analytics.
“We take the view that selectiveness (for effectiveness) should be the new era of analytics. This translates to securely accessing only the information that's actually needed for a particular analysis (instead of collecting all possible data to feed the analysis).” So, what you just touched upon is, that's kind of a data minimization collection perspective, but it seems that Fair Trade Data concepts actually supported data use minimization where you have the data and you can manage the use for different purposes, which it would seem to me, enables you to have data for a broader variety of uses because you can show that you're enforcing data minimization and purpose limitation and even data protection by design and by default on a use-case specific basis to help with the balancing of interest test. Any thoughts or perspectives on that?
Günther Leissler Günther:
I think that's also in line with the thinking of the GDPR. So if you turn, for instance, your data from a personalized basis to an anonymized basis or whatsoever, if you soften the intrusiveness of data, then the more you can do with the data, the more you're allowed to collect the data, to explore the data. So it all comes down if you try to diminish the intrusiveness and personalization of the data, the more you're flexible with what you want to do with your data and GDPR will not prevent you from doing so.
Gary LaFever Gary:
Yeah. I almost get the impression that data protection by design and by default literally explicitly was hoping these types of capabilities would come about because I think quite arguably, it requires it.
Günther Leissler Günther:
Yeah. Compared to the previous data protection law we had here in Europe, that's a new concept, the concept of privacy by design and privacy by default, which puts in the center the technical aspects of data processing. And as it is a new concept, there are no guidelines or no case law or equivalent case here. So you can give argument to what you're using. If you can put a conclusive picture under the GDPR and one of these arguments then is you say, “I'm using data, and the data on its own satisfies some technical standards, we feel we give argument it is compliant with GDPR.” That's the way we can give argument that we use it as we use it. And essentially, that's privacy by design. You say, “I'm shaping my data. I’m processing my data in that shape.” And that can be one of several alternatives to satisfy the privacy policy requirement under the GDPR.
Transparency and Audit Controls: Enable the availability of statistical properties of data sets to aid in interpreting decisions made using the data and to ensure auditable compliance with data privacy and use policies.
Gary LaFever Gary:
Well said. So we've hit upon the first two items in the Lexology article. The next one is transparency and audit controls and this gets a lot of attention particularly in the context of AI and I personally think of transparency oftentimes as, “Are you providing the data subject the ability to see what data you have about them and who you share it with?” But I think this is a different kind of transparency. It’s explaining to the data subject in a way that they can understand what you're doing with their data, not necessarily how the algorithms themselves work because that may not even be possible to explain in advance but to say that you're applying privacy-enhancing techniques that you're going to be taking these steps to put the controls in the data itself.

And in that respect, you can explain those in advance, not at the level you'd have to, to support consent but at a level where you're actually putting them on notice of how you're going to use the data. So that's the transparency side at least as I see it. And then the auditability that when a regulator comes in or even your own internal auditors or external financial auditors, I could see as time goes on, external auditors before they render a financial fitness opinion will ask: “What do you do to do these things?” And so, on this next point of transparency and auditability, do you have any particular thoughts or perspectives on how Fair Trade Data concepts might be helpful in that regard?
Günther Leissler Günther:
I would say it’s similar. As you said, there are two ways to look at transparency. The first one is being transparent towards the user, towards the customer, which essentially means you have to satisfy these requirements under the GDPR Article 13 and Article 14 and provide proper information. And there of course, it's going to be crucial. The more innovative methods and techniques you're using, the more you have to give and draw a picture to your customers, to your employees what you're doing so that they can well understand the way you're processing your data. The other part of transparency is being transparent to regulators or to courts of law. And so far, on that end, it has been silent here in Europe. There's no case law to which extent you have to be transparent when you get audited. So far, what we have experienced is the regulator wants to see the processing register you have. If you have done a privacy assessment, you have to demonstrate that as well. So essentially, it is paperwork. And when it comes down to inspecting your automatic systems, your databases or whatever you have, it’s getting more complicated for the regulators as well. The regulator needs to have technicians and needs to understand the way you are processing your data from an IT perspective. And if you have then for the databases, if you have layers of data of where your data is addressed, where it is encrypted, where it is clear data, that of course, forms something you can call from a legal perspective transparency as well under the GDPR. And in my view, this should not be underestimated. So far, people are more focusing on transparency by explaining to the data subjects what they're doing. But when it comes to an audit, it's no less important to demonstrate transparency by making the regulator understand how you process your data when they are really inspecting on-site.
Gary LaFever Gary:
You're really making their job easier for them. It's a roadmap. Yeah.
Günther Leissler Günther:
The whole audit, I guess, is going to be more structured which will be beneficial for both sides, needless to say.
Gary LaFever Gary:
Well, good perspective.
Cross-Sectional Policy Enforcement: Enable common data store(s) to programmatically support data protection and privacy rights management policies applicable to different entities and locations (i.e. companies, industries, states, countries, regions, etc.) - and to do so simultaneously.
So we have two more things at least from the Lexology article. We've already touched upon this, but I just wanted to go back to it briefly. The fourth item was cross-sectional policy enforcement and that reads that by enabling common data stores to programmatically support data protection and privacy rights management policies applicable to different entities and locations and to do so simultaneously.

So you touched earlier upon the fact that different countries may have different levels of sophistication with regard to the GDPR. But for truly international companies, they have not only to comply with the GDPR, they have to comply with vertical industry requirements, data sovereignty localization requirements in other countries and cross-Atlantic. So there are a lot of issues involved too that it would seem to me the principles of Fair Trade Data where you're technologically enforcing the controls in the data should help to - I hesitate to say automate because you're not taking the discretion away from the data controller. The data controller would always be the one determining the policies, but the Fair Trade Data principles could enforce those policies to operationalize them. So do you have any further comments on that one point of cross-sectional policy enforcement?
Günther Leissler Günther:
From an attorney’s perspective, I would call it language. It creates a kind of common language within the whole corporate group worldwide no matter which jurisdictional requirements you’re subject to. The more centralized the data processing happens, the more the whole corporate group needs one common language in terms of data processing.
Real-Time Policy Adjustment: Adjust in real-time to the changing requirements of policies by dynamically modifying the intelligible form of data into which dynamically obscured data are transformed.
Gary LaFever Gary:
All right. Last one here, which is kind of interesting. And again, these are just five things within this particular Lexology article that are associated with Fair Trade Data. Real-time policy adjustment. So what's going to be interesting is you pointed out some of these principles have not yet been litigated. There's no case law. People are doing their best at anticipating what the law will be. But as time goes on, there’ll be more enforcement actions, lawsuits, et cetera. And as there's more and more clarity on what these rules mean, it seems to me again, Fair Trade Data principles would enable you to adjust the parameters and controls in real time to adjust to the evolution of how these laws are interpreted, as opposed to doing it manually. So just curious if you have any particular thoughts or concepts on that as our life with GDPR expands and continues.
Günther Leissler Günther:
From the perspective as a legal advisor, the huge advantage there would be GDPR is a unionwide uniform law. So whenever something happens, might it be at the court, might it be with a regulator in a country that has impact on all other countries because we are talking about one uniform law. So whenever you start reacting, for instance, if you have to change the way how you delete your data or how you restore your data because you're obliged to do so in one country, you have to react in other countries as well because we're talking about one uniform law. So if you can do that, the faster you can do that and the more centralized you can do that, of course, you're better off in all these other countries as well.
Gary LaFever Gary:
Fantastic. So I just like to hit one or two other things that are in the ENISA report. And it ties upon one of the other things that changes under the GDPR is the liability between co-controllers and controllers and distributors or processors, I'm sorry. So again, the liability that can arise between and does arise between co-data controllers and between data controllers and processors and they can be liable both of them to a data subject.

Only if they have no liability are they excused. And so, if any of those parties is in the least bit culpable, they could be held responsible for the entirety of a data subject claim and then and only then have the right to go back against their co-controllers and their processors.
Therefore, there is need for automated policy definition and enforcement, in a way that one party cannot refuse to honor the policy of another party in the chain of big data analytics.
And there's an interesting point in the ENISA report. It says, this is on Page 6, “There is a need for automated policy definition and enforcement, in a way that one party cannot refuse to honor the policy of another party in the chain of big data analytics.” And then, later on the same page, it says, “Thereby putting automated enforcement of privacy requirements and preferences into the data.” So it's almost as if the ENISA report anticipated or hoped that principles and capabilities would arise and develop that would enable this to happen because otherwise, if I give my data to a co-controller or a processor and I'm relying on policies and contracts that they will process it correctly, I expose myself to liability and I limit my opportunities to share the data to those parties that I feel comfortable with that contract. Whereas, if I can embed those controls, I have greater comfort that those controls will be abided by because they're technologically enforced, which may actually increase the opportunities for data sharing and combining. I'm just curious if you had any perspectives on that or anything else in the ENISA report that you found relevant to Fair Trade Data concepts.
Günther Leissler Günther:
I thought it is quite interesting from a legal perspective when talking about liabilities. We have had some complex scenarios where it highly depends on the contract, the controller and the processors. Needless to say, it also depends on which data is processed and which damage has occurred. But on a more generalized basis, the GDPR acknowledges joint controlling in Article 26 and coming along with that, joint responsibility. So joint responsibility between two or several controllers mean, they have to find a common understanding about their responsibilities because if somebody raises claims or a third party raises claims against one of those controllers, the controller can be held liable for the whole group of controllers and has to satisfy this claim, if justified, and then might call in the other controllers. But again here, I would call it technical language. The more this joint approach speak the same technical language, the more data gets intelligent to satisfy GDPR requirement, the less you have potential dispute about misinterpreting policies, policies being outdated, contracts that did not sufficiently cover all aspects of the data processing of the joint controllers. So I would say, the battleground for dispute between controllers amongst each other might be diminished, the more it comes to the future of technical breach of the data that is shaped under the GDPR. Again, a very innovative approach but something that is certainly important under this giant layer of joint liability under Article 26 of the GDPR.
Gary LaFever Gary:
Fantastic. Well, so in closing, what I'd like to do is just ask your perspective on the GDPR’s intent to actually help facilitate innovation, because I think a lot of people just think of it as attacks on innovation as opposed to a means to actually facilitate innovation and then to the extent that you see Fair Trade Data principles helping to facilitate that.
Günther Leissler Günther:
In an overall view, what we see as one of the key difficulties is role allocation with the controller. In reality, we are talking about the data processing that very often is quite burdensome to do some proper role allocation under the GDPR. I would expect that it’s getting more and more complicated, the more artificial intelligence comes because then, the more the systems by themselves take control and to avoid ending up in year-lasting disputes before, who has to take responsibility for maybe self-involved data processing techniques? I would say that it might be for sure, the more advanced the approach if the system by itself takes care of GDPR compliance so you don’t come into the path of having to discuss who has to take responsibility, who is the owner of the data, who is the owner of the algorithm? So I would expect, the more you can put it on this avenue, the less you have to discuss responsibilities under this giant role allocation discussion.
Gary LaFever Gary:
Very helpful. And the one thing I've heard which is interesting, is the concept of friction. So the idea of Fair Trade Data may at first seem to introduce new friction because you're inserting new processes and pre-processing on the data that you didn't before. But I would actually say that the friction creates frictionless or without friction data use and sharing because of what you've established into the data. And so, I think it's really a mind shift. This Fair Trade Data concept might appear to require new elements in processing and controls that didn't exist before but that's actually because of the law. If you incorporate them though, I really think what we're saying is, it may not only keep you out of trouble, but it may actually increase your opportunity to have further partnership opportunities, both as intaking data from third parties and sharing data on a privacy-respectful way as well as with your customers. So you can give your customers a greater sense of confidence and trust that you're conducting your business but in a way, respects their fundamental rights. So, fantastic conversation, Günther. Anything in closing that you'd like to share with the audience?
Günther Leissler Günther:
I think there's nothing more to add to what you’ve said.
Introduction to Fair Trade Data
 
 
CLICK TO VIEW CURRENT NEWS