Introduction to
Functional Separation

Gary LaFever (Anonos)

[00:07] I like to tell the story and it's very true. When people talk about European masters, we typically talk about artists. But if you want to know about data protection, the Europeans are the masters without question. And there's a concept that was developed in the European law of functional separation. And what's fascinating now is if you look at the new laws. India has a new proposed law, which is one of these ones where you can go to jail. Brazil has one. California has one. And for those of you who know anything about the US when California does something, sooner or later the country does it, right?

[00:43] They all have a concept of functional separation. They may call it different things. Under the GDPR, it's reflected in Pseudonymisation, which is a term that is a term of art. As a recovering attorney, you don't use a term in conversation the way you do in law. You have to look to see what the law says. So, Pseudonymisation has a very specific requirement, which is that you separate information value so you can process it from identity.

[01:09] The French CNIL actually has a copy of our software, BigPrivacy, that they're looking at because of the French government's announced artificial intelligence. Can you still process the data and have a less discrimination? There's no silver bullet. So, what I wanted to very quickly do is about a half a dozen points have been raised throughout the day and bring them out to this concept of functional separation because we think it's very powerful and no one has a patent on functional separation.

[01:36] As I said, it has emanated from the European Data Protection Law. The best place to see anything about it is there's a 2015 guidance from the European Data Protection Supervisor that does a great job of talking about it. But again, fundamentally, it's technical and organizational controls or safeguards that allow you to controllably separate at a level of granularity when you know “who” someone is versus “what” they represent with controlled means of relinking. So, this is not anonymisation, which is another ill-understood term. It is truly Pseudonymisation.

[02:12] So, again, just a couple of things. In one of our first sessions someone said: “It seems to me what you need to do is to integrate controls into the data-driven organization. You can't rely on contracts and policies.” Again, I believe the concept of functional separation has that behind it in order to benefit from functional separation under the GDPR Pseudonymisation. If you look, Pseudonymisation is never required under the GDPR but there are 15 places that if you do it, you get special rights and special usage. Why? Because you heightened the bar to protect and respect the rights of individual data subjects so you get something in return.

[02:46] Next, business cases, several different sessions. You'll never get funding for data governance if you don't lead with the business case. And it's true, but functional separation actually enables you to show business cases and benefits.

[03:02] We had a great conversation on the carrot versus the stick. One of the examples was: “If I can show a carrot, which is new use of data, an expanded use of data and value of data, I can get funding.” Someone else said: “Yeah, but I'm just going to carry a stick.” And the biggest stick that was mentioned was the fact that most historical data was collected using broad based consent. That's not illegal to possess or process. Encrypting it doesn't make it any less illegal.

[03:26] And so, what was raised is: “Does this functional separation concept give you a potential answer?” Well, actually, yes. Because if you convert the data and pseudonymise the data to support legitimate interest processing, you now have the ability to use it on a non-consent model. So, another example. Consent does not support iterative analytics. Fascinating conversation. “How do I get in advance a specific and unambiguous consent for something that I don't know yet? I want to keep data. Why? I'm not sure. But it might be useful in the future.” The reality is once again the GDPR and these other laws say: “If you can show you have technical measures that enforce functional separation, that may give you legitimate interest - again, a term of art not a title of discussion - to maintain that data.”

[04:11] And this has actually been one of my favorite conversations here - the difference in privacy and ethics. I love the shifting to data teams later in the GDPR cycle because that's what we've seen. Initially, it wasn't the privacy people or the compliance people. But toward the end of the day, the CDO team says: “Can I still use my data?” Again, functional separation, a well established principle of EU Data Protection Law, might be the answer there. And how do you automate? How do you take some of the burden away from the business team? The reality is, again, the technical and organizational measures that can support functional separation can be an answer there.

[04:45] I haven't heard that quote from Drucker "Culture eats strategy" in a long, long time, but we all know it's true. Yet, again, if you have technical and organizational measures to help enforce some of your culture, strategy has a little hard time losing or winning. So, again, many, many different ways. And the last one is just ethics generally.

[05:06] So, again, we promised to do this throughout the day. we had the tension slide you saw earlier talking throughout the day how different approaches to data governance might actually provide benefits that provide governance capabilities and business opportunities and reconciling these is a way that can be done. Again, a well-established EU data principle of functional separation. So, with that, we'd like to thank everybody here. And if you're interested, IDC is actually going to be coming out with a whole new report on functional separation. But they have a preliminary report here that I have copies of and there are copies down at the stand. Thank you all for your commitment today.

CLICK TO VIEW CURRENT NEWS

Are you facing any of these 4 problems with data?

You need a solution that removes the impediments to achieving speed to insight, lawfully & ethically

Roadblocks
to Insight
Are you unable to get desired business outcomes from your data within critical time frames? 53% of CDOs cannot achieve their desired uses of data. Are you one of them?
Lack of
Access
Do you have trouble getting access to the third-party data that you need to maximise the value of your data assets? Are third-parties and partners you work with worried about liability, or disruption of their operations?
Inability to
Process
Are you unable to process data due to limitations imposed by internal or external parties? Do they have concerns about your ability to control data use, sharing or combining?
Unlawful
Activity
Are you unable to defend the lawfulness of your current data processing activities, or data processing you have done in the past?
THE PROBLEM
Traditional privacy technologies focus on protecting data by putting it in “cages,” “containers,” or limiting use to centralised processing only. This limitation is done without considering the context of what the desired data use will be, including decentralised data sharing and combining. These approaches are based on decades-old, limited-use perspectives on data protection that severely minimise the kinds of data uses that remain available after controls have been applied. On the other hand, many other new data-use technologies focus on delivering desired business outcomes without considering that roadblocks may exist, such as those noted in the four problems above.
THE SOLUTION
Anonos technology allows data to be accessed and processed in line with desired business outcomes (including sharing and combining data) with full awareness of, and the ability to remove, potential roadblocks.