And there's an interesting point in the ENISA report. It says, this is on Page 6, “There is a need for automated policy definition and enforcement, in a way that one party cannot refuse to honor the policy of another party in the chain of big data analytics.” And then, later on the same page, it says, “Thereby putting automated enforcement of privacy requirements and preferences into the data.” So it's almost as if the ENISA report anticipated or hoped that principles and capabilities would arise and develop that would enable this to happen because otherwise, if I give my data to a co-controller or a processor and I'm relying on policies and contracts that they will process it correctly, I expose myself to liability and I limit my opportunities to share the data to those parties that I feel comfortable with that contract. Whereas, if I can embed those controls, I have greater comfort that those controls will be abided by because they're technologically enforced, which may actually increase the opportunities for data sharing and combining. I'm just curious if you had any perspectives on that or anything else in the ENISA report that you found relevant to Fair Trade Data concepts.

Günther:
I thought it is quite interesting from a legal perspective when talking about liabilities. We have had some complex scenarios where it highly depends on the contract, the controller and the processors. Needless to say, it also depends on which data is processed and which damage has occurred. But on a more generalized basis, the GDPR acknowledges joint controlling in Article 26 and coming along with that, joint responsibility. So joint responsibility between two or several controllers mean, they have to find a common understanding about their responsibilities because if somebody raises claims or a third party raises claims against one of those controllers, the controller can be held liable for the whole group of controllers and has to satisfy this claim, if justified, and then might call in the other controllers. But again here, I would call it technical language. The more this joint approach speak the same technical language, the more data gets intelligent to satisfy GDPR requirement, the less you have potential dispute about misinterpreting policies, policies being outdated, contracts that did not sufficiently cover all aspects of the data processing of the joint controllers. So I would say, the battleground for dispute between controllers amongst each other might be diminished, the more it comes to the future of technical breach of the data that is shaped under the GDPR. Again, a very innovative approach but something that is certainly important under this giant layer of joint liability under Article 26 of the GDPR.

Gary:
Fantastic. Well, so in closing, what I'd like to do is just ask your perspective on the GDPR’s intent to actually help facilitate innovation, because I think a lot of people just think of it as attacks on innovation as opposed to a means to actually facilitate innovation and then to the extent that you see Fair Trade Data principles helping to facilitate that.

Günther:
In an overall view, what we see as one of the key difficulties is role allocation with the controller. In reality, we are talking about the data processing that very often is quite burdensome to do some proper role allocation under the GDPR. I would expect that it’s getting more and more complicated, the more artificial intelligence comes because then, the more the systems by themselves take control and to avoid ending up in year-lasting disputes before, who has to take responsibility for maybe self-involved data processing techniques? I would say that it might be for sure, the more advanced the approach if the system by itself takes care of GDPR compliance so you don’t come into the path of having to discuss who has to take responsibility, who is the owner of the data, who is the owner of the algorithm? So I would expect, the more you can put it on this avenue, the less you have to discuss responsibilities under this giant role allocation discussion.

Gary:
Very helpful. And the one thing I've heard which is interesting, is the concept of friction. So the idea of Fair Trade Data may at first seem to introduce new friction because you're inserting new processes and pre-processing on the data that you didn't before. But I would actually say that the friction creates frictionless or without friction data use and sharing because of what you've established into the data. And so, I think it's really a mind shift. This Fair Trade Data concept might appear to require new elements in processing and controls that didn't exist before but that's actually because of the law. If you incorporate them though, I really think what we're saying is, it may not only keep you out of trouble, but it may actually increase your opportunity to have further partnership opportunities, both as intaking data from third parties and sharing data on a privacy-respectful way as well as with your customers. So you can give your customers a greater sense of confidence and trust that you're conducting your business but in a way, respects their fundamental rights. So, fantastic conversation, Günther. Anything in closing that you'd like to share with the audience?

Günther:
I think there's nothing more to add to what you’ve said.