As discussed in Part I of our article on compliance issues with the GDPR, many organisations are driving just above the speed limit in terms of data processing, hoping that because everyone else is doing it, they won’t get caught. This practice is due to the magnitude of organisational change required with the introduction of the GDPR: very quickly, companies had to consider how their data collection and use was being performed, and didn’t know where to start in terms of fixing their mistakes.
Companies need to process large amounts of data to take competitive market positions. In some industries it is not simply a matter of competition either: data processing (particularly data repurposing) is now a requirement for survival in that industry. One notorious example of this is the AdTech industry, which we will now cover in Part II. We will examine what the problems are when it comes to data privacy in AdTech and Real Time Bidding (RTB), and the solution proposed by Anonos to stop —or at least slow down—speeding cars.
The core challenge that the AdTech industry is facing is: How do we unlock data potential to connect with customers, develop advertisements, and grow our brands if we cannot collect, share and repurpose the data that will fuel it?
THE ADTECH SITUATION
Advertisers need to reach their customers in the moments that matter, in the environments that they reside and at a cost per acquisition that enables them to grow. For now, and the foreseeable future, the operating environment is mostly online. Technology advancements have compromised traditional publisher revenue streams and forced them to maximise digital inventory to evolve and innovate. The technology providers who sit inside this world exist to bridge the gap between these two parties and the consumer. So what exactly is the problem? While I am an optimist and I do not believe any company has been created to simply abuse this reality, it appears to be happening nonetheless.
The problem is both a compliance one and an ethical one. Recent issues arose when the Information Commissioner’s Office (ICO) in the UK released a report on the privacy problems inherent in the AdTech industry, particularly when it comes to RTB. Some of the problems that the ICO noted were that:
- “RTB carries a number of risks that originate in the nature of the ecosystem and how personal data is processed within it”; and
- Between industry participants it has “become apparent … that there are substantially different levels of engagement and understanding of how data protection law applies, and the issues that arise.”
A core psychological concept that applies in this situation is the theory of neutralisation that was popularised by Matza and Sykes in the 1950’s. This theory explores how humans may justify their illegitimate acts by employing psychological methods that deny their responsibility, the existence of a victim, any incurred injury or suggest that their acts are for the greater good.
In Adtech, many people had seen the issues with RTB for a long time, namely well-known groups who fight to protect the privacy rights of individuals around the world. However, because numerous companies were engaging in not-so-good data processing practices, and because many organisations have a poor understanding of what the regulations even are, it took a series of complaints for the ICO to approach it. The series of complaints began in Ireland from Brave, the Open Rights Group, and University College London, which were then followed by complaints in Poland.
Even though the problems were clear and well-known, there was a lack of industry appetite to take the first step to try to fix this issue, and many companies were “hiding in plain sight” simply doing things (incorrectly) like everyone else. The issue of psychological deindividuation in the AdTech industry is a big one that is perpetuated by the multi-billion dollar price tag. Most companies know they have data privacy complexities in their practices, but they continue with current practices because the speed cameras are just starting to be switched on. Now that the ICO has flagged these problems and the speed limits are becoming more obvious, AdTech companies are on high alert to fix things.
But the problem remains: even if companies decide to put more effort into compliance, how do they comply? In many cases they are not aware of, or have not found a suitable technology or approach to compliance that is actually effective when it comes to the GDPR. Moreover, subjective interpretation of the regulations still remains.
EXAMINING SPEED LIMIT WARNINGS
AdTech is fuelled by combining, sharing and analysing very big datasets. The idea is that if someone consents to sharing their data with a particular party, the value exchange is a “free” personalised and relevant experience. There are a large number of Big Data analytics and machine learning processes behind the scenes that enable this. As we have learned, there are a wide range of legal issues that contribute to difficulties with compliance with this ecosystem, including problems with meeting some of the legal grounds for processing under the GDPR. For example, processing under consent and contract grounds is only suitable in some circumstances.
With AdTech, for example, some of the complex big data processes involved cannot be described with enough specificity to meet the GDPR requirements for valid consent. As recently highlighted by the Croatian Ministry of the Sea, Transport and Infrastructure State Secretary Josip Bilaver when speaking on the actions of the Croation Presidency, "finding a formula to solve it is not easy. I would even say it’s an alchemy of sorts" (quoted from the IAPP newsletter on Jan 22nd).
However, there is a potential solution: a separate legal ground for processing called “Legitimate Interests” processing, which has its “own natural field of relevance,” as noted by the Article 29 Working Party. This kind of processing should be examined by companies that are struggling to comply due to the use of complex technologies and the repurposing of data. Processing under Legitimate Interests grounds is not a silver bullet to fix all processing problems, and nor is it intended to be used to get around issues with consent.
Instead, the “Legitimate Interests” processing grounds requires three tests be met before the ground can be relied on: (1) the data controller must have a legitimate interest in the desired use, (2) the data must be necessary to achieve the desired use, and (3) a balancing test must be applied to assess the impact of the use on the interests and fundamental rights and freedoms of data subjects.
In many situations it is difficult to meet these GDPR’s “tests” for Legitimate Interests processing to be a valid processing ground. Nonetheless, the Article 29 Working Party notes that Legitimate Interests processing “can be relied upon in a wide range of situations, as long as its requirements, including the balancing test, are satisfied.” They also specifically outline a number of scenarios where the balancing test is applied to direct marketing situations, indicating that this processing ground (and the accompanying balancing test) is potentially able to be applied in the advertising industry.
ANONOS BIGPRIVACY SOLUTION
Whether or not the balancing test is met will be a matter of fact and degree, and the level of technical and organisational controls that are applied to protect the data subject’s privacy. Pseudonymisation (as defined in the GDPR) is specifically noted as a valid technical and organisational control that can support Legitimate Interests processing. This can go a long way towards helping companies to comply with the GDPR, and gain the data insights that they are hoping for. Importantly, the Anonos BigPrivacy microsegmentation solution implements GDPR-compliant Pseudonymisation and other technical controls, to help you stick to the newly arriving data privacy speed limits. Even better, is that it does so in a way that doesn’t take away from any of the benefits that lie in data utility and processing.
It’s important to remember that in most cases, the only reason why organisations are “driving above the speed limit” when it comes to data use, is because they want to maximise data utility for market competition, product development, customer outreach, and so on. However, you don’t need to act unlawfully to do so: data utility does not have to be at the expense of data privacy. You can process within the rules, and achieve maximum data utility while protecting data subjects and their fundamental rights: driving at the speed limit means getting where you want to go, without harming any pedestrians in the process.
While some companies may continue to assume that non-compliance isn’t that big of a deal (because nobody else is complying either), the AdTech industry exposure and report from the ICO shows that sooner or later, the speed cameras will be turned on for each and every industry. Industries now need clear guidance on what the speed limit is, as well as clear road markings that help companies see how to drive. Even though everyone is speeding at the moment, very shortly, the risks of non-compliance will become a lot bigger.
This article originally appeared in LinkedIn. All trademarks are the property of their respective owners. All rights reserved by the respective owners.
CLICK TO VIEW CURRENT NEWS