Gary LaFever | March 13, 2019

Avoid the Trust Deficit with AI

Ethical_AI-Anonos-BigPrivacy-GDPR-Compliant

Focusing on data breaches, subject access requests (SARS), and consent is distracting and confusing when it comes to AI. What the industry needs is clarity on what is necessary to lawfully POSSESS and USE personal data for AI.This must be clarified NOW before AI design implementations become widespread and significant investments are made. Once made, companies and countries will resist changes because – from an economic return on investment perspective only – it will be more cost effective to engage in regulatory arbitrage than retool AI technology.

TRUSTWORTHY AI FAQs:

Q1: How Can Personal Data Be Unlawful to POSSESS Under the GDPR?

A1: Data Controllers Must Take Affirmative Action to Transform Data Collected in Noncompliance with the GDPR for it to be Lawful to Possess.

Q2: What Is Necessary for Lawful USE of Personal Data In AI?

A2: An Alternate (Non-Consent) Legal Basis with Technical and Organization Safeguards Is Required to Protect the Rights of Data Subjects.

CLARITY ON DATA AND USE

It’s great to see the ICO taking action to address the “trust deficit" with AI, including working with the Alan Turing Institute and AI expert Reuben Binns. [See https://iapp.org/news/a/icos-mcdougall-were-losing-the-battle-for-trust-but-theres-a-solution]. While attention is rightfully paid to the enormous potential benefits from AI, this IAPP article highlights the need for AI to be “trustworthy,” “ethical,” and “non-discriminatory.” However, whether a car is driven in a “trustworthy,” “ethical,” and “non-discriminatory” manner is irrelevant if the people in the car don’t even have the legal right to use the vehicle. Data controllers must take action to ensure that both their Possession and Use of personal data for AI comply with the requirements of the GDPR as well as other evolving data protection regulations. If this message is not widely publicized before AI processing is put into wide spread practice, data subjects will suffer as billions are invested by both countries and commercial enterprise in AI using illegal personal data and the rights of data subjects will continue to be sacrificed on an ongoing basis with no reversal as organization (i) elect to fight in court rather than change the AI processes in which they have invested, and (ii) decide that the most cost-effective course of action is “regulatory arbitrage” given the low risk and cost of enforcement action.

POSSESSION –The GDPR changed the nature of personal data so that it became highly regulated data. The last paragraph on page 31 of the WP29 April 2018 Guidance on Consent [See https://www.anonos.com/hubfs/20180416_Article29WPGuidelinesonConsent_publishpdf.pdf] makes it clear that data controllers must take one of four specifically enumerated actions for data collected in noncompliance with the GDPR to be lawful to possess - doing nothing means the data is unlawful to possess since the GDPR does not have any “grandfather” or savings clause:

  • Re-consent the data in compliance with required specificity, un-ambiguity and voluntariness, together with a separate non-consent legal basis for data subjects who do not re-consent so that consent is truly voluntary.

  • Anonymize the data (in the true sense of the word) so that it cannot be used to infer, single out or link to data subjects.

  • Delete the data (in the case of financial services firms and other regulated industries this could mean keeping a copy of data solely to comply with reporting obligations but not providing access to the data for any other processing).

  • Transform the data to a non-consent legal basis while ensuring that continued processing is fair and accounted for.

PROCESSING– Certain data processing activities – like AI – are incapable of being described at the time of data collection with sufficient detail to support GDPR requirements for consent to serve as a valid legal basis. And, Legitimate Interest, as an alternative legal basis, requires technical and organizational safeguards that truly mitigate the risk to data subjects so that the legitimate interest of the data controller survives a balancing test of the interests of the parties. These safeguards must protect against unauthorized re-identification of data subjects via both direct identifiers as well as indirect identifiers that can be linked together to re-identify a data subject - the “Mosaic Effect.” Traditional technologies developed before increasing volume, velocity and variety of data - like encryption and static tokenisation - are incapable of combating the Mosaic Effect. The IAPP first published my article on this issue in 2014 - What Anonymization and the TSA have in Common [ See https://iapp.org/news/a/what-anonymization-and-the-tsa-have-in-common/].

Anonos technology is the only technology that has been certified by EuroPrivacy as satisfying GDPR requirements for Pseudonymisation.

https://www.prnewswire.com/news-releases/anonos-saveyourdata-software-officially-certified-by-europrivacy-meets-the-requirements-of-the-eu-general-data-protection-regulation-gdpr-300741945.html

Are you facing any of these 4 problems with data?

You need a solution that removes the impediments to achieving speed to insight, lawfully & ethically

Roadblocks
to Insight
Are you unable to get desired business outcomes from your data within critical time frames? 53% of CDOs cannot achieve their desired uses of data. Are you one of them?
Lack of
Access
Do you have trouble getting access to the third-party data that you need to maximise the value of your data assets? Are third-parties and partners you work with worried about liability, or disruption of their operations?
Inability to
Process
Are you unable to process data due to limitations imposed by internal or external parties? Do they have concerns about your ability to control data use, sharing or combining?
Unlawful
Activity
Are you unable to defend the lawfulness of your current data processing activities, or data processing you have done in the past?
THE PROBLEM
Traditional privacy technologies focus on protecting data by putting it in “cages,” “containers,” or limiting use to centralised processing only. This limitation is done without considering the context of what the desired data use will be, including decentralised data sharing and combining. These approaches are based on decades-old, limited-use perspectives on data protection that severely minimise the kinds of data uses that remain available after controls have been applied. On the other hand, many other new data-use technologies focus on delivering desired business outcomes without considering that roadblocks may exist, such as those noted in the four problems above.
THE SOLUTION
Anonos technology allows data to be accessed and processed in line with desired business outcomes (including sharing and combining data) with full awareness of, and the ability to remove, potential roadblocks.