May 4, 2020 | Magali Feys

Data Use vs Privacy: A False Trade-Off?

The concerns surfacing ahead of proposed COVID-19 tracing apps show that privacy isn’t dead: far from it. Fears of government overreach and corporate tracking (if left unaddressed) could doom the apps to failure: for the apps to be useful, experts say they need at least 60% of the population to adopt them.

Adoption rates will vary among countries depending on whether the use of tracing apps is obligatory or voluntary. In the US and EU, a greater focus on civil liberties means that people are much less likely to download an application that has any perceived risk of surveillance, regardless of whether that risk comes now or later. Trust issues will be “make or break” for COVID-19 tracing apps, and this has led to a serious discussion of just how much privacy we are willing to trade away in order to protect ourselves.

While the underlying need to make this difficult choice appears valid, it is actually a false trade-off. Yuval Noah Harari explains that “when people are given a choice between privacy and health, they will usually choose health,” a binary question that nobody wants to answer. Fortunately, technical controls that can enforce the legal and ethical rights underlying privacy are now available, which allows the choice to be reframed  from an “either/or” answer to “both”.

Privacy and Trust Issues

Interest in COVID-19 tracing apps began when governments realized that a vaccine would not be available quickly, and that ongoing lockdowns would harm the economy. Current proposals take either a “centralised” or “decentralised” approach: the former is intended to be more protective of privacy, while proponents of the latter argue that it provides  “more insight into Covid-19’s spread.” These two approaches have created a massive debate over privacy, data use, and trust.

The issue is that governments face a real and urgent problem: they need to be able to roll out COVID-19 monitoring apps to manage the spread of disease with an exit strategy in mind, but without trust, people won’t use the apps, and without widespread adoption, the apps are useless. Moving past the issue requires a realisation that everyone is framing the problem as a false, binary choice, which forces a trade-off between privacy and data use.

A New Hope: Embedded Technical Controls

As technology and law have developed, newer approaches are emerging that mitigate privacy risks in a way that enables data use. These approaches, such as GDPR-compliant Pseudonymisation and data protection by design and by default, do not degrade the accuracy of data, while providing superior privacy protection. These technical controls are embedded into the data and flow with it to provide data protection in use. Using these kinds of dynamic technical controls and a functional separation approach to data processing, it is possible to process information about people without knowing who those people are. This allows both data utility and protection of privacy while data is in use.

This risk-based approach provides numerous benefits to organisations, governments, business, and society, which means that governments can roll out COVID-19 tracing apps that ensure privacy without compromising the potential value of these tools.

Next Steps?

When moving away from traditional models of privacy and data use, it is crucial to remember that just because not all regulators are aware that new technical controls are available, it doesn’t mean that they don’t exist. When a binary choice between data utility and privacy is pushed to the forefront of the debate, new solutions can be overlooked, simply because they sit outside the traditional approaches to the issue.

All big crises provide big opportunities for significant positive change, and the next steps we need to take include the adoption of an integrated framework of data use alongside privacy protection. This kind of data use can ultimately provide more benefits to society, and we need to be ready to reap those rewards.

This article originally appeared in ID BULLETIN. All trademarks are the property of their respective owners. All rights reserved by the respective owners.

 
CLICK TO VIEW CURRENT NEWS

Are you facing any of these 4 problems with data?

You need a solution that removes the impediments to achieving speed to insight, lawfully & ethically

Roadblocks
to Insight
Are you unable to get desired business outcomes from your data within critical time frames? 53% of CDOs cannot achieve their desired uses of data. Are you one of them?
Lack of
Access
Do you have trouble getting access to the third-party data that you need to maximise the value of your data assets? Are third-parties and partners you work with worried about liability, or disruption of their operations?
Inability to
Process
Are you unable to process data due to limitations imposed by internal or external parties? Do they have concerns about your ability to control data use, sharing or combining?
Unlawful
Activity
Are you unable to defend the lawfulness of your current data processing activities, or data processing you have done in the past?
THE PROBLEM
Traditional privacy technologies focus on protecting data by putting it in “cages,” “containers,” or limiting use to centralised processing only. This limitation is done without considering the context of what the desired data use will be, including decentralised data sharing and combining. These approaches are based on decades-old, limited-use perspectives on data protection that severely minimise the kinds of data uses that remain available after controls have been applied. On the other hand, many other new data-use technologies focus on delivering desired business outcomes without considering that roadblocks may exist, such as those noted in the four problems above.
THE SOLUTION
Anonos technology allows data to be accessed and processed in line with desired business outcomes (including sharing and combining data) with full awareness of, and the ability to remove, potential roadblocks.