We need to stop the data economy, before humanity pays the ultimate price

08 October 2020

Portrait of Carissa Véliz

by Carissa Véliz
Associate Professor in Philosophy

Carissa Véliz is an Associate Professor at the Faculty of Philosophy and the Institute for Ethics in AI, and a Tutorial Fellow at Hertford College, University of Oxford. Her research focuses on ethics and AI, moral and political philosophy more generally, and public policy. She is the author of Privacy Is Power (Transworld, 2020).

Adobe Stock 286518376
© Adobe Stock
If you've watched Netflix's documentary The Social Dilemma, you'll know that it paints a terrifying – and accurate – picture of the damage that digital technology is causing to individuals and societies.

Screen addiction, increased rates of suicide and swayed elections are just a few of the horrors laid at Silicon Valley’s door. But despite its apocalyptic tone, what the documentary fails to emphasise enough is the engine driving this social destruction: the systematic violation of our right to privacy.

The examples above are worrying enough, but they’re just the tip of the iceberg when it comes to the implications of privacy loss. Facial recognition – a technology so fit for abuse that it ought to be banned altogether - is advancing at speed. We only need to look back at IBM’s punch card, and how it enabled Nazi authorities to count and categorise German citizens, to see how this type of surveillance technology is ripe for deployment by an oppressive regime. If tech companies - and governments - want to be on the right side of history, they would do well to protect our privacy.

Even in the most capitalist of societies we agree that certain things are not for sale—people, votes, organs, the outcomes of sports matches. We should add personal data to that list. No one should be able to take financial advantage from exploiting sensitive information. That we are allowing companies to profit from the knowledge that someone has a disease, or has lost their child in a car accident, or even that they have been the victim of a rape, is outrageous.

Part of the problem is that the line between surveillance for profit and surveillance for public protection is becoming increasingly blurred. In the UK, Uber’s sharing of data on customers, drivers and journeys has won it support from the police in its ongoing licence battle. The National Police Chiefs’ Council argues the intelligence is a ‘vital’ tool in tackling crimes. While undoubtedly true, citizens might be left wondering whether it’s right for public institutions to encourage certain services that might be detrimental to society overall (consider the employment and safety problems Uber has had) just because it provides them with a surveillance opportunity.

Facial recognition is a technology so fit for abuse that it ought to be banned altogether - Carissa Véliz

To get to grips with what the digital economy really means for privacy, we need to view personal data as a toxic asset. It poisons individuals by making them vulnerable to unfair discrimination, public shaming, identity theft, and more. It poisons institutions, because every data point is a possible leak, a possible lawsuit. It poisons societies, because it puts equality and democracy at risk. We are no longer treated as equals. We are each treated according to our data. We don’t see the same content, we don’t pay the same price for the same product, we are not offered the same opportunities.

Tech doesn’t need to trade in our personal data to work well—the data economy is just a business model. Good tech should work for citizens, not for advertisers or data brokers. It should respect our rights and our liberal democracies, and protect our privacy. This sort of digital ethos might seem like utopia from where we stand now, but it’s not. It’s perfectly achievable. We can outlaw destructive practices, and use technical solution like encryption, differential privacy, and other promising methods to manage data while preserving privacy.

We are currently at the very beginning of a civilising process similar to the one that made our offline life more liveable. Regulation made sure that food being sold was edible, that customers could return faulty products, that cars had safety belts, and that prospective employers couldn’t legally ask you about whether you were planning to have children. The present historical moment is crucial if we want to tame the Wild West of the internet. The ground rules that we set now for personal data will determine the privacy landscape of the next few decades. It is critical that we get things right, and curbing the dark side of tech is going to require changing the business model of data vultures who live off our online trails. Experts need to send a clear message to governments about what is needed: an end to the data economy; a complete ban on personal data trades.

Anyone holding our personal data should be obligated to use it in our interest only - Carissa Véliz

We also need regulatory solutions to curb the use of personal data so that it can only be used in our interest, and not against us. Fiduciary duties exist to protect individuals in a position of weakness against professionals who are supposed to serve them but who might have conflicting interests. Just as doctors, lawyers, and financial advisers are bound by these duties, anyone holding our personal data should be obligated to use it in our interest only.

If we had banned data trades in time, and regulated data controllers and processors properly, we wouldn’t have to worry about Uber handing over data to the police, or TikTok being a threat to national security, or about personalised propaganda online swaying elections, or about the possibility of contact-tracing apps misusing our data. But it’s not too late to fix our privacy landscape. We need to take back control of our personal data, and with it, our ways of life and our democracies.

  • Carissa Véliz's book, Privacy is Power- Why and How You Should Take Back Control of Your Data , is out now

This opinion piece reflects the views of the author, and does not necessarily reflect the position of the Oxford Martin School or the University of Oxford. Any errors or omissions are those of the author.