A GLOBAL HEALTH CRISIS
Pandemic shows current data laws are simply not fit for purpose
An interview with Lawrence Lessig, Member of AXA Research Fund’s Scientific Board
The COVID-19 pandemic is raising important questions about data privacy. In your view, do we have the right regulations in place to deal with these questions?
No, because our current framework tries to lock down data. Our objective shouldn’t be to lock down data. It should be to regulate improper use. Years ago, we had a similar issue with copyright. At the time, we said to copyright owners, "give up control over copies, and let’s talk about how to regulate use." A commercial use ought to be regulated – a non-commercial use shouldn’t be. The fact is, we can’t control access to data. It’s just not controllable. What we can control is its use. COVID provides a perfect context for thinking about this. The one thing we need to combat this pandemic effectively is easy access to contact tracing data. Once we know somebody has the disease, we need to identify who this person has been in contact with, so we can get them tested as well. In a pandemic, that is a perfectly legitimate use of data. But if you want to use that same data for another purpose – if you want to use it, for example, to assess my health so you can increase my insurance costs – that wouldn’t be a permissible use. In that case, you would be using public data that affects me personally in a way that I would have no reason to agree to – and for which there is no “public good” justification. If we embrace this idea that data has a value for society, then we could be using data to help us manage this crisis more effectively. As it is, we’re confronted by an infrastructure of privacy that puts up hurdle after hurdle and prevents us gathering the data we need. With the pandemic, it’s a great opportunity to radically rethink what privacy is about. It’s not hard to imagine how we might build a new infrastructure, where we could be confident about how companies, especially, were using data. Such a regime would be about auditing data use and business models that are based on data. It would also be about the right penalties being in place. If the cost of violating a rule is high enough, companies won’t violate the rule.
“Our current framework tries to lock down data. Our objective shouldn’t be to lock down data. It should be to regulate improper use.”
Are our regulators strong enough to enforce this kind of system?
No, not now. And in fact, we may not have the capacity in government to regulate the use of data properly, particularly in respect to big tech companies. On one hand, we need a better way of controlling privacy. On the other, we don’t have the democratic capacity, especially in the US, to bring that regulation about. We tend to be extreme in our punishment of individuals, but lax in our punishment of corporations; it should be the opposite way around. We know that corporations will change their behavior if the fines are severe enough. I’m not saying it’s going to be easy to get to the right rules – I’m saying that the right rules would provide the right effect. And that, in turn, would protect privacy much more effectively than the current system does.
Wouldn’t this approach restrict individual choice?
We have to be realistic about “choice.” Nobody knows how they want their data to be used right now. And nobody is really exercising autonomy at this moment because nobody, on screen, goes through the process of reviewing the terms and conditions of the agreements they are consenting to. We live a lie, constantly affirming that we’ve read documents we’ve never read. And we pretend this underpins the entire infrastructure of privacy that we have right now. We’ve got to be honest – nobody is doing it. We need to take a step back, and say 95% of these decisions need to be made based on policy, leaving only a small fraction subject to autonomy. We’ve got to move these choices out of the realm of the individual – because, frankly, the individual isn’t making these choices, and generally doesn’t have the capacity (in time, not intelligence) to make them.
“There’s an economy to data, which is devastating for competition because it’s a winner-takes-all market.”
Is there an opportunity now, with COVID-19, to change how we manage data – because people recognize how socially useful it can be?
In countries where you have effective contact tracing, that may be the case. There will be more trust. People will say, “governments have faced a problem, and did so efficiently and effectively. There’s a clear argument for tracing and for enabling the government to use data about me.” In the US, that hasn’t been our experience. We’ve had failure after failure by the federal government. So, the consequence is that people look at the government and just say, “see, government can’t do anything. Government is the problem.”
What about business – are companies changing the way they use data?
The last 10-15 years have taught us that wealth in the 21st century comes not from control over natural resources, but control over behavioral data. There’s an economy to data, which is devastating for competition because it’s a winner-takes-all market. You need a lot of data to make it valuable. So companies will look for ways to extract data from their consumers, and will resist regulation because every regulation is a lost revenue opportunity. It’s like the Wild West: People are doing everything they can to maximize wealth now, waiting for the marshal to turn up and bring order to the society. Everyone knows that sooner or later the marshal will turn up. The question is how much can be extracted before that happens.
Readers also read
- Implications for the international order, democracies and privacy, J. Peter Burgess
- The crises behind the health crisis, Alban de Mailly Nesle
Member of AXA Research Fund’s Scientific Board and Roy L. Furman Professor of Law and Leadership at Harvard Law School, Cambridge MA (US)
Lawrence Lessig is the Roy L. Furman Professor of Law and Leadership at Harvard Law School in the US. He was the director of the Edmond J. Safra Center for Ethics at Harvard. Previously, he founded the Center for Internet and Society at Stanford Law School. Larry is also a member of the American Academy of Arts and Sciences, and the American Philosophical Association. He has written widely on issues of politics, governance and digital technologies. He was named one of Scientific American’s top 50 visionaries.