coronavirus covid19 privacy
Privacy and Covid: the real questions to ask ouserselves
Applications that track citizens’ movements to prevent the uncontrolled spread of contagion raise several concerns about privacy. Yet every day we voluntarily expose and give away our data without realizing it.
Tommaso Buganza, Professor of Leadership and Innovation
Daniel Trabucchi, Assistant Professor of Leadership and Innovation
School of Management Politecnico di Milano
We are living in an unprecedented situation. The global pandemic shown in many Hollywood movies is now a reality and – without popcorns– it looks quite different.
In Italy – as hopefully will soon happen in other countries – the rate of diffusion started finally to decrease. The discussion is now moving to the management of “Phase 2”. What will “the new normal” look like? What will it be like to live in a world in which the virus is under control, but still present?
Digital experts are proposing possible “futuristic scenarios” in which mobile applications will track us to immediately inform potentially exposed people and interrupt the transmission chain (see, for example, the Pan-European Privacy-Preserving Proximity Tracing project). Recently – on April 17th – the Italian government signed a contract for the voluntary tracing app “Immuni”, aiming at developing contact tracing using the Bluetooth technology.
Meanwhile, Google is already sharing anonymous data to identify people who are breaking the lockdown (Giuffrida, 2020; Hamilton, 2020) and partnered up with Apple to develop a contact tracing technology (Apple, 2020). At the same time, the world is starting to look ahead, asking “how” – since “if” is no longer an option– this global emergency will change our lives in the years to come. In an article recently published on the Financial Times, Yuval Harari describes the possible downsides of using available technologies to track people movements and behaviors. On the one hand this would help the sustainability of the national health-care systems, but, on the other hand, the cost could be a “new normal” where our vital functions are constantly measured, stored and analyzed. It is easy to imagine how this huge data base, along with the increasing knowledge about human biomechanics and the impressive advancement of AI, might lead to a reduction of democracy and civil rights in our countries (Harari, 2020).
Humans looks frightened, not only by the pandemic but also by the loss of privacy (Brody and Nix, 2020). Are we slowly but inexorably losing our freedom?
But what do we already know about data? Data is the new oil, being considered a precious resource to be exploited. Data is valuable because they allow us to understand things that we wouldn’t otherwise. They show us something we don’t know, both as individuals and as a community. They show us things that are there, but that are too complicated for individual human brains to see.
Think about Netflix, one of the services many of us enjoy most in these lockdown days. Choosing a new movie, or a new series is an epic adventure. That’s true. But you might not know that Netflix already made it a lot easier for you. Probably you noticed that the “match score” (the percentage indicating how likely you should appreciate the content) is often very high. That’s because Netflix tracks all your behaviors, previous shows, frequency of watching and then, leveraging AI, proposes you only movies and series you should like. You can try a little experiment if curios. Search the whole catalog, you will discover many more contents…and probably don’t like them!
This is just one example of services that many of us use everyday…but the list is long. Spotify can suggest songs, Amazon products, our fitness app –Runkeeper, Runtastic, Freeletics… the next training session optimized using the data we provided through previous ones.
This may sound like something unique of the app economy, but it’s not. Google built its empire on a data-driven business model, targeting ads on the search engine, using players to tag images (with the game Google Image Labeler) and captcha to detect addresses (Perez, 2012) or street view images…possibly to help self-driving cars (Kid, 2019). Even Starbucks – a brick and mortar company – uses the data from its mobile app to get insights on their customers’ habits and tastes (Gallea-Pace, 2020).
We all know these stories but now, all of a sudden, we find ourselves more fearful and privacy sensitive. We are afraid of the impact that technology may have on our lives, but we fail (again) in recognizing that this already happened.
Major digital companies know us perfectly, they know a lot more about us than we imagine. And some of them, over the years, became extremely good in profiting and capturing value from data (Trabucchi et al., 2017, 2018).
They “pay” us back with more personalized services or, in some cases, with free services…which we like even more.
Companies – of course – must respect all privacy laws, and GDPR in Europe has played a huge role in this. However, companies can do a lot with the data we provide them because we accept the terms of use… usually without reading them.
Curiously, this isn’t even the first time that the privacy issue violently explodes. Two years ago, all the social media of the world were filled with the #LeaveFacebook movement.
The Cambridge Analytica scandal highlighted Facebook’s data-based business model and its implications for our privacy, and even the impact that data can have on our lives through micro-targeting and similar phenomena (Cadwalladr, 2019). In those days, it seemed like the world realized what had been there for years: data is valuable, companies use it. Looking at Mark Zuckerberg wearing a suit and tie in front of the US Congress, many of us thought that Facebook would become as empty as our cities are today. The reality is that it didn’t happen. After the clamor, we went back to our habits… we enjoy our free digital services too much to bother considering we actually pay them with our data currency.
And here we are again. In this unique historical moment, we think a lot about how the “new normal” will be like. If we really care about our privacy, we should ask ourselves: will our near future be different only in terms of social relationships and physical movement or will it also put into discussion our well-established digital life?
We can still get our privacy back… if we want.
But, are we ready to give up those wonderful services provided by Netflix, Waze, Amazon, Spotify, Instagram, Facebook, TikTok, Twitter, Snapchat, and all the others?
So, this is our point: is this sudden upheaval around privacy connected to the use of personal data to protect public health really justified? Perhaps we should rather accept that the “new normal” is posing even more frightening questions.
Do we value free services more than public health?
Do we trust private companies more than our governments?