By Thea Anderson, Director, Beneficial Technology and Elizabeth M. Renieris, Fellow, Harvard
Part one of our exploration into the pathways and pitfalls for good global data protection in an evolving landscape.
lifeline to friends and families, livelihoods, and also a vital resource for information and organizing in the era of social distancing and stay-at-home orders. For example, Facebook has seen a 50 percent increase in messaging and video calls across its various platforms. Tik Tok downloads are up by 12 percent and in the first quarter of 2020, Twitter saw 23 percent more users viewing ads in its platform, compared to 2019.
Alongside our increasing reliance on it, technology continues to evolve faster than the laws and norms that govern it. The rapid pace of technological advances means that existing and new regulations can quickly become obsolete. In general, the digital world remains largely unregulated and prone to amplifying power imbalances, especially when it comes to how to protect the mass quantities of personal data being actively collected, stored, and shared, and even passively processed with smart devices and IoT. This gap between technology and governance is particularly concerning and problematic in light of some of the measures and technological interventions being deployed to combat and
To highlight a few recent examples, we have seen Israel rush through an executive order to allow the police to track the cellphones of individuals who tested positive for, or are suspected of having, the virus, mapping the spread of the infection and enabling the prediction of outbreak clusters. In a similar move, South Africa requested mobile phone companies to track people’s movements. Singapore’s Ministry of Health made information about patients public, and turned the information into an interactive map. In Argentina, the free app Testeate enables direct information exchange between the Ministry of Health and the public to track individuals in quarantine. While personal data can be critical to government responses and to inform the public, health and location data can just as easily be used in ways that expose people to new privacy and safety risks, as documented by Privacy International.
For these and pre-existing concerns, Omidyar Network will explore the pitfalls and pathways for good global data protection in an evolving landscape through a forthcoming blog series. We will explore how data protections apply (or don’t) to different technologies in practice, implications for power dynamics in various political and geographical contexts, and what it could mean 10 to 20 years in the future. We’ve done some analysis of how well various legal instruments, including laws and regulations, protect fundamental human rights in their substance with the hope that it can become a guide to avoiding the pitfalls and finding the realistic and effective pathways to realistic data protection – both in this pandemic and going forward.
Initial Observations on Data Protection in an Evolving Landscape
In our review, which started in 2019 and became especially salient in this health emergency, we selected a mix of legal instruments, from countries large and small (with a focus on emerging markets), and at different points in the evolution of their data protection laws. Some with a decades-long tradition, such as Argentina, the first Latin American country to adopt a data protection law in 2000, and others just beginning to build the relevant legal foundations, such as Kenya and Uganda. We also assessed these laws in context to determine how strong they are “on paper” vs. in practice, which we already see tested in real-time with the spread of the coronavirus and the expanded tracking of sensitive health data.
How will different countries respond in the short and long-term? What legal instruments will be most effective in such rapidly changing circumstances?
To date, between 120 to 150 countries have enacted data protection or privacy laws, with the exact figure depending on how such laws are classified. We found many are not fit for purpose to address the surge in use, nor the emerging technologies we’re seeing today. These gaps could lead to increasing discrimination, automate inequality, and further exacerbate power and information imbalances between authorities or corporations and the public as well as serious threats to individuals’ wellbeing with no relevant recourse. Beyond privacy invasions like surveillance of our web traffic and location histories, we now risk a loss of anonymity online and off with the ubiquity of facial recognition technologies, digital identity schemes, and mounting ambient data collection, often with disproportionate impact on vulnerable populations.
Just after the World Health Organization announced we had a full pandemic, forcing many to move their personal and professional lives online, the Global Privacy Assembly, which represents 130 data protection and privacy authorities, pledged that “data protection authorities stand ready to help facilitate swift and safe data sharing to fight COVID-19” alongside the UN Special Rapporteur’s call for governments to not abuse emergency measures to “squash dissent.” Several countries have released new guidance on handling sensitive personal data, including Mexico and Belgium; and the Moroccan Data Protection Authority has announced a call for upcoming revisions to the national data protection law to guarantee more protection of personal data and how to manage health risks against risks to privacy.
Moving forward, we will need to think more creatively and expansively about how to leverage existing legal frameworks and core legal principles (such as necessity and proportionality) to account for emerging technologies and new scenarios like a global pandemic, including their immediate and future implications. We also need to resist the reactive call for new laws and regulations when not needed.
In that vein, let’s take a look at what data protection laws exist today and how they hold up. Today, the vast majority of laws we have reviewed are either newly enacted, or else significantly upgraded, and most often to directly reflect Europe’s General Data Protection Regulation (GDPR), a trend described as the “Brussels effect” no matter the geography or socio-political relevancy.
The first comprehensive data privacy law in the US, the California Consumer Privacy Act (CCPA), largely mimics the GDPR, as do many laws emerging across Latin America and Africa. We see reflections in the recently overhauled Council of Europe’s Convention for the Protection of Individuals with Regards to Automated Data Processing (commonly referred to as Convention 108+) with a growing number of non-European signatories, including Mauritius, Senegal, and Tunisia. Globally, we see a rise in regionally-focused frameworks, such as ASEAN Framework on Personal Data Protection, and increased calls to harmonize frameworks across African countries in response to the African Comprehensive Free Trade Agreement taking effect.
But this doesn’t mean we have universal definitions of effective data protection. When we look across the globe, we see relative uniformity across the legal frameworks for data protection in their high-level objectives, components, and principles. But they show more variety at the detailed, implementation level to be tested in the coming weeks and months in light of the pandemic.
Similarities in data protection approaches, including:
broadly similar rules for the collection, processing, and treatment of personal data; the authorization of a regulator or authority responsible for enforcement; broad similarity in the scope of data protected, the nature of exemptions, and core principled articulated in law; and long-arm or extraterritorial application, highlighting the complexity of cross- border data flows.
Most of the key differences live outside the regulations and are highly dependent on the broader legal foundations, rule of law, quality of governmental and administrative processes and procedures, and a complex array of other social, political, and commercial factors in a given jurisdiction.
The data protection frameworks also exhibit varying philosophical underpinnings, with some laws more focused on protecting an individual’s right to personal identity, others taking a more paternalistic approach, and some more focused on commerce and competition. For example, Brazil’s data protection law emphasizes “free personality development” and provides heightened requirements for any processing that could result in discriminatory treatment or allow the unequivocal persistent identification of a data subject. This is especially concerning in light of Brazilian authorities call to the country’s largest wireless carrier Telefonica Brasil SA to provide cellphone data to help authorities slow the spread of coronavirus in Sao Paulo state. India’s law takes a more custodial approach, imposing certain duties on data fiduciaries and “significant data fiduciaries” and giving the government broad powers over what is in the best interests of India’s data sovereignty. Nigeria takes a more commercial approach, ensuring “that Nigerian businesses remain competitive in international trade through the safeguards afforded by a just and equitable legal regulatory framework on data protection” as a key objective of its law.
Privacy rights and laws that are designed around notions of physical boundaries, like the home and the body, are threatened as we move into an age of more invasive technologies, and now a global pandemic. We are seeing the emergence of a $20B-industry around “emotion detection” technologies that claim to detect and identify emotional states, mind control technologies that claim to treat neurological illness or control our behavior, and AI-enabled cameras that are meant to predict criminal states of mind before crimes occur. As a result of this rate of change, many existing laws may have a more limited impact than previously anticipated, with some calling for new legal frameworks and even new bodies of rights, such as the NeuroRights Initiative out of Columbia University.
We also know that most data protection laws are designed with existing digital infrastructure in mind. This is further complicated by existing tech platforms as they expand globally, such as the increased use of WhatsApp across African countries. These limitations put individuals and communities at risk of new and additional harms, including psychological, emotional, mental, financial, and, in the context of a pandemic like COVID-19, even physical harms. Studies have longed shown that surveillance breeds self-censorship and can alter behavior. In a pandemic, individuals subject to increasing surveillance may be less willing to search for relevant medical information or seek help for the fear of restraints on their liberties or other measures. Those needing mental health or emotional support may be reluctant to reach out virtually where communications are unencrypted, thereby heightening feelings of isolation and fear.
More Insights in Multi-Part Opinion Series
We strive to share more learning and perspectives in forthcoming posts, including:
Implications on data protection and recommendations as many countries digitalize social protection payments and financial services in response to the pandemic and long-term strategies; Interviews with data protection and competition authorities to understand how they are adapting to COVID-19 and how it may impact their approach to regulating new and emerging technologies and tech platforms;
Reflections on where current GDPR-style and alternative models for data protection fall short and what other factors determine the efficacy of laws that govern data protection; How gaps in the existing legal structure could be augmented by alternative data governance models, including global frameworks that address the increasingly cross-border world and opportunities to explore community-level data protections; and
The broader context around data governance, including:
Variations on data protection and privacy laws by geography, including deep dives on emerging trends across both Africa and Latin America.
The levels of responsiveness or reflection different laws are of different evolving political, social, and economic situations and scenarios.
What consumer protection and competition laws can do in support of data protection and privacy, especially when emerging tech is at play.
We’d like to make this series responsive to any additional questions that you feel need to be explored and answered, especially in light of coronavirus and the unprecedented drive for data. Please comment to share your ideas and issues you want us to explore.