Covid-19: how can we avoid locking in bad digital development outcomes?
Co-authored with Becky Faith and first published here by the Institute of Development Studies.
As Covid-19 tracing apps proliferate this month, governments of all hues will gain the ability to track citizen movements in real time and to collect the largest ever trove of personal and real-time location data in history. This could save thousands of lives. It could also be a historic inflection point in civil liberties, the depth and scale of which was unimaginable to George Orwell or Michel Foucault. The introduction of these tracing apps raises important questions about this consolidation of power; and how this moment in time risks placing unaccountable Big Tech companies as institutions essential to the basic functioning of the state and society.
Irrespective of the original intent of inventors, technology is appropriated by different groups to pursue their own agendas, projects and interests. Good people with the right intentions are producing Covid-19 apps to save lives. However, the unintended consequences of technology use are shaped by political, economic and social influences beyond the control of innovators; research from India shows a consistent lack of respect for user privacy.
Power – in whose hands?
Billions of people are using mobile phones and social media to video call grandparents, share amusing images and videos, and organise community fundraising events. It is widely known that Cambridge Analytica and powerful political groups use the same technology to grab personal data, pervert election outcomes and consolidate power. None of these things were in the minds of their inventors. As Melvin Kranzberg famously said: “technology is neither good nor bad; nor is it ever neutral”.
As Covid19 tracking apps are rolled out, our analysis must not be limited to their effectiveness – although there is plenty of reason to question that. When digital technologies are presented as solutions to urgent humanitarian crisis or compelling development problems, it is critical that we apply a power lens and assess what interests are served.
Governments and corporations may use the data in ways that advantage their interests for control and profit and in ways that consolidate their positions of power. This may also be the case with digital identity systems which underpin many social protection systems. Whilst they have enabled the rolling out of emergency cash transfer programmes in Lower- and middle-income countries, they are often introduced without proper mandate and accountability, and punish marginalised groups.
But will they be effective?
Over the next few weeks, we will all feel pressured to download a Covid-19 app and be tracked along with many millions of other citizens. Yet there is only limited evidence about the efficacy of these apps – especially given the fact that 19% of the UK lack fundamental digital skills such as turning on devices or opening apps and there is genuine risk of mission creep.
Will these apps save lives, or would the money be better spent on testing and protective equipment? Would it be better to following the example of South Africa? Rather than relying on apps, the country has screened more than one in ten of its’ population by mobilising 28,000 health workers to screen over 7m people.
Community-led, rights-based approaches to digital development
When we look back at 2020, will we see it as the point when citizens were forced into a false choice between health and privacy, and when corporations and authoritarian government gained the ability to trace our every thought and move in ways unimaginable to George Orwell and Michel Foucault? In the face of weakened state institutions in democratic and non-democratic states alike, this current crisis threatens space for accountability and civil society.
Alternatively, we could seize an opportunity to put in place approaches to digital development that enhance our agency, rights and help build sustainable livelihoods. These approaches are community led, politically aware and recognise intersectional inequalities; people centred approaches which start from the standpoint of excluded people rather than the profit margins of the technology platforms. We have seen multiple examples of this in recent months; from mutual aid groups on Facebook responding to community needs, to communities of people with 3-D printers producing protective visors for frontline workers. Further research is needed to Illuminate and amplify these opportunities for technology to promote human rights and development outcomes, and to stop dangerous trends of amplification of inequalities and infringement of our rights getting locked in.