Abolition of Technologies for Social Justice

Abolition of Technologies for Social Justice

This is the core of my contribution to a Nethope 2022 seminar at the invitation of Oxfam US:

The Abolition of Some Digital Technologies Is Essential to Human Development and Social Justice

a) Technology abolition is not only possible – it is commonplace.

b) Technologies of all kinds are regularly abolished – and have been throughout history.

c) Abolition is not only common – it can be considered to be a measure of social progress.

d) Every generation identifies new social evils and abolishes them.

e) We are all familiar with many examples of abolition – historical and current.

f) These include the abolition of technologies of production including child labour and, slavery; technologies of torture including the rack, pillory, breaking wheels and the instep borer; weapons technologies including, cluster bombs, nerve gas, and biological weapons; and industrial technologies that contained carcinogens or other toxic materials such as lead and asbestos.

g) Not only is abolition commonplace, and progressive, the methods for achieving abolition are well known and well established.

h) Abolition is achieved when citizens name injustices, build coalitions, raise awareness, influence political processes, reform laws and abolish social evils.

i) When abolition is framed in this way as a feature of social progress, it then becomes clear that not only can we abolish technologies, but that we have a duty to abolish any technologies that lead to injustices in just the same way that we act collectively to abolish other social evils.

j) This having been established, the question then becomes which applications of technology should be abolished? Which digital technologies might we consider to be social evils i.e. obstacles to development and social justice? I would argue that digital technologies that may be considered to be social evils would include:

  • Racist algorithms (see Safiya Noble in Algorithms of Oppression and Ruha Benjamin in Race After Technology)
  • Sexist algorithms (see Virginia Eubanks in Automating Inequality and Safiya Noble in Algorithms of Oppression)
  • Government digital services that punish the poor (see Cathy O’Neil’s Weapons of Math Destruction)
  • Bulk surveillance of citizen communications (see the Snowden revelations)
  • Targeted surveillance of opposition politicians/journalists/judges/activists (see the Pegasus spyware story)
  • Biometric refugee surveillance (see Mirca Madianou the Surveillance Asemblage)
  • Covert political advertising/disinformation (see Cambridge Analytica)
  • Social media profiling / amplifying antagonisms for profit (see Shashana Zuboff in Surveillance Capitalism)
  • School proctoring/surveillance tech (see Electronic Freedom Foundation).

Comments are closed.