ABSTRACT

The chapter delves into the main problems that call for the development of the new field of algorithm audit. It starts by presenting the black box problem and its relevance in science and technology studies (STS). The opening of black boxes is not always possible, nor desirable. Few can understand what are the socio-technical processes that frame algorithms. But it is only by unpacking black boxes that it is possible to retrace all the technological, political, social, and economic apparatuses that constitute algorithm assemblage. Scarce knowledge of coding and data science, as well as limited access to technology and the internet, may result in a lack of capabilities to understand the risks of algorithmic decision-making. The chapter continues with the definition of digital social inequalities that lay in the connection between social disadvantage and lack of digital and data literacy. It ends with the consequences that algorithms may have on the trust that citizens and customers put in corporations and organisations. If there is a risk that algorithmic processes may have unjust outcomes, this undermines citizens’ feelings of security and trust in the processes themselves and in the institutions that use them.