ABSTRACT

This chapter explores the intersection of digital risk prediction and race. It argues that studies demonstrating the limited validity of risk technologies with respect to black, Asian, and minority ethnic (BAME) groups reinforce the position of penological scholars who have theorized risk and implied that it is an instrument of racialized control. The studies call into question the objective fairness of the technologies. Objective fairness in this context pertains to the ethicality of using flawed predictions to incapacitate BAME people or to compel them to undertake more intensive rehabilitative work than required. The chapter also explores the issue of subjective fairness and argues that this dimension of risk prediction has been ignored. In particular, the chapter focuses on the nexus of race, predictions, legitimacy, and compliance in the contexts of rehabilitative work. It contends that systemic racial discrimination (for example, the application of risk technologies imbued with predictive bias) can pose implications for the perceived legitimacy of authority, which can in turn undermine the processes and outcomes of rehabilitative work.