ABSTRACT

The growing influence that algorithms and data-driven decisions have on our lives has summoned a wide spectrum of critical voices. Most of the algorithms used in Big Data applications are inaccessible even for those who use them, not to mention those who are affected by the results of algorithmic evaluations. Opening black boxes is not enough because algorithmic scrutiny is difficult-to-impossible to achieve. Opening black boxes and scrutinising algorithms is part of disentangling the complicated relations. But that must not entice us to think that the algorithm – or the source code – is the definitive instance that gives us a lever to the important ethical and political issues. Asking for scrutiny of a biased algorithm implies that one could build an unbiased one. If the readers want transparency, accountability, and scrutiny, they need people who perform transparently and accountably, and people who scrutinise their work.