ABSTRACT

This part conclusion presents some closing thoughts on the key concepts discussed in the preceding chapters. The part shows that nothing is ever unbiased or objective–not even international human rights conventions or biometric technologies or the combination of the two. It examines the databases used by tech developers in the laboratories to develop and test the algorithms that are used in facial recognition are biased in that they consist mostly of white males. This means that biometric technologies implemented at, for example, border sites have a higher risk of producing errors and misidentifying black people who are crossing borders. The universal definition of the human being, as set out in international conventions, originated in the aftermath of the Second World War. All human beings were, at least according to the principles underlying the conventions, considered equal and entitled to a minimum of care and freedom.