ABSTRACT

Data analytics programs are frequently used in decisions governed by the concepts of disparate treatment and disparate impact. Disparate treatment is the US legal term for differentially treating a protected class (e.g., gender, race, ethnicity, religion, national origin, etc.) and requires proof of not only the disparate treatment but also the intent to treat a class of individuals differently based on their status. And a number of relevant laws govern decisions made by humans (and those augmented by data analytics) include the concepts of disparate treatment and disparate impact as measurements for discrimination: Fair Housing Act of 1968, Americans with Disabilities Act, the Age Discrimination in Employment Act, Equal Credit Opportunity Act, and the Civil Rights Act of 1964. Readings include Solon Barocas and Andrew Selbst on the many ways data analytics, specifically data mining, can discriminate in the design of a program and Anna Lauren Hoffman on the limitations of only discussing discrimination in regards to algorithms. The two related cases are (1) on Amazon’s hiring AI program, and (2) a predictive program used by banks.