ABSTRACT

Voice-activated helpers like Siri, Alexa, Google Assistant, and Cortana have automatic female voices and are often perceived to be white. This reinforces the stereotype of a helpful and subservient woman and white norms. This case study problematizes norms and violence perpetuated through virtual assistants and highlights design responses. It draws on multiple interviews with technology developers and critics such as Marco Iacono, an early member of the Apple Siri team. The big four virtual assistants—Siri, Alexa, Google Assistant, and Cortana—have been the subject of much critique in recent years. This chapter highlights the power of social critique and critical design interventions to help affect change, even within major technology companies like Apple, Amazon, Google, and Microsoft.