This chapter explores the development of technologies using forms of artificial intelligence (AI) in emotion-related data (much of which has been developed in “affective computing”). Our focus is on the efforts made to try to track, identify, interpret, replicate, and potentially manipulate emotional activity. AI has been associated with emotion across media, the military, the state, private industries, and academia. The discipline of affective computing has grown considerably in the last twenty years and is now constituted by a range of areas. These include (but are not exhausted by) capturing emotion through facial expression, bodily expression, speech, text, physiological data, such as skin conductance and heart rate, and senses such as touch. Affective computing also involves areas focusing on affect generation, such as in virtual characters and emojis, and physical robots (informed by techniques of labelling posture, gesture, and motion in dance) (Picard, 2000). Much of this work aims to create “emotional agents” (virtual and/or physical) capable of social interaction with humans. Affective computing is a field of significant size and growing influence, especially in industry. It remains a sub-field of engineering, though, which recruits existing models of emotion from psychology (primarily) and the social sciences. As such, it is important to analyse the theories of psychology that underpin the models that affective technologies develop and the associated labelling of emotion therein.