ABSTRACT

The past decade has witnessed an unprecedented growth in user interface and human-computer interaction (HCI) technologies and methods. The synergy of technological and methodological progress on the one hand, and changing user expectations on the other, are contributing to a redefinition of the requirements for effective and desirable human-computer interaction, and are influencing social interaction and collaboration. A key element in these developments is the increasingly important role of affect. The ability to accurately recognize human emotions enhances the effectiveness of HCI and contributes to improved human performance. The ability of machines to manifest behaviours that appear to reflect particular emotions enhances their effectiveness across a range of tasks, including training and education, treatment, and emerging relational agents. In this paper I provide an overview of the state-of-the-art in the emerging technologies supporting emotion sensing and recognition by machines. I also outline several options for the generation of expressive machine ‘behaviours’, which can be interpreted by humans to reflect particular emotions. I present a framework for describing various approaches to emotion recognition across multiple modalities, including facial expressions, physiological signals, speech, body postures, and gestures and movements. I conclude with a discussion of some fundamental questions regarding the integration of affect in HCI.