ABSTRACT

In this chapter, the authors propose an adaptation mechanism based on reinforcement learning that reads subconscious body signals from a human partner, and uses this information to adjust interaction distances, gaze meeting, and motion speed and timing in human–robot interaction. Social robots work closely with people in daily life. Different people have different preferences, and robots need to work with all of them. Expression of emotions has been well studied. Further, there are robots that infer users’ state from emotion expressed by users, such as for creating affective reaction and estimating of user context. There are many research works on emotions. P. Ekman argued the existence of basic emotions, which are common emotions in humans, and proposed six basic emotions: anger, disgust, fear, joy, sadness, and surprise. Emotions have been classified relative to time. The shortest one is autonomic emotion mostly caused by a stimulus and lasts seconds, such as joy and fear.