ABSTRACT

Assessing the effects of audio-visual media can help content creators learn how audiences regard their creations. However, this process can be challenging, in part because traditional methods involve using self-reported questionnaire data which are subject to different forms of bias (participant bias, experimenter bias, or simply human variability). To address this subjective variability, this work proposes a method to supplement questionnaires with autonomic physiological data readings for assessing affective responses to video content. Specifically, it combines subjective self-report questionnaires with electroencephalograph (EEG) and heart rate readings. The target subjects are viewers of short Japanese video commercials and news programs shown through a VR headset platform. Support vector machines were used to detect different types of affect related to the context of the videos. Finally, an analysis of the versatility of the trained models was done by applying a model trained for short TV commercials to a smaller dataset obtained from viewing news programs. The results show a possible link between positive emotions and clarity of understanding.