A Toolkit for Exploring Affective Interface Adaptation in Videogames
From its humble beginnings back in the early 1960's the videogame has become one of the most successful form of HCI to date. However if we look more closely at the interactions between the game and gamer it becomes evident little has changed since the advent of SpaceWar[\] back in 1961. These interactions are for the most part static and thus predictable, given a particular set of circumstances a game will always react in one particular manner despite anything the player may actually do. Because of this the expected lifespan of a videogame is inherently dependant on the choices the videogame provides; once all possible avenues have been explored the game loses its appeal. In this paper we focus on adapting techniques used in the field of Affective Computing to solve this stagnation in the videogames market. We describe the development of a programming software development kit (SDK) that allows the interactions between man and machine to become dynamic entities during play by means of monitoring the player's physiological condition.