ABSTRACT

Multimodality and social semiotics together may bring real benefits in understanding apt forms of communication through better understandings of design: whether in the private or in the public domain—in pleasure and entertainment as much as in work. Incorporating the olfactory mode into a consonant multimodal ensemble thus has the potential to increase immersion and representative realism in a videogame and bring a worldliness to in-game artifacts. In addition to customizable and modified controllers, principles of multi-modality can help us envision other ways to improve accessibility for players with motion impairments. Principles of multimodality can help videogame and hardware designers approach challenges surrounding accessibility and universal design, and how experimental methods through the lens of multimodality might provide novel insights into how meaning is conveyed in videogames. Within multimodal studies, gesture is typically discussed in the context of speech and dialogue; however, in the context of gaming and interactive interfaces, there is a robust body of research within Human-Computer-Interaction (HCI).