ABSTRACT

In the past, accessibility in Human-Computer Interaction (HCI) was primarily concerned with the selection of suitable equipment to enable alternative computer access for people with disabilities. As a result, it was mainly considered as an afterthought and reflected a reactive approach, whereby Assistive Technology solutions addressed problems introduced by a previous generation of technology (Akoumianakis and Stephanidis 1999; Savidis and Stephanidis 1995; Stephanidis 1995). This reactive approach entails primarily adaptations. Adaptations facilitate access to the interface via suitable mechanisms, such as filtering, dedicated interaction techniques, such as scanning and specialized input/output devices (e.g. Braille displays, switches, eyegaze systems). Typically, the result of adaptations includes the reconfiguration of the physical layer of interaction, and, when necessary, the translation of the visual interface manifestation to an alternative modality. For example, access to a graphical user interface (GUI) by a blind user requires “filtering” of the contents of the screen, using appropriate software (e.g. screen reader), so as to present them in an alternative modality (e.g. tactile, audio).