ABSTRACT

The user’s environment is imaged continuously and interrogated, using SIFT image analysis algorithms, for the presence of known ICT devices. The locations of these ICT devices are then related mathematically to the measured eye gaze location of a user. A laboratory-based prototype system has been set up to carry out the ICT device control using eye point of gaze. It consists of four main components: an eye tracker, an object monitor, a user-configurable panel and a controller. A system is described which enables a user to select and control ICT objects by using their eye gaze behaviour. ICT objects can be identified in the user’s environment by the algorithms designed and built into the ART system. At present the ART system components work separately and current research effort is focused on integrating the separate modules into a fully cohesive ART system which will work in real time.