ABSTRACT

Human beings naturally look at objects that they want to manipulate or use. For applications in which an operator is controlling a computer-based function or device, harnessing the direction of the operator’s gaze promises to be a very natural and efficient control interface. With such control, the computer initiates a predefined action once it receives an input based on the operator’s point-of-gaze. Based on the assumption that an operator looks in the general direction that the head is pointing, “gaze-based” control has already been employed for several applications. These systems translate the position of an operator’s head into a system input, enabling “head-based control.” More recently, “eye-based control” has become commercially available, enabling the eye point-of-gaze to be translated into a system input. Unless the head is stationary (or, with some tracking systems, held within a small motion box), determining the eye gaze point also involves tracking the head position/rotation. For gaze-based control to be useful, it is important that the operator’s eye and/or head movements remain natural and not involve any unusual blinking, lengthy fixations or fatiguing inputs.