ABSTRACT

Backpropagation artificial neural networks (ANN) were trained to distinguish two visually similar dynamic arm gestures from a subject with severe speech and motor impairment due to cerebral palsy (SSMICP). Data was collected using a six degree of freedom magnetic position tracker attached to the forearm. The recognition error rates from ANNs trained using both static and motion parameters were compared with static parameters alone and motion parameters alone. Results show that ANNs can be trained to recognize “noisy” arm gestures produced by neuro-motorically impaired individuals. Findings indicate the importance of both static and motion parameters in the machine recognition of such gestures. The results confirm that dynamic gestures can be used to increase the bandwidth of information transfer across the human-machine interface (HMI).