ABSTRACT

Normally speech-disabled people use sign gestures to communicate with another person but for a normal person, it’s very hard to understand sign language. So, our aim is to develop a device capable of translating sign language in order to make it easy for disabled people to communicate with the general public. For this, a smart hand gesture recognition system is proposed to reduce the communication gap between disabled and normal people (Hubbell 2014). The system is in the form of a glove, which consists of five flex sensors, one for each finger and a 3-axis accelerometer, on the back of the palm, connected to an Arduino MEGA device. Flex sensors measure the bend of each finger and the accelerometer measures the slope of the palm. Data from these sensors is for each gesture collected to create a dataset (Deller et al. 2006). The system is based on American Sign Language. Machine Learning Algorithms train over these datasets to predict what gesture the user is making. The goal of this smart system is to minimize the communication gap and make day to day communication for disabled easy.