Sign Language Word Recognition using Wearable Sensors
Communication and collaboration between deaf people and hearing people is hindered by lack of a common language. Although there has been a lot of research in this domain, there is room for work towards a system that is ubiquitous, non-invasive, works in real-time and can be trained interactively by the user. Such a system will be powerful enough to translate gestures performed in real-time, while also being flexible enough to be fully personalized to be used as a platform for gesture based HCI. We propose SCEPTRE which utilizes two non-invasive wrist-worn devices to decipher gesture-based communication. The system uses a multitiered template based comparison system for classification on input data from accelerometer, gyroscope and electromyography (EMG) sensors. This work demonstrates that the system is very easily trained using just one to three training instances each for twenty randomly chosen signs from the American Sign Language(ASL) dictionary and also for user-generated custom gestures. The system is able to achieve an accuracy of 97.72 % for ASL gestures
Dataset:
Please find below the link to the dataset used for this project. If you use the dataset in your research please refer to the publication:
Publication:
Sceptre: a pervasive, non-invasive, and programmable gesture recognition technology P Paudyal, A Banerjee, SKS GuptaProceedings of the 21st International Conference on Intelligent User …
This work was published on the 2016 ACM IUI . (ACM Digital Library Link, pdf)
Dataset 8 Users, 4 repetitions, 10 signs
Dataset Description:
The dataset consits of .csv files collected from two Myo armbands. The format of the files are [word_name]_[id]. The ‘word_name’ is the English translation of the American Sign Language word used and the ‘id’ is a unique identifier. The .zip for each of the above links has sub-folders for each User.
Each file has 50 columns. They represent a sub-sampled data collection from two Myo devices worn on left and right hands of the signer. The first column is the ‘Counter’ that goes from 1 to 50.
The following columns are of the format: [Sensor][pod/direction][left/right]. For instance the EMG reading for the first EMG pod (out of 8) on the left hand would be called EMG0R and the accelerometer reading for the Z axis on the left hand would be called: AXL
Methods:
Faculty Advisor:
Dr. Sandeep Gupta
Research Faculty:
Dr. Ayan Banerjee
Curent Students:
Prajwal Paudyal
Junghyo Lee
Sponsors:
Collaborators:
Paul Quinn
Dr. Tamako Azuma