‘Smart glove’ developed by Amrita University students converts Indian sign-language into speech

Students of Amrita School of Engineering’s Amrita Robotics Research Lab (ARRL) have developed a ‘smart glove’ called MUDRA which converts hand gestures based on Indian sign language into spoken English, potentially paving the way for speech-impaired individuals to communicate more effectively with others. The feat was achieved by a team of four B.Tech students – Abhijith Bhaskaran, Anoop G Nair, Deepak Ram and Krishnan Ananthanarayanan – mentored by HR Nandi Vardhan, Asst. Professor, Dept. of ECE, Amrita School of Engineering, Amrita University.

Said Dr. TSB Sudarshan, Head – Research, Amrita School of Engineering, Amrita University: “According to the 2011 census, 12 million Indians have some kind of speech or hearing disability. They face many issues in society because they are unable to communicate normally or express themselves effectively. They communicate through sign language and hand gestures which are often ridiculed by others, creating social insecurities in them. The Amrita University students have developed the ‘MUDRA’ smart glove to help bridge this gap between speech-impaired people and others.”

Added HR Nandi Vardhan: “The glove is designed to convert hand gestures in Indian sign language to voice. Though this is our initial focus, the glove is multi-purpose and versatile. It can be reprogrammed for a range of applications in which motion-sensor technology plays an important role, such as gaming stations, virtual reality, remote control of devices, and the robotics and medical industry. The glove has a vast potential due to its simplicity and powerful algorithm.”

The lightweight MUDRA glove can be worn comfortably like a riding glove. It recognizes hand gestures in all possible directions and angles using flex resistors, accelerometer and gyroscope. The corresponding output is transmitted as speech through inbuilt speakers. The unique feature of the glove is cost effectiveness without compromising on quality and efficiency.

Said student Abhijith Bhaskaran: “The glove is much cheaper compared to similar gesture-sensing products available today. The prototype took us 16 weeks to build and costs Rs 7,500. The glove can currently recognize numbers from 1 to 10, and Indian sign language gestures corresponding to words such as morning, night, goodbye, thank you, etc. It can detect four different states of each finger, and as many as 70 gestures can be configured. The glove is now in advanced stage of the production cycle. We have begun validating its social feasibility. The preliminary results are very encouraging.” The team intends to conduct field trials once it has designed user experiments with all possible conditions and permutations.

The students faced many challenges while developing the glove. Initially, they intended to use a camera-based device, but it proved to be bulky and expensive. After much research, flex sensors were tested, refined and integrated with the glove to recognize four different positions of each finger. The design of the glove was crucial as a stiff hold was required on the fingers. A range of values was calibrated precisely for each specific position of the finger and the rest was filtered out.

Pix 2 Student Deepak Ram with Mudra Glove, Amrita University

The movement of the hand posed another challenge. Although the inertial measurement unit (IMU) offered values, these were not accurate owing to noise, so filtering techniques were adopted for precision. Since differentiating between various orientations and movements of the hand with only one sensor was proving to be difficult, the students developed a novel method of state estimation.

The inspiration for the MUDRA glove came from another project carried out at the Amrita Robotics Research Lab (ARRL). A robot was to be controlled remotely, and the students were evaluating various techniques for it. One of the ideas was to control the robot through hand gestures. During the course of the project, the students realized the potential of the application, and the idea of the speech glove was born.

The students started off with camera-based gesture recognition and evolved to the current model of the glove. The device’s on-board processing unit can be reprogrammed and reconfigured. The algorithm developed by the students to track 3-D space is unique and will be published soon in a robotics conference. It gives the glove capacity to be easily integrated with any other application.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s