Gesture to Speech/Text Converting Glove

by Shja7942 in Circuits > Wearables

1057 Views, 5 Favorites, 0 Comments

Gesture to Speech/Text Converting Glove

Final_Cover_pic.jpeg
Accelerometer.jpg
Arduino.jpg
Flex_sensor.jpg
Raspberry_pi_pic.jpg

The idea/push behind implementing this project was to help people who have difficulty communicating using speech and communicate using hand gestures or more popularly known as American signed language(ASL). This project can be a step towards providing these people an opportunity to work with other people, who cannot understand the sign language, in a collaborative environment. Also, this project will enable them to give public speeches without the use of an actual human translator. As a start, I was only trying to detect some of the easier gestures such as alphabets A, B, I, etc and have also assigned certain gestures to common words/greetings such as 'Hello', 'Good Morning', etc.

Circuit Assembly

mma8452q-hookup_bb.png
example_circuit_schem.png
Circuit_Diagram_Complete.jpg

Project Details

This Project includes a wearable glove with 4 flex sensors stuck/embedded into the glove - one each for little, middle, index fingers, and thumb. Flex sensor was not used for ring Finger due to limitations on the availability of Analog Input Pins on Arduino Uno R3 and in general due to lack of independent movement displayed by the finger in sign languages. An MMA8452Q accelerometer is also used which is stuck on the backside of the palm to measure the orientation of the hand. Input from these sensors is analyzed and used to sense the gesture. Once the gesture is sensed, the corresponding character/message is saved into a variable. These characters and messages keep on concatenating until a certain pre-determined gesture is made which indicates completion of sentence. Once that special gesture is detected, the saved sentence string is sent to the Raspberry Pi via USB cable by Arduino. Raspberry Pi then sends the received string to the Amazon Cloud Service named Polly to convert the sentence received in text format to speech format and then streams the received speech on the Speaker connected to the Raspberry Pi via the AUX cable.

This project was just a Proof of concept and with better pieces of equipment and planning and better calibrated to detect a lot of other gestures and hand movements. Currently, only limited functionality is programmed into this project such as for basic gesture detection and text to speech output.

Code

Steps

1. Connect the flex sensors and accelerometer MMA8452Q to the Arduino as per the circuit diagram provided.

2. Dump the program Final_Project.ino(found in the Arduino_code.zip file) to the Arduino.

3. Connect Arduino to the Raspberry Pi vis a USB cable. (Cable type A/B).

4. Power up the Raspberry Pi, copy the Raspberry_pi_code.zip file into the Raspberry Pi, and extract it. Connect the speaker to the Raspberry Pi.

5. Copy you AWS account credentials i.e aws_access_key_id, aws_secret_access_key and aws_session_token into the ~/.aws/credentials file. This step is required to communicate with the AWS cloud and to use AWS services.

6. Run seria_test.py program found inside the extracted folder in step 4.

7. Now make the gestures to form a sentence and then do the special gesture( Keep your fingers and palm straight and in a line with the palm facing away from you, and then rotate the wrist giving it a downward turn such that now your palm is facing you and tip of your fingers is pointing downwards towards your feet.) to signal sentence completion.

8. Keep checking the terminal for useful information.

9. And listen to the converted speech being streamed on the speaker.

References