Hear Me

by claire4693 in Circuits > Wearables

2272 Views, 12 Favorites, 0 Comments

Hear Me

Hear Me

#IntelIoT #2015HackNTU

This project is aimed to help those with hearing and speaking dissabilities, to offer a better way to communicate with people.

Using Edsion, Leap Motion, audio device and cloud computing service offered by Mircosoft Azure to build a wearable device translating sign language to spoken language in realtime.

Parts and Materials

螢幕快照 2015-08-23 上午5.10.22.png
螢幕快照 2015-08-23 上午4.56.34.png
螢幕快照 2015-08-23 上午4.56.56.png
螢幕快照 2015-08-23 上午4.57.36.png
螢幕快照 2015-08-23 上午4.57.47.png

Software and Configuration

OS/Image: ubilinux for edison

  1. Because of the large size of librarys, we have to “expand” the root partition. To do this, we moved /var and /usr into micro-SD card and make a symbolic link back to their original path.
  2. the Leap Motion library requires new version dependency libraries, so we have to upgrade our ubilinux to debian jessie. replace all “wheezy” in /etc/apt/source.list by “jessie” and do “apt-get update; apt-get upgrade; apt-get dist-upgrade”.
  3. For the bluetooth speaker surpport, install pulseaudio and pulseaudio-module-bluetooth by apt-get, and edit /etc/bluetooth/audio.conf by add the following line: Enable=Source,Sink,Media,Socket

  4. Leap Motion library has a bug on debian, we can’t turn it on by service command, so we have to put a line “leapd &” in /etc/rc.local.

Collecting Gestures

Leap_Palm_Vectors.png
Leap_Finger_Model.png

Recording the position and orientation of palm and relative position of fingers of each hand using Leap Motion Python API and Leap Motion SDK, picking frames properly, and save the data into a multi-dimentional array. It becomes training data after attaching label manually.

Machine Learning Model and Web Server

azure.jpg
螢幕快照 2015-08-23 上午4.59.26.png

Uploading training data to Microsoft Azure, Using machine learning tool provided by Azure to generate the model, and choose the best model to build web server. We chose Multiclass Neural Network this time.

Real-time Gesture Matching

Sending raw data from Leap Motion to Edison, and after some processing on Edison, submit the data to Azure by wifi to detect which gesture it is.

Sounds Output

Using Edison to connect the bluetooth speaker, and play the corresponding sound files.

Final Thoughts

IMG_1241.PNG