Computer Vision ASL Detection Robotic Arm

by amarsbar in Design > 3D Design

370 Views, 4 Favorites, 0 Comments

Computer Vision ASL Detection Robotic Arm

Live Hand Tracking Robotic Hand | Machine learning and Mechatronics

Have you ever wanted to bridge the communication gap between the hearing and deaf communities? This project creates a robotic hand that can both recognize American Sign Language (ASL) letters through computer vision and physically demonstrate them through precise servo motor control.

The system works in two fascinating ways: it can watch you sign letters through a camera and translate them to text, or you can type letters and watch the robotic hand form the corresponding ASL signs. Built around a Raspberry Pi 5 with custom 3D-printed components, this project combines machine learning, computer vision, mechanical engineering, and real-time control systems.

Why I Built This: 2.80% of the USA's population uses ASL as their primary means of communication, while only 0.09% of Canadians can converse in ASL. This massive communication gap inspired me to create a bidirectional translation system that could help both communities communicate more effectively.

What Makes This Special:

  1. Real-time computer vision using MediaPipe and OpenCV
  2. Custom-trained machine learning model with 100% accuracy on test data
  3. Precise servo control with calibration system
  4. Fully 3D-printable mechanical components
  5. Runs entirely on Raspberry Pi 5 without external computing
  6. Open-source design for community improvement

Skill Level: Advanced (requires experience with 3D printing, basic electronics, and command line interfaces)

Time to Complete: 2-3 weeks (including print time and model training)

Supplies

Electronic Components

Main Processing Unit:

  1. 1x Raspberry Pi 5 (8GB RAM recommended) with SD card and power supply
  2. 1x USB Camera Module (OV2643 or similar with good macro focus)

Servo Control System:

  1. 1x PCA9685 16-Channel PWM/Servo Driver Board
  2. 8x MG996R Metal Gear Servo Motors (or similar high-torque servos)
  3. 1x DC Power Supply (30V, 10A capacity minimum for all servos)
  4. Various jumper wires (male-to-female, male-to-male)

Hardware & Assembly:

  1. 2x Spools PLA filament for 3D printing (2-3kg total)
  2. 1x Spool nylon fishing line (0.60mm diameter, clear)
  3. Super glue or epoxy adhesive
  4. Small screws and bolts (M3 x 12mm recommended)
  5. Heat shrink tubing or electrical tape

Tools Required

3D Printing:

  1. 3D Printer with at least 200x200x150mm build volume
  2. 3D printing slicer software (PrusaSlicer, Cura, or similar)

Electronics Assembly:

  1. Soldering iron and solder (optional, for permanent connections)
  2. Wire strippers
  3. Small screwdrivers (Phillips and flathead)
  4. Multimeter for troubleshooting

Computer Setup:

  1. Computer for initial Raspberry Pi setup
  2. Monitor, keyboard, and mouse for Pi configuration

Software Dependencies

Pre-installed on Raspberry Pi:

  1. Raspberry Pi OS (Bookworm recommended)
  2. Python 3.12+
  3. Git for code repository access

Python Libraries (will be installed during setup):

  1. OpenCV 4.7.0.68
  2. MediaPipe 0.10.0+
  3. NumPy
  4. scikit-learn 1.3.0+
  5. pickle (built-in)
  6. threading (built-in)

C++ Dependencies:

  1. g++ compiler
  2. I2C development libraries
  3. Standard C++ libraries

3D Print the Mechanical Components

The robotic hand consists of several key printed parts that work together to create realistic finger movements.

Download the 3D Models: Access the complete STL file collection from our repository: Download 3D Print Files

Parts List to Print:

  1. 5x Finger assemblies (thumb, index, middle, ring, pinky)
  2. 1x Palm/forearm structure
  3. 1x Rotating base platform
  4. 1x Electronics housing/mount
  5. Various small connector pieces and brackets

Print Settings:

  1. Layer Height: 0.2mm
  2. Infill: 25-30%
  3. Print Speed: 50mm/s
  4. Support: Yes (for overhangs > 45°)
  5. Build Plate Adhesion: Brim recommended

Important Printing Notes:

  1. Print fingers with the tips pointing up for best surface finish
  2. The palm piece will require support material
  3. Print the base in two parts if your print bed is smaller than 250mm
  4. Small holes may need drilling after printing for perfect servo fit

Estimated Print Time: 4-6 days total across all parts

Post-Processing:

  1. Remove all support material carefully
  2. Test-fit all moving parts before final assembly
  3. Drill out servo mounting holes to exact diameter if needed
  4. Sand any rough surfaces that will contact fishing line


Assemble the Mechanical Hand Structure

Now we'll build the physical hand mechanism that will bring your ASL signs to life.

Threading the Fishing Line:

  1. Cut fishing line into 5 pieces, each approximately 50cm long
  2. Thread one line through each finger, starting from the fingertip
  3. The line should run through small guides in each finger joint
  4. Secure the fingertip end with a small knot and super glue

Attaching Servos to Fingers:

  1. Mount each servo in its designated position in the palm
  2. Connect the fishing line to the servo horn (the rotating attachment)
  3. Ensure the line has slight tension when the servo is at 90°
  4. Test each finger's range of motion before finalizing

Installing the Palm Assembly:

  1. Insert all five servos into their mounting positions
  2. Secure with the provided screws
  3. Route all servo wires toward the back of the palm
  4. Attach the palm to the forearm structure

Base Platform Assembly:

  1. Mount the wrist rotation servo in the base
  2. Connect the forearm to the rotating platform
  3. Ensure smooth rotation without binding
  4. Install the electronics housing behind the hand

Testing Mechanical Movement: Before proceeding to electronics, manually test each finger:

  1. Full extension (straight)
  2. Full flexion (bent)
  3. Smooth motion without catching
  4. No excessive friction in the fishing line guides


Electronics and Wiring Setup

GPIO-Pinout-Diagram-2.png

Time to bring intelligence to your mechanical creation.

PCA9685 Servo Driver Setup: The PCA9685 board controls all servos through I2C communication with the Raspberry Pi.

  1. Connect Power to PCA9685:Connect your 30V/10A power supply to the servo power terminals
  2. Connect 5V from Raspberry Pi to VCC on PCA9685
  3. Connect GND from both power supply and Pi to GND on PCA9685
  4. I2C Communication Wiring:Pi Pin 3 (SDA) → PCA9685 SDA
  5. Pi Pin 5 (SCL) → PCA9685 SCL
  6. Pi GND → PCA9685 GND
  7. Servo Connections: Connect servos to PCA9685 channels 0-7:
  8. Channel 0: Thumb
  9. Channel 1: Index finger
  10. Channel 2: Middle finger
  11. Channel 3: Ring finger
  12. Channel 4: Pinky
  13. Channel 5-7: Available for future expansion

Camera Connection:

  1. Connect USB camera to any available USB port on the Pi
  2. Position camera to clearly see the hand gesture area
  3. Ensure good lighting for computer vision accuracy

Power Distribution:

  1. Use the bench power supply for servo power (they draw significant current)
  2. Power the Raspberry Pi separately with its dedicated power supply
  3. Never attempt to power servos directly from the Pi

Safety Check: Before powering on:

  1. Verify all connections are secure
  2. Check for any short circuits with a multimeter
  3. Ensure servo wires aren't pinched or damaged
  4. Confirm power supply voltage settings are correct

Raspberry Pi Software Installation

Let's install the brain of the operation - the software that makes everything work.

Initial Pi Setup:

  1. Flash Raspberry Pi OS to your SD card using the Raspberry Pi Imager
  2. Enable SSH and I2C in raspi-config if you plan to work remotely
  3. Update your system: sudo apt update && sudo apt upgrade

Clone the Project Repository:

bash

cd ~
git clone https://github.com/your-username/robotic-asl-hand.git
cd robotic-asl-hand

Install Python Dependencies:

bash

pip3 install opencv-python==4.7.0.68
pip3 install mediapipe>=0.10.0
pip3 install scikit-learn>=1.3.0
pip3 install numpy
pip3 install pynput

Install C++ Build Dependencies:

bash

sudo apt install build-essential
sudo apt install libi2c-dev
sudo apt install git

Enable I2C Interface:

bash

sudo raspi-config
# Navigate to Interface Options > I2C > Enable
sudo reboot

Verify I2C Connection: After reboot, check that your PCA9685 is detected:

bash

i2cdetect -y 1
# You should see address 0x40 in the output grid

Compile C++ Control Programs:

bash

cd ~/robotic-asl-hand
g++ -o sign_language_hand sign_language_hand.cpp -std=c++17
g++ -o calibration_system calibration_system.cpp -std=c++17
g++ -o hand_mirror hand_mirror.cpp -std=c++17


Training the Computer Vision Model

This is where the magic happens - teaching your computer to recognize ASL letters.

Understanding the Training Process: The system uses MediaPipe to track hand landmarks, then trains a Random Forest Classifier to recognize letter patterns from these landmarks.

Collecting Training Data:

  1. Take Photos for Dataset:

bash

python3 take_letter_pics_100.py
  1. This will prompt you to show each letter A-Y (excluding J and Z which require motion)
  2. Hold each letter steady while the program captures 100 images
  3. Ensure good lighting and clear hand positioning
  4. Total images: 2,500 (25 letters × 100 images each)
  5. Process the Images:

bash

python3 create_dataset.py

This program:

  1. Analyzes each image with MediaPipe
  2. Extracts hand landmark coordinates
  3. Normalizes the data for consistent training
  4. Saves processed data to 'data.pickle'
  5. Train the Classification Model:

bash

python3 train_classifier.py

This creates the machine learning model:

  1. Splits data into training (80%) and testing (20%) sets
  2. Trains a Random Forest Classifier
  3. Reports accuracy score (should be close to 100%)
  4. Saves trained model to 'model.p'

Expected Output:

100.0% of samples were classified correctly!
Process finished with exit code 0

Troubleshooting Training Issues:

  1. If accuracy is low (<95%), retake photos with better lighting
  2. Ensure hand is fully visible in all training images
  3. Check that MediaPipe is detecting all 21 hand landmarks
  4. Consider expanding dataset with more varied hand positions


Servo Calibration System

Every servo and mechanical assembly is slightly different, so we need to calibrate for precise movements.

Why Calibration is Critical: Each ASL letter requires specific finger positions. Without calibration, your letters might be unclear or incorrect.

Running the Calibration Program:

bash

sudo ./calibration_system A

Calibration Interface: When calibrating, you'll see:

Calibrating letter A - Finger 0 (Position: 263)
Controls:
w/s - Increase/decrease position by 5
a/d - Increase/decrease position by 25
n - Next finger
p - Previous finger
r - Reset to straight position
v - View current letter
q - Save and quit
Command:

Calibration Process for Each Letter:

  1. Start with finger 0 (thumb)
  2. Use w/s keys for fine adjustments
  3. Use a/d keys for large adjustments
  4. Press 'v' to see the complete letter formation
  5. Move to next finger with 'n'
  6. Repeat for all 5 fingers
  7. Save with 'q' when satisfied

Calibration Tips:

  1. Reference actual ASL alphabet charts for accuracy
  2. Test each letter formation multiple times
  3. Ensure fingers don't collide or bind
  4. Save frequently during long calibration sessions

The Calibration File: Calibrated positions are saved to 'calibration.conf':

A,263,150,150,150,150
B,150,375,375,375,375
C,263,263,263,263,263
...

Testing Calibrated Letters:

bash

sudo ./sign_language_hand A
sudo ./sign_language_hand B

Step 7: Real-Time Computer Vision System

Now let's put it all together with the live computer vision system.

Understanding the Complete System: The main program runs three concurrent threads:

  1. Camera Thread: Captures video and processes hand landmarks
  2. Input Thread: Handles keyboard text input for letter spelling
  3. Prediction Thread: Manages servo movements and timing

Running the Complete System:

bash

python3 fixed_inference_classifier.py

System Interface: You'll see:

Starting Sign Language Interpreter
You can:
1. Show hand signs to the camera
2. Type words to spell out
3. Press ESC to quit
4. Type 'q' and press Enter to quit

Initializing camera...

====================================================
TYPE HERE AND PRESS ENTER TO SPELL A WORD
Type 'q' to quit
====================================================
>>

Using the System:

For ASL Recognition:

  1. Position your hand clearly in the camera frame
  2. Form ASL letters steadily
  3. The system detects letters and displays them on screen
  4. After a 2-second cooldown, the robotic hand will mirror the detected letter

For Text-to-ASL:

  1. Type any word in the input prompt
  2. Press Enter
  3. Watch as the robotic hand spells out each letter
  4. The system pauses between letters for clarity

Performance Optimization: The system is optimized for the Raspberry Pi 5's capabilities:

  1. Threading prevents camera lag
  2. Frame processing is optimized for real-time performance
  3. Memory usage is managed efficiently
  4. Servo commands are queued to prevent conflicts


Testing and Troubleshooting

Let's ensure everything works perfectly and address common issues.

System Testing Checklist:

Mechanical Tests:

  1. All fingers move smoothly through full range
  2. No binding or catching in the fishing line
  3. Servo horns are securely attached
  4. Base rotation works smoothly
  5. No excessive noise from servos

Electronic Tests:

  1. I2C communication working (i2cdetect shows 0x40)
  2. All servos respond to individual commands
  3. Power supply provides stable voltage under load
  4. Camera captures clear, well-lit images
  5. No overheating of components

Software Tests:

  1. Computer vision detects hand landmarks accurately
  2. Machine learning model loads without errors
  3. Calibration system saves and loads correctly
  4. Real-time system runs without crashes
  5. Threading performance is acceptable

Common Issues and Solutions:

"Failed to open I2C device"

bash

sudo raspi-config
# Enable I2C interface and reboot

Servo movement is erratic:

  1. Check power supply capacity (needs 10A minimum)
  2. Verify all ground connections
  3. Ensure servo wires aren't damaged

Computer vision accuracy is poor:

  1. Improve lighting conditions
  2. Clean camera lens
  3. Retrain model with more diverse data
  4. Check MediaPipe landmark detection

System runs slowly:

  1. Close unnecessary programs
  2. Reduce camera resolution in code
  3. Optimize frame processing rate
  4. Ensure adequate cooling for Pi

Fingers don't form letters correctly:

  1. Re-run calibration for affected letters
  2. Check fishing line tension
  3. Verify servo mounting alignment
  4. Compare with ASL reference charts

Advanced Features and Customization

Take your project to the next level with these enhancements.

Hand Mirroring Mode: For real-time mimicry, use the hand mirror program:

bash

sudo ./hand_mirror

This makes the robotic hand copy your movements in real-time rather than recognizing specific letters.

Calibration Backup and Restore: Save your calibration settings:

bash

cp calibration.conf calibration_backup.conf

Adding New Letters or Gestures:

  1. Modify the letter configurations in the C++ code
  2. Add corresponding entries to the Python label dictionary
  3. Collect training data for new gestures
  4. Retrain the model with expanded dataset

Performance Monitoring: Add system monitoring to track:

  1. Frame processing rate
  2. Servo response time
  3. Memory usage
  4. CPU temperature

Custom Vocabulary: Create preset word lists for common phrases:

python

common_phrases = {
"hello": "HELLO",
"thank you": "THANK YOU",
"please": "PLEASE"
}

Integration Possibilities:

  1. Connect to home automation systems
  2. Add speech synthesis for audio output
  3. Create web interface for remote control
  4. Add Bluetooth connectivity for mobile apps

Step 10: Maintenance and Care

Keep your robotic hand running smoothly for years to come.

Regular Maintenance Tasks:

Weekly:

  1. Check fishing line for wear or fraying
  2. Clean camera lens for optimal vision
  3. Verify servo mounting screws are tight

Monthly:

  1. Backup calibration settings
  2. Update software packages
  3. Check power supply connections
  4. Clean 3D printed components

As Needed:

  1. Replace fishing line when worn
  2. Recalibrate after any mechanical adjustments
  3. Update machine learning model with new data
  4. Replace servos if they become noisy or weak

Upgrade Path: Consider these improvements for future versions:

  1. Higher resolution camera for better recognition
  2. Additional servos for wrist and finger joint articulation
  3. Pressure sensors for haptic feedback
  4. Wireless communication capabilities
  5. Voice command integration

Storage and Transport:

  1. Power down system properly before moving
  2. Protect camera from impacts
  3. Secure loose wires during transport
  4. Store in a dust-free environment

Conclusion and Next Steps

Congratulations! You've built a sophisticated robotic system that bridges the communication gap between hearing and deaf communities. Your robotic hand can now:

  1. Recognize ASL letters through computer vision with high accuracy
  2. Physically demonstrate ASL letters through precise servo control
  3. Operate in real-time on a single Raspberry Pi 5
  4. Be calibrated for perfect letter formation
  5. Process both live gestures and typed text input

What You've Learned:

  1. Advanced 3D printing and mechanical assembly
  2. Computer vision and machine learning implementation
  3. Real-time multi-threaded programming
  4. I2C communication and servo control
  5. System integration and troubleshooting

Sharing Your Project:

  1. Document your build process with photos and videos
  2. Share your calibration improvements with the community
  3. Contribute code enhancements to the open-source repository
  4. Help others troubleshoot their builds

Future Possibilities: This project opens doors to many exciting developments:

  1. Expanding to full ASL words and phrases
  2. Creating bilateral communication systems
  3. Developing educational tools for ASL learning
  4. Building assistive technology for daily communication

Community Impact: Your robotic hand represents more than just a technical achievement - it's a bridge toward more inclusive communication. By making this technology accessible and open-source, you're contributing to a world where communication barriers can be overcome through innovation and empathy.

Resources for Continued Learning:

  1. ASL learning resources for accuracy validation
  2. Computer vision and MediaPipe documentation
  3. Advanced servo control techniques
  4. Machine learning model optimization
  5. 3D printing design improvements

Thank you for building technology that makes the world more accessible for everyone!