Animatronic Raspberry Pi AI Chatbot With Blinking Eyes & Talking LED Mouth
by thediylife in Circuits > Raspberry Pi
308 Views, 8 Favorites, 0 Comments
Animatronic Raspberry Pi AI Chatbot With Blinking Eyes & Talking LED Mouth
Want to build an AI chatbot that actually feels alive? In this Instructable, I’ll show you how I created an animatronic chatbot with moving eyes, blinking eyelids, and a Neopixel mouth that lights up in sync with speech, all powered by a Raspberry Pi 5.
This guide covers the overall build process so you can follow along. For more detailed information, including the code and 3D print file, please visit my blog and the linked YouTube video.
I’ve kept this Instructable focused on the essentials so it's easy to follow.
Supplies
- Raspberry Pi 5 - Buy Here
- 32GB Sandisk MicroSD Card - Buy Here
- PCA9685 Servo Driver Board - Buy Here
- 6 Micro Servos - Buy Here
- 2 Universal Joints - Buy Here
- 4 RC Pushrods - Buy Here
- M2 Screws & Nuts - Buy Here
- 8 LED Neopixel Bard Light - Buy Here
- Breadboard Jumpers - Buy Here
- 5V Power Supply For Servos - Buy Here
- USB Omnidirectional Microphone - Buy Here
- USB Speaker - Buy Here
Some of the above parts are affiliate links. By purchasing products through the above links, you’ll be supporting my projects, at no additional cost to you.
Designing the Animatronic Eyes
The project started from an older eye mechanism I designed a few years ago, and I improved a few usability issues with it.
With this design,
- Each eyeball sits on a small universal joint
- Servos mount with screws instead of glue (much easier to adjust)
- Pushrods link the horizontal and vertical axes
- A third servo per eye controls a pair of eyelids
- Eyelids pivot on small side screws for smooth motion
This gives fully independent control of both eyes and both eyelids, so it can blink, wink, look around, or even go cross-eyed.
Printing & Assembling the Stand
I printed all of the components out in PLA, using black for the stand and mechanism components, white for the eyeballs and grey for the eyelids.
The stand holds:
- The animatronic eye module, secured with some M2 screws.
- The Neopixel mouth, including a channel to hide the cable.
- A small rear platform for the Pi and PCA9685 servo control board
The Neopixel bar screws into the mouth section, and a white printed cover acts as an LED diffuser.
Wiring Everything Up
All six servos plug into a PCA9685 board. Using this board, rather than plugging them into the Pi directly, has several benefits;
- It handles the 5V servo power requirements
- It generates the PWM signals, removing workload from the Pi
- The Pi only needs to send simple I2C position commands
The Pi mounts below the servo board. This board plugs into the Pi's I2C pins, and the Neopixel bar plugs into a single GPIO pin.
Bringing It to Life With a Python Script
Before adding the AI Chatbot functionality, I wrote a short Python script to test eye movement. This script moves the eyes around and blinks them randomly. It has adjustable variables for speeds, travel limits and delay timers.
This helps make sure your mechanical assembly works properly before you connect the chatbot.
You can grab the test Python code from my blog.
Developing the AI Chatbot
I approached the chatbot logic in three stages, getting each stage working before moving on to the next.
1. Conversation Engine (OpenAI API)
A simple text-based chatbot running in the terminal.
You can tune it's personality, tone, and behaviour really easily.
2. Text-to-Speech
I then added voice output with adjustable style, accent, and expressiveness.
3. Speech Recognition
Lastly, I added voice input. A small script listens for a prompt, converts it to text, sends it to the chatbot, and plays the audio reply.
With all three of these stages complete, the system can now:
Listen → Think → Talk
There’s a short delay (1–3 seconds) from the time you stop talking to the time you receive a response (depending on the complexity of the request and length of the response), but it’s faster and more natural than running a model locally.
Adding the NeoPixel Mouth
As a last step to complete the chatbot, I added an 8-LED Neopixel bar, which lights up dynamically with the chatbot’s voice:
Soft sounds just light up the middle two LEDs, while loud/expressive sounds light up the full bar
It gives the chatbot a fun “talking” effect that pairs perfectly with the animatronic eyes.
Adjusting Its Personality
I also tried out giving the chatbot different personalities, including:
- A mad scientist
- A grumpy, sarcastic chatbot
- A calmer, more conversational persona
The OpenAI settings make it easy to tailor the voice and mood.
Final Thoughts
This build has been a fun mix of animatronics, Python, AI, and 3D printing. Watching the eyes blink and the mouth light up while the chatbot talks back is surprisingly entertaining. It feels more like a character than a gadget.
This Instructable gives you the full overview, but the detailed schematics, complete source code, STL files, and step-by-step video are on my blog:
If you want to dive deeper or build one exactly like mine, definitely check those out.