Animatronic Raspberry Pi AI Chatbot With Blinking Eyes & Talking LED Mouth

by thediylife in Circuits > Raspberry Pi

308 Views, 8 Favorites, 0 Comments

Animatronic Raspberry Pi AI Chatbot With Blinking Eyes & Talking LED Mouth

AI Chatbot With Animatronic Eyes and Mouth.jpeg
I Built an AI Chatbot That Talks, Blinks, and Looks Around!

Want to build an AI chatbot that actually feels alive? In this Instructable, I’ll show you how I created an animatronic chatbot with moving eyes, blinking eyelids, and a Neopixel mouth that lights up in sync with speech, all powered by a Raspberry Pi 5.

This guide covers the overall build process so you can follow along. For more detailed information, including the code and 3D print file, please visit my blog and the linked YouTube video.

I’ve kept this Instructable focused on the essentials so it's easy to follow.

Supplies

  1. Raspberry Pi 5 - Buy Here
  2. 32GB Sandisk MicroSD Card - Buy Here
  3. PCA9685 Servo Driver Board - Buy Here
  4. 6 Micro Servos - Buy Here
  5. 2 Universal Joints - Buy Here
  6. 4 RC Pushrods - Buy Here
  7. M2 Screws & Nuts - Buy Here
  8. 8 LED Neopixel Bard Light - Buy Here
  9. Breadboard Jumpers - Buy Here
  10. 5V Power Supply For Servos - Buy Here
  11. USB Omnidirectional Microphone - Buy Here
  12. USB Speaker - Buy Here

Some of the above parts are affiliate links. By purchasing products through the above links, you’ll be supporting my projects, at no additional cost to you.

Designing the Animatronic Eyes

Design of Animatronic AI Chatbot.jpeg

The project started from an older eye mechanism I designed a few years ago, and I improved a few usability issues with it.

With this design,

  1. Each eyeball sits on a small universal joint
  2. Servos mount with screws instead of glue (much easier to adjust)
  3. Pushrods link the horizontal and vertical axes
  4. A third servo per eye controls a pair of eyelids
  5. Eyelids pivot on small side screws for smooth motion

This gives fully independent control of both eyes and both eyelids, so it can blink, wink, look around, or even go cross-eyed.

Printing & Assembling the Stand

3D Printed Parts For Chatbot.jpeg
All Servos Installed.jpeg
RC Pushrods Installed.jpeg
Eyelid Assembly Connected To Servo.jpeg
Neopixel LED Mouth.jpeg
Screwing Eye Assembly To Stand.jpeg

I printed all of the components out in PLA, using black for the stand and mechanism components, white for the eyeballs and grey for the eyelids.

The stand holds:

  1. The animatronic eye module, secured with some M2 screws.
  2. The Neopixel mouth, including a channel to hide the cable.
  3. A small rear platform for the Pi and PCA9685 servo control board

The Neopixel bar screws into the mouth section, and a white printed cover acts as an LED diffuser.

Wiring Everything Up

PCA9685 Servo Control Board With Servos Plugged In.jpeg
PCA9685 Servo Control Board.jpeg
Raspberry Pi 5 With All GPIO Connections Made.jpeg

All six servos plug into a PCA9685 board. Using this board, rather than plugging them into the Pi directly, has several benefits;

  1. It handles the 5V servo power requirements
  2. It generates the PWM signals, removing workload from the Pi
  3. The Pi only needs to send simple I2C position commands

The Pi mounts below the servo board. This board plugs into the Pi's I2C pins, and the Neopixel bar plugs into a single GPIO pin.

Bringing It to Life With a Python Script

AI Chatbot Terminal Application.jpeg
Animatronic Eyes Moving.jpeg

Before adding the AI Chatbot functionality, I wrote a short Python script to test eye movement. This script moves the eyes around and blinks them randomly. It has adjustable variables for speeds, travel limits and delay timers.

This helps make sure your mechanical assembly works properly before you connect the chatbot.

You can grab the test Python code from my blog.

Developing the AI Chatbot

AI Chatbot Complete and Running In Terminal.jpeg
Pi 5 Based Animatronic AI Chatbot.jpeg

I approached the chatbot logic in three stages, getting each stage working before moving on to the next.

1. Conversation Engine (OpenAI API)

A simple text-based chatbot running in the terminal.

You can tune it's personality, tone, and behaviour really easily.

2. Text-to-Speech

I then added voice output with adjustable style, accent, and expressiveness.

3. Speech Recognition

Lastly, I added voice input. A small script listens for a prompt, converts it to text, sends it to the chatbot, and plays the audio reply.

With all three of these stages complete, the system can now:

Listen → Think → Talk

There’s a short delay (1–3 seconds) from the time you stop talking to the time you receive a response (depending on the complexity of the request and length of the response), but it’s faster and more natural than running a model locally.

Adding the NeoPixel Mouth

AI Chatbot Running.jpeg

As a last step to complete the chatbot, I added an 8-LED Neopixel bar, which lights up dynamically with the chatbot’s voice:

Soft sounds just light up the middle two LEDs, while loud/expressive sounds light up the full bar

It gives the chatbot a fun “talking” effect that pairs perfectly with the animatronic eyes.

Adjusting Its Personality

I also tried out giving the chatbot different personalities, including:

  1. A mad scientist
  2. A grumpy, sarcastic chatbot
  3. A calmer, more conversational persona

The OpenAI settings make it easy to tailor the voice and mood.

Final Thoughts

This build has been a fun mix of animatronics, Python, AI, and 3D printing. Watching the eyes blink and the mouth light up while the chatbot talks back is surprisingly entertaining. It feels more like a character than a gadget.

This Instructable gives you the full overview, but the detailed schematics, complete source code, STL files, and step-by-step video are on my blog:

👉 AI Chatbot Blog Post

👉 YouTube Video

If you want to dive deeper or build one exactly like mine, definitely check those out.