Simple Robin Williams Animatronic

by ssulliv30 in Circuits > Arduino

193 Views, 2 Favorites, 0 Comments

Simple Robin Williams Animatronic

title.jpg

This animatronic has user interaction with the implementation of ultrasonic sensors, a tilt sensor, and a photoresistor. The purpose was to create a Robin Williams look-alike that would narrate specific quotes based on the interaction with the tilt sensor, mimic eye movements with ultrasonic sensors, or play a game if the photoresistor is activated. The eyes shift left or right depending on which ultrasonic sensor is activated, and the jaw will hinge up and down to mimic speaking. Supplies included the basics of what can be found in an ELEGOO UNO Project Super Starter Kit and wood pieces. 

Supporting Structure

supporting_strucutre.png
wood blocks.png

The supporting structure is composed of measured pieces of wood from 2 -48 in garden stakes and cardboard. Firstly, the garden stakes need to be cut in to measures blocks.

·       10 in block X3

·       8 in block X2

·       11 in block X2

·       8.62 in block X1

These pieces were cut using a miter saw, however if that is not available to you, a hand-held wood saw or bandsaw can be used as well.

Base

base.png

Next, the “base” can be constructed. We used an electric drill and electric screwdriver to create a sturdy foundation. Wood glue could be used if time is permitted to clamp and cure the pieces overnight. A 7/64 in drill bit is used. The smallest screwdriver bit should be used for the screwdriver to have a secure hold. 

Then, flipping the base and drilling upwards (it is strongly recommended to use two table clamps) The 11 in pieces can be attached where the red circles are marked on the image above, and the 10 in piece would connect these at the top. Now the wood base should be created.

The entire structure can be scaled down depending on the size of the face. Once the face is printed, it should be backed with cardboard and attached to the front of the structure with glue.

Joints

main_joint.png
support1.png
support2.png

Two layers of cardboard back the face to create a sturdy base, as seen in the images. A larger rectangle is cut out of the second layer so that the servo motor arm has enough space for it's linear motion. An oval (the shape of the eyes) is cut out of the first layer so that no white space would be shown from the front side.

Then the eyes were attached to the servo motors as shown above. The eyes were held in place by taping to the servo motor arm and pinning through one of the holes in the servo motor arm. The entire servo motor was then taped to a piece of cardboard tall enough so that the eyes could show through the cutout eyes at the appropriate height. These cardboard columns were glued to the frame to hold them in place.

The mouth was attached to several pieces of cardboard to add more thickness. The servo was then attached to the mouth by creating a hole in the cardboard pieces and using tape and a pin to secure the mouth to the servo motor arm.

Actuators

Three microservos were used in this project. One was used to move each eye and one was used to move the mouth up and down. Two Arduinos were used.

Since two Arduinos were used, so subsequently two laptops had to be used to power the microcontrollers as well.

Sensors

Two ultrasonics sensors, one tilt sensor, and one photo resistor were used. The images below visualize the wiring for each set of sensors using tinkercad. However, it is recommended that the ultrasonic sensor and photoresistor be programmed on one Arduino, while the tilt sensor be programmed on a separate Arduino. It is also recommended that while the ultrasonic sensors and tilt sensor can share a breadboard, the photoresistor has its own small breadboard so that it can more easily be implemented into the face. Otherwise, a combination of soldering and longer wires would need to be utilized.

Ultrasonic Sensor

newservo.png

The two ultrasonic sensors response to distance, and to two micro servos relatively. Both micro servos begin at an origin of 90 degrees. If the sensor detects an object 10 cm or less, the eyes will move left or right.

The holders for the ultrasonic sensors were 3D printed to ensure the sensors would stay upright and be perpendicular to the surface. They were found on Thingiverse and adjusted based on the size of our project, with some walls removed. The link can be found here: https://grabcad.com/library/hc-sr04-sensor-support-1.

Tilt Sensor

tiltsensor.png

When the tilt sensor is tilted to 90 degrees, the animatronic iterates a movie quote. We attached the tilt sensor to a block, and when the top block is tilted over the audio outputs. Remember the photoresistor is on it's own breadboard and tapped to the back of the forward where the photoresistor peaks out.

Photoresistor

phototink.png

The photoresistor maps an output and detects an output to be less than a set integer, a game initiates on the relative laptop. 

Programming

When the left ultrasonic sensor detects a distance of 10 cm or less, both microservos move to 60°.This is to mimic eye tracking. When the right sensor detects the same distance points, both sensors move 109° respectively. As the object moves away, the eyes -- quickly or slowly depending on the speed of the object, revert to the original position.

When the tilt sensor detects an angle of 90 degrees, a condition becomes valid, and the quote outputs. Depending on the hysteresis, the tilt sensor can be as sensitive or unresponsive as needed.

When the photoresistor detects a mapped output of greater than 600, a game initiates. The game was taken from hackster.io , and modified to initiate.

Attached is the code for each laptop.

Final Product

title.jpg
supporting image_1.png
supporting image_2.png
ME511_Animatronic


Lessons/Mistakes

Initially, the tilt sensor was meant to attach directly behind the animatronic head. However, since the final product did not stabilize the two Arduinos or breadboard to the frame this would not work. This is why we have two blocks instead. The Genie from Aladdin was drawn on the block to connect to the actor Robin Williams role in the film and the quote. To avoid this, a holder can be made for the breadboard and Arduinos so that the entire animatronic can be connected.

The other lesson we learned was to test one part at a time. It can become very confusing troubleshooting an entire project at once, so test each portion individually to ensure that the code and wiring are working properly before moving on to the next part of the code and circuit. 

The two microservos that controlled eye movement had a difficult time running while the jaw servo was running. To avoid this problem, the jaw servo and two eye servos were split onto two Arduinos, which would need two laptops. If the two Arduinos were synched, only one laptop would be needed which would be imperative if you were working without a partner.

Synching the microservo that controlled the lower jaw to the audio to mimic realistic mouth movements was very untimely. Since we used an MP3 player and not a SD card Adapter, we could not map output and then determine servo condition based on this output. Instead, we had to time the mouth movements and add individual delays. This is why we only implemented one quote as opposed to the many we had planned. In the future, synching the mouth and servos should be accomplished before doing any other work.

Improvement

The upper jaw and lower jaw can both have a microservo to obtain a more realistic version of speaking.

The eyes could be more carefully cut out and attached to the microservos to avoid holes.

Only microservos were used for the joints. Utilizing a gear system could have created a more seamless linear path for the eyes and mouth to follow. T servo motor connected to the mouth could be connected in a different way so that the mouth does not stick out as far. This could be done by researching different joint choices and it would make the mouth look more realistic.