WALL-E: an Animatronic Build
The goal of this project was to build an animatronic figure of a well-known character that interacts with the user. Our team selected WALL-E from the Disney animated film WALL-E. In passive mode, the animatronic plays “La Vie en Rose” while gently articulating its eyes. When a user approaches, WALL-E’s eyes light up and he begins to “dance” by swinging his arms and rotating his neck, and placing an object inside him triggers mechanical recycling sounds. The supplies used for this project include a micro-controller, sensors, actuators, a chassis, and audio devices.
Supporting Structure
To create the supporting structure for the WALL-E animatronic, the model was first designed in SOLIDWORKS. The main supporting structure is the body, which has dimensions of 30 cm × 32 cm × 25 cm and includes a front hatch that can open. The next supporting structures designed in SOLIDWORKS were the neck and the eyes for the animatronic. Finally, a gear system was modeled in SOLIDWORKS for WALL-E’s tracks, which he uses to move around in the movie. Once all components were successfully modeled, the parts were printed using an Bamboo 3D printer and PLA filament. The total print time for all supporting structures is 40 hours.
Joint Design
The joints for the WALL-E animatronic were all modeled in SOLIDWORKS. The design included joints for the front hatch, arms, neck, and eyes, each using a specific joint type suited to its range of motion. The hatch uses simple pins that extend from the sides and fit into circular slots in the body to allow rotation. The arm joints uses a crank slider, the neck joint uses a crank arm, and the eye joints use a crank arm. Once the joint models were completed, they were prepared for printing, with a total print time of 8 hours.
Actuators
The actuators for the WALL-E animatronic were integrated into the design in SOLIDWORKS to ensure proper fitment with the supporting structures. All motion was driven by servo motors, which were chosen for their ability to provide controlled articulation. The design of the mounting locations and joint interfaces was guided by the physical size and geometry of the selected servo motors to ensure clean and secure mating. These servo motors enabled articulation of the arms, neck, and eyes once assembly began. After all parts were 3D printed, the physical servo motors were installed into their designated mounts during final assembly.
Downloads
Sensors
The sensors for the WALL-E animatronic consisted of a photoresistor and a force sensing resistor. The photoresistor detects changes in light levels, and the design relies on the idea that when an object gets close to WALL-E, the light in front of the sensor decreases. This change allows WALL-E’s eyes to light up and signals him to begin his dancing sequence. The force sensing resistor changes its output based on the amount of pressure applied, so when an object is placed on top of it, the added force triggers the mechanical recycling sounds. To read the sensor values and relay the information to the DFPlayer and the servo motors, an Arduino Uno R3 was used along with Arduino programming, which is discussed in the next section.
Programming
The programming for the WALL-E animatronic was built around a sensor-driven state machine that controls movement and audio based on user interaction. The algorithm continuously monitors the ambient light and force sensor readings, using timed thresholds to determine when WALL-E should transition between passive “resting” behavior, active “awake” behavior, or start his dance routine. Servo commands are processed in real time and mapped to specific motion profiles for the head, neck, eyes, and arms, while separate routines smooth each movement to create natural animation. The audio module is triggered by specific sensor events, allowing WALL-E to play different sounds depending on the detected action. Together, these programmed routines coordinate the lighting conditions, force inputs, servo articulation, and audio cues into a cohesive and responsive interaction system. An image of the circuit setup is provided for reference.
Final Product
The final product is a fully assembled WALL-E animatronic that successfully demonstrates all interactive features outlined in the project goals. The completed model includes a functioning body with a front hatch, articulated arms, neck, and eyes, and integrated sensors and actuators that allow WALL-E to respond to user interaction. In passive mode, WALL-E gently moves his eyes while playing “La Vie en Rose,” and when a user approaches, his eyes illuminate and he begins his dancing routine. Placing an object inside the hatch activates the force sensor, triggering mechanical recycling sounds through the audio system.
With the supporting structure, joints, actuators, sensors, and programming working together, the final animatronic closely reflects the character’s behaviors and offers an engaging and responsive user experience.
Lessons Learnt
Throughout the development of this project, several key lessons were identified that can improve future iterations of the animatronic. One major takeaway was the importance of designing joint mechanisms more efficiently so that fewer servo motors are required to achieve the same range of motion. Reducing the number of actuators would simplify assembly, decrease wiring complexity, and lower overall power consumption. Another lesson learned was the need for a more organized and dedicated electronics housing. A cleaner internal layout would make it easier to access components, perform troubleshooting, and transport the animatronic without risking loose connections or damage. Addressing these points would significantly improve both functionality and maintainability in future designs.