Cammy

by erisarakipaj in Circuits > Arduino

39 Views, 2 Favorites, 0 Comments

Cammy

WhatsApp Image 2024-06-08 at 22.02.49_a5cebcc5.jpg

Cammy ( Confrontation Avoidant Machine) is a unique take on the classic concept of the "useless machine." Unlike traditional useless machines designed to perform trivial or paradoxical tasks, our robot is designed to interact with human emotions through voice modulation. She responds to vocal cues by moving away when yelled at and approaching when spoken to softly and nicely, mimicking a basic form of emotional sensitivity. The project involved constructing a sound-tracking robot using two microphones to detect sound direction and respond accordingly.

The software for the Confrontation Avoidant Machine was developed using the Arduino IDE. The primary functionalities included:

  • Sound Analysis: Capturing and analyzing audio inputs from two microphones.
  • Motor Control: Managing the robot's movement based on the sound analysis.
  • Response Execution: Determining and executing the robot's behavior based on sound direction and amplitude.
  • For loud sounds, the robot moves backward (indicating fear or distress).
  • For soft sounds, the robot moves forward (indicating attraction or curiosity).

Supplies

Untitled-3.png

Here are the essential components for the electronic part of Cammy:

  • (x2) Wheels 
  • (x2) Motors
  • (x2) AZ-Delivery KY-038 Microphones
  • (x1) Breadboard
  • (x1) Microcontroller
  • (x1) 9V Battery
  • (x1) L293D

Additional materials and tools, which may be substituted or customized, include:

  • 3D printer
  • Isolation Tape
  • Glue
  • Googly eyes (Optional)
  • Ribbon for Bows (Optional)

Setting Up the Electronics

Untitled-1.png

Build the Circuit as shown above, making sure all parts of the circuit have a common ground connection. We started by placing the H-Bridge in the middle of the Breadboard, straddling the center gap and then connecting the pins appropriately.

Then we connected the microphones using male-female jumper wires. Each microphone's output was connected to one of the Arduino's analog inputs, such as A0 and A1.

Assemble the Robot

Slide11.JPG
Slide10.JPG
WhatsApp Image 2024-06-06 at 23.19.35_342ad5fb.png
WhatsApp Image 2024-06-10 at 00.30.05_00bf66e0.jpg
WhatsApp Image 2024-06-10 at 00.28.27_47c92c96.jpg
WhatsApp Image 2024-06-10 at 00.28.27_48e04d5e.jpg
WhatsApp Image 2024-06-10 at 00.28.27_127aa159.jpg

In the next step, we prepared the robot chassis. Using Rhino Software, we designed a compact 3D model to accommodate all components and conceal the cables. The Arduino, breadboard, and battery were strategically placed within the chassis to keep the components neatly hidden. We then mounted the DC motors underneath the chassis, connecting them to the wheels on the side. Finally, we positioned the microphones on the chassis to effectively capture sound from different directions.

Downloads

Code Logic

Slide15.JPG
Untitled-4.png

We used the Arduino IDE to prepare the code. To get our robot working, we designed an algorithm to interpret sound amplitude for the robot's self-orientation and movement.

The robot uses sound data collected by two microphones from the environment to determine the sound source. Similar to how human ears work, the robot can roughly detect the direction of the sound based on the different amplitudes received by each microphone. First, the robot LISTENS (collecting sound data and calculating/comparing amplitudes). After this calculation and comparison, it NAVIGATES to orient itself toward the sound source. Once oriented, it MOVES either toward the user or away, depending on whether the user is speaking nicely or shouting, according to a set threshold.

We use a state machine method to ensure the robot transitions smoothly between states (LISTEN, NAVIGATE, MOVE). When a stable sound source is detected, the robot will repeatedly alternate between LISTEN and NAVIGATE until it is correctly oriented, then it will enter the MOVE state.

  1. Setup and Initialization
  2. Include the standard integer, math libraries, and global variables.
  3. Define constants for motor control pins and microphone pins.
  4. Declare variables for storing microphone values, amplitude calculations, and timing
  5. Define a structure to store listening results.
  6. Define constants for shouting threshold (for defining movement) and normal threshold (for reducing noise-irritation).
  7. Define an enumeration for different states of the machine
  8. Check the sensitivity of both microphones (should be even)
  9. Setup Function
  10. Initialize pin modes for motor control.
  11. Set basic funtions for motor actuation (MoveForward, TurnRight, TurnLeft)
  12. Set the initial state to LISTEN
  13. Begin serial communication for debugging
  14. Main Loop
  15. Continuously check the current state and call the appropriate function: listen, navigate, or move.
  16. State Functions
  17. Listen Function:
  18. Enter the listening state and obtain sound levels.
  19. Gather AveAmp (average amplitude) collected by the microphones within the duration of SampleWindow to avoid sudden irritation and confusion.
  20. Transition to the NAVIGATE state if a significant sound is detected.
  21. Navigate Function:
  22. Enter the navigation state and control the direction based on sound levels.
  23. Repeat LISTEN and NAVIGATE until there is no amplitude difference between both microphones, and then transist to MOVE state
  24. Move Function:
  25. Enter the move state and control the machine's movement based on the average amplitude.
  26. Return to the LISTEN state when the task is successfully executed.

Challenges and Testing

1. Movement precision: since the motor we’re using cannot calculate the turns but only can be controlled by the time that the current runs through, it has difficulty in determining the precise angle our robot turns. Also, the same delay time can cause different turning angles due to the voltage difference. When trying your own set-up, it is encouraged to have adjustments accordingly.

2. Hardware assembly: the two wheels should be installed parallel to make sure the movement is smooth. Shell-shaped covers near the microphones (working like ears) are also encouraged to enhance the data-gathering performance.

3. Noise reduction: since the sound motors generate while functioning could affect sound data collection, some normalization would be helpful for the performance.

4. Speed control: step to move forward can be to include speed control in the code. It can be included in the basic motor control functions.

References

Sources: 


  • Tutorials/State machines Tutorial/CommentedSketches/Part 1/BlinkWithDelay/BlinkWithDelay.ino at master · j-bellavance/Tutorials · GitHub
  • PIE/FSM1.ino at main · bminch/PIE · GitHub
  • Build a Sound-Tracking Search and Rescue Robot | Science Project (sciencebuddies.org) 
  • https://youtu.be/fD0TESgnnAI?feature=shared