Autonomous Jenga-Playing Robot (J_Bot)

by ArpanK8 in Circuits > Robots

469 Views, 3 Favorites, 0 Comments

Autonomous Jenga-Playing Robot (J_Bot)

Autonomous Jenga-Playing Robot (J_Bot)

Introduction and Overview of the J_Bot

Hi everyone! My name is Arpan Kumar and today I'll be walking you through this tutorial on how to build a Jenga-Playing robot today! Over the last month, I've been developing a robot that is capable of playing Jenga. While it's very far from being able to as well as a human player, I found it to be a worthy proof of concept. Throughout the project, an overarching goal was for the J_Bot to replicate human behavior while playing as best as it could in some form of biomimicry. The basic functionality of the J_Bot includes teleoperated and autonomous modes of control through a gamepad or a webcamera.

I want to emphasize that this is a complex project and I won't be walking you through the details of it. I am quite certain that anyone who wants to build a J_Bot will be capable of figuring out the smaller mechanical, electrical, and programming details themselves. That being said, don't be shy to reach out to me in the comments section! I'd be happy to help you out if you're struggling on something (chances are that I've struggled on the same thing).

Supplies

Enough talk! Lets begin! The parts you will need are listed below:

  • 1 Raspberry Pi 3B+
  • 1 Arduino Mega
  • 1 Logitech C270 Webcamera
  • 1 Ultrasonic Sensor
  • 2 Limit Switches
  • 2 DC Motors (Tetrix Motors are what I used)
  • 2 Servos (Standard Size0
  • 1 Breadboard
  • 1 Logitech F310
  • 1 Led
  • 1 220 Ohm Resistor
  • 1 Seeed Studio Motorshield V2
  • 1 9V battery (or alternate external power source)
  • All accessories/peripherals needed in conjunction with the parts listed above (ex. wires)

Please note that if you have different components that you think will work please go ahead and use them! These are just the parts I used.

The Leadscrew Mechanism (DC Motor)

Leadscrew Mechanism

Leadscrew

The first mechanism I started with was the leadscrew. It makes up the backbone of the robot and is the first stage of the arm. In general, any vertical linear motion would work for the J_Bot. However, I chose to use a leadscrew because they offer a high degree of torque and accuracy. The only downside of the leadscrew mechanism is that it's slow. The idea behind this is pretty simple: the leadscrew rotates and a hollow piece with grooves cut in it to mesh with the leadscrew is able to move. I included a closeup video of its workings.

The Dual Pulley Mechanism (Servo)

Dual Pulley Mechanism

To move the blocks back and forth, I use a dual pulley mechanism. In essence, one string is connected to one side of the object. The other string is attached to the opposite side of the object. When one string pulls, the other string has to be released at the same rate as the initial string is being pulled and vice-versa. Because both string are moving at the same rate in opposite directions, all you have to do is wire both strings in opposite directions to the a pulley. That way, when the pulley rotates one way, one string will be pulled and the other will be released: allowing the object in the middle to be pulled.

The Claw Mechanism (Servo)

Claw - Detailed View

The claw mechanism relies on a dual rack-and-pinion system to simultaneously open and close both sides in a linear manner. By placing one of the racks above and one one of the racks below the pinion, when the pinion moves, both racks move in opposite directions. Turning the pinion clockwise will cause the racks to diverge while turning the pinion counterclockwise will cause the racks to converge. By attaching each side of the claw to one of the racks, both sides can simultaneously close or open depending on the direction of rotation of the pinion.

Adding the Base of the J_Bot

IMG_20210329_230434470.jpg

At this point, you should have the completed 3 DOF arm with you. Great! Now you should find a large baseplate to mount the entire assembly on. This base will enclose the robot arm, the turntable, the Arduino, Raspberry Pi, and the breadboard, so you should pick a relatively large plate. I'm using an 4.5" by 12" Actobotics plate, but feel free to use whatever you want (ex. wooden sheet).

The Turntable Mechanism (DC Motor)

Turntable Mechanism

Out of the entire robot, the turntable remains the biggest "wow" factor. it is also the easiest mechanical feature to implement consisting of as little as a motor and a flat base piece. The turntable is positioned right under the Jenga stack. When the turntable rotates, so does the stack. This allows the robot arm to stay fixed to the base while still being able to pick up blocks from all sides of the stack. I wanted to house my motor sideways for logistic reasons so I ended up using a pair of bevel gears. However, you can mount this any way you want. While mounting the motor + turntable assembly onto the base, make sure it is within reach of the robot arm. On top of the turntable is where the entire Jenga stack will go, so make sure it is well positioned with respect to the robot.

Arduino and Motorshield

IMG_20210329_230610310.jpg

Now onto the testing phase! Add the Arduino and breadboard onto your base and attach the Motorshield to the Arduino. I would highly recommend that you individually test every motor in your build before continuing. Doing so can illuminate issues in your mechanical design or even your electronics. I used the Seeed V2 Motorshield to control the 2 DC motors in the J_Bot. Early on, I faced some issues with the motors not turning. After doing some debugging, I realized that the power output coming to them was very little. It's imperative that you provide an external power source to power your J_Bot because otherwise it simply won't run. I would recommend either and 9V or 12V battery pack. After you power up your Motorshield, I would recommend trying to move the motors at different speeds and directions.

If you need more assistance, try this link: https://wiki.seeedstudio.com/Motor_Shield_V2.0/

Sensors

IMG_20210329_230644945.jpg
IMG_20210329_230636725.jpg

As a way to make sure that the J_Bot doesn't extend past its physical limits, I included a 2 limit switches and 1 Ultrasonic sensor. The limit switches are placed at the base and the top of the leadscrew mechanism in a place that they will be triggered if the claw ever comes to them. Whenever the touch sensors are activated, they instantly prevent motion in the direction that they are stopping the claw. This acts as a failsafe in case the J_Bot ever malfunctions. The Ultrasonic sensor is meant as a way to verify the distance that the dual pulley mechanism has travelled, however, at the J_Bot's present stage of development, it is used only for telemetry and feedback purposes.

Serial Communication Between the Raspberry Pi and Arduino

I have used both a Raspberry Pi and an Arduino as a way to outsource the heavy-duty computation to the Raspberry Pi which then sends signals to the Arduino through the Serial Port telling it what to move on the robot. We'll start with the basics.

  1. Download the Arduino IDE onto the Raspberry Pi so we can program directly from the Pi to the Arduino
    1. >>> sudo apt-get install arduino
  2. Run the Arduino program linked above
  3. Run the Raspberry Pi program linked above to test whether the Arduino program works

If you need more help, check out this link: https://www.instructables.com/Connect-Your-Raspber...

One note, if you're having trouble with COM port errors, try switching the port to ACM1 or USB0.

Gamepads and Function Based Programming

If you're still here, I'm glad to see you've made it! Okay, now that we've initiated a simple serial connection between the Raspberry Pi and the Arduino, its time to add a bit of complexity. Through the basic logic in the previous step, if we assign every unique character a unique response on the Arduino's side, we can initiate a variety of preprogrammed outputs. By creating functions in the Arduino IDE, we can specifically respond to the given Serial command. But to give multiple commands, we have to have a more sophisticated input device. For this, we are going to use a USB controller/gamepad. I am using the Logitech F310. To connect button inputs from the gamepad to the Raspberry Pi code, we are going to use Pygame, which, as it happens, is pre-installed on the Raspberry Pi. I created a sample piece of code that helps identify which button connects to what function. It is a little out of date and the code is a little messy, but I linked it above. When you plug in your controller, you can initialize whichever buttons you would like to whichever serial outputs you would like.

Now that we have Pygame and we have connected different buttons on our controller to different serial outputs, we need to program the Arduino to respond to the specified outputs. I would start by creating a few major functions:

move_up(int speed_)

move_down(int speed_)

no_go()

go_forward()

go_backward()

claw_open()

claw_close()

rotate_counterclocwise(int speed_)

rotate_clockwise(int speed_)

stop_rotation()

By combining the above functions, we can, in essence, write a program to pick and place a block. This is exactly what we're going to do. The Arduino is only responsible for knowing what to do when it is told. In contrast, the Raspberry Pi can use its connection to the webcamera to figure out when to execute each of the above functions. Above, I have linked my final Arduino program that utilizes function based programming to

OpenCV Image Processing and Autonomous Navigation

IMG_20210329_232259459.jpg

We're finally here: the final and most formidable step of this tutorial. To begin, we'll need to install OpenCV on the Raspberry Pi. That in itself is almost a full tutorial, so I included a link for you to follow to accomplish this.

https://pimylifeup.com/raspberry-pi-opencv/

From here on out, I'll assume you have OpenCV installed on your Pi. For the webcamera to recognize the block, we're going to use Template Based Matching. By matching a template image to the real feed from the webcamera, we are going to be able to get the relative coordinates of the top-right-hand corner of the blocks the webcam recognizes. Quite a bit of my code is pulled form the official documentation from OpenCV so here is the link in case you want a deeper understanding of the code: https://opencv-python-tutroals.readthedocs.io/en/l....

A high-level overview of my code is this:

1. We import, initialize, and define things that need to be imported, initialized, and defined.

2. We compare the template image to the feed from the webcamera and see where the confidence is higher than the threshold and plot the corresponding rectangles

3. We find the block that's the closest to the claw (so it has to move the least) and begin moving upwards whilst tracking the block

4. When the claw reaches the block, it picks it up, pulls it out, and moves upwards

5. To check whether we are at the top of the stack, we check to see how many blocks we are able to see and their relative locations

6. If we are only able to see one block and its relative location is very low in the camera's frame, we know that we are at the top and it is time to place the block we are holding onto the stack

Thank you for reading my Instructable! I apologize that it could not be made in greater detail, however, I am certain that if you're interested in this project you will be able to utilize this tutorial as a guide to making your own J_Bot while also adding your own spin on it!