Cairn - End-Effector for KUKA Robot

by IvanaTrifunovic in Circuits > Robots

1133 Views, 2 Favorites, 0 Comments

Cairn - End-Effector for KUKA Robot

Stacking.jpg
Cairn End-Effector for KUKA Robot
1.FeedAPiece.png
1.GrabThePiece.png
3.LiftThePiece.png
4.YayACairn.png

“A cairn is a man-made pile of stones raised for a purpose.Cairns have been and are used for a broad variety of purposes. In prehistoric times, they were raised as markers, as memorials and as burial monuments. In modern times, cairns are often raised as landmarks, especially to mark the summits of mountains.”


Cairn is a collaborative machine that detects and sucks any scrap piece of material you give him, and stacks them to create an unexpected and balanced artwork


Let's dive into Material Manipulation, and fabricate a custom end-effector for a KUKA robot. This tutorial is an opportunity to get familiar with KukaVarProxy and adaptive robot path generation, and to explore the possibilities of computational design applied to a digital fabrication framework.

We strongly recommend to be familiar with computational design and scripting on Grasshopper, to have basic Arduino knowledge as well as to have experience with Robot programming to follow this tutorial!


This project was conducted by Ivana Trifunovic, Ioannis Moutevelis, and André Aymonod as a part of the Computational Design and Digital Fabrication seminar in the ITECH master program.

Supplies

image_2022-07-25_001237337.png

Electronics:

  • Arduino UNO and Breadboard
  • DC 12V Micro Electric Vacuum Air Pump (link)
  • Pneumatic Suction Cup Kit (link)
  • 4mm diameter silicon hose (link)
  • Transistor (from Arduino Starter Kit)
  • Diode (from Arduino Starter Kit)
  • Webcam (link)
  • (optional: 12V Power supply)
  • 2x 5m USB Cables (to connect to the KUKA Robot)
  • Jumper wires


End-Effector:

  • 2x14mm plywood plates (see plans and dimensions in the next section)
  • 4x M6 Rods
  • M6 Nuts and washers


Softwares:

  • Rhinoceros 7 and Grasshopper (with plug-in: Firefly, Peacock, Virtual Robot Framework, Simulacrum (download here)
  • KukaVarProxy
  • Visual Studio, or any code editor
  • Optional (to simulate the robot program): Kuka Office Lite inside a virtual Machine (VMWare Workstation) and KukaWorkVisual


Tools:

  • Soldering iron
  • Duck tape
  • Spanner
  • 6-axis Robotic Arm (in our case, a KUKA 125-2 Robot)


Material:

  • White and Black A2 paper sheets
  • Scrap pieces of ceramics, glass, plastic... Or whatever flat material you have at your disposal!

Concept

Cairn_RobotPathPlanning
Cairn End-Effector for KUKA Robot - Releasing suction
image.jpg
image_2022-07-24_200204073.png

The aim of Cairn is to enhance real-time interaction between material, machine, and user through a sensing loop and an adaptive and parametric toolpath generation. The result is a playful pick-and-place game between an user and the robot that will create surprising stacked sculptures together.


Working principle

Cairn relies on a simple physical mechanism to grab and release pieces of material: suction. A suction cup connected to a vacuum air pump and controlled by an Arduino UNO is capable to suck any small piece of flat material.


Process

The workspace is organised in two areas: scanning and stacking.

The robot works in a simple motion loop. It starts by going to a photo spot above the scanning area, from where we'll take a snapshot of webcam stream. This snapshot will allow us to identify the contour of the piece, and its position, which will inform a toolpath adapted to the piece. Once this new toolpath is computed, the robot moves toward the piece and the suction mechanism is activated via Arduino. The robot lifts the piece from its center of mass, and brings it to the stacking area, where it will be released. Once this is completed, the robot comes back to the photo spot, and the user can place a new piece.


Image collection and processing

To identify the contour and the location of each piece, we rely on a webcam that is mounted on the end-effector.

It is interesting to mention that there are many other options to perform those tasks. We first explored this in Python using OpenCV to collect and process our images, but then decided to switch to Grasshopper to integrate the workflow with the rest of our toolpath generation in the smoothest way possible.

Grasshopper's Firefly plug-in allowed us to stream the webcam in real-time and take snapshots every time the robot was in the defined photo spot. We then converted this image into black-and-white to facilitate the processing. Then, we used the Rasterize component from Peacock plug-in to retrieve the piece contour curve.

We also needed to locate each piece within our scanning area. To facilitate this process, we created a black square frame with white edges, in which the user can freely place the piece. It is very important that this frame is fully included in the picture, so that the result of the rasterization is the piece contour curve in relation to the contour frame.

The dimensions of the frame enabled us to then scale the resulting contour curve, and move it to the right position in our digital model.

Building the Hardware

EndEffector_CloseUp.jpg
Cairn_WiringDiagram.jpg
Cairn_WiringSchem.png

Let's start by building our end-effector.

Build a home for the electronic circuit

Manufacture the two plates provided in the file below. We made ours out of 14mm plywood, but you can also use a lasercutter with a strong material such as medium.

Insert the suction cup in the central hole. Then, attach the two plates together M6 Rods.


Suction mechanism set-up

The next step is to build your electronic circuit according to the attached wiring diagram. We highly recommend testing the components individually before fixing the circuit inside the effector. For troubleshooting, we provided a simple Arduino code that will activate and release the suction pump every 5 seconds.

Once the circuit is ready, securely fasten every component inside the two plates you built previously using duct tape, superglue, or screws.


Give sight to the effector

Attach the webcam to the bottom plate, with the camera facing down. Make sure to use a wireless webcam, or to have a long enough cable to connect to your computer (ours was 5 meters long). Set-up the camera resolution and zoom.


Once everything is ready...

You are ready to mount the end-effector on your robot flange using adapted bolts. Make sure that the end-effector is firmly attached and won't move during the process.

Adaptive Parametric Toolpath Generation

As explained above, all our computation happens in Grasshopper. We attached both our Rhino and Grasshopper files that are ready to use (download them here as the formal is not supported: 3DM File and Grasshopper File)

The script starts by capturing a snapshot of the webcam stream, and processing the picture to retrieve the contour curve and center point. The 3D model contains a replica of our robot cell and workspace set-up, so once we get the piece's geometry, we can place it in the 3D space according its real position. We then compute the new position of the piece, on the stacked area. These locations allow us to generate the planes that will constitute our toolpath.

The script includes a section that is writing the KRL code that we'll input in KukaVarProxy. Note that it is only necessary to generate the KRL code once, as it will be updated in real-time for every new piece.

The last part of the script enables the connection and communication to KukaVarProxy. This includes: a boolean variable to actualise the KRL code, custom variables for each position that is updated, and the Firefly component to send code to the Arduino.

KRL Code in KukaVarProxy

22.07.17_Workflow.png

Now comes the most interesting part: the communication between our Grasshopper script, and the resulting adaptive toolpath!

You can download the KRL code here as well as the corresponding .dat file here. Let's walk through the code.

DEF cairn()
;------- Declaration section -------
;FOLD DECL
DECL E6AXIS HOME
;ENDFOLD
;------- Initialization ---------
BAS (#INITMOV,0)
BAS(#VEL_PTP, 20)
$VEL.CP = 0.4
$BASE = $NULLFRAME
$TOOL = TOOL_DATA[8]
HOME = {A1 0,A2 -90,A3 90,A4 0,A5 45,A6 0,E1 0, E2 0, E3 0, E4 0, E5 0, E6 0}
;PHOTO = {E6POS: X 1329.282, Y 979.818, Z 502, A 90, B 0, C -180, E1 0, E2 0, E3 0.0, E4 0.0, E5 0.0, E6 0.0}


UPDATE = TRUE
PHOTOREADY = FALSE

The Declaration and Initialization sections are pretty straightforward. We define a cairn() function, and declare all our variables in the .dat file. We then assign a Base, Tool, and Home position, as well as a custom position: PHOTO, which corresponds to the position where we take a snapshot of the webcam stream. Finally, notice the PHOTOREADY boolean that is set to False, and that we will later use to trigger an update of the variable positions.

;----------- Main section ----------
PTP HOME
WHILE UPDATE

  LIN PHOTO C_VEL
  HALT ;take picture + set PHOTOREADY to TRUE

We then jump into the main section of the code, where we start by going to the HOME position, and then to our PHOTO position. This is now the time to take a snapshot of the webcam stream, that will trigger the whole grasshopper script to be recomputed. Once this is done, we set the PHOTOREADY boolean to True in grasshopper, this signifies that we are ready to execute the motion loop.

IF PHOTOREADY THEN

    LIN GRABING C_VEL
    LIN PUSHING C_VEL

    HALT
    SUCTION = TRUE
    HALT

    LIN {E6POS: X 1594.601, Y 574.28, Z 622.5, A 90, B 0, C -180, E1 0, E2 0, E3 0.0, E4 0.0, E5 0.0, E6 0.0} C_VEL
    LIN {E6POS: X 1500.871, Y -359.291, Z 633.5, A 90, B 0, C -180, E1 0, E2 0, E3 0.0, E4 0.0, E5 0.0, E6 0.0} C_VEL
    LIN STACKING C_VEL

    HALT
    SUCTION = FALSE
    HALT

    PHOTOREADY = FALSE
ENDIF


This loop will carry on the movement we previously described: moving toward our piece in the variable positions GRABING and PUSHING. This is the point where we turn another boolean, SUCTION, to True which activate our Arduino and starts the suction. Once the robot has grabbed the piece, it carries it to the variable position STACKING, where the Arduino is turned off, and the piece is released. This is the end of the loop, and PHOTOREADY is set to False again. The robot then goes back to the PHOTO position, and waits for PHOTOREADY to be turned to true again (once a new piece will have been imputed).


We used Kuka Office Lite inside a virtual machine (VMWare) along with Kuka Work Visual to simulate this toolpath and check the reachability of the robot, and the communication between our scripts. This step is optional, and requires a more complex set-up. We recommend that you get familiar with this more complex set-up beforehand if you are interested in this :)

Robot Implementation

Cairn End-Effector for Kuka robot
ReleasePiece.jpg
ApproachPiece.jpg
LiftPiece.jpg
Cairn End-Effector for KUKA Robot

If you carried all the previous steps: congratulations! By this point, you should be ready to implement your effector in a real robot cell, and see it come to life :)

The first step is to mount the manufactured effector on the robot Flange, and to calibrate it. To do so, we used the XYZ-4-Point Method, which consists in approaching a reference point from 4 different directions.

After this, we prepared our workspace, and installed scanning frame. We surveyed its position, and input this data in our grasshopper script, so that the real and digital location match perfectly.

After connection our computer to the Robot controller and KukaVarProxy, we performed a few tests to adjust the camera, and the pressure the suction cup applies on the piece.

And finally... We are ready to go!

We started by running our program very carefully in T1, and after a few successful loops, switched to T2 and increased the velocity until reaching 100%. In the videos you can see the result with 12 pieces, from a user POV and from the effector POV :)

What Comes Next

We created an interactive system between material, machine, and users. The person feeds the piece and the robot is able to sense and compute the material position, grab and release it.

Now that this system is working smoothly, there are many way in which we could build on that, especially on the computational design side. To widen the variety of our results, we could arrange the pieces into a mosaic, or into specific shapes following the principle of a jigsaw. We could also integrate structural feedback to play with balance and create more impressive stacks.


We had a lot of fun, and learnt a lot with this project, and we hope you enjoyed reading through this tutorial! We are always curious to hear feedbacks and suggestions, so the comment section is wide open for you :D