AI-Aided Inchworm Robot for Colonoscopy

by diosparkbrando in Circuits > Robots

313 Views, 4 Favorites, 0 Comments

AI-Aided Inchworm Robot for Colonoscopy

Free (5).png
robot overview.png
Mr.Bull2025-01-0123.54.37-ezgif.com-optimize.gif

Here is a robot I made for smart colonoscopy. I utilized the inchworm mechanism to move the robot.

In particular, it moves like an inchworm, using elongation and contraction to drive the pneumatics' gait power. It also has an AI module to detect possible regions of lesion (colorectal polyps, the premature form of colorectal cancer). Thus, it can conduct colonoscopies with minimum human control, ensuring efficiency and accuracy in detection.

Why Do I Make It?

Colorectal cancer is the 3rd most common cancer worldwide, but if it can be detected in the early stage in the form of polyps, we can improve the patient's survival rate from 14% to over 90%.

Yet, traditional colonoscopy control by doctors would take a long time to generate a good diagnosis result, resulting in a misdiagnosis rate of up to 25% due to human errors. Indeed, my father once had an unpleasant experience with a colonoscopy diagnosis in the hospital. So, I thought of creating a robot that solves this problem.

Inspirations

  1. Diagnosis Efficiency Issue: To tackle this issue, I mimic the inchworm gait, a simple unital gait defined by the robot's elongation and contraction. This would allow the robot to move with a stable detection rate. In addition, this mechanism also fits the winding structure of the colorectum.
  2. Skill-Requirement Issue: The AI detection module allows the robot to detect the polyps automatically, requiring no human effort. Thus, the robot maintains high detection accuracy without the help of experts.

Overall Design:

The robot consists of

  1. The main body of the robot: mimics the motion of the inchworm in which it uses a 4-bellow pipe section to elongate and contract, and it has one balloon at the head and the tail each to anchor on the wall of the colorectum.
  2. The electronics system: controlled by a central PCB, it powers the motions of the robot by controlling the pneumatics system. The pneumatics system consists of 6 relays, one 6-bundle valve, and two gas pumps for pumping and sucking the gas.
  3. The AI system: uses a Sentry2 module to detect the polyps inside the colorectum using a trained Yolov2 network.

To learn more details about this project, feel free to refer to the project poster, paper, and engineering journal.

Here we go with the steps to take to build this robot!

Supplies

Tools:

  1. M2 Screw Drivers
  2. M3 Screw Drivers
  3. Soldering Iron
  4. A Pair of Scissors
  5. 3D Printer
  6. (Optional) Laser Cutter (For acrylic parts)

Mechanical Parts:

  1. 4x Bellow Pipes
  2. 6x 2mm Silicon Tubes
  3. 6x 4mm Silicon Tubes (for connecting to the valves)
  4. 2x Rubber Sleeves

Electronics:

  1. 1x ESP32 Board
  2. 2x Electronic Gas Pumps (one for positive pump and one for negative pump)
  3. 1x Sentry2 Camera Module (for AI Detection)
  4. 6x 7HA Relays
  5. 1x 6-Bundled Valve
  6. 1x Buck Converter
  7. 1x 4-pot Dupont Wire
  8. 1x 12v Power Source
  9. Some 1-pot Dupont Wire
  10. (Optional) 1x DC-DC Buck Converter
  11. (Optional) 1x LED

Softwares:

  1. Python 3
  2. LabelImg
  3. Arduino

Designing Main Body

Below are the 3D-printed parts for the robot body I created using Autodesk Fusion.

  1. Head and Tail: They are the two ends of the robot that are moved by the elongation and contraction of the bellow pipes. Gas was pumped through the airholes embedded inside their hollow chamber to pump the balloons (silicon sleeves) for anchorage inside the colons.
  2. Bellow Plate (Head and Tail): They are fixed to the head and tail respectively and are where the 4 bellow pipes connect the two ends of the robot.
  3. Hull Lids: They are responsible for clamping the silicon sleeves onto the head and tail to ensure air tightness.

Noticed that the head and tail have a slightly different design.

Why Those Differences?
This is so that the silicon tubes can properly connect to the airholes at the head, tail, and tail bellow plate without interfering with other parts.


One of the biggest challenges I faced when designing those parts is that for the screw holes on the respective parts, if I designed them to be strictly M2 standard, then the screws might not be able to fit in easily due to the inaccuracy of using regular 3D printing. As a result, I increased the diameter of each screw hole by 0.3 mm.

Important Note:
This design is intended for regular 3D printing machine to print and assemble, if you are trying to use photocuring 3D printing to replicate this project, please, if you can, adjust the screw holes to M2 standards (2mm).

Print the Parts

8971739090674_.pic.jpg

3D print one copy of each of the stl files.

Noticed that the color of the parts is not the same as I have printed sets of parts before, and some of the parts in the bundles are used for my attempts to assemble the robots.

When printing, I recommended using tree support.

Assemble the Robot

7861735715882_.pic.jpg
8981739093090_.pic.jpg
7891735716583_.pic.jpg
7901735716591_.pic.jpg

Now, after printing all the parts we need, we can assemble the main body of the robots.

Non-3D printed materials you need:

  1. 2mm silicon rubber tubes
  2. 4 bellow pipes
  3. A 4-pot Dupont wire
  4. M2 Screws

Here are some tips for assembling the robot:

  1. Make sure to cut the 2mm silicon rubber tubes to a suitable length before assembling them
  2. For the silicon tubes, connect them first to where they should belong before connecting the 3D parts using screws, or else it will be impossible to connect them afterward (ask me how I learned my lessons :< ).
  3. Don't forget to let the 4-pot Dupont wire pass through the robot first, you need that to connect to the Sentry2 Camera module in the steps later.
  4. When you cover the silicon rubber sleeves on the body of the robot, make sure to go in straight and pierce through the part that covers the screw holes to make sure the assembly process goes smoothly when you put the lids on.
  5. For the robot, use only M2 screws to connect the parts.
  6. You can screw in all eight screw holes, but if you want to finish this step quickly, you can screw like in the second image above.

Make sure you test the air tightness of the balloons to see if it can expand and contract properly!

Design the PCB

electronic system overview.png
截屏2025-02-09 18.05.29.png
截屏2025-02-09 18.06.30.png
截屏2025-02-09 18.07.58.png

Now to power the robot, we need to build a pneumatics system controlled by electronics. The first image shows all the electronic components in my robot.

The PCB board will be connected to each of the respective components as labeled in the second image.

Noticed in the schematics above, the biggest unlabelled box is the ESP32 board, the 6 unlabelled boxes in the relay sections are the 7HA relays and the unlabelled box at the LED section is the DC-DC 12v to 3.3V buck converter (They are all not shown in the 3D image above).

For the VCC and the GND, they are all fully connected to the respective parts of the board, all you need to do is to solder the other components to the respective parts.

PS: The 4-pot connector counted 3 from top to down on the right side of the board was meant to connect to a distance sensor, but I removed that from this version of robot as I thought that might not be needed as the track of the colorectum is fixed.

Below is the 3D model of the PCB:

Downloads

Assemble the Electronics

8991739104797_.pic.jpg
9001739104814_.pic.jpg
9041739193233_.pic.jpg
7871735716177_.pic.jpg

After printing out the PCB, we now solder the respective components to it.

As noticed in the image above, my PCB is slightly different from the design in the last step. This doesn't include the LED as the demo version I made didn't include the design. Other than that, the rest remains the same for the electronics.

Also, please solder the relays in the orientation shown in the image above. Otherwise, the positive and negative IN and OUT terminals might be mixed up, causing the PCB to malfunction.

For the valves, face the holes away from the PCE and connect each valve to the relay.

There must be one pump for positive pressure and one for negative pressure (should be built like in the image above).

Prepare to Train the AI Module

截屏2025-02-09 21.31.56.png
7391735114594_.pic.jpg
9011739106824_.pic_hd.jpg
7731735713881_.pic_hd.jpg

This is just a proof-of-concept, which I haven't done (and am currently unable to do) any clinical experiments on humans. But it can still detect anomalies in the tube-like testing environment I set.

If you want to customize your AI module, you can collect images to train your image by copying the Python file below into the SD in the Sentry2 module and renaming it to main.py.

Then, power your Sentry2, once it is finished initializing, take your module in front of your target of interest and press the button labeled in red, it will take the photos automatically for you. It is recommended to take at least 300 images to achieve the best training results.

After you have taken all the images needed, label them using LabelImg after you retrieve the images from Sentry2. You can name your targets to any name you want if you like.

For the label format, select PascalVOC.

Now enjoy drawing bounding boxes around your targets :)!

Train the AI Module

截屏2025-02-09 22.01.00.png
截屏2025-02-09 21.52.15.png
截屏2025-02-09 22.02.12.png

After finished collecting the data, you can go to https://developer.canaan-creative.com/ and then go to "Model Training".

  1. Go to "Dataset" -> "Create Dataset"
  2. For annotation type, select "Image Classification"
  3. After you create your dataset, go to "Configuration"
  4. Then, upload your images and labels, for this step, I prefer to use the archive format/compress package format (details about this format are in the image above.)
  5. After you finish uploading the data, click "train". For "platform", select "K210". For iterations, use 240. For other hyperparameters leave them as default. You can play around with those numbers if you like.
  6. After finishing training, click the "download" button at the entry of your job. You will receive a zip package. Save the following files: "anchor.txt", "label.txt", and "det.kmodel".

Code the Robot to Move

flow.png

To control the motion of the robot, I control the state of the valves using the ESP32 board, which controls the state of the valves, leading the head/tail to either expansion or contraction and the bellow pipes to either elongation or contraction.

Technically, ESP32 and Sentry2 are two independent computational units (think about the neural system in octopus!). The ESP32 will allow the robot to move forward continuously. But once the Sentry2 detects a polyp, it will use serial communication to signal the ESP32 to stop for a while and switch to moving back. While stopping, the Sentry2 module will take a few of the images of the anomalies.

For the codes below, upload the .ino file into ESP32 directly. For the Python file, first, adjust the following lines according to the annotations:

labels = ["polyp"] # object type names, follow the order in label.txt
anchor = (1.67, 0.80, 1.61, 1.17, 1.92, 1.47, 3.08, 2.28, 4.42, 3.11) # anchors, use the value from the second line of anchor.txt

Where "label.txt" and "anchor.txt" are previously saved.

Then, copy the Python code into the Sentry2 module together with "det.kmodel".

Warning: make sure that the Python file below and det.kmodel are the only files inside the SD card of Sentry2 at this point. If there are other Python file for example, the module might not work properly.

Integrate the AI Module to the Robot

le box.png
9021739110566_.pic.jpg
9031739110880_.pic.jpg
7511735350207_.pic_hd.jpg

Using the models below, we now print a box to fit in the Sentry2 module.

  1. First, you need to screw the lid of the box on the head of the robot on the four screw holes.
  2. Then, you can connect the 4-pot Dupont wire to the Sentry2 module.
  3. Then, put the Sentry2 module module into the box.
  4. Finally, if all the files you need are in Sentry2, seal the box.

Now we finished assembling the Sentry2 module, the robot now has the power to use AI to detect the objects.

Create the Electronics Box

WechatIMG759.jpg

Now, we can build a box to contain all the electronics so that they are not scattered throughout the house.

The image above is my creation of the box. I used a mix of 3D-printed material with acrylic to build it, and it is assembled using M3 screws, but you don't have to be like me. You can use cheaper alternatives like cardboard boxes for example.

Put the Electronics Into the Box

box_new.jpg
7651735659573_.pic_hd.jpg

After you finish building your electronics box, assemble the electronics and the PCB into the box.

Again, no matter how you design your box, be sure to leave holes to allow the 4mm tubes and the power slot for the 12V power source to stick out from the box so that they can connect to the robot on the outside.

The 4mm tubes and the 2mm tubes have different calibers, so you need a converter to connect the 2mm tube on the robot with the 4mm tube at the valves.

Below is the 3D model of the converter.

Downloads

Connect to the Power Supply (and Ready to Go)

9051739198669_.pic_hd.jpg
exp1-7.jpg
exp1-8.jpg
exp1-9.jpg
Inchworm Robot Demo Video

After you finish assembling your box, connect the power slot to your 12V power source.

The system takes a while to initialize (particularly Sentry2), but after that time, it will work properly.

Watch a demo of how it moves!

The images I attached above are the photos taken by the robot during the experiment shown in the demo video.

Potential Improvements and Final Thoughts

截屏2025-02-10 13.41.15.png

The version of the robot for this instructable is version 2. Currently, I am working on improving the design.

For example, I have redesigned the head and the tail in a way such that instead of clamping the balloon with screws, it now locks the balloon by the elastic force created by a buckle structure. Moreover, it will be much easier to assemble, making the robot more robust by further improving the air tightness while taking less time to assemble.

Currently, I haven't printed the new version of the robot yet, but it will be out soon hopefully.

Although it is intended for colorectal cancer diagnosis, I envision it to be used for other tube-like situations such as in water pipes in a house. In cases like that, with some specific adjustments, it can detect the anomalies such as damages in the pipes for further treatment.

If you built one and used it for another situation or have some other suggestions about the project, please let me know in the comments!