Android Controlled Robot Spy Camera

by danionescu in Circuits > Remote Control

11546 Views, 53 Favorites, 0 Comments

Android Controlled Robot Spy Camera

logo.jpg
Phone controlled robot camera

This interesting but complicated project will cover things from designing building a robot, to advanced configurations in linux (raspberry pi) to building an Android application and controlling the robot.

So this being said it's more ambitious then the average projects, but i think you'll have a lot to learn by examining some ideas here or even replicating the entire project.

First we'll build the robot using plexiglass, plastic sheets, DC motors with gearboxes and various electronic components. The device will be able to move independently the two front wheels and it will be able to use it's headlight. Then we're going to set up the raspberry pi powering the robot and configure the project and install various dependencies. Then we're going to build and install an android app and use it to control the robot remotely using the camera and a wifi connection.

The technologies and concepts will be exploring here:

Development platforms: Arduino, Raspberry pi, Android

Electronics: H-bridge, using a transistor to drive a big load, infrared sensors

Linux : using docker, docker compose, configuring services using systemctl, video streaming

Programming: Android applications, python, arduino language, serial communication, MQTT

Things Required

2016-12-29 12.09.41.jpg
2016-12-30 12.33.05.jpg

Parts:

1. Plexiglass sheet

2. Plastic sheet ( you can also use a plexiglass sheet here )

3. Glue

4. Tyre + DC motor with gearbox + bracket (eBay) 13$

5. Small nuts and bolts, hexagonal metal spacers

6. 2 x any directional wheel tyre

7. Small LED flashlight (it will be transformed into a headlight)

8. Arduino pro mini 328p (eBay) 2 $

9. 2 x infrared obstacle sensor

10. PCB

11. NPN tranzistor (to drive the flashlight)

12. L7805CV 5V regulator

13. L298 H-bridge

14. Rezistor 220 Ohms

15. Male usb connector

16. Male micro usb connector

17. Various wires

18. 3 v regulator (for communication between arduino and raspberry pi)

19. Male & female PCB connectors

20. On / off switch

21. XT-60 female LiPo connector (eBay) 1.2$

22. 2S 1300 mAh LiPo battery with XT-60 connector

23. 5v battery pack

24. Raspberry Pi 3

25. Raspberry Pi card

26. Raspberry Pi case

27. Raspberry Pi camera

Tools:
1. USB to serial FTDI adapter FT232RL to programm the arduino pro mini

2. Arduino IDE

3. Drill

4. Fine blade saw

5. Screwdrivers

6. Soldering iron

7. Wire cutter

Skills:

1. Soldering, check this tutorial

2. Basic arduino programming, this tutorial might be useful

3. Linux service configuration, package installation

Building the Robot Platform

2016-12-29 12.29.19.jpg
2016-12-30 12.02.19.jpg
2016-12-29 14.35.36.jpg
2016-12-29 14.35.30.jpg
2016-12-29 14.19.21.jpg
2016-12-29 14.12.02.jpg
2016-12-29 14.10.49.jpg
2016-12-29 14.08.35.jpg
2016-12-29 14.06.50.jpg
2016-12-29 14.04.54.jpg
2016-12-29 13.43.56.jpg
2016-12-29 13.06.23.jpg
2016-12-29 13.04.47.jpg
2016-12-29 13.01.07.jpg
2016-12-29 12.58.26.jpg
2016-12-29 12.56.12.jpg
2016-12-29 12.54.41.jpg
2016-12-29 12.46.34.jpg
2016-12-29 12.39.35.jpg
2016-12-29 12.29.19.jpg
2016-12-29 12.21.58.jpg
2016-12-29 12.21.51.jpg
2016-12-29 12.17.29.jpg
2016-12-29 12.12.35.jpg
2016-12-29 12.09.41.jpg

The robot we're going to build is going to have the following specifications:

- It will have traction on the front wheels by two separately DC motors

- the back wheels should be able to move in any direction 360 degrees

- the direction will be controlled by varying speed on the front wheels so no separately direction mechanism is needed, also the robot will be able to rotate on the spot

- it will have lights on the top

- it should have enough room for the electronics, batteries, and a raspberry pi case with a camera

- a few cm ground clearance is needed to overcome small obstacles

Don't forget to check the images for important details and building tips.

We're going to build the robot from plexiglass or hard plastic, i've used them both but you can choose whatever you wish.

The base plate will be 18 x 13 cm on the base plate the DC engines will be attached with metal brackets nuts and bolts. The H-bridge will be mounted the middle of the plate facing the floor. The back wheels will be attached using 2 cm hexagonal metal spacers (one side male one side female)

A big hole near the H-bridge is needed to connect the electronics on the top side.

The top part of the robot will consist of two plates in "L" shape one will be 12 x 13 cm and the other 6,5 x 13 cm. The plastic plates will be glued together. These plates will provide cover for the electronics, a place to mount the headlight and a support for the raspberry pi case. The top part will be attached from the bottom part using 6 cm hexagonal metal spacers )

Building the Electronics

sketch.png
micto-usb-wireing.png
usba-tousbb.png
2016-12-30 16.07.02.jpg
2016-12-30 16.06.22.jpg
2016-12-29 15.29.40.jpg
2016-12-30 12.36.28.jpg
2016-12-30 15.09.23.jpg
2016-12-30 15.08.44.jpg
2016-12-30 12.53.27.jpg
2016-12-30 12.01.40.jpg
2016-12-30 11.40.23.jpg
2016-12-30 11.31.31.jpg
2016-12-30 11.28.37.jpg
2016-12-30 11.16.29.jpg
2016-12-30 11.12.10.jpg
2016-12-30 11.08.56.jpg
2016-12-30 11.06.14.jpg
2016-12-30 11.03.34.jpg
2016-12-30 11.01.45.jpg
2016-12-30 10.58.16.jpg
2016-12-30 10.50.58.jpg
2016-12-30 10.43.11.jpg

Pinout (arduino):


Led flashlight: D3

Left motor: PWM (D5), EN1, EN2(A4, A5)

Right motor: PWM (D6), EN1, EN2(A3, A2)

Infrared sensors: Front (A0), Back(A1)

Raspberry pi communication pins: Tx: D11, Rx: D10


Building the PCB, assembly

1. In the last step we've already accommodated the H-bridge on the floor side of the robot. We'll also need to install the two infrared sensors one in front and one in the back. We're going to mount them on the chassis using a small metal plate. The metal plate will be like an "L" shape, and will have two holes. Using nuts an bolts we're going to install it on the chassis. The sensors will be on the middle of the chassis one in the front and one in the back.

2. Next the headlight part, i've used a 5 volts led flashlight for this. I've cut the headlight exposing only the "head" part and soldered two wires to power it. Then i've glued the headlight on the middle of the robot top part and drilled a hole near the headlight put the cables through the hole and soldered a small female two wire connector to it.

3. Assembling the raspberry pi case. You will need a raspberry pi, a pi camera, memory card at least 4 GB, an pi camera connector. Insert the card with the latest Raspian installed, then take the pi camera connector insert it carefully on the raspberry pi, then insert it in the camera, then close the case.

If you don't know how to install the Raspian operating system check this link.

For more information on how to install the camera and enable it check this official article.

4. Building the PCB with the main electronic components. I've attached the fritzing schematics in fzz format and as a picture. You can use it as a reference oh how to build the electronics.

Soldering steps:

a. Cut the female PCB connectors, there are two 12 pin connectors for the microcontroller and two 5 pin connectors, there are two 3 pin connectors for the IR sensors, a six pin connector for the H-bridge and a pin connector for the raspberry pi communication (ground, TX, RX)
b. After all the connectors are cut there must be soldered on the back of the PCB

c. Solder the KF301-2P connector

d. Solder the NPN tranzistor and the corresponding resistor to it's base

e. Solder the L7805CV 5V regulator

f. Solder the 3.3 volts regulator on the arduino to raspeberry pi TX line

g. Solder the male pins to the arduino pro mini

h. Solder all the red(+), black(-), and white(signal) thin wires according to the fritzig schematic

i. Solder a male connector to the breadboard for the flashlight

5. Connectors

a. Build a connector from the 5V USB battery pack to the raspberry pi and arduino, you will need a male USB connector type A, a male micro usb connector black and red wires, heat shrink tubing and a female to female breadboard connector. First cut the female to female connector in two, these parts will go into the arduino negative and positive male pins. The type A USB connector will send power to the arduino and also to the raspberry pi using a micro usb connector. Check the images for soldering usb tips.

b. Build the connector to the LiPo battery to the electronics board, you will need a XT-60 female LiPo connector, red and black wires, heat shrink tubing and a small switch capable of handling 10 A. The black wire will be connected directly from the XT-60 to the electronics board (KF301-2P plug in screw connector), the red wire will be connected through the small switch

c. Connect the two IR sensors of the robot to the corresponding female connectors on the PCB using female - male breadboard connectors

d. Connect the H-bridge to the 6 pin female connectors to the PCB using male - female breadboard connectors

e. Connect the motors positive and negative terminals to the H-bridge

f. Connect the H-bridge main power supply to the KF301-2P plug in screw connector on the PCB

6. Before placing the arduino to the PCB double check everything using a magnifying glass and a multimeter

Downloads

Arduino Code

First i need to answer an important question: Why does an intermediary arduino layer has to exist and not directly connect the Pi to the electronics?

1. It's more modular, you can reuse the arduino robot in another project without the PI

2. For safety, it's cheaper to replace a 3$ arduino pro mini than to replace a Pi (35$)

3. An arduino it's not intrerupted by the operating system like the pi is, so it's more efficient to implement PWM controlls for the mottors, polling the front and back sensors a few times per second

4. If an error might occur in the python script the robot might run forever draining the batteries and probably damaging it or catching fire if not supervised, in an arduino sketch a safeguard it's more reliable because it does not depends on an operating system

5. It's easier to debug a decoupled system


Ok, so i've got the Why part covered, i'll explain the arduino sketch a bit. It basically does two things:

1. It receives motor and light commands from the serial line and drive the motors or toggle the light

For example:

* "M:-25:16;" means (-25 left), and (16 power), it wil translate to left motor 17% and right motor 32%, and direction forward
* "M:44:19;" means (44 to the right) and (19 power) it will translate to: left motor 38%, right motor 5% and direction forward

* "L:1;" means lights and "L:0" lights off

2. It polls the infrared sensors from the back and the front of the robot and sends data about distances through the serial line

First you'll need to download and install this library:

The main code is located on github repository here, or from you can copy paste it from below.

Upload the code to the arduino using a FTDI adapter. Now you can give the robot commands to see it work, for this just connect the second serial line and send motor or light through it. One way to do this is using a bluetooth module like HC-05 and connect it to a phone using a bluetooth application. Then give it serial commands like "L:1"

// source for TextMotorCommandsInterpretter: "https://github.com/danionescu0/arduino/tree/master/libraries/TextMotorCommandsInterpretter"

#include <SoftwareSerial.h> 
#include <TextMotorCommandsInterpretter.h>

const char MOTOR_COMMAND = 'M';
const char LIGHT_COMMAND = 'L';
/**
* how long the motor command will take effect in ms
* an incomming motor command will last for maxDurationForMottorCommand
* if it's not going to be resetted by another motor command
*/
const long maxDurationForMottorCommand = 300;
// adjust this value to limit robot speed
const byte maxPwmValue = 230;
// How long between successive distance transmissions in ms
const long transmitingInterval = 500;
const int maxObstacleDetection = 1000; // analog read max detection value
const int minObstacleDetection = 500; // analog read min detection value
const byte FLASH_PIN = 3;
const byte RIGHT_MOTOR_PWM_PIN = 5;
const byte RIGHT_MOTOR_EN1_PIN = A4;
const byte RIGHT_MOTOR_EN2_PIN = A5;
const byte LEFT_MOTOR_PWM_PIN = 6;
const byte LEFT_MOTOR_EN1_PIN = A3;
const byte LEFT_MOTOR_EN2_PIN = A2;
const byte FRONT_DISTANCE_SENSOR = A0;
const byte BACK_DISTANCE_SENSOR = A1;

SoftwareSerial masterComm(11, 10); // RX, TX
TextMotorCommandsInterpretter motorCommandsInterpretter(-50, 50, -50, 50);
String currentCommand;
long lastCheckedTime;
long lastTransmitTime;
boolean inMotion = false;

void setup() 
{
    Serial.begin(9600);
    masterComm.begin(9600);
    masterComm.setTimeout(10);  
    pinMode(FLASH_PIN, OUTPUT);
    pinMode(LEFT_MOTOR_PWM_PIN, OUTPUT);
    pinMode(LEFT_MOTOR_EN1_PIN, OUTPUT);
    pinMode(LEFT_MOTOR_EN2_PIN, OUTPUT);
    pinMode(RIGHT_MOTOR_PWM_PIN, OUTPUT);
    pinMode(RIGHT_MOTOR_EN1_PIN, OUTPUT);
    pinMode(RIGHT_MOTOR_EN2_PIN, OUTPUT);
    lastCheckedTime = millis();
    lastTransmitTime = millis();
}

void loop() 
{
    if (masterComm.available() > 0) {   
        currentCommand = masterComm.readString();
        processCommand();
    }
    if (inMotion && millis() - lastCheckedTime > maxDurationForMottorCommand) {
        stopMotors();
    }
    if (millis() - lastTransmitTime > transmitingInterval) {
        lastTransmitTime = millis();
        masterComm.print(getObstacleData());
        Serial.print(analogRead(BACK_DISTANCE_SENSOR));Serial.print("---");
        Serial.println(getObstacleData());
    }
    /* FOR DEBUG
    motorCommandsInterpretter.analizeText("M:-14:40;");
    Serial.write("Left==");Serial.println(motorCommandsInterpretter.getPercentLeft());
    Serial.write("Right==");Serial.println(motorCommandsInterpretter.getPercentRight());   
    delay(10000);*/
}

String getObstacleData()
{
    int frontDistance = analogRead(FRONT_DISTANCE_SENSOR);
    int backDistace = analogRead(BACK_DISTANCE_SENSOR);
    frontDistance = map(frontDistance, maxObstacleDetection, minObstacleDetection, 0, 10);
    backDistace = map(backDistace, maxObstacleDetection, minObstacleDetection, 0, 10);

    return String("F=" + String(frontDistance) + ":B=" + String(backDistace) + ";");
}

void processCommand() 
{
    switch (currentCommand.charAt(0)) {
        case (MOTOR_COMMAND):
            steerCar();
            break;
        case (LIGHT_COMMAND):
            toggleLight(currentCommand.charAt(2));
            break;
    }
}

void steerCar() 
{
    motorCommandsInterpretter.analizeText(currentCommand);
    float percentLeftMotor = motorCommandsInterpretter.getPercentLeft();
    float percentRightMotor = motorCommandsInterpretter.getPercentRight();
    Serial.write("Left=");Serial.println(percentLeftMotor);
    Serial.write("Right=");Serial.println(percentRightMotor);
    setMotorsDirection(motorCommandsInterpretter.getDirection());
    analogWrite(LEFT_MOTOR_PWM_PIN, percentLeftMotor * maxPwmValue);
    analogWrite(RIGHT_MOTOR_PWM_PIN, percentRightMotor * maxPwmValue);    
    inMotion = true;
    lastCheckedTime = millis();
}

void setMotorsDirection(boolean forward)
{
    if (forward) {
        digitalWrite(LEFT_MOTOR_EN1_PIN, HIGH);
        digitalWrite(LEFT_MOTOR_EN2_PIN, LOW);
        digitalWrite(RIGHT_MOTOR_EN1_PIN, HIGH);
        digitalWrite(RIGHT_MOTOR_EN2_PIN, LOW);
    } else {
        digitalWrite(LEFT_MOTOR_EN1_PIN, LOW);
        digitalWrite(LEFT_MOTOR_EN2_PIN, HIGH);
        digitalWrite(RIGHT_MOTOR_EN1_PIN, LOW);
        digitalWrite(RIGHT_MOTOR_EN2_PIN, HIGH);
    }
}

void stopMotors()
{
    Serial.println("Stopping motors");
    analogWrite(LEFT_MOTOR_PWM_PIN, 0);
    analogWrite(RIGHT_MOTOR_PWM_PIN, 0);
    inMotion = false;
}

void toggleLight(char command)
{
    Serial.println("Toggle light");
    if (command == '1') {
        digitalWrite(FLASH_PIN, HIGH);
    } else {
        digitalWrite(FLASH_PIN, LOW);
    }
}

Installing and Configuring the Raspberry Pi Project and Dependencies

flow-diagram.png

How does it work:

You will see a diagram of what i'll try to explain above in the attached images.

a. The android app shows the uv4l streaming inside a webview. The uv4l process runs on the raspberry pi, captures video input from the camera and streams it. It's an awesome tool with many features

b. Using controls inside the android app lights and engines commands are issued to the MQTT server

c. The python server inside the docker container on the raspberry pi listens to MQTT commands and passes them using serial interface to the arduino. The arduino board controls the motors and the lights.

d. The arduino senses distances in front and back of the robot and sends the data through the serial interface to the python server, the python forwards them to the MQTT and they get picked up by the android interface and shown to the user

First thing you'll need a fully installed and configured Raspbian on the raspberry pi, and the camera needs to be psychically connected and configured. Also all the configuration will be done using ssh, so it's a good thing to get it configured.

We can't cover all the basic things here, but do try these links if necessary:

If you don't know how to install the Raspbian operating system check this link.
For more information on how to install the camera and enable it check this official article.

For how to configure ssh on Raspbian, check this.

If you wish to control your robot using the android app from outside the wifi, you should consider port forwarding on your wifi router, otherwise you'll be restricted to use your local ip addresses inside your wifi.

To find out your local ip address on the raspberry pi use "ifconfig":

ifconfig

.........

eth0      

Link encap:Ethernet  HWaddr b8:27:eb:16:e7:ff  
          inet6 addr: fe80::ff00:f22f:9258:b92b/64 Scope:Link
          UP BROADCAST MULTICAST  MTU:1500  Metric:1
          RX packets:0 errors:0 dropped:0 overruns:0 frame:0
          TX packets:0 errors:0 dropped:0 overruns:0 carrier:0
          collisions:0 txqueuelen:1000 
          RX bytes:0 (0.0 B)  TX bytes:0 (0.0 B)

........

wlan0     Link encap:Ethernet  HWaddr 00:c1:41:00:10:f6  
          inet addr:192.168.0.102  Bcast:192.168.0.255  Mask:255.255.255.0
          inet6 addr: fe80::e1f4:5112:9cb2:3839/64 Scope:Link
          UP BROADCAST RUNNING MULTICAST  MTU:1500  Metric:1
          RX packets:1678396 errors:0 dropped:0 overruns:0 frame:0
          TX packets:381859 errors:0 dropped:0 overruns:0 carrier:0
          collisions:0 txqueuelen:1000 
          RX bytes:259428527 (247.4 MiB)  TX bytes:187573084 (178.8 MiB)
.....

We're interested on the wlan0 inet addr, in our case "192.168.0.102"

The ports to be forwarded (defaults) are: 9090 for uv4l and 1883 for mosquitto. You can forward this ports to the same output ports if they are'n banned from the internet provider firewall or some other ports.

Port forwarding is something is's done differently on every router, here are some tutorials: this, and you can also try to search on google "port forwarding your_router_model" to see more relevant results.

Prerequisites:

a. install git using command line

b. clone the github project in the home folder:

The folder location it's important because in docker-compose.yml the location is hard coded as: /home/pi/robot-camera-platform:/root/debug If you need to change the location, please change the value in docker-compose too

git clone <a href="https://github.com/danionescu0/robot-camera-platform.git" rel="nofollow"> https://github.com/danionescu0/robot-camera-platf...</a>

c. disable the pi's serial console, if you don't know how to do that, check this link

Install Uv4l streamming:

chmod +x uv4l/install.sh
chmod +x uv4l/start.sh
sh ./uv4l/install.sh

If this fails or you need to find out more details about uv4l, check this tutorial.

Configuration:

a. by editing uv4l/start.sh you can configure the following aspects of the video streaming: password, port, frame rate, with, height, rotation and some other minor aspects

b. edit config.py and replace password with your own password that you've set on the mosquitto server

c. edit docker-container/mosquitto/Dockerfile and replace this line

RUN mosquitto_passwd -b /etc/mosquitto/pwfile user your_password

with your own user and password for mosquitto

d. edit config.py and replace serial port and baud rate with your own, i recommend you'll keep the baud rate though. If you want to change it don't forget to edit that on the arduino-sketch too

Test uv4l installation
a. Start it:

sh ./uv4l/start.sh 

b. Test it in the browser at the address: http://your_ip:9090/stream

c. Stop it

sudo pkill uv4l

Install docker and docker-compose

About docker installation: https://www.raspberrypi.org/blog/docker-comes-to-...

About docker-compose installation: https://www.raspberrypi.org/blog/docker-comes-to-...

Auto starting services on reboot/startup

By making the services auto start you'll eliminate the need of manually login through ssh and then activate all the services by hand, we're going to do this using systemctl.

a. Copy the files from systemctl folder in systemctl folder to /etc/systemd/system/

b. Enable services

sudo systemctl enable robot-camera.service

sudo systemctl enable robot-camera-video.service

c. Reboot

d. Optional, check status:

sudo systemctl status robot-camera.service
sudo systemctl status robot-camera-video.service

Configuring and Building the Android Application

We're all most done, in this step we're going to install the android application. These are all the prerequisites:

1. Clone the github project:

git clone  https://github.com/danionescu0/android-robot-came...

The next steps involves setting up your environment, i'll just enumerate them and give a link to a specialized tutorial, in case you don't know how to do it.

2. Enable developer options on your android phone. You can find out more here: https://developer.android.com/studio/debug/dev-opt...

3. Download and install Android studio: https://developer.android.com/studio/index.html?ut... and this https://www.javaworld.com/article/3095406/android/...

4. Import project : https://developer.android.com/studio/intro/migrate...

Now we're going to configure the the streaming and MQTT credentials:

5. Edit ./app/src/main/values/passwords.xml and configure MQTT and streaming

The MQTT host should be something like: http://your_ip:1883

The streaming host should be something like: http://your_ip:9090/stream

6. Upload and run the application

Using the Robot and Debugging

2018-01-21 11.05.45.jpg
2017-07-14 09.56.28.jpg
2018-01-21 14.16.41.jpg

Using the app

The application has only a main screen, in the left of the screen the streamming image is displayed. On the right

there are the controls. The main control is a steering wheel, touch the steering wheel in the direction you wish the robot to move. Below the steering wheel there is a headlight button, touch it to toggle the light.

In the top right corner there is a text like : "- Batt Connected".

* First dash means no obstacles, if there is an obstacle in front or in the back of the robot it will be signaled with a small arrow pointing in front or in the back.

* The "Batt" status is not implemented yet.

* "Connected" means that MQTT server is connected so the robot can be used, the other possible value is "Disconnected"

Debugging can be done on multiple layers:

1. On the arduino layer

- Connect the FTDI adapter to the second serial line to the laptop (RX to pin 11 and TX to pin 10) and issue motor commands and light commands to see if the robot responds to these commands

- Double check the connections, if the motors are moving backwards reverse the both motor wires, if one motor is moving backwards reverse the it's wires

- Check if the arduino is connected properly to the H-bridge, check this link for more information

2. On the rasberry pi layer

- Check docker is running the two containers (mosquitto and python server)

pi@raspberrypi:~ $ docker ps
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES 473a56da2230 dockercontainer_python-server "python /root/debu..." 9 months ago Up 4 hours dockercontainer_python-server_1 3e0b1933d310 robot-camera-mosquitto "/usr/bin/entry.sh..." 9 months ago Up 4 hours 0.0.0.0:1883->1883/tcp dockercontainer_mosquitto_1

- Check the processes are running on specified ports, you should look for 9090 (streamming) and 1883 (mosquitto)

pi@raspberrypi:~ $ netstat -nltp
(Not all processes could be identified, non-owned process info will not be shown, you would have to be root to see it all.) Active Internet connections (only servers) Proto Recv-Q Send-Q Local Address Foreign Address State PID/Program name tcp 0 0 0.0.0.0:9090 0.0.0.0:* LISTEN - tcp 0 0 0.0.0.0:5900 0.0.0.0:* LISTEN - tcp 0 0 0.0.0.0:8080 0.0.0.0:* LISTEN - tcp 0 0 0.0.0.0:22 0.0.0.0:* LISTEN - tcp6 0 0 :::1883 :::* LISTEN - tcp6 0 0 :::5900 :::* LISTEN - tcp6 0 0 :::22 :::* LISTEN -

- Check the serial port exists (it's the correct one) and it's specified in the project config.py

pi@raspberrypi:~ $ ls -l /dev/ttyS0
crw-rw---- 1 root dialout 4, 64 Jan 14 19:59 /dev/ttyS0

- Stop the docker process and manually connect to serial using picocom

Then issue motor and light commands directly to see if the robot responds

sudo systemctl start robot-camera.service

picocom -b 9600 /dev/ttyS0

# now issue motor and light commands to test the robot

- Check if the serial console is deactivated on the raspberry pi

- Check the streaming and MQTT are accessible outside the raspberry pi suing a mosquitto client (for MQTT) and checking the streaming in a web browser)

3. Android application layer

- Check all the necessary steps to enable the phone into debugging mode (check this out)

- Make sure you've set up the passwords and endpoints correctly in the passwords.xml

- Check the streaming and MQTT are accessible outside the raspberry pi suing a mosquitto client (for MQTT) and checking the streaming in a web browser)

- See the top right corner of the app and check for "Connected"

- Use the android debugger to check for errors

Other Use Cases, Extending the Code

Robot object follower with computer vision

Another use case is a object following robot .The robot will follow an object of a specific color and size threshold

Check above for the video.

Because this is out of scope of this tutorial i'll just give you some hints:

- for this to work you won't need the video streamming, mqtt and docker installed

- make sure python 3.x is installed though

- install python requirements from requirements.txt

sudo pip3 install -r /home/pi/robot-camera-platform/navigation/requirements.txt

- In config_navigation.py you'll find:

hsv_bounds = (

    (24, 86, 6),
    (77, 255, 255)
)
object_size_threshold = (10, 100)

HSV means hue saturation value, and for our color object detection to work it has a lower and an upper bound, our object color will have to be in this range to be detected. Here you can find a visual HSV object threshold detector.

Object size threshold means the smallest and the highest object radius size (in percents from width) which will be considered a detection.

- install and configure VNC (more information of how to install VNC here)

- Run the object tracking script in VNC graphical interface in a terminal. This will enable you to view the video, with a circle drawn over it. The circle means that the object has been detected.

python3 object_tracking.py colored-object --show-video

OR Run the object tracking script with no video output:

python3 object_tracking.py colored-object

I think the main advantage of this platform is versatility, it can be adapted easily to other interesting usages, some of my ideas:

- Following a person by face recognition

- Patrolling and reporting movement

- Replace the wifi with a 3G modem and the robot can be controlled outside, maybe exploring dangerous zones

This being a very complex project i assume that there will be errors, i will appreciate if you ask me anything in the comments area.

If you like my project please subscribe to my instructables channel and to my youtube channel for more interesting stuff.