Using Docker on the Raspberry Pi

by vipercmd in Circuits > Raspberry Pi

33163 Views, 50 Favorites, 0 Comments

Using Docker on the Raspberry Pi

docker-whale.png

Our Makerspace recently introduced a keyless entry system using RFID FOBs that is monitored by a Raspberry Pi. Putting the hardware together was a straightforward process but it was the software controlling the authorisation that deserved proper documentation. This is why Docker was introduced into the project.

I will present the benefits of Docker, its installation on the Raspberry Pi, and some sample applications. Items required include:

What Is Docker?

docker-whale-home-logo.png

Docker is to services like virtual machines are to physical hardware. A virtual machine contains a fully functional operating system that requires dedicated memory and disk space. A Docker image however is just the binary and configuration files for a single service or group of services. Docker only requires a recent Linux kernel; no specialised hardware, huge amounts of RAM or disk space, and not a high powered CPU.

There is plenty of documentation and use cases available on Docker so I won't reiterate them here. Instead I will describe the benefits of Docker and its use on the Raspberry Pi.

Why Arch Linux?

docker-whale-home-logo.png

Using Docker to contain your service does not incur much overhead in the limited available resources on the Pi. As such I don't need a bloated operating system to run these images. Arch Linux is a trimmed down and easily extendable operating system. That stated however, there is nothing preventing your Docker images from using Raspbian. This is where Docker portability begins.

Installation

Installing Arch Linux onto the Pi is pretty straight forward: you download the image, partition your SD card, copy the image, boot up on the Pi. Here are the instructions for the original generations of the Pi as well as instructions for the new Pi 2.

Updating and Configuration

Log into your new installation using the username "root" and the password "root". If you're going to interface to other hardware on the GPIO pins using I2C or SPI protocols, you'll need to enable these modules for Arch Linux.

First upgrade your installation to the latest Arch Linux libraries:

$ pacman -Syu

Now enable the I2C and SPI modules:

$ vi /boot/config.txt

Uncomment these two lines:

device_tree_param=i2c_arm=on
device_tree_param=spi=on

$ vi /etc/modules-load.d/raspberrypi.conf

Add the following line:

i2c-dev

And finally reboot to use your updated installation and new modules:

$ reboot

Installing Docker

docker-whale-home-logo.png

Now that the operating system has been updated it is time to install Docker. With Arch Linux it is simply the following command and then the subsequent commands to enable start on boot:

$ pacman -S docker
$ systemctl enable docker
$ systemctl start docker

Docker is installed and running!

Create Our Docker Images

docker-whale-home-logo.png

Lets create an image that is based upon the Raspbian Jessie distribution. Even though our Pi is configured with Arch Linux, we can use any other Raspberry-based OS as the base image for our service. I will follow the instructions from Adafuit NodeJS Embedded Development.

The idea behind Docker is to create images that can be reproduced at any time. Since the project uses NodeJS, lets create a NodeJS image just for the Raspberry Pi. Later we'll create the program in a different Docker image that will be based upon this image.

Node JS Image

Create a new directory named "rpi-nodejs" and change into it. Now create a file named "Dockerfile" with the following contents:

<p>FROM resin/rpi-raspbian:jessie<br>MAINTAINER vipercmd<br>
RUN echo "deb <a href="http://apt.adafruit.com/raspbian/" rel="nofollow"> http://apt.adafruit.com/raspbian/ </a> wheezy main" >> /etc/apt/sources.list && \
    apt-get update && \
    DEBIAN_FRONTEND=noninteractive apt-get install -y --force-yes node && \
    apt-get clean</p>

Docker needs to be instructed to read this Dockerfile and create an image. This is done with the following command:

$ docker build -t rpi-nodejs .

This will build a NodeJS image based upon the contents of the current "." directory. The image will be named "rpi-nodejs"

These instructions are the same as the Adafruit documentation with some shortcuts. Instead of using curl to bring in the Adafruit repository, I simply added it to the sources.list. And finally, as Docker images execute as root, the image will not require the GPIO-Admin project to be a part of the image. You can see this image in your local Docker image repository:

$ docker images<br>REPOSITORY                                TAG                 IMAGE ID            CREATED             VIRTUAL SIZE<br>rpi-nodejs                                latest              f9a415a3f49b        2 minutes ago       151.1 MB<br>resin/rpi-raspbian                        jessie              f9b109c91ac9        3 months ago        119.7 MB

Application Image

Now lets create an image that uses our new NodeJS image for our application. Continuing with the instructions we need to install and NPM package and add our source code.

Move back to the parent directory for "rpi-nodejs" and create a new directory named "rpi-onoff" and change into it. First create a new file named "test.js" with the following contents:

// button is attaced to pin 17, led to 18<br>var GPIO = require('onoff').Gpio,
    led = new GPIO(18, 'out'),
    button = new GPIO(17, 'in', 'both');<br><br>// define the callback function
function light(err, state) {
  
  // check the state of the button
  // 1 == pressed, 0 == not pressed
  if(state == 1) {
    // turn LED on
    led.writeSync(1);
  } else {
    // turn LED off
    led.writeSync(0);
  }
  
}<br><br>// pass the callback function to the<br>// as the first argument to watch()<br>button.watch(light);

And now create another file named "Dockerfile" with the following contents:

FROM rpi-nodejs<br>MAINTAINER vipercmd<br>
RUN npm install onoff<br>
WORKDIR /app
ENTRYPOINT ["node", "/app/test.js"]<br>
COPY test.js /app/

The first line of the Dockerfile states that this image will be based upon our newly created rpi-nodejs image. This is an example of reusing common base images for different projects. The benefit is that all your application/service images are based upon the same parent.

The run line will bring in the dependency for onoff via the npm package manager.

Finally a directory is created in the image and the test.js script is copied into it. The entry point tells Docker that when the image is executed and made into a container to run the test.js script using node.

To compile and run this image:

$ docker build -t rpi-onoff .<br>$ docker run --privileged -d rpi-onoff

The "privileged" option must be specified because this image requires R/W access to the GPIO pins. Now that the application is running, press the button and watch the LED turn on and off.

Wrap Up

docker-whale-home-logo.png

These were very simple examples of using Docker. Docker being to shine as your Raspberry Pi images become more complex. Each Dockerfile contains the complete set of instructions to build the image. This is much easier than trying to find the documentation on how to rebuild the image, decisions for why something was configured, etc.

Docker has a stand-the-test-of-time containment. That is, you can experiment with the latest Raspbian, Arch Linux, etc as a base operating system for your Pi and these images will never need to be updated. They will continue to use Raspbian Jessie. Now I am pulling in the latest version of Node and the onoff npm package; however a specific version could be set in the Dockerfile.

Docker also supports a remote image repository. This allows you to build and test images locally, push to a remote registry, and then you can update each Pi with the latest version of the application. You can also roll-back to the previously working image should the newest release fail.