Facial Recognition Using the “Person Sensor” and Pico Explorer

by CLClab306 in Circuits > Sensors

2377 Views, 3 Favorites, 0 Comments

Facial Recognition Using the “Person Sensor” and Pico Explorer

Facial Recognition Using the Person Sensor and Pico Explorer

Facial recognition systems have been in use for decades, but the cost and complexity of the required hardware and software has typically put facial recognition beyond the reach of most Makers. However, that situation changed recently with creation of the “Person Sensor” by Useful Sensors Inc. This novel sensor brings machine-learning capabilities like facial detection and facial recognition to a small (19.3 x 22 mm), relatively inexpensive (< $10) module that can be programmed with low-cost microcontrollers over an I2C interface. Using a small camera and onboard microcontroller designed for machine learning applications, this device can detect nearby people’s faces and return information about: (a) how many there are, (b) where they are relative to the sensor, and (c) recognize as many as 8 different faces. Useful Sensors’s Person Sensor Developer Guide provides detailed information about the sensor as well as several code examples that demonstrate facial detection and facial location, but less information on facial recognition, which the creators say is “still experimental.”

This Instructable describes my approach to exploring facial recognition using the Person Sensor, with the broader goal of helping me (and other Makers) better understand how it might be used in future projects. In my case, the Person Sensor was connected to a Raspberry Pi Pico RP2040 microcontroller mounted on a Pimoroni Pico Explorer Base. I chose the Pico based on its low cost, powerful chip, large number of GPIO pins (25) and because it can be easily programmed with CircuitPython. I was also motivated by the fact that several code examples linked in the Developer Guide were written in CircuitPython, providing a great foundation for writing my code. I chose the Explorer Base for its built-in LCD display, pushbuttons (4), mini-breadboard, and easy access to GPIO pins, all of which made it ideally-suited for testing the Person Sensor.

The video shows my best-case test of the sensor’s facial recognition abilities. Before recording the video, I used my program to “teach” the device what each of the last 8 U.S. Presidents looked like by showing it 8 different Presidential Portrait photos (downloaded from Wikipedia and printed on thick matte photo paper). The Person Sensor is barely visible in the upper left corner and the Pico and Explorer Base are in the lower left corner of this video. I used portrait photos because the person is typically looking directly at the camera and is well illuminated. The video depicts a flashcard-style recognition test that involved briefly showing the sensor each of the photos and waiting for Pico’s answer, which appears in green on the top line on the text display. The second text line shows the sensor’s level of confidence in the identification using a number between 0 (no confidence) and 100 (high confidence). As can be seen, all 8 photos were correctly identified within a few seconds with a high degree of confidence (99%). You will also notice blinking yellow rectangles in the display (usually with a large X in the middle) soon after each photo is presented. The sensor looks for faces up to 5 times/sec and a rectangle appears each time the sensor sees a face. The X indicates the face was looking directly at the sensor. The size of the rectangle indicates proximity to the sensor (bigger = closer) and its x-y location reflects the two-dimensional spatial location of the face relative to the sensor (i.e., above vs. below and left vs. right). More about all of this later on. 

Supplies

Sensor2.jpeg
Qwiic.jpeg
Pico.jpeg
basefront.jpeg

The first 5 items are required to assemble the basic circuit described in Step 1. Two additional items are listed as "optional" in case you want to add a Neopixel light ring to increase illumination.

1. Person Sensor V1.0 by Useful Sensors [$9.95]

2. STEMMA QT / Qwiic JST SH 4-pin to Premium Male Headers Cable [$0.95] 

3. Raspberry Pi Pico RP2040 [$4-5]

4. Pimoroni Pico Explorer Base [~$30]

5. Assorted breadboard jumper wires 

6. Optional (requires minor soldering): (a) Neopixel Ring, (b) 5V DC, 1A power source for the Neopixel ring

Wire the Circuit

CircuitDiagram.png
baseinbox.jpeg
baseback.jpeg

The Explorer Base and Qwiic connector greatly simplify the wiring process (see Circuit Diagram).

  1. Insert the Pico into the female headers in the upper left corner of the Explorer Base with the USB connector oriented toward the outside edge of the base. Make sure the pins align correctly with the headers and apply downward pressure as evenly as possible to avoid bending the pins. [See picture]
  2. Attach the Qwiic cable connector to the connector on top of the Person Sensor. Make sure the holes on the cable connector align correctly with the pins on the sensor's connector. With the sensor oriented as shown in the Circuit Diagram, the black wire should be at the left and the yellow wire on the right.
  3. Insert the male headers on the other end of the Qwiic cable into the upper 2x6 female header in the middle of the Explorer Base as shown in the Circuit Diagram. These wires provide 3.3V power to the sensor and allow I2C communication between the Pico and sensor.
  4. Insert a jumper wire between AUDIO on the upper 2x6 header and GP0 on the lower 2x6 header. This wire (shown in green) allows program control over the piezo speaker on the Explorer Base. Note that there are 7 more GPIO pins available on the lower header (GP1-GP7). One of these pins will be used to control the optional Neopixel ring (see Step 7), but the others can be used for future extensions of this project (e.g., controlling LEDs, servos, etc.). Several other pins are not shown on the 2x6 headers (ADC, SPI, Motors). The backside of the Explorer Base lists how most of the GPIO pins are assigned. [See picture]

Since a consistent sensor orientation facilitates facial recognition testing, I initially mounted the sensor (with the Qwiic connector on top) on an adjustable desktop phone holder using Blu-Tack re-usable adhesive. This required extending the length of the Qwiic connector leads by using female-male jumper wires. A more durable solution for sensor mounting is shown later (see Step 7: Optional Neopixel Ring). Finally, to protect its backside, I placed the Explorer Base in a 3D-printed box (see picture).

If your Pico came with loose headers, you can find a nice tutorial for soldering headers here.

Install CircuitPython on Your Pico

Software.1.jpeg

If CircuitPython is already loaded on your Pico, go to Step 3. If not, I encourage you to check out Adafruit's excellent tutorial for "Installing Circuit Python" on the Pico, especially if you need help troubleshooting this step.

  1. Download the latest version of the CIrcuitPython UF2 file for Pico here. I used "CircuitPython 8.0.0-beta.6" and did not run into any problems.
  2. Attach one end of a micro USB cable to your computer.
  3. WIth the Pico unplugged, push and hold down the BOOTSEL button. Don't release the button until you do #4.
  4. Attach the micro USB cable to the Pico and release the button. If everything's working, you should see a new disk drive appear named RPI-RP2. If that doesn't happen, repeat #3 and #4. If RPI-RP2 still does not appear, consult the Adafruit CircuitPython installation guide for troubleshooting suggestions.
  5. Drag the UF2 file downloaded earlier (#1) to RPI-RP2. After a few seconds, RPI-RP2 will disappear and a new disk drive named CIRCUITPY will appear, indicating successful installation of CircuitPython on your Pico.

CIRCUITPY will disappear when you disconnect the USB cable, and it will reappear when you re-connect the cable to your computer. If you eject the CIRCUITPY drive while the cable is connected, disconnect and re-connect the cable to make the drive re-appear.

Install the Mu Editor on Your Computer

Software.2.jpeg

If you have already installed the Mu Editor or another preferred Python editor on your computer, you can skip this step. CircuitPython coding and troubleshooting is greatly facilitated by using the Mu Editor, a simple Python code editor that works well with CircuitPython-compatible boards. This editor also has a built-in serial console that enables instant feedback from CircuitPython programs during their execution. Again, I encourage you to check out Adafruit's Mu Editor Installation Guide.

  1. Download the Mu Editor Installer and follow the installation instructions for your particular computer (Windows or MacOSX) here.
  2. Double-click the Mu Editor icon to start the program. If everything's working, and the CIRCUITPY drive is still available, you should see a 1-line Python program named "code.py" in the Editor window [print("Hello World!")]. The Mu Editor automatically loaded that file from the CIRCUITPY disk drive.
  3. Open the serial console by clicking the "Serial" icon in the Editor menu bar. The serial console will open at the bottom of the Editor window. If you then click the "Save" icon in the Editor menu bar, it will re-save the code.py file on the CIRCUITPY disk and run the "Hello" program. If everything's working, you will see "Hello World!" appear in the serial console portion of the Editor window (along with a bunch of other stuff).

If you need a little more time learning how the Mu Editor works, check out the Adafruit tutorial here.

Install Library Files & Facial Recognition Program on CIRCUITPY

Software.3.jpeg
files.png

The facial recognition program needs several different CircuitPython library files to work properly.

  1. Download the latest version of the CircuitPython Library Bundle here. You should select the version of the Library Bundle that matches the version of CircuitPython that you installed on your Pico in Step 2 above (i.e., either for Version 7.x or Version 8.x). I used the Version 8 bundle.
  2. Open the adafruit-circuitpython-bundle-X.x folder and then open the "lib" subfolder.
  3. Locate the following 5 folders/files and drag them to the folder labeled "lib" on your CIRCUITPY disk.
  4. adafruit_display_text
  5. adafruit_st7789.mpy
  6. adafruit_ticks.mpy
  7. asyncio
  8. neopixel.mpy
  9. Open the Mu Editor and its serial console window.
  10. Download the code.py file (below) and drag to the CIRCUITPY disk drive on your Pico. This new file will replace any older code.py file stored on the drive.
  11. Click the "Load" icon in the Mu Editor menu bar and load the code.py file from CIRCUITPY. You will need to close any code.py file already open in the Editor window before loading the new code.py file.

When you finish, the CIRCUITPY directory should contain everything shown in the screenshot.

Downloads

What's Happening?

IniitialDisplay.jpeg

The Facial Recognition program started as soon as code.py was installed on CIRCUITPY (Step 4). If everything’s working, you should see several things happening while looking at the sensor.

First, the green LED on the sensor will light up whenever the sensor detects a face within its 110 degree field of view. The LED, which is controlled by the sensor's microcontroller, becomes active as soon as 3.3V power is applied to the sensor. This LED is especially useful for adjusting the sensor's orientation relative to nearby faces.

Second, as was seen in the video, blinking yellow rectangles appear on the display whenever nearby faces are detected (see picture). The closer the face, the bigger the rectangle. Also, the rectangle contains a large X when the face is looking at the sensor. Although the sensor can sample the visual field 5-7 times/sec, the display's blinking rate is limited to about 2 samples/sec by my program.

Third, whenever a large X appears in the box, a green number (-1) appears on the display near Button A. "-1" is the default ID number assigned to an unrecognized face before the sensor has been trained. If an alphabetic character appears instead, press Button A for a second or two, which should return the "-1". More information about the ID label and the number below it will be provided in Step 6, which also explains the bottom display line ("Next ID# = 0").

Finally, the Mu Editor's serial console offers a text version of data in the display. That is, the console shows the ID label and whether the face is looking at the sensor. It also shows the sensor's level of confidence in the x-y coordinates for the rectangle (Box%) on a scale from 0 to 100. The last item, ID%, will be 0 until the sensor has learned to recognize at least one face (see Step 6). The sensor actually collects data on the four largest faces during each sample. However, since facial recognition is only applied to the largest face, this program only shows data for that face in the serial console. However, data from any other detected faces can be shown in the console by uncommenting Line 207.

Teaching the Sensor to Recognize Specific Faces

Labeled_Buttons.png

As noted earlier, the Person Sensor can be trained to recognize up to 8 different faces. The sensor's creators call this training process "calibration." Calibration involves sending a command to the sensor to "memorize" the next face it sees and arbitrarily assign an ID number between 0 and 7. Later, if the largest face is recognized as a calibrated face, the sensor returns the assigned ID and it will appear on the display and in the serial console. In my program, the Pico Explorer's pushbuttons are used for calibration (see picture). More specifically, the process involves the following steps:

  1. Use the Y and B buttons as needed to increase or decrease the Next ID# to be assigned.
  2. Orient the sensor toward the face (or picture of a face) that you want to calibrate.
  3. Press Button X to calibrate the next face sample when the display consistently shows a rectangle with an X across several consecutive samples.
  4. The next ID# is automatically increased after calibration to avoid re-assigning a previously calibrated ID#. However, if you want to re-assign a previously calibrated ID#, you can use the B or Y button to set that number as the ID for your new calibration.

Immediately after calibration, the sensor will start sending facial recognition information after each sample. The display will no longer show a -1 next to Button A (see picture). Instead, you will see one of the first 8 letters of the alphabet (A-H), corresponding to ID#'s 0-7, respectively (A = 0, B = 1, etc.). The alpha characters are temporary labels and can be changed to something more meaningful by editing Line 25 in the program. Pressing Button A will erase the sensor's memory of all previously calibrated faces (it's the "reset" button).

After one or more faces are calibrated, the program provides additional feedback by activating the Explorer's built-in piezo speaker whenever the sensor's confidence in recognizing the largest face is 97 or greater. In addition, the Pico's built-in LED is turned on whenever ID#0 (labeled "A") is recognized with that same high level of confidence. That feature was included here to illustrate how the sensor's facial recognition ability could be used to execute actions only when a particular calibrated face is recognized. The ID% criterion value for activating the built-in LED and piezo can be modified in Line 29.

At this point, you can experiment with the sensor by calibrating additional faces to see how it performs. Since I could not find 7 other people to participate in my testing, I trained the sensor to recognize multiple faces using portrait photos (see video). An interesting aspect of the sensor’s behavior became apparent when I showed the sensor a new picture after the first calibration. The novel face was initially recognized as the already calibrated face, albeit with a lower level of confidence (85-93%). However, immediately after calibrating the second face, the sensor correctly identified both photos with 99% confidence. The same thing happened as each novel face was introduced.

For privacy reasons, the sensor does not give users access to images used in calibration or to detailed facial biometrics. Rather, it is intended more as a plug-and-play device that provides the kind of facial detection/recognition capabilities described earlier for users who want those features in their applications, but do not want to become experts in vision AI and machine learning.

Optional Neopixel Ring

Sensor_Ring.png
no-plugs.jpeg
plugs.png
plugs.jpeg
Sensorbase.png
WholeThing.jpeg
NeopixelDIagram.png

Illumination levels can have a big effect on facial recognition data. I realized this when I found that ID% confidence for my face was reduced at night if I had calibrated my face earlier in the day when daylight was streaming through the window next to my desk. To provide more consistent illumination, I mounted a NeoPixel Ring around the Person Sensor and designed 3D-printed parts to hold both the sensor and ring. The sensor-ring part (PersonSensor_PixelRing.stl--see design picture) was printed using Natural Translucent PLA (0.25 mm layer height, 20% infill). The sensor and ring were then placed into this part (see picture) and held firmly using two additional 3D parts (PixelRing_plug.stl & Sensor_plug.stl--see design picture) printed with a flexible filament (TPU: any color, 0.25 layer height, 100% infill). This assembly was then attached (with 6/32 machine screws and nuts) to a desktop base (Sensor_Base.stl--see design picture) printed with PLA (any color, 0.38 layer height, 20% infill) to create the final assembly (see picture). All parts were printed without supports on a TAZ5 printer with a 0.5 mm nozzle.

The additional wiring for the NeoPixel Ring is shown in the circuit diagram (see picture). The Pico Explorer's built-in mini-breadboard was used to make connections after soldering male-male header wires to the 3 contacts on the NeoPixel Ring. The positive (+) and negative (-) leads from a 5V-1A power supply were connected to two different columns on the breadboard via a terminal connector. The minus side of the supply was then connected via the breadboard to the second GND pin on the Pico's upper 2x6 header using a jumper wire. The GND and Vcc pins on the NeoPixel Ring were connected to the minus and plus sides of the power supply, respectively. Finally, the D-in pin from the ring was connected to pin GP1 on the Pico's lower 2x6 header. The program sets the NeoPixel default brightness to 0.5 (Line 38), but that value can be changed to make the ring brighter or dimmer. Because NeoPixels draw a lot of current, do not use the Pico's 3.3 V pin as a power source.

This is just one solution to the illumination problem and perhaps not the simplest. One obvious disadvantage is that users looking at the sensor are also exposed to a ring of bright lights. One way to reduce that exposure would be to add a switch to the breadboard so that power to the NeoPixel Ring could be supplied only when needed for calibration or facial recognition.

Program Notes

A detailed explanation of the program is not provided here, but comments have been inserted throughout the code to facilitate interpretation. Nevertheless, two aspects of the code are worth noting since they may be less familiar.

First, the overall structure of the program involves cooperative-multitasking using the CircuitPython asyncio Library. Doing so allows button presses to be serviced only when they produce an interrupt and otherwise allows the program to spend most of its time sampling the person sensor. I had initially written the program to poll all four buttons on every sensor sampling loop, but found that really slowed down the sensor sampling rate. Since button presses occur relatively infrequently, it was much more efficient to use asyncio. In essence, the program sets up two tasks, one for collecting sensor data and one for monitoring the buttons. These tasks run "simultaneously," although the interrupt-task (catch_buttons) runs in the background. Meanwhile, the primary sensor sampling task (person_sensor) loops in the foreground. A more detailed description of asyncio for CircuitPython can be found here.

Second, the Person Sensor was designed so that the user can make configuration changes by writing a byte value for a pre-defined register address into the I2C bus to the peripheral address of the sensor (0x62). That is exactly how the facial calibration process is triggered (Line 148) and how all stored ID's are erased (Line 139). Three additional configuration settings are made in Lines 93-95. These settings: (a) set the sensor's Mode to "continuous" (i.e., continuous sampling), (b) enable the ID model, allowing calibration, and (c) enable long-term storage of recognized IDs, even after power is removed from the sensor. That last setting means that you can calibrate a Person Sensor with one program, remove power, and then use the trained sensor in a completely different application that retains the same configuration settings but does not allow more calibrations. More detailed information on the sensor's configuration options can be found in the Person Sensor Developer Guide.

Lessons Learned

The introductory video demonstrates the remarkable facial recognition capabilities of the Person Sensor. After teaching the sensor 8 different pictures (i.e. calibrating), the sensor correctly recognized each picture with a 99% confidence level. Moreover, I have replicated this finding multiple times, even after erasing all stored IDs and re-training the sensor. To see whether this outcome would generalize, I tested the sensor with a completely different, more diverse group of portrait photos that included women, Asian-Americans and African-Americans (all are current or recent U.S. Senators--see Line 27 for names). These tests yielded an outcome identical to the initial test. That is, all 8 faces were correctly recognized within a second or two at a 99% confidence level, indicating the outcome of the video demonstration is not unique to that particular set of photos.

The success of facial recognition can depend importantly on the similarity of training and testing conditions. For these tests, the NeoPixel ring was turned on and the pictures were presented 8 or 12 inches from the sensor during both calibration and testing. If I changed either the lighting condition or distance between training and testing, overall accuracy and ID% confidence were sometimes reduced. Although several pictures were correctly recognized with 98-99% confidence, others were either correctly identified at lower levels of confidence or completely mis-identified. More testing is needed, both with three dimensional (i.e., real) faces and a variety of testing conditions. At this point, it's safe to say the Person Sensor's facial recognition system will perform better in applications that maximize the similarity between the conditions during calibration and the application's primary use conditions (i.e., illumination level, distance from face, and angle of face). While I am no expert in the capabilities of more complex/costly facial recognition systems, I suspect they share some of the same limitations noted for the Person Sensor.

Although this Instructable has focused on the Person Sensor's facial recognition abilities, it is worth repeating that this device can also: (a) detect faces within its visual field, (b) determine whether the face is looking at the sensor, and (c) provide the face's x-y coordinates, all without any prior training (calibration), making it useful for an even broader range of applications that do not involve facial recognition.

I am interested to see what other Makers learn about the Person Sensor's facial detection, facial location, and facial recognition abilities and, especially, how they incorporate this powerful, yet inexpensive sensor into their own applications.