How I Made an Ai to Watch My Cats Poop for Me

by estefannie in Circuits > Raspberry Pi

525 Views, 1 Favorites, 0 Comments

How I Made an Ai to Watch My Cats Poop for Me

why I took 50000 pictures of my cats pooping

I made a poop monitor for my cats. Like a baby monitor but a poop monitor. Watch the video for context :]

Supplies

  • Pimoroni's thermal camera - mlx90640
  • Raspberry Pi 4 (8GB)
  • NoIR camera
  • IR LEDs
  • 2N3906 transistor
  • Resistors
  • PLA filament
  • Custom Vision
  • PATIENCE

The Website

The first thing I did was to set up my Pi as a server and host of my website because I wanted to use this as a monitor and be able to see the data and live feed from anywhere in my house.

The back-end

Installing NodeJS on Raspberry pi:

1. Checked the ARM version of my Pi:

$ uname -m

2. Went to the Node.JS download page and copied the right version which was ARMv7. I copied the download link.

3. In the terminal I pasted the download link to download:

$ wget https://nodejs.org/dist/v12.19.0/node-v12.19.0-linux-armv7l.tar.xz

4. Extracted the archive:

$ tar -xvf node-v12.19.0-linux-armv71.tar.xz

5. Copied node to /usr/local

$ cd node-v12.19.0-linux-armv71/
$ sudo cp -R * /usr/local/

Done. You can check with node -v or npm -v to check the version and verify that it did install correctly.

The front-end

The HTML is hosted, in the same Raspberry Pi as the back-end. So any device connected to the same network as my Raspberry Pi can see it. I also used a bit of Javascript to make the tables and filters.



Cat Detection

To determine which cat is using the litter box, I tried object recognition with OpenCV, and I didn't realize what a pain it would be to build and install in a Pi, and I didn't even end up using it, but I am adding to this Instructables in case someone wants to stick to the open source option.

I followed these instructions to install it on a Pi. I don’t really recommend doing OpenCV on a Pi, but if you’re going to do this, do it with the Pi 4, not the 3B. The 3B took 7 hours to build OpenCV and the 4 took 1 hour. Also, the Jetson nano comes with it already installed and ready to use. It is more expensive, but it saves time.

My OpenCV models took about a week to train each on my personal computer and didn’t really work well.

The option that worked for me (the fastest) was Custom Vision. It only took me like a couple afternoons (because of my learning curve) to train the models on their server and the prediction working. I did the free trial and I really liked it because it was very easy to do and also liked their UI and it was very easy to do the Python in the Pi once I had the trained models in their servers.

Training the models

To train the models I had to get a thousands of pictures of my cats using the bathroom.

I needed positive and negative models. The cat 1 positive model is trained with pictures of cat 1 using the litter box. A cat 1 negative model is trained with pictures of cat 2 using the litter box, blurry pictures, an empty litter box, my hands cleaning the litter box, and the occasional foot pic loll. And the same would go for Cat 2 models.

IR Setup

Screen Shot 2022-08-29 at 6.38.10 PM.png
Screen Shot 2022-08-29 at 6.38.16 PM.png
Screen Shot 2022-08-29 at 6.38.25 PM.png
Screen Shot 2022-08-29 at 6.38.30 PM.png
Screen Shot 2022-08-29 at 6.39.38 PM.png

To take the thousands of pictures of my cats using the litterbox I set up a script that triggered the camera with a motion sensor, but the litter box area was very dark and I didn’t want to use a flash for the pictures because I didn’t want to scare my cats.

So I decided to use IR LEDs which are invisible to the eye. I paired them with a NoIR camera (camera without an IR filter) to be able to see the IR lights on the cats and take a picture of them in the dark while still getting a detailed photo.

To mount the LEDs I designed and 3D printed tiny stadium light panels to hold the individual IR LEDs. I did five 4x4 panels in total 80 IR LEDs. These panels could rotate so I could adjust them as needed.

I turn on/off the IR LEDs with a transistor. I didn’t want to use a relay because of the clicking sound and I didn't want to train my cats to feel like pooping or peeing when they hear a relay. So all the LEDs are wired to a 2N3906 transistor.

The Thermal View

Screen Shot 2022-08-29 at 6.38.38 PM.png

I realized I still wanted a live feed of the cats and got a thermal camera to watch them go to the bathroom while still giving them their privacy, after all I was just looking for temperature and health stats.

I got Pimoroni’s thermal camera - the mlx90640. Pimoroni also provides an experimental library for this camera that works with Raspberry Pi. To visualize it in the website, I took the test.cpp example code from Pimoroni’s mlx90640 library and recreated it in Javascript.

Here is what I did:

  • I modified test.cpp which reads from the thermal camera to make a 2D array of temperature values and output a string with the data in JSON format
  • The NodeJS back-end reads the string and converts it to an object
  • This object is then used by the front-end (written in JavaScript) and draws the thermal view on the html
  • I used web sockets to continuously receive the updated object from the back-end to the front-end and update the thermal view on the website, making it looks like a live view of the camera


The Poop Threshold

I grabbed all the timestamps from the photos I took to train the models and calculated how long it took them to go to the bathroom.

The time difference between pee and poo was significant not only because it takes them longer to poo but also because they also spend more time trying to bury the poo and scent longer.

Anyway, I made a time threshold and basically it is this:

if (seconds > poopThreshold)

return poop;

UI / UX

Screen Shot 2022-08-29 at 6.37.50 PM.png
Screen Shot 2022-08-29 at 6.37.58 PM.png

I made a couple gifs of my cats and a vacant sign. The main page shows the picture it took, the name of the cat that is currently using the bathroom, their temperature, the live feed of the thermal camera, a timer, and when they are done either the poo or the pee drawing highlights to show which one it was.

There is also a tab with tables that show all the historical data and I can also filter through it if I just want to see Teddy’s or Luna’s by tapping their faces. It is a very human UX ;] You can see what they did, the date, how long it took them, and the photo that was taken when they went.