BBQ Pi (With Data Visualization!)
by IsraelI8 in Circuits > Raspberry Pi
3153 Views, 18 Favorites, 0 Comments
BBQ Pi (With Data Visualization!)
Introduction
Barbecuing most typically refers to the slow process of using indirect heat to cook your favorite meats. Though this method of cooking is hugely popular -- especially in the US -- it does have what some may consider a rather serious weakness: it requires hours of semi-lucid attention to be spent monitoring the temperature of your pit and food. Enter: Raspberry Pi.
The Original Project
The original source for this project can be found here: https://old.reddit.com/r/raspberry_pi/comments/a0... The gist of it is that reddit user Produkt was able to relay food and pit temperature data from relatively cheap, commercially available wireless thermometers to a Raspberry Pi (which had attached to its GPIO pins a small RF module). In the original project (linked above), Produkt had his data being stored in a sqlite database and displayed onto a locally hosted apache2 php website.
This solution already solves the original problem touched upon in the introduction of this blog: you can now monitor your food & pit temperature remotely with a web browser. But what if we wanted to expand on this? Enter: GridDB.
Supplies
Raspberry Pi4
GridDB Web API & FluentD
Upon seeing this project, my first thought -- after the initial wave of excitement -- was thinking of ways I extend the functionality. By using the GridDB and its Grafana plugin, I sought out to visualize my food & pit data. Beyond that, I wished to set up Grafana annotations to seek out any anomalous data points -- can’t have any charred meat!
To get started, I needed to use the C code from the original project to read the data coming in from the wireless thermometer and post that data into my GridDB server. To get this up and running, I spun up a GridDB Server on Azure using a CentOS virtual machine. The easiest way to share data from our edge machine (Raspberry Pi) to our cloud server was via the GridDB Web API. So, on that vm, I set up GridDB’s WebAPI along with Fluentd and the accompanying GridDB connector.
Before actually sending data up to the cloud, I needed to create the basic schema for my BBQ Pi container. The dataset coming in is extremely simple: we have two temperature sensors, one cook id, and of course, the timestamp. So our schema looks like this:
timeseries = gridstore.put_container("bbqpi",[ ("time",griddb.GS_TYPE_TIMESTAMP), ("cookid",griddb.GS_TYPE_INT), ("probe1",griddb.GS_TYPE_INT), ("probe2",griddb.GS_TYPE_INT) ],griddb.GS_CONTAINER_TIME_SERIES)
To create this timeseries container, I simply used the WebAPI (port 8080):
curl -X POST --basic -u admin:admin -H "Content-type:application/json" -d \ '{"container_name":"bbqpi", "container_type":"TIME_SERIES", \ "rowkey":true, "columns":[ \ {"name": "time", "type": "TIMESTAMP" }, \ {"name": "cookid", "type": "INTEGER" },\ {"name": "probe1", "type": "INTEGER" }, \ {"name": "probe2", "type": "INTEGER" }]}' \ http://localhost:8080/griddb/v2/defaultCluster/dbs/public/containers
With the container created, I needed to utilize Fluentd (port 8888) to post actual data into our container. Here’s a CURL command posting some dummy data:
curl -X POST -d 'json={"date":"2020-01-01T12:08:21.112Z","cookid":"1", "probe1":"150", "probe2":"140"}' http://localhost:8888/griddb
From there, I needed to append the original code to send an HTTP POST Request whenever our Pi was reading data from our pit (about once every ~12 seconds).
As a side note: writing this code taught me to appreciate just how verbose the C language can be:
int postData(char time[], int cookid, int probe1, int probe2, char url[]) { CURL *curl; CURLcode res; /* In windows, this will init the winsock stuff */ curl_global_init(CURL_GLOBAL_ALL); char errbuf[CURL_ERROR_SIZE] = { 0, }; char agent[1024] = { 0, }; char json[1000]; snprintf(json,200,"json={\"date\":\"%s.112Z\",\"cookid\":\"%d\", \"probe1\":\"%d\", \"probe2\":\"%d\"}",time,cookid,probe1,probe2); /* get a curl handle */ curl = curl_easy_init(); if(curl) { /* First set the URL that is about to receive our POST. This URL can just as well be a https:// URL if that is what should receive the data. */ snprintf(agent, sizeof agent, "libcurl/%s", curl_version_info(CURLVERSION_NOW)->version); agent[sizeof agent - 1] = 0; curl_easy_setopt(curl, CURLOPT_USERAGENT, agent); curl_easy_setopt(curl, CURLOPT_URL, url); curl_easy_setopt(curl, CURLOPT_USERNAME, "admin"); curl_easy_setopt(curl, CURLOPT_PASSWORD, "admin"); curl_easy_setopt(curl, CURLOPT_VERBOSE, 1L); curl_easy_setopt(curl, CURLOPT_ERRORBUFFER, errbuf); curl_easy_setopt(curl, CURLOPT_POSTFIELDS, json); /* Perform the request, res will get the return code */ res = curl_easy_perform(curl); if(res != CURLE_OK) { size_t len = strlen(errbuf); fprintf(stderr, "\nlibcurl: (%d) ", res); if(len) fprintf(stderr, "%s%s", errbuf, ((errbuf[len - 1] != '\n') ? "\n" : "")); fprintf(stderr, "%s\n\n", curl_easy_strerror(res)); goto cleanup; } cleanup: curl_easy_cleanup(curl); curl_global_cleanup(); return 0; } }
With this function written, I just needed to have it run at the same time that the sqlite data was being posted:
if (goodData==1) { if (last_db_write==0 || (secs-last_db_write>=10)) { snprintf(sql,100,"INSERT INTO readings (cookid,time,probe1,probe2) VALUES (%d,'%s',%d,%d);",cookID,buff,probe1,probe2); printf("%s\n",sql); rc=sqlite3_exec(db,sql,callback,0,&zErrMsg); if (rc!=SQLITE_OK) { printf("SQL error: %s\n",zErrMsg); } else { last_db_write=secs; } char url[] = "http://xx.xx.xx.xx:8888/griddb"; postData(buff, cookID, probe1, probe2, url); } }
To make sure your data is actually being inserted into your server, you can run the following command to query your database and view the results:
curl -X POST --basic -u admin:admin -H "Content-type:application/json" -d '{"limit":1000}' http://localhost:8080/griddb/v2/defaultCluster/dbs/public/containers/bbqpi/rows
Grafana
With the code in place, now when we use the original web portal to start a “cook”, we will be simultaneously storing our temperature data into our GridDB server.
The next step will be to visualize our data using Grafana. To do so, we followed the information from this blog: here. The nice thing about this implementation is that it’s extremely easy to see our data charted into a nice graph. It also adds annotations.
The annotations discussed in the blog make it extremely easy for us to monitor when something goes wrong with either our food or the pit itself. In my case, I was cooking beef short ribs. With those, I did not want the temperature in the pit to grow beyond 275 degrees Fahrenheit. If I saw the temperature go beyond that, I could turn off a burner and allow the heat to dip again:
I had a similar rule for the sensor actually keeping tabs on the food itself: if the food got to an internal temp of 203 degrees fahrenheit, the ribs were ready. You can see the lone annotation at the end of the cook here:
All in all, the cook only took me about ~4 hours or so, but this sort of setup would truly excel if I was cooking something which would have required even more time in the grill (think a low-slow smoke that lasts ~12 hours). Despite that, I believe the value if this tool is easily apparent: being able to log your foods’ results and then compare it to previous cooks means your BBQing will slowly get better over time as you can use data to see what works and what doesn’t.
The Food
This was the first time I have ever made beef short ribs; for seasoning, I simply used salt, black pepper, and garlic powder. Despite some issues with the burner getting too high for a tad bit there in the beginning, the ribs came out fantastic. Please take a look:
Conclusion
In the end, the food came out terrific, the sensors, GridDB, and Grafana all worked in concert beautifully, and we got some valuable data on how to cook these things again for the next time we want to impress some friends.