Nutri-Manager: an Embedded System That Can Classify Different Foods to Map Them to Their Respective Nutritional Value

by shalok4813_be23 in Workshop > Science

55 Views, 0 Favorites, 0 Comments

Nutri-Manager: an Embedded System That Can Classify Different Foods to Map Them to Their Respective Nutritional Value

instru.png
show.jpg

This is part of assignment submitted to Deakin University, School of IT, Unit SIT210/730 - Embedded Systems Development

As awareness of health and wellness grows, many individuals strive to track their daily nutritional intake to improve eating habits, address health concerns, or achieve fitness goals. Despite the plethora of programs and tools available, manually counting calories and nutrients for each meal may be laborious and time-consuming, leading to inconsistent data and incomplete records.

Despite the numerous advantages of monitoring meal consumption, accurately executing this task might be challenging. Historically, individuals were required to document their meals, estimate their food intake, and manually calculate their caloric and nutritional content. Although some applications include databases and other digital tools that facilitate this procedure, it still requires considerable time and work. Users frequently must go through extensive lists of foods to identify the appropriate items, accurately input portion proportions, and consistently record each meal in a uniform manner. This may result in missing records, hence diminishing the reliability of the information provided by these applications.

This unit has provided the chance to develop an embedded system based on specific assessment criteria and fault testing. This will aid in developing a system that is practical and its deployment may be considered viable. Fault tolerance will be essential for maintaining the project's functionality in the event of failures, hence enhancing the system's reliability. The evaluation criteria will provide a systematic examination of the system and its execution.

Why do we need this project?

Athletes and persons with particular dietary objectives face challenges in precisely monitoring their nutritional intake in dining halls, buffets, and sporting facilities. Manually documenting calories and nutrients in high-traffic settings is labor-intensive, susceptible to errors, and frequently inconsistent, resulting in imprecise data. This leads to inadequate dietary monitoring, impeding health and performance. A simplified, dependable solution is required to simplify nutrition tracking in these environments, delivering real-time, precise information with no effort.

Supplies

To accomplish the objective of this system. The four fundamental criteria are machine learning, weight detection, communication, and data visualization. The following hardware satisfies these requirements:

  1. Machine Learning: Raspberry Pi 5, Raspberry Pi Camera Module 3
  2. Weight Detection: Full Bridge Load Cell, H7X11 module
  3. Visual Output: LCD Display
  4. Communication: Arduino Nano 33 IoT

Installing Dependencies

Before going any further, it is important to create an environment for our program to be executed properly. Since, most of the libraries in raspberry pi use open source code which is continuously changing due to which there are several compatibility issues that are better dealt by creating a virtual environment. So,

Raspberry Pi 5

Step 1.1. Create a virtual environment in the root folder of raspberry pi where you will be writing your code.

Install the following libraries

pip install picamera2 numpy tensorflow opencv-python bluepy firebase-admin

Make sure that you go to the virtual environment and change the settings to include system packages and write true in front of that attribute.

Arduino Nano 33 IoT

Step 1.1: Go to the libraries section of Arduino IDE.

Step 1.2: Include the following libraries to you sketch as well your IDE:

<ArduinoBLE.h>
"HX711.h"
"Firebase_Arduino_WiFiNINA.h"
<LiquidCrystal_I2C.h>

Step 1.3: Make sure that you have selected the correct port for transferring the code and Arduino Nano is recognized by that port.

Connecting Hardware

hard.png

Let's start with Raspberry PI 5

Raspberry Pi 5:

Step 1: Slightly pick the holder of the camera port and insert the camera module strip into it and close it. Make sure that the camera port is fully closed, if not it will not be able to read anything from the camera.

Load Sensor and HZX11 module:

Step 1: Do the following wiring connections:

HX711 Module Pins

  1. VCC: Connect to Arduino 3.3V
  2. GND: Connect to Arduino GND
  3. DT (Data): Connect to Arduino D2
  4. SCK (Clock): Connect to Arduino D3

Step 2: Connect the load cell wires to the HX711 as follows:

  1. Red (E+): Connect to E+ on HX711
  2. Black (E-): Connect to E- on HX711
  3. White (S+): Connect to A+ on HX711
  4. Green (S-): Connect to A- on HX711

Arduino Nano 33 IoT:

Step 1: Connect the A4 and A5 pin to SDA and SCL pin of LCD screen.

Step 2: Connect GND and VCC to GND and VCC of LCD screen.

Note: If the visibility of content on LCD screen is low, try tweaking the potentiometers behind the LCD screen to improve its sensitivity and brightness.


The given image describes a brief layout of what would the hardware setup would look like.

Write Code for Raspberry Pi 5

Before starting the steps of this code, first download the .tflite file of the model. This model is a tensor flow lite version which is much faster and efficient in resource management. You can find the link to the model on git hub here.

After downloading the model, you can move on to the next step. The model from which it was converted to TensorFlowLite had nearly 94% accuracy. It is expected to observe some decrease in its accuracy. Along with that the ML model used in this system is MobileNetV2 which is perfect for embedded systems and mobile applications.

Step 1: Declare all the libraries in the previously created virtual environment.

from picamera2 import Picamera2
import numpy as np
import tensorflow as tf
import cv2
from bluepy import btle
import time
from collections import Counter
import time
import firebase_admin
from firebase_admin import credentials
from firebase_admin import db
from datetime import datetime

Step 2: Write code to connect to Arduino Nano using BLE

This method will take the device's mac address to connect to along with characteristics uuid which is a special unique identifier used to denote different characteristics. It initiates the machine learning model when the weight is detected on the scale.

def connect_and_read_ble(device_mac, sensor_characteristic_uuid, class_characteristic_uuid):
global sensor, mode, classified_food
load_threshold = 1
weight_detected = False
try:
print(f"Connecting to {device_mac}...")
device = btle.Peripheral(device_mac, btle.ADDR_TYPE_PUBLIC)
print(f"Reading characteristics {sensor_characteristic_uuid}, {class_characteristic_uuid}")
while True:
sensor_characteristic = device.getCharacteristics(uuid=sensor_characteristic_uuid)[0]
val = sensor_characteristic.read()
sensor = int.from_bytes(val, byteorder='big', signed=True)
print(f"Sensor signal: {sensor}")

class_characteristic = device.getCharacteristics(uuid=class_characteristic_uuid)[0]

if sensor > load_threshold and not weight_detected:
weight_detected = True
print(f"Weight Detected: {sensor}")
classified_food = start_inference_loop(model_path="class_model_optm.tflite",max_predictions=50,cam=picam2)
integer_val = classified_food
bytes_val = int(integer_val).to_bytes(1, 'big')
print(f"Writing {integer_val} to the characteristic...")
class_characteristic.write(bytes_val, withResponse=True)
send_data(integer_val, sensor)
elif sensor < load_threshold and weight_detected:
weight_detected = False
integer_val = classified_food
bytes_val = int(integer_val).to_bytes(1, 'big')
print(f"Writing {integer_val} to the characteristic...")
class_characteristic.write(bytes_val, withResponse=True)
time.sleep(0.5)
read_value = class_characteristic.read()
print(f"Identified class (read): {read_value}")
num = int.from_bytes(read_value, byteorder='big')
print(f"Identified class as integer: {num}")
except btle.BTLEException as e:
print(f"Failed to write or read characteristic: {e}")
except Exception as e:
print(f"Failed to connect or read from {device_mac}: {str(e)}")
device.disconnect()
print("Disconnected")
except KeyboardInterrupt:
print("Disconnecting...")
device.disconnect()
print("Disconnected")

Step 3: Setting up inference from ML model

This line of code would ensure the implementation of ML model whenever the pi demands to classify the food present on the tray. The working of this code starts by loading the image, starting the camera, running inference with preprocessed images. This process is done in a loop inside start_inference_loop for 50 times to get the most repeated value of the food item classified by the model.

def load_model(model_path="class_model_optm.tflite"):
interpreter = tf.lite.Interpreter(model_path=model_path)
interpreter.allocate_tensors()
input_details = interpreter.get_input_details()
output_details = interpreter.get_output_details()
return interpreter, input_details, output_details

def preprocess_image(image):
image_rgb = cv2.cvtColor(image, cv2.COLOR_BGRA2RGB)
image_resized = cv2.resize(image_rgb, (224, 224))
image_array = np.array(image_resized, dtype=np.float32) / 255.0
input_array = np.expand_dims(image_array, axis=0)
return input_array

def start_camera():
try:
time.sleep(1)
picam2 = Picamera2()
picam2.start()
print("Camera initialized successfully.")
return picam2
except Exception as e:
print(f"Error initializing the camera: {e}")
return None

def run_inference(interpreter, input_details, output_details, frame):
input_array = preprocess_image(frame)
interpreter.set_tensor(input_details[0]['index'], input_array)
interpreter.invoke()
output = interpreter.get_tensor(output_details[0]['index'])
predicted_class = np.argmax(output)
return predicted_class

def start_inference_loop(model_path="class_model_optm.tflite", max_predictions=50, cam=None):
global classified_food
interpreter, input_details, output_details = load_model(model_path)
predictions = []

while len(predictions) < max_predictions:
frame = cam.capture_array()
if frame is None:
print("Failed to grab frame")
break

predicted_class = run_inference(interpreter, input_details, output_details, frame)
predictions.append(predicted_class)
frame_rgb = cv2.cvtColor(frame, cv2.COLOR_BGRA2RGB)
cv2.imshow('Camera Feed', frame_rgb)

if cv2.waitKey(1) & 0xFF == ord('q'):
break

most_common_class, count = Counter(predictions).most_common(1)[0]
print(f"Most frequent predicted class after {max_predictions} predictions: {most_common_class} (Predicted {count} times)")

classified_food = most_common_class

return most_common_class

Step 4: Setting up Firebase

This is another important part of the system that stores the data predicted by the raspberry pi after detecting weight on the tray. This data can be further accessed to create a responsive website that has dashboards in it.

def send_data(class_val, wt):
# Get the current time
current_time = datetime.now()

# Format the time as a string
formatted_time = current_time.strftime("%Y-%m-%d %H:%M:%S")

print("Current time:", formatted_time)
ref = db.reference('/meals/' + '/'+formatted_time)# Data to be sent to Firebase
data = {
'food': food_array[class_val],
'weight': wt,
}# Sending data to Firebase
ref.set(data)
print('Data sent to Firebase successfully!')

Step 5: Since, the main parts of the code for raspberry pi are discussed above.

The rest of the complete code for raspberry pi can be found in this step. Make sure that you keep the structure of the database same and change the credentials to your own Firebase data credentials.

if _name_ == "_main_":
device_mac_address = "MAC address of your Arduino Nano"
sensor_characteristic_uuid = "2A58"
class_characteristic_uuid = "2A59"

picam2 = start_camera()

# Path to your Firebase credentials JSON file
cred = credentials.Certificate('Enter you JSON credential certificate')# Initialize the app with a service account, granting admin privileges
firebase_admin.initialize_app(cred, {
'databaseURL': 'your database url'
})# Reference to your Firebase database

while True:
try:
connect_and_read_ble(device_mac_address, sensor_characteristic_uuid, class_characteristic_uuid)
break;
except KeyboardInterrupt:
print("Exiting loop...")
picam2.stop()
cv2.destroyAllWindows()
break;


Write Code for Arduino IDE

For Arduino Nano 33 IoT:

Step 1: Let's start with creating BLE setup.

This setup will include creating a service and associating two characteristics to it. These characteristics will be leveraged to establish communication between the two devices.

void BLE_setup() {
if (!BLE.begin()) {
Serial.println("Bluetooth® Low Energy failed to start!");
}
BLE.setLocalName("Nano 33 IoT");
BLE.setAdvertisedService(commService);
commService.addCharacteristic(sensorCharacteristic);
commService.addCharacteristic(classsCharacteristic);
BLE.addService(commService);
BLE.advertise();
Serial.println("BLE Peripheral setup complete.");
}

Step 2: Measure weight properly and at right time.

This code will measure weight and update the value of weight whenever, the weight is detected on the tray..


void load_setup() {
scale.begin(DOUT, CLK);
Serial.println("Taring the scale...");
scale.tare();
Serial.print("Calibration factor: ");
Serial.println(calibration_factor);
}

void calibrate_weight() {
delay(500);
long rawData = scale.get_units(10);
weight = rawData / calibration_factor - container_wt;
Serial.print("Weight: ");
Serial.print(weight);
Serial.println(" g");
}

void get_weight() {
calibrate_weight();
if (weight > LOAD_THRESHOLD && !weightDetected) {
weightDetected = true;
Serial.print("Weight Detected: ");
Serial.println(weight);
}
if (weight < LOAD_THRESHOLD && weightDetected) {
weightDetected = false;
}
}

Step 3: Get total calories from weight and present food class.

Create a food class that has calories per g information, then build a get_calories method that finds out the calories after taking weight as an argument.

struct Food {
String name;
float calories_per_gram;
};

Food foods[] = {
{"Burger", 2.74}, // 2.74 kcal/g
{"Butter_Naan", 3.10}, // 3.10 kcal/g
{"Tea", 0.20}, // 0.20 kcal/g
{"Chapati", 1.68}, // 1.68 kcal/g
{"Chole_Bhature", 3.48}, // 3.48 kcal/g
{"Dal_Makhani", 1.60}, // 1.60 kcal/g
{"Dhokla", 1.25}, // 1.25 kcal/g
{"Fried_Rice", 1.60}, // 1.60 kcal/g
{"Idli", 1.04}, // 1.04 kcal/g
{"Jalebi", 3.40}, // 3.40 kcal/g
{"Kaathi_Rolls", 2.74}, // 2.74 kcal/g
{"Kadai_Paneer", 2.22}, // 2.22 kcal/g
{"Kulfi", 1.90}, // 1.90 kcal/g
{"Masala_Dosa", 1.68}, // 1.68 kcal/g
{"Momos", 1.75}, // 1.75 kcal/g
{"Paani_Puri", 3.60}, // 3.60 kcal/g
{"Pakode", 4.20}, // 4.20 kcal/g
{"Pav_Bhaji", 2.85}, // 2.85 kcal/g
{"Pizza", 2.95}, // 2.95 kcal/g
{"Samosa", 3.00} // 3.00 kcal/g
};

float get_total_calories(String name,int length, float wt){
for(int i = 0; i < length; i++){
if (foods[i].name == name){
return foods[i].calories_per_gram * wt;
}
}
}

Step 4: Loop through the Bluetooth connection

Create a loop to maintain the Bluetooth connection with raspberry pi. This code will maintain when and how the data for characteristics will be updated in the Bluetooth service. This code also use MAC filtering to ensure that only authorized raspberry pi is connected to the Arduino Nano 33 IoT.

const String allowedMACs[] = {"Your raspberry pi MAC address"};
void loop() {
BLEDevice central = BLE.central();
if (central) {
String centralMAC = central.address();
Serial.print("Trying to connect central: ");
Serial.println(centralMAC);
bool isAllowed = false;
for (int i = 0; i < sizeof(allowedMACs) / sizeof(allowedMACs[0]); i++) {
if (centralMAC.equalsIgnoreCase(allowedMACs[i])) {
isAllowed = true;
break;
}
}

if (isAllowed) {
Serial.println("Allowed central is detected.");
while (central.connected()) {
get_weight();
sensorCharacteristic.writeValue(weight);
classsCharacteristic.readValue(&class_val, 2);
Serial.print("Class = ");
Serial.println((int)class_val);
// Display the result on the LCD
if (weightDetected && class_val < 20){
lcd.clear();
lcd.setCursor(0, 0);
lcd.print(food_array[class_val]);
lcd.print(": ");
lcd.print(weight, 1);
lcd.print(" g");
lcd.setCursor(0, 1);
lcd.print("Calories: ");
float total_calories = get_total_calories(food_array[class_val],20, weight);
lcd.print(total_calories);
Serial.println(total_calories);
delay(2000);
lcd.clear();
}
}
Serial.print(F("Disconnected from central: "));
Serial.println(central.address());
lcd.clear();
} else {
Serial.println("Connection rejected. Central not allowed.");
BLE.disconnect();
}
}
}

Step 5: Save and Verify it. Download the complete code from NutriManager_prot.ino.

Save and Run It

Step 1: Save the code in both the devices

Step 2: First run the code on Arduino IDE and then run the python script on the virtual environment

Step 3: Watch serial monitor to monitor the output and state of both the devices.

Here are some things to consider while building this project:

  1. Should Arduino Nano have LCD attached to it or attached to a tray where the calories will be displayed?
  2. Object Detection model would be able to identify multiple food items in a tray. While the simple classification model used in this project can detect a single food item at a time.(Scope for modification)
  3. Scope for better design Ideas for feasible implementation of this project


The reason Raspberry Pi 5 is used for sending data to Firebase instead of Arduino Nano is because Arduino Nano has the same hardware that does the job for WiFi connection as well as Bluetooth connection. As a result, using Arduino Nano for sending data to Firebase would create a lot of interference in communication between both the devices.


Well that was it ! Hope it helps

Thank You !