GestureChef: Smart Recipe Companion

by Mukesh_Sankhla in Circuits > Gadgets

2673 Views, 48 Favorites, 0 Comments

GestureChef: Smart Recipe Companion

DSC00523.JPG
DSC00611.JPG
DSC00599.JPG
DSC00601.JPG
DSC00595.JPG
DSC00613.JPG

In today's fast-paced world, we all struggle with one daily question "What should I cook today?" With endless options and limited time, choosing a meal can often be more frustrating than cooking itself! But what if your kitchen could suggest the perfect dish in real-time, based on the time of day?

Introducing the Gesture-Controlled Kitchen Guide, an innovative touch-free cooking assistant designed to make meal selection effortless. Whether it's breakfast, brunch, lunch, snacks, dinner, or even one of your sneaky midnight cravings, this smart device instantly suggests delicious recipes suited to the moment, so you never have to second-guess your next meal again.

Built around a 4.7" E-Ink display, a gesture sensor, and a rechargeable 18650 battery, this sleek, modern device operates with simple hand gestures. No messy touchscreens, no cluttered recipe books, just wave right for a new meal idea, swipe up to reveal step-by-step instructions, and swipe down to return. Plus, with its custom 3D-printed case and magnetic mount, it blends seamlessly into any kitchen setup.

Cooking should be inspiring, not overwhelming. Whether you're a busy professional, a home chef, or someone just looking for meal inspiration, the Gesture-Controlled Kitchen Guide is your perfect kitchen companion. Let's dive in and revolutionize how we cook, one wave at a time!


Sponsored by NextPCB:

NextPCB has launched the ESP32-S3 Custom Design Accelerator, a program designed to support cutting-edge edge AI applications. Developers can receive two free PCBA prototypes—including PCB fabrication, components, assembly, and global shipping, valued at $500+.

The program focuses on smart home technology, environmental sensing, industrial predictive systems, and AI-driven fields like machine learning and computer vision. Only 20 original projects will be selected, open to both enterprises and individual creators.

Think your project has what it takes? Apply now: https://www.nextpcb.com/pcb-assembly-quote

1️⃣ Register your project with NextPCB.

2️⃣ Submit your PCBA inquiry through their quotation system and follow the instructions to claim your two free ESP32-S3-based PCBA units.

Supplies

DSC00384.JPG
DSC00395.JPG
DSC00396.JPG
DSC00389.JPG
DSC00392.JPG
DSC00394.JPG
DSC00390.JPG
DSC00391.JPG

1x 4.7" E-Ink Display

1x Gesture Sensor

1x 18650 Battery

6x 5mm Magnets

1x Screws Kit


Tools:

My 3D Printer

My Screwdriver Kit


About the LilyGO EPD 4.7" Display:

The LilyGO EPD 4.7 is a powerful and versatile E-Ink display module based on the ESP32-S3 microcontroller, making it an excellent choice for low-power applications like GestureChef.

Key Features:

  1. 4.7-inch E-Ink display with clear and crisp visibility even in bright light.
  2. ESP32 processor, offering Wi-Fi and Bluetooth connectivity for IoT applications.
  3. Ample storage options:
  4. 16MB Flash memory for firmware and applications.
  5. 8MB PSRAM for handling complex tasks and graphics.
  6. Onboard BMS (Battery Management System) for safe and efficient 18650 lithium battery charging.
  7. USB Type-C interface for programming and power delivery.
  8. Low power consumption, perfect for always-on display applications.

About the GR10-30 Gesture Sensor:

The GR10-30 gesture sensor is an advanced non-contact sensing module capable of detecting 12 different hand gestures with high accuracy. This makes it perfect for applications like GestureChef, allowing intuitive control without physical touch.

Key Features:

  1. Recognizes 12 gestures, including:
  2. Move up, down, left, right, forward, and backward
  3. Rotate clockwise and counterclockwise (single and continuous)
  4. Hover and wave for seamless interaction
  5. Customizable parameters such as:
  6. Gesture trigger distance
  7. Hand rotation angle sensitivity
  8. Hovering time recognition
  9. Recognition window size for precise control
  10. Sensing range of up to 30 cm, providing stable and reliable performance.
  11. Dual interrupt pins to signal:
  12. When a gesture is detected
  13. When an object enters the recognition range

Applications:

The GR10-30 is ideal for hands-free control in various applications, including:

  1. Smart home devices (lighting, automation)
  2. Robot interaction and gesture-based HMI (Human-Machine Interface)
  3. Touchless gaming and interactive displays
  4. Gesture-based remote controllers

With its combination of E-Ink technology, ESP32 power, gesture control compatibility, and onboard battery management, the LilyGO EPD 4.7" is a fantastic choice for smart home projects, dashboards, and hands-free interactive displays.

CAD & 3D Printing

DSC00402.JPG
DSC00406.JPG
DSC00408.JPG
DSC00404.JPG
DSC00410.JPG
DSC00411.JPG

We started by designing the GestureChef case in Fusion 360, ensuring precise dimensions for the 4.7" E-Ink display and gesture sensor. The case consists of two main parts: the housing and the cover.

To enable magnetic mounting, the cover design incorporates six 5mm magnets, allowing easy attachment and removal. Additionally, we designed five small button extensions to make the onboard display buttons more accessible.

Once the design was complete, we 3D-printed the parts. For the dual-color effect, we used the filament change technique at specific layers, giving it a sleek and visually appealing finish.

Download and 3D print:

1x Housing.stl

1x Cover.stl

5X Button.stl

1x Clip.stl

Circuit Connections

DSC00428.JPG
Add a little bit of body text.png
DSC00413.JPG
DSC00414.JPG
DSC00415.JPG
DSC00430.JPG
DSC00429.JPG

Follow the circuit diagram above to connect the gesture sensor to the display module.

I soldered the necessary wires directly to the ESP32 on the display module. To connect the gesture sensor, simply use the pre-attached connector that comes with it, no extra wiring hassle!

Here’s the pin mapping:

  1. VCC → VCC
  2. GND → GND
  3. RX → TX
  4. TX → RX

Display Assembly

DSC00432as.JPG
DSC00434.JPG
DSC00436.JPG
DSC00437.JPG
DSC00438.JPG
DSC00439.JPG
DSC00443.JPG
DSC00440.JPG

Start by gathering the 3D-printed case, 5 buttons, and the display.

Insert the buttons into their respective slots in the housing.

Carefully place the display inside the housing, ensuring that the buttons align properly and the USB Type-C port is accessible.

Handle the display gently to avoid any damage.

With the display securely in place, the assembly is ready for the next step.

Sensor Assembly

DSC00447.JPG
DSC00449.JPG
DSC00454.JPG
DSC00455.JPG
DSC00457.JPG

Take the housing assembly and the gesture sensor.

Connect the sensor using the pre-soldered connector on the display.

Position the sensor correctly, ensuring the orientation is correct.

Use the 3D printed clip and two 2mm screws to secure the sensor in place. The clip should lock both the display and the sensor for a firm hold.

Battery Connection

DSC00461.JPG
DSC00464.JPG

Simply connect the 18650 battery to the display.

For power management, I am relying on the deep sleep mode of the ESP32, so there is no physical switch. However, if you prefer a manual control option, you can add a power switch to turn the device ON and OFF as needed.

Magnetic Cover

DSC00419.JPG
DSC00417.JPG
DSC00420.JPG

Take the 3D-printed cover and six 5mm magnets, then press the magnets into the designated holes.

If the magnets feel loose, use a small amount of quick-drying glue to secure them in place.

Final Assembly

DSC00467.JPG
DSC00468.JPG
DSC00471.JPG
DSC00473.JPG
DSC00475.JPG

Take the housing assembly, cover, and four 2mm screws.

Align the cover properly on the housing and secure it in place using the screws.

Programming

DSC00480.JPG
Screenshot 2025-03-27 220250.png
Screenshot 2025-03-29 171517.png
Screenshot 2025-03-27 220305.png

Download and install the required libraries:

  1. LilyGo EPD47 Library
  2. DFRobot RTU Library
  3. DFRobot GR10_30 Library

Download the GestureChef repository, extract the files, and open GestureChef.ino in Arduino IDE.

If you haven’t already configured the ESP32 environment in Arduino IDE, follow this guide:

  1. Visit: Installing ESP32 Board in Arduino IDE.
  2. Add the ESP32 board URL in the Preferences window:
https://raw.githubusercontent.com/espressif/arduino-esp32/gh-pages/package_esp32_index.json
  1. Install the ESP32 board package via Tools > Board > Boards Manager.
  2. Install Arduino ESP32 V 2.0.5 or above and below V3.0


Enter your WiFi credentials

const char* ssid = "*********";
const char* password = "********";

Change time zone, for me I am in India (IST 5:30) so calculate the offset as 5.5x60x60 = 19800 sec

const long gmtOffset_sec = 19800; // IST (UTC +5:30)

Select the settings in Arduino IDE as shown in above image.

Connect the display to PC using Type-C cable.

Click on "Upload" and wait for the process to complete.

Once uploaded, your GestureChef is ready to serve you meal ideas with just a swipe!

Personalization

Screenshot 2025-03-27 220323.png

How to update the recipe book:

Originally, I wanted to fetch recipes dynamically using AI APIs like ChatGPT or DeepSeek to provide a wider variety of meal suggestions. However, since most AI APIs are not free or have usage limits, I explored free recipe APIs instead. Unfortunately, these APIs had limited food options and didn’t cover all the meals I wanted to include. As a result, I decided to create a local database in recipe.h, which stores predefined meal names, ingredient lists, and step-by-step cooking instructions. This ensures that the recipe viewer works offline, without relying on external APIs, while still offering a diverse selection of meals for different times of the day.

To modify the recipe book, open the recipe.h file. Here, you can update:

  1. Meal Headings: Modify the names of meals.
  2. Ingredients List: Add or edit the list of ingredients for each meal.
  3. Recipe Steps: Update the step-by-step instructions for preparing each meal.


How to change the meal timings:

The function suggestMeal() determines meal suggestions based on the current time. It assigns the meal type as follows:

void suggestMeal() {
int hour = getCurrentHour();

if (hour >= 6 && hour < 10) currentMealType = 0; // Breakfast
else if (hour >= 10 && hour < 12) currentMealType = 1; // Brunch
else if (hour >= 12 && hour < 16) currentMealType = 2; // Lunch
else if (hour >= 16 && hour < 18) currentMealType = 3; // Snacks
else if (hour >= 18 && hour < 22) currentMealType = 4; // Dinner
else currentMealType = 5; // Midnight Meal

// Randomly selects a meal from the available options
currentMealIndex = random(0, 10);
}

To adjust meal time, modify the time ranges in this function.

Code Explanation

DSC00568.JPG

1. Setup

  1. Initializes serial communication
  2. Configures wake-up button
  3. Sets up e-paper display
  4. Connects to WiFi
  5. Configures NTP for accurate time
  6. Initializes gesture sensor

2. Display Functions

  1. displayMainPage(): Shows current time, date, and instructions
  2. displayMealPage(): Displays ingredients for selected meal
  3. displayRecipePage(): Shows step-by-step cooking instructions

3. Core Logic

  1. handleGesture(): Processes swipe gestures to navigate between pages
  2. suggestMeal(): Recommends meals based on current time
  3. goToSleep(): Puts device into deep sleep after timeout
  4. Time/date functions: Get formatted time strings from NTP

How It Works

  1. On startup, the device shows the main page with current time/date
  2. A right swipe shows suggested meals based on time of day
  3. Users can swipe left/right to browse different meal options
  4. Swiping up on a meal shows its recipe steps
  5. Swiping down returns to ingredients view
  6. After 5 minutes of inactivity, the device sleeps until button(Pin 39) press.

Gesture Controls:

  1. Wave Right: Next meal and ingredient list.
  2. Wave Left: Previous meal and ingredient list.
  3. Wave Up: Enter recipe steps.
  4. Wave Down: Exit recipe view and return to meal selection.

Additional Gesture Actions:

  1. Forward Gesture
  2. Backward Gesture
  3. Clockwise Rotation Gesture
  4. Anticlockwise Rotation Gesture
  5. These extra gestures can be programmed for custom actions as per user preference!

Conclusion

DSC00580.JPG
DSC00583.JPG
DSC00582.JPG
DSC00586.JPG

With GestureChef, meal selection and recipe navigation become effortless, all thanks to intuitive gesture control and a power-efficient E-Ink display. No more struggling with what to cook, just wave your hand and let GestureChef do the thinking!

But this is just the beginning. GestureChef isn't limited to meal suggestions, it's a versatile platform powered by the ESP32, an E-Ink display, and a gesture sensor. With a few tweaks, it can be transformed into a weather station, a smart clock, a grocery list manager, a calendar for office meetings, or even a daily reminders dashboard. The possibilities are endless!

This project combines innovation, practicality, and seamless interaction, proving that technology can truly simplify everyday tasks. Whether in the kitchen or beyond, GestureChef is your hands-free companion for a smarter living experience!

If you enjoyed this project, don't forget to hit the like button and leave a comment below. And if you've created your own, be sure to share your experience in the "I Made It" section.

Thank you! See you next time ;)