DecayDock – AI Smart Fridge Companion
by ptallthings93 in Circuits > Electronics
308 Views, 2 Favorites, 0 Comments
DecayDock – AI Smart Fridge Companion
Introduction
I created DecayDock, an AI-powered smart fridge companion built using an ESP32-CAM and TFT display that helps people reduce household food waste through food recognition and freshness tracking.
The idea came from a very common real-life problem I noticed at home. Many times, vegetables and leftovers were forgotten behind other containers inside the refrigerator until they spoiled. Even though the food was bought with good intention, busy schedules and lack of visibility caused unnecessary waste. I realized this is something many families, students, and working professionals experience daily.
While researching, I found that food waste is not only a household problem but also a global environmental issue. According to the UNEP Food Waste Index Report 2024, the world wasted around 1.05 billion tonnes of food in 2022, and households were responsible for nearly 60% of that waste. At the same time, around 783 million people globally faced hunger.
UNEP Food Waste Index Report 2024
Food waste also contributes heavily to climate change. Research from the Food and Agriculture Organization (FAO) states that food loss and waste generate approximately 8–10% of global greenhouse gas emissions, mainly because decomposing food in landfills releases methane gas.
FAO Food Waste and Climate Impact
This inspired me to build a simple and affordable system that could help people manage food more intelligently in everyday life. DecayDock uses Edge AI to recognize food items like vegetables, fruits, milk, and leftovers, then visually tracks freshness using color-based progress bars on a TFT display.
The goal of the project is not just technical innovation, but helping people build better habits, reduce waste, save money, and become more aware of food consumption in a practical and user-friendly way.
What This Project Does
DecayDock is an AI-powered smart fridge companion that helps users reduce household food waste by recognizing food items and tracking their freshness in real time.
Using an ESP32-CAM module and Edge AI, the system can identify common food items such as vegetables, fruits, milk, and leftovers when they are shown in front of the camera. After detection, the food item is automatically added to a simple inventory system displayed on the TFT screen.
The project then estimates the freshness of each item based on:
- food category
- storage duration
- expected shelf life
Freshness is displayed visually using color-based progress bars:
- Green → Fresh
- Yellow → Consume Soon
- Red → Expiring
This makes it easy for users to quickly understand which food should be consumed first before it gets wasted.
The project was created to solve a common real-life problem where food is forgotten inside refrigerators because of busy lifestyles and poor visibility. By making food tracking simple and visual, DecayDock encourages better food management habits, reduces unnecessary grocery waste, and promotes more sustainable living using affordable Edge AI technology.
This heavily inspired from circuit digest object detection : https://circuitdigest.com/microcontroller-projects/object-recognition-using-esp32-cam-and-edge-impulse (credit to : CircuitDigest)
Supplies
Supplies
To keep the project compact, practical, and easy to install on a refrigerator, all the electronics were designed inside a custom magnetic enclosure. The enclosure contains the ESP32-CAM module, TFT display, power section, and supporting components in a clean and organized layout.
The magnetic mounting allows the device to attach directly to the fridge door without any permanent installation, making it feel more like a real consumer product instead of a temporary prototype.
Electronics & Components
Component Quantity Purpose
ESP32-CAM Module 1 Edge AI food recognition and processing
2.4” TFT Display 1 Display inventory and freshness UI
FTDI Programmer 1 Upload code to ESP32-CAM
Magnetic Enclosure 1 Compact fridge-mounted housing
Neodymium Magnets 2–4 Attach enclosure to refrigerator
Breadboard / PCB 1 Circuit connections
Jumper Wires Few Wiring connections
5V USB Power Module 1 Device power supply
Push Buttons (Optional) 2 Menu and reset controls
RGB LED (Optional) 1 Visual freshness indicator
Software & Platforms
Software Purpose
- Arduino IDE
- Programming and firmware upload
- Edge Impulse
- TinyML food recognition model
- TFT_eSPI Library
- TFT graphics and UI
- ESP32 Camera Library
- Camera interface and image capture
Tools Used
ToolPurpose
- Soldering Iron
- Permanent electrical connections
- Hot Glue Gun
- Component mounting
- Screwdriver Set
- Enclosure assembly
- Wire Cutter & Stripper
- Cable preparation
- Laptop / PC
- Programming and AI training
- 3D Printer (Optional)
- Custom enclosure design
Details
The enclosure was designed to be minimal, lightweight, and refrigerator-friendly. The front section contains:
- ESP32-CAM lens opening
- TFT touchscreen display
- status indicator area
while the internal section houses:
- ESP32-CAM module
- wiring
- power connections
- optional LEDs and buttons
Magnets are mounted behind the enclosure so the device can easily stick to the refrigerator surface without damaging it.
This design keeps the project:
- compact
- portable
- clean-looking
- practical for everyday use
and gives the prototype a more polished product-like appearance.
Understanding the System
Step 1: Understanding the System
Before building the hardware, I first planned how the complete system would work in a simple and practical way.
The main idea behind DecayDock was to create a compact smart fridge companion that could recognize food items and help reduce household food waste using Edge AI.
The system is built around the ESP32-CAM module because it combines:
- a microcontroller
- camera
- WiFi capability
in one small and affordable board.
Instead of using expensive cloud-based AI systems, the project uses Edge Impulse TinyML to run the food recognition model directly on the ESP32-CAM. This makes the system faster, lightweight, and suitable for embedded applications.
The complete workflow of the system is simple:
Working Flow
- The user places a food item in front of the camera.
- The ESP32-CAM captures the image.
- The TinyML model identifies the food item.
- The detected item is displayed on the TFT screen.
- A freshness percentage and color status are generated.
- The system visually reminds users which food should be consumed first.
The project was designed to solve a real-life problem where vegetables and leftovers are often forgotten inside refrigerators until they spoil. By making food visibility easier and more interactive, the system encourages better food management habits and sustainable living.
At this stage, I also finalized the hardware structure:
- ESP32-CAM for AI and image processing
- TFT display for inventory and freshness UI
- Magnetic enclosure for fridge mounting
Wiring It Up
After understanding the system architecture, the next step was connecting the ESP32-CAM with the TFT display and preparing the hardware for AI food recognition.
The goal was to keep the circuit simple, compact, and reliable so it could easily fit inside the magnetic fridge enclosure.
The ESP32-CAM acts as the main controller for:
- camera image capture
- Edge AI processing
- display communication
- inventory logic
The TFT display is connected using SPI communication and is used to show:
- detected food name
- freshness percentage
- color freshness bars
- inventory information
Components Connected
- ESP32-CAM
- TFT Display
- FTDI Programmer
- Jumper wires
- 5V power supply
TFT Display Connections
TFT Display Pin ESP32-CAM Pin
VCC 3.3V
GND GND
SCK GPIO14
MOSI GPIO15
CS GPIO13
DC GPIO2
RST GPIO12
FTDI Programmer Connections
The FTDI programmer is used to upload code to the ESP32-CAM.
FTDI ESP32-CAM
5V 5V
GND GND
TX U0R
RX U0T
For uploading code:
- GPIO0 must be connected to GND
- Press the reset button after starting upload
Hardware Testing
After completing the wiring, I tested:
- camera initialization
- TFT display communication
- power stability
- serial communication
This step was important because stable wiring is necessary before deploying the Edge Impulse TinyML model.
Once the display and camera worked correctly, the hardware was ready for AI model integration and real-time food recognition.
Configuring TFT_eSPI
After wiring the TFT display, the next step was configuring the TFT_eSPI graphics library inside the Arduino IDE.
The TFT_eSPI library is used to:
- display food names
- draw freshness progress bars
- create the inventory interface
- render the smart fridge UI
This library is lightweight and optimized for ESP32-based displays, making it ideal for real-time embedded graphics.
Installing TFT_eSPI Library
Step 1
Open Arduino IDE.
Step 2
Go to:
Step 3
Search for:
Install the library by Bodmer.
Editing User Setup File
To make the TFT display work correctly with the ESP32-CAM, the pin configuration inside TFT_eSPI must be edited.
Open:
Configure Display Driver
Uncomment the display driver according to your TFT module.
Example for ILI9341:
Configure SPI Pins
Set the SPI pins according to the project wiring:
Set Display Resolution
Example:
Testing the Display
After configuration, upload a simple graphics test sketch to check:
- text rendering
- colors
- screen refresh
- SPI communication
Example test code:
Why This Step Is Important
Configuring TFT_eSPI correctly is important because the TFT display acts as the main user interface of DecayDock.
The display is responsible for showing:
- detected food items
- freshness status
- inventory list
- visual progress bars
- smart reminders
Setting Up Edge Impulse & Collecting the Dataset
After completing the hardware setup, the next step was training the AI model using Edge Impulse.
This is the most important part of the project because the food recognition system completely depends on how well the dataset is collected and trained.
For DecayDock, I wanted the AI model to recognize common refrigerator food items such as:
- tomatoes
- onions
- bananas
- milk packets
- spinach
- leftovers
The complete workflow was inspired by ESP32-CAM object recognition projects using Edge Impulse, but I customized the process specifically for food inventory and freshness tracking applications. (Circuit Digest)
Why I Chose Edge Impulse
I used Edge Impulse because it makes TinyML development easier for embedded systems like ESP32-CAM.
It provides:
- image dataset management
- image labeling
- model training
- testing
- Arduino library deployment
all in one platform.
Another important reason was that Edge Impulse models can run directly on ESP32-CAM without requiring cloud AI processing. This makes the system:
- faster
- low power
- offline capable
- more practical for daily use
Creating the Edge Impulse Project
Step 1 — Create Account
Go to:
Create an account and log in.
Step 2 — Create New Project
Click:
Project Name:
Project Type:
Preparing the Dataset
Instead of downloading random internet datasets, I collected my own images using the ESP32-CAM because I wanted the AI model to work in real refrigerator and kitchen conditions.
This improved:
- practical accuracy
- lighting adaptation
- real-world performance
Food Categories Used
To keep the model lightweight and optimized for ESP32-CAM, I trained only a few important food classes:
Food Item Images Collected
Tomato 50+
Onion 50+
Banana 45+
Milk Packet 40+
Spinach 40+
Leftovers 35+
Keeping fewer classes helped improve:
- detection speed
- memory usage
- model accuracy
Capturing Images
Image Collection Process
I used the ESP32-CAM to capture images from:
- different angles
- different distances
- multiple lighting conditions
I intentionally collected images:
- inside kitchen lighting
- near refrigerators
- with cluttered backgrounds
instead of clean studio conditions.
This helps the AI work better in actual daily environments.
Research and Edge Impulse community discussions also suggest that using real device images and multiple viewing angles improves TinyML accuracy significantly. (Edge Impulse Forum)
Important Tips I Followed
1. Fixed Camera Position
I kept the camera angle consistent during testing because stable positioning improves recognition reliability.
2. Plain Background for Initial Training
For early model training, I used a simple background to reduce false detections.
3. Good Lighting
Proper lighting helped improve image clarity and model learning.
4. Smaller Image Resolution
I used:
image size because smaller resolutions work better for embedded TinyML systems and reduce training time. (Medium)
Uploading the Dataset
After collecting images:
Step 1
Open:
inside Edge Impulse.
Step 2
Upload all collected images.
Split:
- 80% Training
- 20% Testing
Labeling Images
Next, I opened:
and manually labeled each image according to the food category.
Example:
- Tomato
- Onion
- Banana
- Milk
This step teaches the AI model how to identify different food items.
Creating the Impulse
After labeling:
Open:
Settings used:
SettingValue
Image Size
96×96
Processing Block
Image
Learning Block
Object Detection
Training the AI Model
Inside:
I trained the TinyML model using Edge Impulse FOMO architecture because it is optimized for ESP32-CAM devices.
The model learns:
- shapes
- colors
- textures
- object outlines
to recognize food items.
Testing Model Accuracy
After training, I tested the model directly inside Edge Impulse.
The model successfully identified:
- tomatoes
- onions
- bananas
- milk packets
with good stability under normal indoor lighting.
Exporting the Arduino Library
After successful training:
Step 1
Open:
Step 2
Select:
Step 3
Download the generated ZIP library.
Installing the AI Library
Extract the downloaded ZIP file and move the generated Edge Impulse library into:
Restart Arduino IDE.
The AI model is now ready to run directly on the ESP32-CAM.
Why This Step Was Important
This step transformed DecayDock from a normal ESP32 camera project into an actual Edge AI system.
Instead of manually entering food items, the device can now:
- recognize food automatically
- process images locally
- work without cloud AI
- provide real-time smart inventory assistance
This is what makes the project:
- practical
- intelligent
- lightweight
- sustainability-focused
while still remaining affordable and maker-friendly. (Circuit Digest)
Testing and Demonstration
After successfully deploying the Edge Impulse model to the ESP32-CAM, the next step was testing the complete food recognition system in real-world conditions.
This was one of the most important stages because I wanted the project to work reliably inside an actual kitchen environment instead of only working under ideal lighting conditions.
The main goal during testing was to verify:
- food recognition accuracy
- display response
- real-time detection speed
- stability under different lighting conditions
Uploading the Final Code
The exported Edge Impulse Arduino library was integrated into the main ESP32-CAM firmware inside Arduino IDE.
The code handled:
- camera initialization
- TinyML inference
- food detection
- TFT display updates
- freshness bar rendering
After uploading the code, the ESP32-CAM started running the AI model directly on-device without cloud processing.
Real-Time Detection Testing
To test the system, I placed different food items in front of the camera one by one.
Examples tested:
- tomato
- onion
- banana
- milk packet
- spinach
The ESP32-CAM successfully identified the food item and displayed:
- item name
- freshness percentage
- color freshness status
on the TFT display.
Freshness UI Testing
The freshness system was tested using simulated storage durations.
Example:
- newly added tomato → green freshness bar
- older stored spinach → yellow warning bar
- expired milk → red status indicator
This helped create a simple and intuitive visual system that users can understand instantly.
Lighting Condition Testing
One challenge during testing was varying refrigerator and kitchen lighting.
The model performed best under:
- moderate indoor lighting
- stable camera positioning
- minimal reflections
To improve reliability, I:
- adjusted camera angle
- increased training image variety
- tested under different room conditions
This improved overall detection consistency.
Performance Observations
The system achieved:
- fast object detection
- smooth TFT updates
- stable Edge AI inference
- low hardware power consumption
Because the AI model runs directly on the ESP32-CAM, the project works offline without requiring internet connectivity.
Making UI at Display
After testing the AI food recognition system, the next step was creating a clean and user-friendly interface for the TFT display.
The main goal of the UI was to make the system feel like a real smart appliance instead of just a hardware prototype. I wanted users to instantly understand:
- which food item was detected
- how fresh it is
- which food should be consumed first
using simple visual elements.
UI Design Concept
The interface was designed with a minimal and modern layout inspired by:
- smart kitchen devices
- IoT dashboards
- food delivery applications
The TFT screen displays:
- food image
- detected food name
- AI confidence score
- freshness percentage
- animated progress bar
- freshness color indicator
Example:
🥬 Spinach
Freshness: 72%
🟩🟩🟩🟩🟩🟨⬜⬜⬜⬜
This creates a much more interactive and understandable user experience compared to plain text output.
Software & Libraries Used
Software / LibraryPurpose
Arduino IDE
Main programming environment
TFT_eSPI
TFT graphics rendering
TJpg_Decoder
Display food images
SPI Library
SPI communication
Edge Impulse Library
AI food recognition
ESP32 Camera Library
Camera interface
Why I Used TFT_eSPI
The TFT_eSPI library was chosen because it is:
- lightweight
- fast
- optimized for ESP32
- ideal for embedded UI graphics
It allows:
- drawing shapes
- rendering text
- creating progress bars
- displaying images
- smooth screen updates
This helped make the interface look more polished and responsive.
Displaying Food Images
To improve the visual experience, I added small food images/icons on the TFT display.
Example:
- tomato image
- onion image
- banana image
The images were converted into:
and displayed using the:
library.
The images are stored inside ESP32 flash memory and loaded dynamically after food detection.
Creating the Freshness Progress Bar
The progress bar is one of the main UI elements of DecayDock.
The bar visually represents freshness condition:
Color Meaning
Green Fresh
Yellow Consume Soon
Red Expiring
The progress value decreases over time based on:
- food type
- estimated shelf life
- storage duration
This creates a very intuitive user experience because users can understand freshness instantly without reading detailed information.
UI Layout Structure
The final UI layout contains:
Top Section
- AI detected food name
- confidence score
Middle Section
- food image/icon
Bottom Section
- freshness percentage
- progress bar
- consume reminder
Example:
“Use spinach today.”
Designing the UI
The UI was designed directly using TFT graphics functions inside Arduino IDE.
Main functions used:
- fillScreen()
- drawRect()
- fillRect()
- drawBitmap()
- setCursor()
- print()
These functions helped create:
- boxes
- labels
- progress bars
- image placeholders
- animated indicators
Example UI Code
Real-Time UI Updates
Whenever the AI model detects a new food item:
- the previous screen clears
- new food image loads
- progress bar updates
- freshness value changes
This creates a smooth smart-device style experience.
Why This Step Was Important
The UI transformed the project from:
a basic AI detection demo
into:
a practical smart kitchen product prototype.
Instead of showing complicated technical outputs, the system communicates information using:
- images
- colors
- progress bars
- simple reminders
which makes the device:
- easier to use
- visually appealing
- beginner-friendly
- more realistic for everyday users
Assembly at Encloser
After completing the hardware testing and TFT interface design, the final step was assembling all the components inside a compact magnetic enclosure.
The main goal during assembly was to make the project look and feel like a real smart home product instead of a temporary breadboard prototype.
I wanted the device to:
- mount easily on a refrigerator
- remain compact and lightweight
- protect the electronics
- keep wiring organized
- improve overall presentation quality
Enclosure Design
The enclosure was designed as a small rectangular fridge-mounted module with:
- front camera opening
- TFT display cutout
- internal space for ESP32-CAM
- cable management section
- rear magnetic mounting support
The design keeps the front side clean while hiding most wiring and electronics internally.
Components Mounted Inside
The following components were fixed inside the enclosure:
ComponentPlacement
ESP32-CAM
Rear internal section
TFT Display
Front display opening
Wiring Connections
Side cable channels
USB Power Cable
Bottom exit slot
Magnets
Rear panel
Mounting the TFT Display
The TFT display was aligned carefully with the front display window so the interface remained clearly visible.
To secure the display:
- hot glue
- double-sided foam tape
- small mounting supports
were used.
I intentionally left a slight bezel around the screen to give it a more realistic consumer-device appearance.
Positioning the ESP32-CAM
The ESP32-CAM was mounted behind the front panel with the camera aligned through a circular camera opening.
This positioning helped:
- improve image capture angle
- protect the lens
- reduce visible wiring
- maintain a cleaner design
The camera angle was adjusted slightly downward because most food items would be scanned from below during testing.
Magnetic Fridge Mount
To make installation simple and user-friendly, strong neodymium magnets were attached behind the enclosure.
This allowed the device to:
- stick directly to the refrigerator
- move easily when needed
- avoid drilling or permanent installation
The magnetic mounting system also made the prototype feel more like an actual smart kitchen accessory.
Cable Management
During assembly, special attention was given to cable management because exposed wires can make prototypes look unfinished.
To improve the appearance:
- wires were shortened
- cable ties were added
- internal routing was organized
- extra jumper wires were removed
This gave the project a cleaner and more professional hardware-maker look.
Final Power Setup
The system is powered using:
through the ESP32-CAM module.
The USB cable exits through a small slot at the bottom of the enclosure to keep the front side minimal and uncluttered.
Final Testing After Assembly
After enclosure assembly, I performed one final system test to verify:
- camera visibility
- TFT display readability
- stable power connection
- food recognition performance
- enclosure heat management
The device successfully operated while mounted vertically on a refrigerator surface.
Code
Conclusion
Building DecayDock was much more than just making another electronics project. It started from a very simple real-life observation inside my home seeing vegetables and leftovers getting wasted because they were forgotten inside the refrigerator. At first, it looked like a small daily habit, but while researching, I realized how strongly household food waste is connected to larger global issues like hunger, climate change, methane emissions, and unnecessary resource consumption.
What made this journey special for me was the process of turning a small idea into a working Edge AI product using affordable maker hardware. From collecting food datasets using the ESP32-CAM, training TinyML models in Edge Impulse, debugging wiring problems, designing the TFT interface, and finally assembling everything inside the magnetic enclosure — every stage taught me something new.
There were many moments where things did not work properly:
- wrong detections
- unstable wiring
- display issues
- lighting problems during testing
but solving those challenges was also the most enjoyable part of the project. Watching the system finally recognize a tomato or onion in real time on the TFT screen genuinely felt exciting because it transformed the project from just code and circuits into something interactive and meaningful.
One thing I learned during this project is that sustainability does not always require large industrial systems or expensive technology. Sometimes even a small device placed on a refrigerator can help people become more aware of their daily habits and reduce waste little by little.
DecayDock represents the idea that technology should not only be smart, but also responsible and human-centered. The project combines:
- embedded AI
- sustainability
- everyday usability
- simple human behavior
into one practical solution.
Most importantly, this project reminded me why I enjoy hardware making so much — the ability to take a real-world problem, experiment creatively, learn through failures, and finally build something that could genuinely help people in daily life.