AI Powered RC Car

by waffle566 in Circuits > Electronics

145 Views, 3 Favorites, 0 Comments

AI Powered RC Car

DSC_0691.JPG
DSC_0693.JPG
DSC_0696.JPG
DSC_0695.JPG
DSC_0691.JPG

This project is an AI-powered RC car that autonomously follows a person using computer vision, responds to hand gestures for control, and avoids obstacles using an ultrasonic sensor.

The car uses a Raspberry Pi 5 with a camera as its brain, running Google's MediaPipe library for real-time person detection and hand gesture recognition. An Arduino Nano handles the low-level motor control, receiving serial commands from the Pi and driving four TT gearbox motors through an L298N motor driver in a tank-style configuration.

When powered on, the car automatically locks onto the first person it sees and begins following them. The owner can control the car entirely through hand gestures — an open palm stops the car, a thumbs up resumes following, pointing left or right redirects the car to a new target, and waving brings it back to the owner. An ultrasonic sensor on the front acts as an emergency bumper, stopping the car if it gets within 30cm of any obstacle.

The entire system starts automatically on boot — no monitor, keyboard, or manual commands needed. Just flip the power switch and the car is ready within 30 seconds.

This guide covers everything from wiring the hardware, installing the software, testing and calibrating each subsystem, and assembling everything into a 3D-printed enclosure.


(Images of the enclosure may look different because I printed v1 and the stl that is linked here is improved)

Supplies

Hardware:

  1. L298N Motor Driver
  2. Mini Breadboard
  3. TT Motor, 4
  4. 7.2V RC Battery, 5000MaH, any works fine but 7.2V or more is preferred
  5. Raspberry Pi and Raspberry Pi Camera
  6. Heatsink for Raspberry Pi (Optional)
  7. Arduino Nano
  8. Ultrasonic Sensor
  9. USB-C Plug into wire converter
  10. Power Switch
  11. Jumper Wires (M to M and F to M)
  12. Some sort of storage device for the Raspberry Pi (MicroSD Card is preferred)
  13. Threaded inserts (RX-M3x5.7 & RX-M2x4)
  14. Raw Wire (Multicolor is preferred, at least 20AWG)
  15. Tamiya Plug Converter into Raw Wire (Male)
  16. Buck Converter (You can use any, but make sure it is rated for over 5A or your Raspberry Pi will crash while booting up the AI)
  17. Bunch of different types of screw lengths (screws needed are M2's and M3's)
  18. Raspberry Pi 5 Adapter Cable

Tools:

  1. Soldering Iron
  2. Wire Stripper
  3. Screwdriver
  4. Wire Cutters
  5. Glue Gun

Connecting the Hardware

Overview

This AI-powered RC car is built with autonomous person-following, gesture-based control, and obstacle avoidance in mind. The system integrates computer vision, real-time motor control, and ultrasonic sensing to create a robot that can identify, lock onto, and follow a person — while responding to hand gestures and avoiding collisions.

The following are the main hardware blocks in the system.

Processing Unit

At the core of the robot is a Raspberry Pi 5 (8GB RAM) running Raspberry Pi OS Bookworm (64-bit). This quad-core ARM Cortex-A76 single-board computer was selected for its ability to run real-time AI inference using MediaPipe and OpenCV, handling person detection, pose estimation, and hand gesture recognition simultaneously through a camera feed. The Pi serves as the brain of the system — capturing frames, making tracking decisions, and sending motor commands over serial.

Motor Control Unit

An Arduino Nano microcontroller acts as the dedicated motor controller, receiving serial commands from the Raspberry Pi over USB. This separation isolates the time-sensitive PWM motor signals from the Linux operating system's non-deterministic timing, and protects the Pi from electrical noise generated by the motors. The Arduino handles all low-level motor logic including speed control via PWM, direction switching, and a 500ms safety timeout that automatically stops the motors if communication with the Pi is lost.

Vision System

A Raspberry Pi Camera Module 3 is connected via CSI ribbon cable to the Pi's CAM0 port, mounted at the front of the car and angled upward approximately 20–30 degrees. This upward tilt allows the low-mounted camera to capture a standing person's full body and hands from 1–2 meters away, which is essential for both person tracking via MediaPipe Pose and hand gesture recognition via MediaPipe Hands.

Motor Driver and Motors

For propulsion, the robot uses four AEDIKO TT DC gearbox motors (rated 3–6V, 200RPM, 1:48 gear ratio), arranged in a tank-style configuration — two on the left, two on the right. Each pair is wired in parallel and driven by one channel of an L298N dual H-bridge motor driver. The left pair connects to outputA (controlled by Arduino pins D3, D2, and D4) and the right pair connects to output B (controlled by Arduino pins D5, D7, and D8). The tank-style differential steering allows the car to turn by varying the speed between the left and right motor pairs, including spinning in place for sharp turns.

Obstacle Detection

An HC-SR04 ultrasonic distance sensor is mounted low on the front of the car, pointing straight ahead parallel to the ground. It provides a complementary layer of awareness to the camera — while the camera handles high-level person tracking and gesture recognition, the ultrasonic sensor acts as a low-level emergency bumper guard, triggering an immediate stop when any obstacle is detected within 30cm. The sensor is connected to Arduino pins D9 (trigger) and D10 (echo) and is polled continuously in a background thread.

Power Management

The system is powered by a 7.2V 5000mAh battery pack, providing sufficient capacity for extended operation. Power is distributed to two separate subsystems. The battery feeds the L298N motor driver directly through its +12v power terminal (the L298N accepts 7–12V despite the label), which in turn delivers approximately 5.8V to the motors after its internal 1.4V voltage drop — safely within the motors' 3–6V rating. A separate buck converter (set to 5.1V output) steps the battery voltage down to power the Raspberry Pi 5 via USB-C, ensuring a clean and stable 5V at 3A+ supply. The Arduino Nano is powered through USB from the Pi. A physical toggle switch is wired inline with the battery's positive lead, allowing the entire system to be powered on and off with a single flip.

Auto-Start System

A systemd service is configured on the Raspberry Pi to automatically launch the AI controller on boot, eliminating the need for a keyboard, monitor, or manual commands. The complete startup flow is: flip the power switch, the Pi boots, and the AI begins tracking within approximately 30 seconds.




Step-by-Step Connection Instructions

The following instructions describe how to connect every component, starting from the Raspberry Pi and working outward. Complete each step in order before moving to the next.

Step 1.1 — Install the Camera on the Raspberry Pi

Locate the CAM0 port on the Raspberry Pi 5. It is a small flat connector with a black plastic latch. Gently lift the black latch upward with your fingernail. Take the Pi Camera Module 3's ribbon cable and slide it into the connector with the metal contacts facing the circuit board. Press the black latch back down until it clicks. The ribbon cable should be firmly seated and not pull out easily. Mount the camera at the front of the car, angled upward approximately 20–30 degrees so it can see a standing person from about 1.5 meters away.

Step 1.2 — Connect the Raspberry Pi to the Arduino Nano

Take a USB cable with a USB-A plug on one end and a mini-USB (or micro-USB, depending on your Nano clone) plug on the other end. Plug the USB-A end into any of the USB-A ports on the Raspberry Pi 5. Plug the other end into the Arduino Nano. This single cable carries both the 5V power to run the Arduino and the serial data used for motor commands. No additional wiring is needed between these two boards.

Step 1.3 — Prepare the L298N Motor Driver

Before connecting any wires to the L298N, locate the two small plastic jumper caps sitting on the A enable and B enable pins. These are small black or blue plastic clips bridging two pins. Pull both of them off and set them aside — you will not need them again. Removing these jumpers allows the Arduino to control motor speed through PWM. If they are left on, the motors will only run at full speed with no speed control. Leave the 5V enable jumper in place — this one stays on to power the L298N's internal logic from the battery.

Step 1.4 — Wire the Battery to the Power Switch

Take your 7.2V 5000mAh battery pack. Cut the red (+) wire and connect each cut end to one terminal of a toggle switch or rocker switch (rated for at least 3A at 12V). When the switch is flipped on, current flows from the battery to the rest of the system. When flipped off, everything loses power. The black (−) wire from the battery remains uncut.

Step 1.5 — Wire the Battery to the L298N Motor Driver

From the switch's output side, run the red (+) wire to the L298N's +12v power screw terminal (the leftmost terminal in the power section). Despite the label saying 12V, the L298N accepts 7–12V, so your 7.2V battery works fine. Connect the battery's black (−) wire to the power GND screw terminal, which is the terminal directly next to the +12v power terminal. Do not connect anything to the +5 power terminal — leave it empty.

Step 1.6 — Wire the Battery to the Buck Converter

You need to split the battery power so it also feeds the buck converter. Run a second red (+) wire from the switch's output side to the buck converter's IN+ terminal. Run a second black (−) wire from the battery's negative terminal (or the L298N's power GND terminal, since they share the same ground) to the buck converter's IN− terminal. Before connecting anything else, use a multimeter to measure the output side of the buck converter. Turn the small screw on the buck converter until the output reads exactly 5.1V. Getting this voltage right before connecting the Pi is important — too high could damage the Pi, too low will cause instability.

Step 1.7 — Wire the Buck Converter to the Raspberry Pi

Connect the buck converter's OUT+ to the Pi 5's USB-C power input. You can do this by either using a buck converter module that has a built-in USB-C output, or by cutting a USB-C cable, identifying the red (5V) and black (GND) wires inside, and connecting them to the buck converter's OUT+ and OUT− respectively. The Raspberry Pi 5 requires a solid 5V at 3A minimum. For development and testing, use a proper USB-C wall charger instead and save the buck converter for when the car is running untethered.

Step 1.8 — Wire the Arduino Nano to the L298N (Signal Wires)

Connect the following six signal wires from the Arduino Nano to the L298N motor driver. These are all 5V logic-level signals carried by standard jumper wires:

  1. Arduino pin D3 to the A enable pin on the L298N. This controls the left motor speed via PWM.
  2. Arduino pin D5 to the B enable pin on the L298N. This controls the right motor speed via PWM.
  3. Arduino pin D2 to IN1 on the L298N. This is the left motor forward signal.
  4. Arduino pin D4 to IN2 on the L298N. This is the left motor reverse signal.
  5. Arduino pin D7 to IN3 on the L298N. This is the right motor forward signal.
  6. Arduino pin D8 to IN4 on the L298N. This is the right motor reverse signal.

Step 1.9 — Wire the Arduino Ground to the L298N

Run a wire from any GND pin on the Arduino Nano to the power GND screw terminal on the L298N (the same terminal where the battery's black wire is connected). This shared ground connection is essential — without it, the 5V signal wires will not be referenced correctly and the motor driver will not respond to commands.

Step 1.10 — Wire the Left Motors to the L298N

Take the two left-side TT motors (left front and left rear). Each motor has a red wire (+) and a black wire (−). Twist the two red wires together and screw them into the first slot of the outputA screw terminal (output 1). Twist the two black wires together and screw them into the second slot of the outputA screw terminal (output 2). This parallel wiring means both left motors receive the same voltage and spin together as a single unit.

Step 1.11 — Wire the Right Motors to the L298N

Take the two right-side TT motors (right front and right rear). Twist both red wires together and screw them into the first slot of the output B screw terminal (output 3). Twist both black wires together and screw them into the second slot of the output B screw terminal (output 4). If a motor spins in the wrong direction during testing, swap that side's red and black wires at the screw terminal — no code changes are needed.

Step 1.12 — Wire the HC-SR04 Ultrasonic Sensor

The HC-SR04 has four pins in a row labeled VCC, TRIG, ECHO, and GND. Connect them as follows:

  1. VCC to the Arduino Nano's 5V pin. This provides power to the sensor.
  2. TRIG to Arduino pin D9. The Arduino sends a short pulse on this pin to trigger a measurement.
  3. ECHO to Arduino pin D10. The sensor returns a pulse on this pin whose duration indicates the distance.
  4. GND to any GND pin on the Arduino Nano.

Mount the HC-SR04 on the front of the car, facing straight ahead, positioned low and parallel to the ground. It should be below the camera so it catches ground-level obstacles like chair legs, walls, and furniture.

Step 1.13 — Final Check Before Powering On

Before flipping the power switch, verify the following connections:

  1. The battery's red wire passes through the toggle switch before splitting to the buck converter and the L298N's +12v power terminal.
  2. The battery's black wire connects to the L298N's power GND terminal and the buck converter's IN−.
  3. The buck converter output is set to 5.1V and connected to the Pi via USB-C.
  4. The A enable and B enable jumper caps on the L298N have been removed.
  5. The 5V enable jumper on the L298N is still in place.
  6. Six signal wires run from the Arduino (D2, D3, D4, D5, D7, D8) to the L298N (IN1, A enable, IN2, B enable, IN3, IN4).
  7. One ground wire runs from the Arduino GND to the L298N power GND.
  8. The left motors' red and black wires are twisted together and screwed into outputA (outputs 1 and 2).
  9. The right motors' red and black wires are twisted together and screwed into output B (outputs 3 and 4).
  10. The HC-SR04 is connected to the Arduino's 5V, GND, D9, and D10 pins.
  11. The Pi Camera ribbon cable is seated in the CAM0 port with the latch pressed down.
  12. The USB cable connects the Pi's USB-A port to the Arduino Nano.

Once everything is verified, the hardware is fully connected and ready for software installation in the next step.

Uploading the Software

Software Overview

The software stack is split across two devices — the Arduino Nano and the Raspberry Pi 5 — each running code tailored to its role in the system.

The Arduino Nano runs a lightweight C firmware that listens for serial commands from the Pi and translates them into PWM motor signals and direction pin states on the L298N. It also reads the HC-SR04 ultrasonic sensor on demand and reports distance values back over serial. The firmware includes a 500ms safety timeout — if the Pi stops sending commands for any reason, the Arduino automatically stops all motors to prevent the car from driving uncontrolled.

The Raspberry Pi 5 runs the AI stack in Python. This consists of four modules working together: a person and object tracker using MediaPipe Pose and OpenCV, a hand gesture recognizer using MediaPipe Hands, a serial communication handler that sends drive commands to the Arduino, and a main controller that ties everything together into a real-time control loop. On each camera frame, the Pi identifies the target person, checks for hand gestures, reads the obstacle distance, decides what the car should do, and sends the appropriate motor command — all within a single frame cycle.

The Pi runs Raspberry Pi OS Bookworm (64-bit), which was selected specifically because it ships with Python 3.11, the latest version supported by Google's MediaPipe library. The newer Trixie release ships with Python 3.13, which MediaPipe does not yet support.

A systemd service is configured to launch the AI controller automatically on boot, so the car begins operating within approximately 30 seconds of being powered on — no monitor, keyboard, or manual commands required.

Step-by-Step Software Installation

Step 2.1 — Flash Raspberry Pi OS Bookworm to the SD Card

On a separate computer (not the Pi), download and install Raspberry Pi Imager from https://www.raspberrypi.com/software/. Insert a microSD card (16GB or larger recommended) into the computer. Open Raspberry Pi Imager and select the following:

  1. Device: Raspberry Pi 5
  2. Operating System: Raspberry Pi OS (Legacy, 64-bit) — this is Bookworm
  3. Storage: your microSD card

Before flashing, click the settings icon and configure the following:

  1. Set a hostname (e.g. raspberrypi)
  2. Set a username and password (e.g. admin / your chosen password)
  3. Enable SSH under the Services tab
  4. Enter your Wi-Fi network name and password

Click Write and wait for the flash to complete. Insert the SD card into the Raspberry Pi 5 and power it on using a USB-C wall charger (not the buck converter — use a proper power supply for setup).

Step 2.2 — Connect to the Raspberry Pi

Wait approximately 60 seconds for the Pi to boot. Connect to it from another computer using SSH:



ssh admin@raspberrypi.local

Enter the password you set during flashing. If the hostname does not resolve, find the Pi's IP address from your router's admin page and use that instead:



ssh admin@192.168.x.x

Alternatively, connect a monitor, keyboard, and mouse directly to the Pi and open a terminal window.

Step 2.3 — Update the System

Run the following commands to ensure all system packages are up to date:



sudo apt update && sudo apt upgrade -y

This may take several minutes depending on your internet speed.

Step 2.4 — Install System Dependencies

Install the packages required by the AI software:



sudo apt install -y python3-pip python3-venv python3-opencv espeak-ng libcap-dev libatlas-base-dev

These provide Python package management, the OpenCV computer vision library, a text-to-speech engine (for future speaker support), and linear algebra libraries used by MediaPipe.

Step 2.5 — Add the User to the Serial Group

The Arduino communicates with the Pi over a USB serial connection. By default, the user does not have permission to access serial ports. Run the following command to grant access:



sudo usermod -a -G dialout admin

After running this command, log out and log back in for the change to take effect:



logout

Then reconnect via SSH or reopen the terminal.

Step 2.6 — Upload the Arduino Firmware

This step is done on any computer with the Arduino IDE installed — it does not need to be done on the Pi.

  1. Open the Arduino IDE.
  2. Open the file motor_controller.ino (bottom of this step, ino file) from the project files.
  3. Connect the Arduino Nano to the computer via USB.
  4. Go to Tools → Board and select "Arduino Nano".
  5. Go to Tools → Processor and select "ATmega328P (Old Bootloader)" if the Nano is a clone, or "ATmega328P" if it is genuine.
  6. Go to Tools → Port and select the port that appeared when the Nano was plugged in.
  7. Click Upload and wait for the upload to complete.

To verify the firmware is working, open the Serial Monitor (set to 9600 baud) and type S followed by Enter. The Arduino should respond with OK. Type D followed by Enter and it should respond with DIST, followed by a distance value in centimetres.

After uploading, disconnect the Arduino from the computer. It will be connected to the Pi in a later step.

Step 2.7 — Copy the Project Files to the Raspberry Pi

Transfer the project folder rc-car-ai (bottom of this page) to the Raspberry Pi. There are several ways to do this:

Using SCP from another computer on the same network:



scp -r rc-car-ai admin@raspberrypi.local:/home/admin/Desktop/

Using a USB flash drive: copy the folder to a USB drive, plug it into the Pi, and copy the files to /home/admin/Desktop/rc-car-ai.

The project folder should contain the following files in /home/admin/Desktop/rc-car-ai/:

  1. main.py — the main AI controller
  2. tracker.py — person and object tracking module
  3. gesture.py — hand gesture recognition module
  4. motor_serial.py — serial communication with the Arduino
  5. speaker.py — voice alert module (for future use)
  6. requirements.txt — Python package list

Step 2.8 — Create the Python Environment and Install Packages

Navigate to the project folder and create a virtual environment with access to system-installed packages:



cd /home/admin/Desktop/rc-car-ai
python3 -m venv --system-site-packages env
source env/bin/activate
pip install mediapipe pyserial numpy

The --system-site-packages flag is important — it allows the virtual environment to use the system-installed OpenCV and picamera2 libraries, which are more reliable on the Pi than pip-installed versions. The MediaPipe installation may take a few minutes.

Step 2.9 — Test the Software

Plug the Arduino Nano into one of the Pi's USB-A ports. Verify the Pi can see it:



ls /dev/ttyUSB* /dev/ttyACM*

This should show a device like /dev/ttyUSB0 or /dev/ttyACM0. Then run the AI controller with the debug camera preview:



cd /home/admin/Desktop/rc-car-ai
source env/bin/activate
python3 main.py --debug

If a monitor is connected, a camera preview window will appear showing the live feed with tracking overlays. The terminal should display:



[MotorSerial] Connected to Arduino on /dev/ttyUSB0
[Main] Initializing tracker...
[Main] Initializing gesture recognizer...
[Main] Pi Camera initialized via picamera2
[Main] Running! Press Ctrl+C to stop.

Stand in front of the camera. When the system detects a person, it will print [Main] Owner locked! and begin following. Press Ctrl+C to stop.

If running without a monitor, omit the --debug flag:



python3 main.py

Step 2.10 — Configure Auto-Start on Boot

Create a systemd service file so the AI controller launches automatically every time the Pi boots:



sudo nano /etc/systemd/system/rc-car-ai.service

Paste the following content into the editor:



[Unit]
Description=AI RC Car Controller
After=network.target

[Service]
Type=simple
User=admin
WorkingDirectory=/home/admin/Desktop/rc-car-ai
ExecStart=/home/admin/Desktop/rc-car-ai/env/bin/python3 /home/admin/Desktop/rc-car-ai/main.py
Restart=on-failure
RestartSec=5
SupplementaryGroups=video dialout audio

[Install]
WantedBy=multi-user.target

Save the file by pressing Ctrl+O, then Enter, then Ctrl+X to exit the editor. Enable the service:



sudo systemctl daemon-reload
sudo systemctl enable rc-car-ai

Step 2.11 — Verify Auto-Start

Reboot the Pi to confirm the service starts automatically:



sudo reboot

Wait approximately 30 seconds, then reconnect via SSH and check the service status:



sudo systemctl status rc-car-ai

The output should show "active (running)" in green. To view the live AI logs:



journalctl -u rc-car-ai -f

Press Ctrl+C to stop watching the logs.

To temporarily stop the auto-start service for manual testing:



sudo systemctl stop rc-car-ai

To disable auto-start entirely:



sudo systemctl disable rc-car-ai

Step 2.12 — Software Installation Complete

At this point, both the Arduino and the Raspberry Pi are running their respective software. The Arduino is listening for serial motor commands, and the Pi is running the AI controller that captures camera frames, tracks people, recognizes gestures, reads obstacle distances, and sends drive commands — all starting automatically on boot with no user interaction required beyond flipping the power switch.

Testing and Calibration

Overview

With the hardware connected and software installed, this step verifies that every component works correctly before the system is assembled into the enclosure. Each subsystem is tested individually first, then everything is tested together as a complete unit.

Step-by-Step Testing


Step 3.1 — Power On

Connect the Pi to a USB-C wall charger (not the battery — use a reliable power source for testing). Plug the Arduino Nano into the Pi's USB port. Stop the auto-start service so you can run the software manually with visible output:



sudo systemctl stop rc-car-ai
cd /home/admin/Desktop/rc-car-ai
source env/bin/activate

Step 3.2 — Test the Motors

Open a Python shell to send motor commands directly:



python3 -c "
from motor_serial import MotorSerial
import time
m = MotorSerial()
m.connect()
time.sleep(2)
m.forward(150)
time.sleep(2)
m.stop()
m.spin_left(150)
time.sleep(1)
m.stop()
m.spin_right(150)
time.sleep(1)
m.stop()
m.close()
"

Watch the motors during this test. The car should drive forward for 2 seconds, spin left for 1 second, then spin right for 1 second. If a side drives backward when it should go forward, swap the red and black wires for that side at the L298N output terminal.

Step 3.3 — Test the Ultrasonic Sensor



python3 -c "
from motor_serial import MotorSerial
m = MotorSerial()
m.connect()
import time; time.sleep(2)
for i in range(10):
d = m.request_distance()
print(f'Distance: {d:.1f} cm')
time.sleep(0.5)
m.close()
"

Place your hand in front of the HC-SR04 sensor at various distances. The readings should roughly match the actual distance. If it always reads 999.0, check the TRIG and ECHO wires are connected to the correct Arduino pins (D9 and D10).

Step 3.4 — Test the Camera and Tracking

Run the AI controller with the debug preview (requires a monitor connected to the Pi):



python3 main.py --debug

Stand approximately 1.5 meters in front of the camera. A green bounding box should appear around you in the preview window, and the terminal should print [Main] Owner locked!. If the preview mostly shows floor, tilt the camera up. If it mostly shows ceiling, tilt it down.

Step 3.5 — Test the Gestures

While the debug preview is running, test each gesture:

  1. Open palm — hold your hand up with all fingers spread. The terminal should print [Main] GESTURE: STOP and the motors should stop.
  2. Thumbs up — the terminal should print [Main] GESTURE: RESUME and the car should start following again.
  3. Point left or right — extend only your index finger and point. The terminal should print [Main] GESTURE: POINT LEFT or POINT RIGHT.
  4. Wave — wave your hand side to side with fingers open. The terminal should print [Main] GESTURE: RETURN TO OWNER.

Hold each gesture steadily for about 1 second. If gestures are not detected, move closer to the camera or ensure your hand is well-lit.

Step 3.6 — Adjust Speed

If the car drives too fast or aggressively, you can reduce the maximum speed. Stop the program with Ctrl+C and restart with a lower speed:



python3 main.py --debug --speed 120

The speed value ranges from 0 to 255. Start with 120 for indoor testing and increase once you are comfortable with the car's behaviour.

To make the speed change permanent, edit the systemd service file:



sudo nano /etc/systemd/system/rc-car-ai.service

Change the ExecStart line to include the speed flag:



ExecStart=/home/admin/Desktop/rc-car-ai/env/bin/python3 /home/admin/Desktop/rc-car-ai/main.py --speed 120

Save, then reload:



sudo systemctl daemon-reload

Step 3.7 — Test on Battery Power

Once everything works on wall power, disconnect the USB-C wall charger and power the Pi from the buck converter and battery. Flip the power switch and wait 30 seconds. SSH into the Pi and check:



sudo systemctl status rc-car-ai

If the Pi crashes or reboots under load, the buck converter cannot supply enough current. You will need a buck converter rated for at least 5V at 5A output, or a USB-C power bank that supports 5V/4A.



If your RC Car moves a little slow you can change these lines:

Find these lines near the top of the `CarController` class (around line 30):

APPROACH_SPEED_FAST = 200

APPROACH_SPEED_SLOW = 120

TURN_SPEED = 160

SEARCH_SPIN_SPEED = 100

Change them to:

APPROACH_SPEED_FAST = 255

APPROACH_SPEED_SLOW = 200

TURN_SPEED = 220

SEARCH_SPIN_SPEED = 180


Also Change:

inner_speed = max(inner_speed, 40)

Change it to:

inner_speed = max(inner_speed, 140)

Assembling the Enclosure

Screenshot 2026-03-30 173324.png

Once your done with troubleshooting the raw components, you can finally move on to actually placing them into the enclosure. The file below will take about 20 hours to print and uses roughly 500g of filament. Once that is finished you can begin to insert the threaded inserts into the allocated holes in the design.



Look inside your enclosure and you will see different sections for different hardware parts. To start grab your soldering iron and insert 4 RX-M3x5.7 inserts into the 4 allocated holes for the motor driver. Those will be the only RX-M3x5.7 inserts you will need for this. For the rest of the holes, insert RX-M2x4 inserts. Once finished, use your screws to screw your components into the enclosure. The hole in the back of the enclosure is for your power switch, and the 2 holes at the front is for the camera (upper hole) and for the ultrasonic sensor (lower hole). Both the power switch and ultrasonic sensor will need to be hot glued into their places as there are no screw holes for them.


Once your finished with inserting all the parts, inserts, and screwing everything into place, it is time to test it. Plug in the battery into the Tamiya plug converter and flick the switch. After about 30s the AI should start and the RC Car will begin following you. Feel free to use your hot glue gun to glue in any components that are loose!

Have fun with this project!