Autonomous Search and Rescue Robot#

Overview#

A fully autonomous “search and rescue” robot integrating dual sensing (vision + IR reflectance), YOLO object detection, articulated arm with inverse kinematics, and a split system architecture (Raspberry Pi high‑level + ESP32 real‑time control).

Highlights#

  • Project Management + Advanced Problem Solving
  • Rapid Prototyping, Manufacturing + CAD
  • Tolerance stack-up analysis & precision fit validation
  • Design for Manufacturing (DFM) & Design for Assembly (DFA)
  • Center-of-mass balancing & stability optimization
  • Mechanical-electrical co-design for PCB integration
  • Soldering, Electronics + PCB Design
  • Troubleshooting (noise issues, signal integrity, software + hardware)

Full Mission#

Line Following#

Object Tracking#


Details#

Check out the project repository on Github!

Quick Start#

  1. Flash ESP32 firmware (PlatformIO project in esp32/).
  2. Connect ESP32 over USB; verify serial at 921600 baud.
  3. On the Pi / dev machine:
python3 -m venv .venv
source .venv/bin/activate  # Windows: .venv\Scripts\activate
pip install -r pi/requirements.txt
python pi/robot_controller.py --serial-port /dev/ttyUSB0 --enable-gui
  1. Confirm cameras enumerate (GUI or log output).
  2. Begin mission (line following, detection, pickup sequence).

Repository Layout#

PathPurpose
cad/Laser-cut DXF plates, 3D printed STL mounts, arm + camera hardware revisions. May be missing some components :(
esp32/Low-level real‑time firmware: motors, servos, reflectance sensors, OLED UI, command parser.
pi/High‑level Python subsystems: camera manager, line following (vision), object detection, arm + motor coordination, GUIs & diagnostics.
pi/*.jsonRuntime configuration (cameras, serial, object detection, chunk configs).
pi/YOLO_MODELS_GUIDE.mdModel selection & notes.
pi/WINDOWS_SETUP.mdWindows-specific environment setup.
pi/DOCUMENTATION.mdDetailed vision + line following internals (imported into this README summary).

System Architecture#

	Raspberry Pi (Python)                               ESP32 (Firmware)
┌──────────────────────────────┐                  ┌──────────────────────────┐
│ camera_manager.py (threads)  │  serial cmds     │ motor.cpp                │
│ line_following_manager.py    │────────────────► │ linefollower.cpp (PID)   │
│ object_detection_manager.py  │                  │ arm.cpp (IK, servos)     │
│ motor_controller.py          │  serial cmds     │ pi.cpp (command parser)  │
│ arm_controller.py            │◄─────────────────│ input_display.cpp (UI)   │
│ robot_controller.py (or GUI) │  status / ACK    │ custom_servo.cpp         │
└──────────────────────────────┘                  └──────────────────────────┘

Closed-loop layering:

  • High-level (Pi): vision line extraction, object detection, mission state, arm trajectory requests.
  • Low-level (ESP32): deterministic PWM, reflectance PID, servo motion profiles, safety handling.

Firmware (ESP32)#

Location: esp32/

Build & flash:

pio run -t upload

Key classes:

  • MotorController – speed limiting, direction control, min/max enforcement.
  • LineFollower – FreeRTOS task computing line position from 4 analog sensors; PID (Kp,Ki,Kd,Ko) -> differential motor commands.
  • ServoController – Multi-servo abstraction: target angle, speed limiting, mirrored shoulder mapping, optional IK velocity mode, wrist lock.
  • PiComm – Parses ASCII commands starting with PI:; returns ESP:... responses.
  • InputDisplay – Debounced inputs + user feedback states (INIT, READY, RUNNING, ERROR).

Timing model:

  • Main loop (~100 Hz) services serial + updates servos.
  • Line following runs in its own task (consistent dt for PID deltaTime calculation).
  • Servo motion increments based on elapsed millis (no blocking delays).

High-Level Software (Raspberry Pi / PC)#

Location: pi/

Install:

pip install -r pi/requirements.txt

Run main controller:

python pi/robot_controller.py --serial-port /dev/ttyUSB0 --enable-gui

Key modules:

  • camera_manager.py – Thread-per-camera continuous capture, non-blocking latest frame API, optional downsample.
  • line_following_manager.py – Vision line detector: multi-row scan, brightness threshold, brown rejection, adaptive search center.
  • object_detection_manager.py – YOLO (see object_detection_config_*.json), optional crop to ROI for speed.
  • motor_controller.py – Maps vision-derived line error to controller speed, then to raw motor speeds via serial (PI:MC), supports reflectance mode switching.
  • arm_controller.py – High-level turret & IK commands packaging to serial (PI:GP, PI:SP, etc.).
  • robot_gui.py / object_detection_ui.py / line_following_ui.py – Visualization & tuning.

Command Protocol (Pi <-> ESP32)#

ASCII lines terminated by \n. Requests start with PI:. Selected handlers (see pi.h):

CommandFormatPurpose
MotorsPI:MC,left,rightDirect raw motor speeds (-255..255).
Motor base speedPI:LBS,valueSet base speed used by reflectance PID.
Motor min speedPI:LMS,valueMinimum PWM for movement.
Line follow toggle`PI:LF,10`
PID gainsPI:PID,kp,ki,kd,koSet Kp,Ki,Kd,Ko.
Target positionPI:TP,xSet desired line position (sensor space).
Sensor thresholdsPI:ST,r1,l1,r2,l2Set analog voltage thresholds.
Reflectance samplePI:REFRequest current sensor voltages.
Servo positionsPI:SP,base,shoulder,elbowDirect joint targets (use ‘-’ to skip).
Wrist position (unlock)PI:WP,angleSet wrist without lock.
Wrist lock toggle`PI:WLT,10`
Wrist lock anglePI:WLA,angleSet locked angle.
Claw anglePI:CP,angleOpen/close claw.
Global IK posPI:GP,x,yCartesian target (inverse kinematics).
Global IK velPI:GV,vx,vyCartesian velocity mode.
Servo speedsPI:SS,base,shoulder,elbow,wristPer-joint speed commands.
Servo max speedsPI:SMS,base,shoulder,elbow,wristPer-joint speed caps.
Base angle queryPI:BASEReturn base joint angle.
Mission completePI:COMPLETESignal mission termination.

Responses typically: ESP:OK:msg, ESP:ERROR:reason, status updates (e.g. ESP:EMERGENCY_STOP, ESP:FINISH).

Example exchange:

> PI:PID,30,0,0.5,2
< ESP:OK:PID Updated
> PI:LF,1
< ESP:OK:Line Following Started

Core Algorithms (Summary)#

Reflectance Line Following (Firmware)#

4 analog sensors -> position estimate -> PID (Kp,Ki,Kd,Ko) -> differential motor PWM (clamped).

Vision Line Detection (Pi)#

Row sampling + brightness threshold + brown rejection + weighted lateral error -> optional camera-mode PID.

Curve Detection#

Angle difference lower vs upper line segments > threshold => phase transition cue.

Object Detection#

YOLO model (config JSON) with optional ROI cropping triggers pickup routine.

Arm IK#

Planar 2‑link solve with mechanical offsets; mirrored shoulder; wrist lock for end-effector orientation.

Mission Flow (Reference)#

  1. System init & servo home.
  2. Enable chosen line following mode.
  3. Track line; detect curve events for state changes.
  4. Object zone: detect & localize target.
  5. Execute pickup (IK + claw) and deposit.
  6. Resume or terminate mission.

Safety#

  • Emergency stop halts motors, stops tasks, reports ESP:EMERGENCY_STOP.
  • Reset button (long hold) returns to READY state.

Competition Rules#

Official competition rules for the robot can be found here. Apologies in advance if this link is no longer functional in the future! Send me a message and I may be able to email you a PDF or something…

Development#

TaskExample
Flash firmwarepio run -t upload
Serial monitorpio device monitor -b 921600
Run controllerpython pi/robot_controller.py --serial-port /dev/ttyUSB0 --enable-gui
Vision line UIpython pi/line_following_ui.py --camera 1
Object detection testpython pi/debug_object_detection.py

Contributors#

Made with love and Big Way by Ryan Cheng, David Oh, Zachary Xie, Bowen Yuan