Autonomous Search and Rescue Robot#
Overview#
A fully autonomous “search and rescue” robot integrating dual sensing (vision + IR reflectance), YOLO object detection, articulated arm with inverse kinematics, and a split system architecture (Raspberry Pi high‑level + ESP32 real‑time control).
Highlights#
- Project Management + Advanced Problem Solving
- Rapid Prototyping, Manufacturing + CAD
- Tolerance stack-up analysis & precision fit validation
- Design for Manufacturing (DFM) & Design for Assembly (DFA)
- Center-of-mass balancing & stability optimization
- Mechanical-electrical co-design for PCB integration
- Soldering, Electronics + PCB Design
- Troubleshooting (noise issues, signal integrity, software + hardware)
Full Mission#
Line Following#
Object Tracking#
Details#
Check out the project repository on Github!
Quick Start#
- Flash ESP32 firmware (PlatformIO project in
esp32/). - Connect ESP32 over USB; verify serial at 921600 baud.
- On the Pi / dev machine:
python3 -m venv .venv
source .venv/bin/activate # Windows: .venv\Scripts\activate
pip install -r pi/requirements.txt
python pi/robot_controller.py --serial-port /dev/ttyUSB0 --enable-gui- Confirm cameras enumerate (GUI or log output).
- Begin mission (line following, detection, pickup sequence).
Repository Layout#
| Path | Purpose |
|---|---|
cad/ | Laser-cut DXF plates, 3D printed STL mounts, arm + camera hardware revisions. May be missing some components :( |
esp32/ | Low-level real‑time firmware: motors, servos, reflectance sensors, OLED UI, command parser. |
pi/ | High‑level Python subsystems: camera manager, line following (vision), object detection, arm + motor coordination, GUIs & diagnostics. |
pi/*.json | Runtime configuration (cameras, serial, object detection, chunk configs). |
pi/YOLO_MODELS_GUIDE.md | Model selection & notes. |
pi/WINDOWS_SETUP.md | Windows-specific environment setup. |
pi/DOCUMENTATION.md | Detailed vision + line following internals (imported into this README summary). |
System Architecture#
Raspberry Pi (Python) ESP32 (Firmware)
┌──────────────────────────────┐ ┌──────────────────────────┐
│ camera_manager.py (threads) │ serial cmds │ motor.cpp │
│ line_following_manager.py │────────────────► │ linefollower.cpp (PID) │
│ object_detection_manager.py │ │ arm.cpp (IK, servos) │
│ motor_controller.py │ serial cmds │ pi.cpp (command parser) │
│ arm_controller.py │◄─────────────────│ input_display.cpp (UI) │
│ robot_controller.py (or GUI) │ status / ACK │ custom_servo.cpp │
└──────────────────────────────┘ └──────────────────────────┘Closed-loop layering:
- High-level (Pi): vision line extraction, object detection, mission state, arm trajectory requests.
- Low-level (ESP32): deterministic PWM, reflectance PID, servo motion profiles, safety handling.
Firmware (ESP32)#
Location: esp32/
Build & flash:
pio run -t uploadKey classes:
MotorController– speed limiting, direction control, min/max enforcement.LineFollower– FreeRTOS task computing line position from 4 analog sensors; PID (Kp,Ki,Kd,Ko) -> differential motor commands.ServoController– Multi-servo abstraction: target angle, speed limiting, mirrored shoulder mapping, optional IK velocity mode, wrist lock.PiComm– Parses ASCII commands starting withPI:; returnsESP:...responses.InputDisplay– Debounced inputs + user feedback states (INIT, READY, RUNNING, ERROR).
Timing model:
- Main loop (~100 Hz) services serial + updates servos.
- Line following runs in its own task (consistent dt for PID deltaTime calculation).
- Servo motion increments based on elapsed millis (no blocking delays).
High-Level Software (Raspberry Pi / PC)#
Location: pi/
Install:
pip install -r pi/requirements.txtRun main controller:
python pi/robot_controller.py --serial-port /dev/ttyUSB0 --enable-guiKey modules:
camera_manager.py– Thread-per-camera continuous capture, non-blocking latest frame API, optional downsample.line_following_manager.py– Vision line detector: multi-row scan, brightness threshold, brown rejection, adaptive search center.object_detection_manager.py– YOLO (seeobject_detection_config_*.json), optional crop to ROI for speed.motor_controller.py– Maps vision-derived line error to controller speed, then to raw motor speeds via serial (PI:MC), supports reflectance mode switching.arm_controller.py– High-level turret & IK commands packaging to serial (PI:GP,PI:SP, etc.).robot_gui.py/object_detection_ui.py/line_following_ui.py– Visualization & tuning.
Command Protocol (Pi <-> ESP32)#
ASCII lines terminated by \n. Requests start with PI:. Selected handlers (see pi.h):
| Command | Format | Purpose |
|---|---|---|
| Motors | PI:MC,left,right | Direct raw motor speeds (-255..255). |
| Motor base speed | PI:LBS,value | Set base speed used by reflectance PID. |
| Motor min speed | PI:LMS,value | Minimum PWM for movement. |
| Line follow toggle | `PI:LF,1 | 0` |
| PID gains | PI:PID,kp,ki,kd,ko | Set Kp,Ki,Kd,Ko. |
| Target position | PI:TP,x | Set desired line position (sensor space). |
| Sensor thresholds | PI:ST,r1,l1,r2,l2 | Set analog voltage thresholds. |
| Reflectance sample | PI:REF | Request current sensor voltages. |
| Servo positions | PI:SP,base,shoulder,elbow | Direct joint targets (use ‘-’ to skip). |
| Wrist position (unlock) | PI:WP,angle | Set wrist without lock. |
| Wrist lock toggle | `PI:WLT,1 | 0` |
| Wrist lock angle | PI:WLA,angle | Set locked angle. |
| Claw angle | PI:CP,angle | Open/close claw. |
| Global IK pos | PI:GP,x,y | Cartesian target (inverse kinematics). |
| Global IK vel | PI:GV,vx,vy | Cartesian velocity mode. |
| Servo speeds | PI:SS,base,shoulder,elbow,wrist | Per-joint speed commands. |
| Servo max speeds | PI:SMS,base,shoulder,elbow,wrist | Per-joint speed caps. |
| Base angle query | PI:BASE | Return base joint angle. |
| Mission complete | PI:COMPLETE | Signal mission termination. |
Responses typically: ESP:OK:msg, ESP:ERROR:reason, status updates (e.g. ESP:EMERGENCY_STOP, ESP:FINISH).
Example exchange:
> PI:PID,30,0,0.5,2
< ESP:OK:PID Updated
> PI:LF,1
< ESP:OK:Line Following StartedCore Algorithms (Summary)#
Reflectance Line Following (Firmware)#
4 analog sensors -> position estimate -> PID (Kp,Ki,Kd,Ko) -> differential motor PWM (clamped).
Vision Line Detection (Pi)#
Row sampling + brightness threshold + brown rejection + weighted lateral error -> optional camera-mode PID.
Curve Detection#
Angle difference lower vs upper line segments > threshold => phase transition cue.
Object Detection#
YOLO model (config JSON) with optional ROI cropping triggers pickup routine.
Arm IK#
Planar 2‑link solve with mechanical offsets; mirrored shoulder; wrist lock for end-effector orientation.
Mission Flow (Reference)#
- System init & servo home.
- Enable chosen line following mode.
- Track line; detect curve events for state changes.
- Object zone: detect & localize target.
- Execute pickup (IK + claw) and deposit.
- Resume or terminate mission.
Safety#
- Emergency stop halts motors, stops tasks, reports
ESP:EMERGENCY_STOP. - Reset button (long hold) returns to READY state.
Competition Rules#
Official competition rules for the robot can be found here. Apologies in advance if this link is no longer functional in the future! Send me a message and I may be able to email you a PDF or something…
Development#
| Task | Example |
|---|---|
| Flash firmware | pio run -t upload |
| Serial monitor | pio device monitor -b 921600 |
| Run controller | python pi/robot_controller.py --serial-port /dev/ttyUSB0 --enable-gui |
| Vision line UI | python pi/line_following_ui.py --camera 1 |
| Object detection test | python pi/debug_object_detection.py |
Contributors#
Made with love and Big Way by Ryan Cheng, David Oh, Zachary Xie, Bowen Yuan