Pavan Velagaleti
← Home

Vision based— Gesture-Controlled Robotic Finger

Vision-based human-in-the-loop control: MediaPipe gesture → reference mapping → encoder PD → tendon-driven actuation

2025 Tier 2
Vision based— Gesture-Controlled Robotic Finger

What I built

Built a single-DOF robotic finger that tracks a human finger gesture in real time. A monocular webcam + MediaPipe estimates finger flexion, filters the signal (low-pass + deadband), maps it to a motor reference, and sends commands over serial to an Arduino. The Arduino closes the loop using rotary encoder feedback and a PD controller to drive a DC motor through an H-bridge, actuating a tendon/spool mechanism with elastic return for extension.

Problem

Translate noisy vision-based finger motion into smooth, stable actuator commands for a tendon-driven finger—while maintaining real-time behavior, safe bounds, and reliable tracking with motor-side encoder feedback.

Approach

  • Used a webcam + MediaPipe to track 21 hand landmarks and estimate finger bend using inter-segment angles (dot-product geometry).
  • Applied signal conditioning (low-pass smoothing + deadband) to suppress jitter and prevent actuator chatter.
  • Mapped filtered finger angle to a motor reference angle with scaling + saturation to respect mechanical/electrical limits.
  • Implemented a lightweight Arduino firmware loop: receive reference via UART (115200), read encoder, compute error, set motor direction + PWM through an H-bridge.
  • Added embedded safety constraints: PWM saturation, reference bounds, and a serial timeout fail-safe.
  • Derived a SISO electromechanical plant model for the DC motor + transmission and designed a PD controller for stable tracking.
  • Validated control behavior in MATLAB/Simulink before hardware testing (closed-loop step response + Bode analysis).

Engineering decisions

DC motor + encoder over hobby servo
A servo was evaluated early but lacked torque for tendon friction/elastic load; the final design uses a DC motor with rotary encoder feedback for controllable torque and closed-loop position tracking.
Filter + deadband to stabilize vision control
Vision-based angle estimates are noisy; smoothing and a deadband prevent high-frequency jitter from becoming motor chatter.
Partition compute: vision on host, control on MCU
Vision processing stays on the PC; Arduino handles deterministic low-level control for reliability under serial timing variability.

Ownership

  • System integration: vision → reference generation → serial protocol → embedded control → electromechanical actuation
  • Encoder processing and motor position estimation from counts-per-rev conversion
  • PD controller implementation + stability-focused tuning for tendon compliance
  • Hardware iteration: tendon routing, spool/encoder alignment, mechanical guards + return mechanism

Results

  • Achieved stable, damped closed-loop tracking of gesture-derived reference commands with noise suppression and safety constraints.
  • Demonstrated a complete end-to-end pipeline: perception → reference → embedded control → mechanical motion.
  • Produced a full engineering report with modeling + Simulink validation prior to hardware execution.

Gallery

Stack

MechatronicsControlEmbeddedComputer VisionArduinoDC MotorEncoderMATLAB/Simulink