Skip to main content

Autonomous Robot

Autonomous real-time navigation system including computer vision, embedded autonomous control, and lightweight perception models running on low-power hardware.

Embedded autonomy stack with computer vision, sensor fusion, and lightweight perception models; on-device control loops and path planning optimized for low-power hardware.

RoboticsComputer VisionMachine LearningDeep LearningReal-Time
99.5% image classification accuracy20 FPS on embedded hardwareDecision-making in <250 ms
ROSJetson NanoArduinoStereo CamerasPID ControlC++Python

Duration

Introduction

Designed and deployed an autonomous navigation system that perceives, plans, and executes reliably on constrained hardware. The stack integrates computer vision and sensor fusion with lightweight perception models, real-time control loops, and path planning tuned for low-power embedded devices.

The Challenge

The challenge was enabling autonomous navigation without prior maps and with limited resources. As a student, I had to absorb robotics fundamentals while solving real problems: real-time perception, planning, obstacle avoidance, and hardware integration — all on a student budget and constrained compute.

Solution & Approach

I developed an end-to-end autonomous system from scratch, iterating fast and prioritizing reliability:

Computer Vision & ML Pipeline

  • Object detection with lightweight networks on Jetson Nano.
  • Custom models for competition-specific obstacles.
  • Image classification (obstacles, background, line, signals) at 99.5% accuracy.
  • Depth estimation with stereo cameras for 3D perception.
  • 20 FPS processing under limited compute.

Decision-Making System

  • ML-based decision trees for navigation choices.
  • Reinforcement learning to optimize routes.
  • Rule-based fallbacks for robust operation.
  • Real-time sensor fusion (vision, ultrasound, IMU).

Control & Integration

  • ROS-based architecture connecting perception and control.
  • PID controllers for smooth motion and precise positioning.
  • Custom drivers to integrate sensors.
  • Simulation environment for hardware-free testing.

Competition Performance

  • Autonomous navigation challenges completed reliably.
  • Line following and signal response consistent with classification (99.5% accuracy).
  • Obstacle avoidance and target identification.
  • Continuous operation during competition tests.
  • Key learnings about real-world robotics constraints.

Outcomes

The platform achieved stable navigation, robust obstacle avoidance, and smooth path tracking under hardware constraints, demonstrating production-grade autonomy primitives that transfer to robotics, drones, and edge-AI use cases.