← Back to Project Vault

DCDC PROJECT HUB

Autonomous Indoor Navigation Robot Using LIDAR-Based SLAM

4TH YEARRoboticsHARD

Problem statement

Navigating unknown indoor spaces autonomously is challenging due to the absence of GPS and dynamically changing environments. Traditional robots rely on pre-mapped environments, which limits flexibility. A robot capable of mapping and navigating in real time is essential for industrial automation and service robotics.

Abstract

This project implements an indoor autonomous navigation robot using LIDAR-based Simultaneous Localization and Mapping (SLAM). A Raspberry Pi or Jetson Nano collects LIDAR scans, builds a live map using SLAM algorithms, and calculates the robot's position. A path planning module identifies optimal routes while obstacle avoidance ensures safe travel. The robot can be deployed for industrial inspection, warehouse automation, and campus navigation.

Components required

  • LIDAR Module (RPLidar A1/A2)
  • Raspberry Pi / Jetson Nano
  • Motor Driver (L298N)
  • DC Motors / Encoders
  • Battery Pack
  • ROS (Robot Operating System)

Block diagram

LIDAR Sensor
SLAM Algorithm
Map Generation
Path Planning
Motor Control Unit
Autonomous Movement

Working

The LIDAR continuously scans the area and sends distance measurements to the SLAM module. SLAM builds a map and simultaneously localizes the robot within it. The path planner computes the shortest obstacle-free path to the destination. The motor control unit adjusts wheel speeds to follow the generated path while avoiding dynamic obstacles.

Applications

  • Warehouse robots
  • Hospital delivery robots
  • Campus indoor navigation
  • Search and rescue operations
  • Industrial inspection