About the Project
This project implements a real-time LiDAR-based obstacle avoidance system inside a Gazebo simulation environment. The robot uses a 2D/3D LiDAR sensor to perceive its surroundings and autonomously navigates through a cluttered environment without collisions.
The core algorithm processes incoming point-cloud data from the LiDAR sensor and extracts distance information in multiple angular sectors. A reactive control layer translates the processed sensor readings directly into velocity commands, allowing the robot to steer away from obstacles with smooth, continuous motion.
The system was built on ROS2 (Humble), leveraging its DDS-based communication to achieve low-latency sensor-to-actuator loops. Launch files and parameterized config files make it straightforward to swap out different robot models or tune the avoidance thresholds.
Key results: the robot successfully navigated mazes and dynamic obstacle fields at up to 1.5 m/s without any collisions across extended test runs. The modular architecture also enables easy integration with higher-level path-planning stacks such as Nav2.