posted on 2025-08-01, 00:00authored byRagib Rownak
The focus of this research is the development of an advanced autonomous navigation system that integrates Detectron2's Panoptic FPN model with multi-sensor fusion and simultaneous 3D mapping capabilities for the Clearpath Husky A200 unmanned ground vehicle (UGV) operating in complex outdoor environments. By combining panoptic segmentation for comprehensive scene understanding with the ROS2 Navigation Stack (Nav2) and RTAB-Map for real-time 3D mapping, a robust perception, mapping, and path planning framework is developed. This framework enables simultaneous localization, mapping, and terrain-aware obstacle avoidance capabilities that surpass traditional 2D LiDAR-based approaches. The system utilizes sensor fusion between the SICK LMS-111 2D LiDAR and Intel RealSense D435i RGB-D camera, where both instance objects and stuff classes detected by the Panoptic FPN model are classified and converted to laser scan format for seamless integration with the navigation costmap while concurrently building detailed 3D environmental maps. The implementation addresses critical limitations in outdoor robotic navigation where traditional systems fail to detect terrain obstacles such as grass, earth, and vegetation. Experimental validation in both Gazebo Fortress simulation and real-world outdoor environments demonstrates the effectiveness of the proposed panoptic segmentation-based system with concurrent 3D mapping in achieving collision-free navigation through complex scenarios. The framework successfully operates at 8-10 FPS on the NVIDIA Jetson AGX Orin platform, showing significant improvements in obstacle detection accuracy (92%), navigation reliability, and environmental understanding compared to traditional LiDAR-only approaches.