Autonomous mobile robots are pivotal to Industry 4.0, enabling flexible, intelligent, and connected manufacturing systems. Their effectiveness, however, critically depends on the reliability of Simultaneous Localization and Mapping (SLAM), the ability to estimate a robot's position while constructing a map of its environment. This thesis investigates which SLAM paradigms, LiDAR-based, visual, or multi-sensor, offer the most dependable performance for indoor autonomy in dynamic smart factory settings. A reproducible experimental pipeline was developed on the Clearpath Jackal platform under ROS 2 Humble, encompassing controlled Gazebo simulations and real world trials at SmartFactoryOWL. Multiple algorithms were benchmarked, including LiDAR-based (GMapping, Hector, Cartographer) and visual or multi-sensor frameworks (ORB-SLAM3, RTAB-Map), across varying environmental dynamics. Performance was assessed using quantitative and qualitative indicators such as map completeness, localization accuracy, computational efficiency, and resilience to dynamic obstacles. Results revealed that the camera, LiDAR, and fused sensor configurations exhibited their distinct strengths. LiDAR-only methods provided high geometric precision in static scenes but degraded under dynamic conditions, while the camera approach adapted better to appearance changes. The fusion modality achieved the best overall performance, combining robustness, and computational efficiency for reliable real-time operation in smart factory environments. To further improve reliability under limited resources, a MobileNet-augmented SLAM approach was introduced, integrating a lightweight semantic pre-filter that filters dynamic objects and retains static features, ensuring real-time performance on a CPU-only Intel i7 platform. By systematically analyzing the comparative performance of SLAM algorithms and introducing an efficient learning based enhancement, this thesis provides actionable insights for deploying scalable, real-time, and resilient navigation systems in Industry 4.0 environments.

Comparative study on Multi-sensor SLAM in Smart Factories

RAMESH, KRITHIGA
2024/2025

Abstract

Autonomous mobile robots are pivotal to Industry 4.0, enabling flexible, intelligent, and connected manufacturing systems. Their effectiveness, however, critically depends on the reliability of Simultaneous Localization and Mapping (SLAM), the ability to estimate a robot's position while constructing a map of its environment. This thesis investigates which SLAM paradigms, LiDAR-based, visual, or multi-sensor, offer the most dependable performance for indoor autonomy in dynamic smart factory settings. A reproducible experimental pipeline was developed on the Clearpath Jackal platform under ROS 2 Humble, encompassing controlled Gazebo simulations and real world trials at SmartFactoryOWL. Multiple algorithms were benchmarked, including LiDAR-based (GMapping, Hector, Cartographer) and visual or multi-sensor frameworks (ORB-SLAM3, RTAB-Map), across varying environmental dynamics. Performance was assessed using quantitative and qualitative indicators such as map completeness, localization accuracy, computational efficiency, and resilience to dynamic obstacles. Results revealed that the camera, LiDAR, and fused sensor configurations exhibited their distinct strengths. LiDAR-only methods provided high geometric precision in static scenes but degraded under dynamic conditions, while the camera approach adapted better to appearance changes. The fusion modality achieved the best overall performance, combining robustness, and computational efficiency for reliable real-time operation in smart factory environments. To further improve reliability under limited resources, a MobileNet-augmented SLAM approach was introduced, integrating a lightweight semantic pre-filter that filters dynamic objects and retains static features, ensuring real-time performance on a CPU-only Intel i7 platform. By systematically analyzing the comparative performance of SLAM algorithms and introducing an efficient learning based enhancement, this thesis provides actionable insights for deploying scalable, real-time, and resilient navigation systems in Industry 4.0 environments.
2024
Comparative study on Multi-sensor SLAM in Smart Factories
File in questo prodotto:
File Dimensione Formato  
Master_thesis_Krithiga_pdfA.pdf

accesso aperto

Dimensione 63.45 MB
Formato Adobe PDF
63.45 MB Adobe PDF Visualizza/Apri

È consentito all'utente scaricare e condividere i documenti disponibili a testo pieno in UNITESI UNIPV nel rispetto della licenza Creative Commons del tipo CC BY NC ND.
Per maggiori informazioni e per verifiche sull'eventuale disponibilità del file scrivere a: unitesi@unipv.it.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.14239/33933