PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.
This paper presents the implementation of localization algorithms for indoor autonomous mobile robots in known environments. The proposed implementation employs two sensors, an RGB-D camera and a 2D LiDAR to detect the environment and map an occupancy grid that allows the robot to perform autonomous/remote navigation throughout the environment while localizing itself. The implementation uses the data retrieved from the perception sensors and odometry to estimate the position of the robot through the Monte Carlo Localization algorithm. The proposed implementation employs the Robot Operating System (ROS) framework on an NVIDIA Jetson TX2 and the Turtlebot 2. Experimental results were considered using a physical implementation of the mobile robot in an indoor environment.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.
The alert did not successfully save. Please try again later.
Luis Rodolfo Macias, J. Enrique Aleman-Gallegos, Ulises Orozco-Rosas, Kenia Picos, "Map based localization using an RGB-D camera and a 2D LiDAR for autonomous mobile robot navigation," Proc. SPIE 12673, Optics and Photonics for Information Processing XVII, 126730F (4 October 2023); https://doi.org/10.1117/12.2676585