Indoor SLAM with a Laser Scanner
Last updated
Last updated
Note: This tutorial is applicable for Kerloud Flying Rover equipped with Raspberry Pi model 4 or model 3B+
The tutorial illustrates procedures to realize indoor SLAM with a laser scanner for Kerloud Flying Rover.
📌 The tutorial code is for research project only, and cannot be deployed directly on a product.
The required hardware setup for this tutorial is:
Kerloud Flying Rover Scorpion Laser
Indoor Laser Pi Localization Module (equipped with SlamTec RpLidar A1, Raspberry Pi model 4 or 3b+ and Benewake TF-Luna distance sensor)
The well-tested environment is ROS Noetic with Ubuntu 20.04 and ROS kinetic with Ubuntu 16.04, and all necessary components are setup properly in factory.
The indoor SLAM process is realized by integrating a 2D laser slam algorithm with an EKF state estimation module in the autopilot. The adopted 2D laser slam algorithm is Google Cartographer, which is a benchmark algorithm in the robotics community. The Cartographer package can output a precise real-time location for the vehicle in the horizontal 2D plane, while the TF-Luna distance sensor provides height measurement. The diagram of the estimation process can be depicted as below:
Remote visualization enables a user-friendly experience to access the onboard computer from a remote PC. We first have to connect the onboard Raspberry Pi to a local wifi network once, so that the wifi connection can be automatically setup next time. The vehicle IP address can be found by logging into the administration page of the router. Taking the TP-Link router as an example, users can visit 192.168.0.1 or http://tplogin.cn/ to checkout all computers connected in the local network. The computer name for the vehicle is ubuntu by default. The IP address is then displayed as shown below:
After obtaining the IP address, we can set a host IP in both the remote PC and the vehicle with:
we then set ROS environment variables in the remote PC, while those for the vehicle are set in production:
📌 You might also use a mobile phone as a hotspot alternatively to setup the required network;
You have to comment ROS environment settings in ~/.bashrc for the remote PC if you don't need to connect to the vehicle.
To validate the network connection, we can perform the following commands in the remote PC:
If the network is setup correctly, then /rosout and /rosout_agg topics can be viewed in the second terminal above, otherwise an error message will come out as below:
The ROS workspace for the indoor laser slam is located in ~/src/catkinws_laserslam, and it contains several packages:
rplidar_ros: the driver package for the RpLidar A1 to provide the laser scan data.
robot_laserslam: the laser slam package consisting of configurations, launch files for the Cartographer.
pose_converter: the package to fuse the slam output with the distance sensor measurement, and feed the 3D position to mavros.
To build the workspace, simply run:
To enable remote visualization, it's required to copy the workspace to the remote PC as well, and follow the same build process. We assume the workspace is located under the same directory for illustrations later on.
To facilitate the deployment of the indoor slam packages, we provide several laser scan datasets under the directory ~/src/catkinws_laserslam/dataset/2D-laser-datasets. Users can launch the laser slam node for these datasets to familiarize those software tools.
The commands to perform the simulation are shown as follows:
If you follow the above procedures correctly, the rviz window will pop out and show the slam output as below:
To illustrate, the blue line refers to the trajectory of the laser frame, and the relationship between the laser frame and the odom frame reveals the motion of the laser scanner. To view the tf tree of these frames,
The tf tree of frames for this simulation is given below for reference:
To start the laser slam in rover mode, it's advised to follow procedures below:
Place the flying rover in an indoor environment with surrounding walls and good light condition.
Power the vehicle and ensure that the vehicle mode switch is set as the rover mode.
Set up a local wifi network with a router, and make sure that the onboard Raspberry Pi can connect to it automatically.
Connect the vehicle with QGroundcontrol station, and it's recommended to calibrate the gyro and magnetometer carefully for the first time.
Start the laser slam process remotely with commands below:
The run_2d.sh script will bring up all nodes in sequence for the indoor laser slam.
It's optional to start the remote visualization process in the local PC,
Note that the remote visualization can occupy some computation burden in Raspberry Pi, so it's recommended to turn it off in experiment.
Start a new terminal for the remote Raspberry Pi, and validate the local position data with:
The topic /mavros/vision_pose/pose is the fused position data from the pose_converter package, and /mavros/local_position/pose is the local position from the autopilot. If both topics are correct, the slam output can be confirmed.
Then users can operate the rover in manual mode easily, or with their own applications based on our SDK.
The demonstration for laser slam in the rover mode can be viewed below:
The navigation in multicopter mode follows the same fashion as the rover mode, although sophisticated tuning of low level EKF is crucial to fuse the laser slam output with IMU sensor data. The same commands as above apply here.
The demonstration for laser slam in the multicopter mode can be viewed in the video below, in which the vehicle performs its flight in semi-autonomous mode.
Map Comparison of Lidar-based 2D SLAM Algorithms Using Precise Ground Truth, https://ieeexplore.ieee.org/document/8581131
Google cartographer documentation, https://google-cartographer-ros.readthedocs.io/en/latest/