EKF-SLAM on Turtlebot3

Overview

The EKF-SLAM was implemented on a Turtlebot3 using ROS2 and C++. The implementation utilizes wheel odometry and LiDAR data for the robot's state estimation, as well as a differntial drive kinematics library for low level control of the robot. The entire SLAM control pipeline was simulated using Rviz and can be tested on the physical Turtlebot3.

A video showing the final results of the implementation can be seen below. The blue robot represents the odometry estimate, the green robot the SLAM estimate, and the red robot the simulated true position of the robot. The cylinders represent obstacles on the map, and similarly green represents where the SLAM estimates the obstacles to be while the red is their simulated true position.

SLAM Pipeline

The pipeline is comprised of two main processes: state estimation and obstacle perception. For the state estimation, wheel odometry estimates were used to make a prediction of the robot's state. This predication is further updated and refined with every obtacle that the robot percieves through the LiDAR data. The wheel odometry and obstacle estimates result in a SLAM estimate which is a much more accurate estimate of the robot's state and location on the map.

The obstacle perception for the SLAM updates utilizes real and simulated LiDAR data of the robot's surroundings. The first step in using this data was to perform clustering of the LiDAR points of the cylindrical obstacles. Then, I implemented a circle fitting algorithm to fit a circle of a certain radius to each cluster. A circle classification algorithm was then used to filter out the clusters that did not resemble circles. Lastly, I used data association to prevent already detected obstacles from being added as new ones to the map. These steps would allow the robot to accurately detect the cylinderical obstacles in the environment and for the SLAM algorithm to create better state estimates and maps.