What is SLAM?
SLAM is a solution to the chicken-egg problem of needing to have a map to find your location and needing to know your location to make a map.
Simultaneous Localization and Mapping describes the robotics problem of a robot that can see its surroundings but does not know its location and is tasked with both finding itself (localization) and remembering information about the areas it’s previously occupied (mapping).
The General Solution:
For a robot, the solution to this problem takes the form of a SLAM algorithm that uses data collected by the robot and attempts to match this new data to data that has previously been collected. The previously collected data acts as the stationary map, and the physical displacement of the new data from the old map is the change in the robot’s position.
Most SLAM algorithms also allow the use of odometry data. Odometry is information collected via your robot’s wheels (i.e. how much they rotate) that help it to resolve where it has traveled. The algorithm often uses the guess provided by odometry as a starting guess for the robot’s new position. The LIDAR data then refines this guess to be more accurate.
The Specific Solution:
There are many SLAM algorithms out there, and in our search, we came across openslam.org and BreezySLAM, both of which are great resources for anybody interested in learning about the subject. We decided on BreezySLAM because of how easy it is to dive into and start using immediately.
How Do I Get Started?
The RoboPeak LIDAR unit we're using to do SLAM. Check out our detailed review!
In order for BreezySLAM to work, you need a LIDAR sensor. We’re using a RoboPeak LIDAR unit, a low-cost sensor based on technology in the Neato Robotics platform. Hokuyo sensors are also very popular in this field. To interface with an Arduino, your sensor needs to output serial data, ideally in point-by-point packets, and your Arduino needs a way to send this information to your computer (we use XBee radios as a wireless serial connection).
If you plan on using odometry data in conjunction with the LIDAR data, you will need at least one* shaft encoder on your robot. Your robot will also need to transmit this data to the base station. We send encoder data alongside LIDAR scan data to ensure they are in sync.
*Using one encoder on each side of your robot (two total) is highly recommended in nearly all mobile robotic systems because they generally have two degrees of freedom, meaning that two non-redundant sensors can fully define the robot’s position. If you are using a hardware platform with omni-directional drive capabilities, you will need three encoders, and you will also need to modify the BreezySLAM library as it does not support this hardware configuration.
BreezySLAM works in either Python or C++, with similar processing times, thanks to their C extensions for the Python version. We chose Python because we wanted to do everything from a base station computer, and Arduinos don’t quite have enough processing power to run the software. The BreezySLAM package can be downloaded from their website (or from our GitHub page, under libraries), and python installation is easy, following instructions on their website (or doing sudo python setup.py install from their “python” folder). Note that BreezySLAM is under constant development, so check their website for the newest version! We’ll be documenting the changes as we come across them in our GitHub.
The package’s documentation does a good job explaining the function of various parts of the library, although it does not help a whole lot with integrating the library into your code. They have sample code on their website home page and also in a file called log2pgm.py in their examples folder.
There is a function called update within the library that does all the heavy-lifting. You call it each time you have a new LIDAR scan, passing the scan and any odometry you have to the function. The scan data takes the format of a vector of distances (angle data is stored by the indices of the distances in the vector, an inconvenience in this library). The odometry data is passed as a vector of timestamp, left wheel value, and right wheel value. For specifics, keep reading.
Yes, But What Do I Actually DO???
We spent some time figuring out how to use BreezySLAM, and we eventually adapted the functioning log2pgm.py file into our code. To make it easier for you, we’d recommend checking out our project’s code to see how we use it in Python. Here are some graphics we made to represent what our code does with respect to running BreezySLAM:
update(scan_mm, velocities) takes distance and odometry data in a specific format:
- scan_mm is a Python list of integer values [mm] with length scan_size defined in the initialization of a laser object of class Laser()
- velocities is a Python tuple containing forward change in position [mm], angular change in position [deg], and change in time [s] since the last call to update()
getpos() returns a tuple of absolute robot position* (x, y, theta), relative to the bottom-left corner of the map and straight ahead. Please note that the x and y axes are inverted from a standard right-handed coordinate system. From upside-down, this coordinate system’s theta is defined as it is in polar coordinates. However, because a counter-clockwise (negative) heading is usually defined to be positive (as it is by LIDAR units), we prefer to use x, y, and heading instead of BreezySLAM’s x, y, and polar theta.
*We are uncertain whether getpos() is designed to return the center of the robot’s position (in which case it is wrong) or the LIDAR unit’s position (in which case offset_mm is inverted). As soon as we hear back from the creators of BreezySLAM, we will post new code with an update noted in changes.txt in our GitHub.