Your shopping cart is empty.

RP LIDAR SLAM Robot

07 Jul, 2014

RP LIDAR SLAM Robot

Meet the Robot


Our robot watching the Belgium-US World Cup Game with the rest of the office last week.

This is our robot, which has a lot of awesome capabilities for under $600, but still lacks a name and personality (let us know if you have any ideas for a good robot name!). In this post, we'll be giving some quick demos of its capabilities and opening up its source code. Before showing off the robot in action, let's go over a few concepts involved in SLAM!

SLAM

SLAM, or Simultaneous Localization and Mapping, is a challenge that comes up often in robotics. It's a type of chicken-egg problem: in order to find out where you are, you need a map; in order to make a map, you need to know where you are. In many applications, robots have neither a good map nor a good indicator of their location. Sensors utilizing absolute positioning, e.g. GPS, do not always give an accurate enough position for other sensors - like LIDAR - to make a reliable map, and are often not practical for situations like exploring caves or inside metal-abundant office buildings. Sensors that measure only the robot's interactions with the environment, e.g. encoders, are subject to drift (error increasing with time); sensors that measure things about the world, like distance sensors or LIDAR, can only narrow down possible locations minimally - different sets of walls and corners might look the same to LIDAR, for example. By combining those two methods of localization algorithmically, it becomes possible to make a map and figure out where you are on it at the same time. To learn more about how SLAM and other robotics algorithms work, stay tuned next week!

Demonstration


The robot begins moving.

The robot (reddish triangles in center of frame) and detected obstacles are represented around it. The darker a point is, the higher confidence the algorithm and sensor are that an obstacle is in that location. Because this is a fairly naive algorithm and some obstacles move, some ghosting is noticeable - for example, behind the robot, the two curves with a faint shadow are someone's legs who moved towards the robot. The robot is also remotely manually directed from this same interface. WASD is used for directions. Any of these commands left in the serial communication text box will continually drive the robot in the corresponding direction. Lower case wasd can be used to give the robot measured commands. For example, w100 moves the robot forward almost exactly 100mm; a90 rotates the robot approximately 90 degrees. Slippage is more severe when turning and therefore control algorithms are less accurate; however, we hope to provide some feedback from SLAM for future directional control. The serial text box can also be used to send some advanced commands for configuring the on-robot microcontroller and XBee. Full documentation of the commands is in the source readme. 


The robot has finished travelling through the kitchen.

From this interface, a user can also change the zoom of the map, or pan around as they wish. Here, the map has been zoomed so we can see everywhere the robot has been. While some work is still needed to make the map fully usable by an autonomous path-planning algorithm, it looks reasonably like a map that we can understand! The robot was driven more quickly in the second half (left on the map) of its journey, so there is more blurring and the positions are less certain (lighter gray). However, let's say that for this robot, the mission is over and we want to keep the map.


A saved image file of the map created by this mission. 

The map can easily be saved and explored or refined with external image processing software. This additionally will serve as a pipeline for data to be used by our future work with path-planning and more formal map creation. We can only show so much in static images, so let's jump over to the actual record of this mission through the TechStars kitchen!

Demonstration in Motion


The video quality is a little low, which we're trying to fix, but we wanted to show the robot and SLAM at the same time. Click on the links in the video to see robot and SLAM individually in higher quality.

Right now, we're working on making the interface a little cleaner and scraping out a few bugs before moving onto autonomous navigation. However, since we've got SLAM working and are excited to see what other hobbyists, engineers, or inventors can do with a cheap LIDAR, we're publishing our initial source code and then continuing development as an open-source project hosted on GitHub. While we may test some crazier algorithms and features locally first, you can watch us on GitHub to see the source code for our projects as they evolve this summer. We hope that you're enjoying the blog posts, and remember to share a good robot name for our little SLAMbot or any other feedback you have!

Leave a comment
4 comments

Paco Gallardo

April 03, 2015

Thanks for your excellent projects.

I see there is a new source souce since April the 2th but I can’t install because It do not detect “numpy”… How can I solve?

Thanks in advance,

Paco

Bill

June 21, 2015

Hi Paco! Sorry we didn’t get back to you sooner. You’ll need to install numpy in order to import it successfully. Numpy is an array mathematics package for python that’s part of the scientific computing package scipy. For info on how to install scipy or just the numpy part of scipy, see http://www.scipy.org/install.html.
Thanks! Let us know if you have more questions.

Sean Roelofs

November 17, 2015

Hello, my name is Sean, and I am a Junior at Bohemia Manor High School. I plan on using RPLidar to create an autonomous robot (not SLAM) for my Honors STEM Capstone class (analogous to a senior research project).

What control system does this robot use?
What electronics does this robot use?
What kind of battery does it use?
What kind of software is needed to create a program for this controller/sensor setup?
Basically I am asking how to connect the RPLIDAR to a programmable central unit that can then also control several motors…

Alison Hamilton

January 22, 2017

I was just looking at your RP LIDAR SLAM Robot | Aerospace Robotics website and see that your website has the potential to get a lot of visitors. I just want to tell you, In case you don’t already know… There is a website network which already has more than 16 million users, and the majority of the users are interested in topics like yours. By getting your website on this network you have a chance to get your site more visitors than you can imagine. It is free to sign up and you can read more about it here: http://s.t0m-s.be/5q – Now, let me ask you… Do you need your website to be successful to maintain your way of life? Do you need targeted traffic who are interested in the services and products you offer? Are looking for exposure, to increase sales, and to quickly develop awareness for your website? If your answer is YES, you can achieve these things only if you get your website on the service I am describing. This traffic network advertises you to thousands, while also giving you a chance to test the service before paying anything. All the popular blogs are using this network to boost their readership and ad revenue! Why aren’t you? And what is better than traffic? It’s recurring traffic! That’s how running a successful website works… Here’s to your success! Read more here: http://trck.be/1SX
Alison Hamilton http://www.esdlink.de/67

is added to your shopping cart.
Go to Cart
is added to your wishlist.
Go to Wishlist