SLAM with ROS Using Bittle and Raspberry Pi 4

If you think the second robot in the picture doesn't look stable enough, you”ll be surprised.

Main topic of this article is going to be SLAM and mapping with ROS. We’ll use Bittle, an agile quadruped robot from Petoi, that finished their Kickstarter campaign last month with huge success.

First let’s start with some theory.

What is SLAM?

SLAM stands for Simultaneous Localization and Mapping – it a set of algorithms, that allows a computer to create a 2D or 3D map of space and determine it’s location in it. While by itself, SLAM is not Navigation, of course having a map and knowing your position on it is a prerequisite for navigating from point A to point B.

We can use various sensors to receive data about the environment that can be used for mapping

  • Laser Scanners (one-dimensional and 2D- (sweeping) laser rangefinders)
  • Cameras(Monocular, Stereo and RGB-D)
  • Sonar sensors
  • Tactile sensors
  • Others

In practice a lot of times, a combination of sensors is used, and later a fusion algorithm is applied, for example extended Kalman filter, to obtain precise information.

If we come back to basics, for most of applications you will be dealing either with LIDAR based SLAM or Visual SLAM. LIDAR based SLAM is relatively easy to set up and it is quite precise – there is a reason Waymo uses LIDARs on their self-driving cars.

But of course, there is a reason that Tesla doesn’t – LIDARs are bulky, quite expensive and since they have rotating parts require maintenance when in operation for longer period of time. For Visual SLAM, RGB-D sensor approaches also can be quite robust, whereas simple stereo or monocular systems can be tricky to set up. Here are some more links in the description to read about SLAM in details!

What Is Simultaneous Localization and Mapping?

LSD-slam and ORB-slam2, a literature based explanation

RPLIDAR and ROS programming- The Best Way to Build Robot

In this article we’ll try Monocular Visual SLAM algorithm called ORB-SLAM2 and a LIDAR based Hector SLAM.

Visual SLAM with ORB-SLAM2

For ORB-SLAM2, we will use regular cheap web-camera – it needs to be calibrated to determine the intrinsic parameters that are unique to each model of the camera. I recommend to do calibration with inbuilt ROS camera calibration tools. To install these do (you can install on your Ubuntu PC):

sudo apt-get install ros-melodic-camera-calibration

Print the calibration checkerboard, download it from here.

Measure the side of the square in millimeters. Then enter the following commands to start calibration:

roslaunch usb_cam usb_cam.launchrosrun camera_calibration cameracalibrator.py --size 8x6 --square 0.108 image:=/camera/image_raw camera:=/camera

Change the square parameter to match the size of the square on your calibration board.

In order to get a good calibration you will need to move the checkerboard around in the camera frame such that:

  • Checkerboard on the camera's left, right, top and bottom of field of view
  • X bar – left/right in field of view
  • Y bar – top/bottom in field of view
  • Size bar – toward/away and tilt from the camera
  • Checkerboard on the camera's left, right, top and bottom of field of viewX bar – left/right in field of viewY bar – top/bottom in field of viewSize bar – toward/away and tilt from the camera
  • Checkerboard filling the whole field of view
  • Checkerboard tilted to the left, right, top and bottom (Skew)

At each step, hold the checkerboard still until the image is highlighted in the calibration window.

When application gathered enough data, you will be able to press Calibrate button. Calibration process might take a few minutes, so be patient. A successful calibration will result in real-world straight edges appearing straight in the corrected image. A failed calibration usually results in blank or unrecognizable images, or images that do not preserve straight edges.

After that you will need to convert camera parameters to.yaml format with the help of this package, rename it as head_camera.yaml and place it in.ros/camera_info/ folder.

There is a package integrating ORB-SLAM2 to ROS available, that also publishes 2D occupancy map. The installation process is quite complicated, I recommend to use Ubuntu 18.04 image for Raspberry Pi as a starting point to avoid the need for compiling many (many, many, many) additional packages.

Install ROS Desktop and necessary dependencies

sudo sh -c 'echo "deb http://packages.ros.org/ros/ubuntu $(lsb_release -sc) main" > /etc/apt/sources.list.d/ros-latest.list'sudo apt-key adv --keyserver 'hkp://keyserver.ubuntu.com:80' --recv-key C1CF6E31E6BADE8868B172B4F42ED6FBAB17C654sudo apt updatesudo apt install ros-melodic-desktopecho "source /opt/ros/melodic/setup.bash" >> ~/.bashrcsource  ~/.bashrcsudo apt-get install ros-melodic-pcl-ros ros-melodic-image-geometry ros-melodic-octomap-ros ros-melodic-usb-cam

Create catkin workspace, install catkin build tools and clone ORB_SLAM2_ROS repository and Bittle driver repository to your catkin_ws/src folder

mkdir -p catkin_ws/src && cd catkin_ws/srcgit clone https://github.com/rayvburn/ORB-SLAM2_ROSgit clone https://github.com/AIWintermuteAI/bittle_ROScd bittle_ROS && git checkout slam

Download the vocabulary file and place it in ORB_SLAM2/orb_slam2_lib/Vocabulary folder

wget https://github.com/raulmur/ORB_SLAM2/raw/master/Vocabulary/ORBvoc.txt.tar.gz

Then from catkin workspace folder, do

cd src/ORB-SLAM2_ROS/ORB_SLAM2sudo chmod +x build*./build_catkin.shecho "source ~/catkin_ws/devel/setup.bash" >> ~/.bashrcsource  ~/.bashrc

If compilation process freezes, try increasing swap size to 2 Gb

sudo swapoff -asudo fallocate -l 2G /swapfilesudo chmod 600 /swapfilesudo mkswap /swapfilesudo swapon /swapfilegrep SwapTotal /proc/meminfo

Later you can delete the swap file if you don't need it. After successful installation run an example to make sure it works as supposed to:

roslaunch orb_slam2_ros raspicam_mono.launch

An additional step required because you're most likely running Raspberry Pi (or other SBC) in headless mode, without screen or keyboard – either that or your robot is really bulky. So we will need to configure ROS to work on multiple machines – have a look at my previous article in BIttle series, where this process is described in details.

Since Bittle driver is written in Python 3 and ROS still uses Python 2.7 by default, we'll need to install rospkg for Python 3 to make them play together.

pip3 install rospkg

Once you have ORB-SLAM2 and packages for Bittle (or your robot base)web-camera drivers installed you can run

roslaunch bittle_driver bittle_vslam_robot.launch

It will bring up the whole system – robot driver, web camera node and ORB-SLAM2. ORB-SLAM2 requires enough information about the environment to initialize, so you can manually move the robot around to avoid large changes in translation or orientation. After ORB-SLAM2 initialized it will start publishing octomap. You can use control to move your robot around.

Source: SLAM with ROS Using Bittle and Raspberry Pi 4


About The Author

Muhammad Bilal

I am highly skilled and motivated individual with a Master's degree in Computer Science. I have extensive experience in technical writing and a deep understanding of SEO practices.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top