Itās not 2020 if you canāt build robots of doom out of scrap consumer electronicsā¦ (c)freenect Github issue
Last year I wrote an article about building and installing ROS Melodic on new (at that time) Raspberry Pi with Debian Buster OS. The article has received a lot of attention both here on hackster.io and on other platforms. Iām very glad that I helped so many people to successfully install ROS on Raspberry Pi. In the accompanying video I also briefly demonstrated getting depth image from Kinect 360. Later numerous people have contacted me on LinkedIn and asked me how did I manage to use Kinect with Raspberry Pi. I was kind of surprised at the question, since the process of getting Kinect ready at that time took me about 3-4 hours and didnāt seem extremely complicated. I shared my .bash_history files with all the people inquiring me about the issue and in April finally found the time to write an article on how to install Kinect drivers and perform RGB-D SLAM with RTAB-MAP ROS. Week of sleepless nights after starting writing the article I now understand why so many people asked me this question š
I will start with brief explanation about what approaches did work and which didnāt. Then Iāll explain how to install Kinect drivers for use with ROS Melodic and finally how to set up your machine for RGB-D SLAM with RTAB-MAP ROS.
What Worked and What Didnāt
There are a few drivers available for Kinect on Raspberry Pi ā out of them two are supported by ROS.
OpenNI drivers ā openni_camera package for ROS
libfreenect drivers ā freenect_stack package for ROS
If you look at their respective GitHub repositories, you can find that OpenNI driver has last been updated years ago and in practice is EOL for a long time. ibfreekinect on the other hand is being timely updated. Same for their respective ROS packages, freenect_stack was released for ROS melodic, while he lastest distro openni_camera has listed support for is Fuerteā¦
It is possible to compile and install OpenNI driver and openni_camera package on Raspberry Pi for ROS Melodic, although it didnāt work for me. In order to do that follow this guide, steps 1, 2, 3, on step 2 and 3 remove the ā-mfloat-abi=softfpā flag from Platform/Linux/Build/Common/Platform.ARM file(per advice on this Github issue). Then clone openni_camera package to your catkin workspace and compile with catkin_make. It didnāt work for me though, the error was creating depth generator failed. Reason: USB interface is not supported!
Using libfreenect and freenect_stack yielded success in the end, but there were quite a few problems to solve and the solution was a bit hacky, albeit working very stable (1 hour + continued operation).
Installing Freenect Drivers and Freenect_stack
Iāll assume that you use my ROS Melodic Desktop image from this article. If you want to do installation in different environment, for example ros_comm image or in Ubuntu for Raspberry Pi, make sure that you have enough knowledge about ROS to solve problems that might arise from that difference.
Letās start by building libfreenect drivers from source, since apt-get repository pre-built version is too outdated.
sudo apt-get update
sudo apt-get install libusb-1.0-0-dev
git clone https://github.com/OpenKinect/libfreenect.git
cd libfreenect
mkdir build && cd build
cmake -L ..
make
sudo make install
Hopefully the build process will be uneventful and full of green friendly messages. After you installed libfreenect driver, next sthing to do is to install freenect_stack package for ROS. There are quite a few other packages it depends on, weāll have to clone them and build with catkin_make all together. Before you start, make sure your catkin workspace is properly set up and sourced!
From your catkin workspace src folder:
git clone https://github.com/ros-drivers/freenect_stack.git
git clone https://github.com/ros-perception/image_common.git
git clone https://github.com/ros-drivers/rgbd_launch.git
git clone https://github.com/ros-perception/vision_opencv.git
git clone https://github.com/ros-perception/image_pipeline.git
git clone https://github.com/ros/geometry2.git
Whooh, that was a lot of cloning.
cd ..
To check if we dependencies for all packages in place execute this command:
rosdep install --from-paths src --ignore-src
If you successfully cloned all the necessary packages it will request to download libfreekinect with apt-get. Answer no, since we already installed it from source.
sudo apt-get install libbullet-dev libharfbuzz-dev libgtk2.0-dev libgtk-3-dev
catkin_make -j2
Tea time š or whatever your favorite drink is.
After compilation process has finished you can try launching kinect stack and checking if it outputs the depth and color images properly. I use Raspberry Pi headless, so I need to run RVIZ on my desktop computer.
On Raspberry Pi do (Change the IP address to IP address of your Raspberry Pi!):
export ROS_MASTER_URI=http://192.168.0.108:11311
export ROS_IP=192.168.0.108
roslaunch freenect_launch freenect.launch depth_registration:=true
You will see output as in Screenshot 1. āStopping device RGB and Depth stream flush.ā indicates that Kinect is ready, but nothing is subscribed to its topics yet.
On your desktop computer with ROS Melodic installed do:
export ROS_MASTER_URI=http://192.168.0.108:11311
export ROS_IP=[your-desktop-computer-ip]
rviz
Now you should be able to see RGB and Depth image streams in RVIZ as in Screenshot 2 aboveā¦ but not at the same time.
Okay, here is where hacky stuff starts. I spent 3 days trying different drivers and approaches and nothing worked ā as soon as I would try accessing two streams simultaneously the Kinect would start timing out as you can see in Screenshot 3. I tried everything: better power supply, older commits of libfreenect and freenect_stack, stopping usb_autosuspend, injecting bleach to USB ports(okay, not the last one! donāt do it, itās a joke and should not constitute a technical advice š ). Then in one of GitHubās issues I saw an account of a person who said their Kinect was unstable, until they āloaded the USB busā by connecting WiFi dongle. I tried that and it worked. On the one hand, Iām glad that it worked. On the other hand, somebody is really ought to fix that. Well, meanwhile having (sort of) fixed that, letās move on to the next step.
Installing Standalone RTAB MAP
First we have a bunch of dependencies to be installed:
Despite there is a prebuilt armhf package available for PCL, weāll need to compile it from source because of this issue. Consult PCL GitHub repository to see how to compile it from source.
sudo apt-get install libvtk6-dev libvtk6-qt-dev libvtk6-java libvtk6-jni
sudo apt-get install libopencv-dev cmake libopenni2-dev libsqlite3-dev
Now letās clone rtab map standalone package git repository to our home folder and build it. I used the latest release(0.18.0).
git clone https://github.com/introlab/rtabmap.git
cd rtabmap/build
cmake ..
make -j2
sudo make install
sudo ldconfig rtabmap
Now when we have compiled standalone RTAB MAP, we can move to the last step ā compiling and installing ROS wrapper for RTAB MAP, rtabmap_ros.
Installing rtabmap_ros
If you got that far, you probably know the drill by now š Clone the rtabmap_ros repository to your catkin workspace src folder. (Execute next command from you catkin workspace src folder!)
git clone https://github.com/introlab/rtabmap_ros.git
Weāll need these ROS packages as well, that rtabmap_ros depends on:
git clone https://github.com/ros-perception/perception_pcl.git
git clone https://github.com/ros-perception/pcl_msgs.git
git clone https://github.com/ros-planning/navigation.git
git clone https://github.com/OctoMap/octomap_msgs.git
git clone https://github.com/introlab/find-object.git
Before you start compilation you can make sure you are not missing any dependencies with the following command:
rosdep install --from-paths src --ignore-src
Install more dependencies from ap-get (these will not interrupt the linking, but will throw an error during compilation)
sudo apt-get install libsdl-image1.2-dev
Then move to your catkin workspace folder and start compiling:
cd ..
catkin_make -j2
Hope you didnāt put your favorite compilation drink anywhere too far. After the compilation is done weāre ready to do the mapping!
Show Time
Do that hacky trick with adding something like WiFi or Bluetooth dongle to an USB port ā I was using 2 USB 2.0 ports, one for Kinect, the other for WiFi dongle.
On Raspberry Pi do (Change the IP address to IP address of your Raspberry Pi!):1st terminal:
export ROS_MASTER_URI=http://192.168.0.108:11311
export ROS_IP=192.168.0.108
roslaunch freenect_launch freenect.launch depth_registration:=true data_skip:=2
2nd terminal:
roslaunch rtabmap_ros rgbd_mapping.launch rtabmap_args:="--delete_db_on_start --Vis/MaxFeatures 500 --Mem/ImagePreDecimation 2 --Mem/ImagePostDecimation 2 --Kp/DetectorStrategy 6 --OdomF2M/MaxSize 1000 --Odom/ImageDecimation 2" rtabmapviz:=false
You will see output as in Screenshot 1. āStopping device RGB and Depth stream flush.ā indicates that Kinect is ready, but nothing is subscribed to its topics yet.In second terminal you should be seeing messages about odom quality.
If you move Kinect too fast, odometry quality will go to 0 and youāll need to move to a previous location or start from clean database.
On your desktop computer with ROS Melodic and rtab_map package installed(I recommend you use Ubuntu computer for that, since pre-built packages are available for amd64 architecture) do:
export ROS_MASTER_URI=http://192.168.0.108:11311
export ROS_IP=[your-desktop-computer-ip]
rviz
Add MapGraph and MapCloud displays to rviz and choose the corresponding topics coming from rtab_map.
Source: RGB-D SLAM With Kinect on Raspberry Pi 4 ROS Melodic