Use D435i to run ORB-SLAM2 in real time (1)

Article directory

    • 1. Install third-party libraries
      • 1.1 Install dependencies
      • 1.2 Install Pangolin
      • 1.3 Install Eigen3
    • 2. Install ORB-SLAM2
    • 3. Install D435i driver
      • 3.1 Install IntelRealSense SDK2.0
      • 3.2 Installation of realsense-ros
    • 4. ROS runs ORB-SLAM2 in real time
      • 4.1 Camera calibration
      • 4.2 Compile ORB-SLAM2
      • 4.3 Run ORB-SLAM2 in real time


ubuntu20.04

1. Install third-party libraries

1.1 Installation dependencies

sudo apt-get update
sudo apt-get install git gcc g++ vim make cmake

1.2 Install Pangolin

Pangolin compressed package

(1) Install dependencies

sudo apt-get install libglew-dev libboost-dev libboost-thread-dev libboost-filesystem-dev
sudo apt-get install ffmpeg libavcodec-dev libavutil-dev libavformat-dev libswscale-dev libpng-dev

(2) Configure and compile
Remember to change the downloaded file name when cd

cd Pangolin
mkdir build & amp; & amp; cd build
cmake -DCPP11_NO_BOOST=1 ..
make
sudo make install

(3) Execute the command to check whether the installation is successful

cd ../examples/HelloPangolin
mkdir build & amp; & amp; cd build
cmake..
make
./HelloPangolin

If the installation is successful, the following screen will pop up.

1.3 Install Eigen3

Eigen3 download address
(1) Clone source code

git clone [email protected]:eigenteam/eigen-git-mirror.git

run command

cd eigen3
mkdir build & amp; & amp; cd build
cmake..
make
sudo make install

(2) Copy the eigen header file to the calling folder in the root directory

sudo cp -r /usr/local/include/eigen3 /usr/include

2. Install ORB-SLAM2

ORB-SLAM2 download address
(1) Clone source code

git clone [email protected]:mirrors/raulmur/ORB_SLAM2.git

run command

cd ORB_SLAM2
chmod +x build.sh
./build.sh

chmod + ): such as make -j4/j8/jx

(2) An error occurs during compilation

error: 'usleep' was not declared in this scope

Add: before the header file corresponding to the error:

#include <unistd.h>

An error occurred:

error: static assertion failed: std::map must have the same value_type as its allocator

Put the include/LoopClosing.h file in the ORB-SLAM2 source code directory

typedef map<KeyFrame*,g2o::Sim3,std::less<KeyFrame*>,
        Eigen::aligned_allocator<std::pair<const KeyFrame*, g2o::Sim3> > > KeyFrameAndPose;

change into

typedef map<KeyFrame*,g2o::Sim3,std::less<KeyFrame*>,
        Eigen::aligned_allocator<std::pair<KeyFrame *const, g2o::Sim3> > > KeyFrameAndPose;

(3) Download the official TMU data set
download link
Monocular example running rgbd_dataset_freiburg1_xyz of TUM dataset

Place the downloaded rgbd_dataset_freiburg1_xyz folder in the directory of ORB-SLAM2

(4) Run monocular

./Examples/Monocular/mono_tum Vocabulary/ORBvoc.txt Examples/Monocular/TUM2.yaml rgbd_dataset_freiburg1_xyz/

3. Install D435i driver

3.1 Install IntelRealSense SDK2.0

(1) Register the publish key of the server

sudo apt-key adv --keyserver keyserver.ubuntu.com --recv-key F6E65AC044F831AC80A06380C8B3A55A6F3EFCDE
sudo apt-key adv --keyserver hkp://keyserver.ubuntu.com:80 --recv-key F6E65AC044F831AC80A06380C8B3A55A6F3EFCDE

(2) Add the server to the warehouse list

sudo add-apt-repository "deb https://librealsense.intel.com/Debian/apt-repo $(lsb_release -cs) main" -u

(3) Installation library

sudo apt update
sudo apt install librealsense2-dkms
sudo apt install librealsense2-utils
sudo apt install librealsense2-dev
sudo apt install librealsense2-dbg

(4) Check whether the installation is successful

realsense-Viewer

3.2 Realsense-ros installation

(1) Create a workspace

mkdir -p ~/catkin_ws/src
cd ~/catkin_ws/src/

(2) Clone source code realsense-ros

git clone https://github.com/IntelRealSense/realsense-ros.git
cd realsense-ros/
git checkout `git tag | sort -V | grep -P "^2.\d + \.\d + " | tail -1`
cd..

(3) Install ros package ddynamic_reconfigure

sudo apt install ros-noetic-ddynamic-reconfigure

(4) Compile

catkin_init_workspace
cd..
catkin_make clean
catkin_make -DCATKIN_ENABLE_TESTING=False -DCMAKE_BUILD_TYPE=Release
catkin_make install

(5) Download the function package source code to the src directory

git clone https://gitcode.com/mirrors/ros-drivers/usb_cam/tree/develop/src

(6) Recompile the workspace and configure the environment

cd ~/catkin_ws/
catkin_make
source ~/catkin_ws/devel/setup.bash

(7) Test camera

roslaunch usb_cam usb_cam-test.launch


Use rqt_graph to view messages

(8) If the image is not displayed, it may be due to the following reasons
a. Report an error

RLException: [usb_cam-test.launch] is neither a launch file in package [usb_cam] nor is [usb_cam] a launch file name
The traceback for the exception was written to the log file

Reason: Environment not configured

source ~/catkin_ws/devel/setup.bash

If you don’t want to enter it every time, you can add this sentence to the ~/.bashrc file

sudo gedit ~/.bashrc

Save and exit, execute:

source ~/.bashrc

b. The camera displays a blurry screen
Solution: The default pixel format of the camera is mjpeg format, the pixel format defined in the launch file is yuyv format (sixth line), change it to mjpeg

However, a warning will be prompted in the terminal at this time

eprecated pixel format used, make sure you did set range correctly

Then find line 447 of the usb_cam.cpp file in the usb_cam-develop source code src file, and add before video_sws_ = sws_getContext…:

{<!-- -->
    AVPixelFormat pixFormat;
    switch (avcodec_context_->pix_fmt) {<!-- -->
    case AV_PIX_FMT_YUVJ420P:
        pixFormat = AV_PIX_FMT_YUV420P;
        break;
    case AV_PIX_FMT_YUVJ422P:
        pixFormat = AV_PIX_FMT_YUV422P;
        break;
    case AV_PIX_FMT_YUVJ444P:
        pixFormat = AV_PIX_FMT_YUV444P;
        break;
    case AV_PIX_FMT_YUVJ440P:
        pixFormat = AV_PIX_FMT_YUV440P;
        break;
    default:
        pixFormat = avcodec_context_->pix_fmt;
        break;
    }
    avcodec_context_->pix_fmt = pixFormat;
  }

Recompile the workspace and configure the environment

c. The camera screen is black / the camera recognized is not the camera we connected to
Solution: There are many virtual cameras in the computer. We need to replace the video* in the third line of usb_cam-test.launch with the camera connected by our USB.
1. Print the computer’s camera

ls /dev/video*

2. But what I printed out was a bunch of videos. In fact, there is only one built-in camera and an external USB camera on my computer.

/dev/video0 /dev/video2 /dev/video4 /dev/video6
/dev/video1 /dev/video3 /dev/video5 /dev/video7

3. So how do we know which of the bunch of cameras printed above is the real camera? Then use the v4l2-ctl command:

sudo apt install v4l-utils

4. Below we will conduct experiments one by one with the following instructions:

v4l2-ctl -d /dev/video0 --all

5. A lot of information is printed, we don’t need to worry about it, just focus on the camera ok: If there is a camera ok, it means that the camera is indeed real, and vice versa. As for whether it is a computer or an external connection, just change the corresponding number in the third line of video* in usb_cam-test.launch, and then test it.

4. ROS runs ORB-SLAM2 in real time

4.1 Camera calibration

1. Calibration board
2. Installation of noetic camera calibration tool

sudo apt-get install ros-noetic-camera-calibration

3. #Start camera

roslaunch usb_cam usb_cam-test.launch

4. Open a new terminal and start calibration
Pay attention to the parameter settings (modify according to the actual situation): –size 9×6 –square 0.016

rosrun camera_calibration cameracalibrator.py --size 9x6 --square 0.016 image:=/usb_cam/image_raw camera:=/usb_cam --k-coefficients=4

Move the chessboard according to x (left and right), y (up and down), size (front and back), skew (tilt), etc., until the progress bars of x, y, size, and skew all turn to green positions.

When they all turn green, press the CALIBRATE button and wait for a while to complete the calibration.
After completion, click the SAVE button to save the data to the default directory:
The terminal will display the following information:

('Wrote calibration data to', '/tmp/calibrationdata.tar.gz')

This folder tmp is displayed in the root directory (other location, computer). Double-click to enter and you will see the calibrationdata.tar.gz compressed package. When you open it, you will find pictures and file data information:

Note: The calibration result compressed package must be moved out of the default path tep folder in a timely manner, otherwise there may be no calibration results at any time if you open it again after a while.
Click the commit button to submit the data and exit the program. You can see the prompt to write the file head_camera.yaml.
(5) After finding the calibration result file, modify Asus.yaml in the Examples/ROS/ORB_SLAM2 directory according to its data.

4.2 Compile ORB-SLAM2

Copy the ORB_SLAM2 source code folder to the ~/catkin_ws/src folder, and open the terminal in the ORB_SLAM2 folder
(1) Modify the ros example source file
Add header files to all .cc files in the Examples/ROS/ORB_SLAM2/src path

#include <unistd.h>

Subscribe to the single-mono ros_mono.cc and ros_mono_ar.cc files:

ros::Subscriber sub = nodeHandler.subscribe("/camera/image_raw", 1, & amp;ImageGrabber::GrabImage, & amp;igb);

Replace with:

ros::Subscriber sub = nodeHandler.subscribe("/usb_cam/image_raw", 1, & amp;ImageGrabber::GrabImage, & amp;igb);

(2) Compile

cd ~/catkin_ws/src/ORB-SLAM2
chmod +x build.sh build_ros.sh
./build.sh

4.3 Run ORB-SLAM2 in real time

roslaunch usb_cam usb_cam-test.launch
rosrun ORB_SLAM2 Mono Vocabulary/ORBvoc.txt Examples/ROS/ORB_SLAM2/Asus.yaml