Simulation realizes lio_sam mapping and ndt_matching positioning

Article directory

  • 1. Simulation environment
  • 2. lio_sam mapping
    • 1. Modify configuration file
    • 2. Start building a map
  • 3. ndt_matching positioning
    • 1. Create a new startup file
    • 2. Start
  • Summarize

1. Simulation environment

Use the existing open source simulation environment to build a ROS open source mini unmanned vehicle from scratch. The author has configured the car model and gazebo environment. The imu frequency has been changed to 200HZ. There are detailed instructions in the article and will not be introduced here.

Start simulation

roslaunch steer_mini_gazebo steer_mini_sim_sensors_VLP16_lio_sam.launch

The topics are as follows:

/ackermann_steering_controller/cmd_vel
/ackermann_steering_controller/odom
/clock
/gains/left_rear_joint/parameter_descriptions
/gains/left_rear_joint/parameter_updates
/gains/right_rear_joint/parameter_descriptions
/gains/right_rear_joint/parameter_updates
/gazebo/link_states
/gazebo/model_states
/gazebo/parameter_descriptions
/gazebo/parameter_updates
/gazebo/set_link_state
/gazebo/set_model_state
/imu/data
/joint_states
/rosout
/rosout_agg
/tf
/tf_static
/velodyne_points

Control the movement of the car, open a terminal, record all topics (you can also record the required topics), and the bag file will be automatically generated in the folder where the terminal is located based on the current time.

rosbag record -a

2. lio_sam mapping

For the installation test of lio_sam, please refer to the use of Lego-LOAM and LIO_SAM and the processing of maps.

1. Modify the configuration file

In the params.yaml file under LIO-SAM/config

Change the topic to your own car

 # Topics
  pointCloudTopic: "/velodyne_points"
  imuTopic: "/imu/data"
  odomTopic: "/odometry/imu"
  gpsTopic: "odometry/gpsz"

Modify save path

 # Export settings
  savePCD: true
  savePCDDirectory: "/Downloads/"

Modify the coordinate transformation from imu to radar

 # Extrinsics: T_lb (lidar -> imu)
  extrinsicTrans: [0.0, 0.0, 0.0]
  extrinsicRot: [1, 0, 0,
                  0, 1, 0,
                  0, 0, 1]
  extrinsicRPY: [1, 0, 0,
                  0, 1, 0,
                  0, 0, 1]

Leave the others as default.

2. Start mapping

roslaunch lio_sam run.launch
rosbag play your.bag


Saved PCD-GlobalMap.pcd

3. ndt_matching positioning

1. Create a new startup file

Use the ndt_matching positioning module in Autoware.ai. In order to adapt to the front, the following modifications need to be made to the simulation data

autoware.ai/src/autoware/documentation/autoware_quickstart_examples/config path:
Create a new headless_setup_steer_mini.yaml with the following contents:

tf_x: 0
tf_y: 0
tf_z: 0.115
tf_yaw: 0
tf_pitch: 0
tf_roll: 0

localizer: velodyne
use_sim_time: false

.autoware/data/tf path:
Create new tf_steer_mini.launch

<launch>
<!-- worldからmapへのtf -->
<node pkg="tf2_ros" type="static_transform_publisher" name="world_to_map" args="0 0 0 0 0 0 /world /map" />
<!-- mapからmobilityへのtf -->
<!-- <node pkg="tf2_ros" type="static_transform_publisher" name="map_to_mobility" args="0 0 0 0 0 0 /map /mobility" />-->
<node pkg="tf2_ros" type="static_transform_publisher" name="base_link_to_velodyne" args="0 0 0.115 0 0 0 /base_link /velodyne" />
</launch>

Under the path autoware.ai/src/autoware/documentation/autoware_quickstart_examples/launch/rosbag_demo:
Create a new my_map_steer_mini.launch with the following contents:

<launch>
  <rosparam command="load" file="$(find autoware_quickstart_examples)/config/headless_setup_steer_mini.yaml" />

  <include file="$(env HOME)/.autoware/data/tf/tf_steer_mini.launch"/>
  
  <node pkg="map_file" type="points_map_loader" name="points_map_loader" args="noupdate $(env HOME)/.autoware/data/map/pointcloud_map/GlobalMap.pcd"/>


</launch>

Under the path autoware.ai/src/autoware/core_perception/lidar_localizer/launch:
The content of the new ndt_matching_steer_mini.launch is as follows:

<launch>

  <arg name="method_type" default="0" /> <!-- pcl_generic=0, pcl_anh=1, pcl_anh_gpu=2, pcl_openmp=3 -->
  <arg name="use_gnss" default="0" />
  <arg name="use_odom" default="true" />
  <arg name="use_imu" default="true" />
  <arg name="imu_upside_down" default="false" />
  <arg name="imu_topic" default="/imu/data" />
  <arg name="queue_size" default="1" />
  <arg name="offset" default="linear" />
  <arg name="get_height" default="false" />
  <arg name="use_local_transform" default="false" />
  <arg name="sync" default="false" />
  <arg name="output_log_data" default="false" />
  <arg name="gnss_reinit_fitness" default="500.0" />

  <node pkg="lidar_localizer" type="ndt_matching" name="ndt_matching" output="log">
    <param name="method_type" value="$(arg method_type)" />
    <param name="use_gnss" value="$(arg use_gnss)" />
    <param name="use_odom" value="$(arg use_odom)" />
    <param name="use_imu" value="$(arg use_imu)" />
    <param name="imu_upside_down" value="$(arg imu_upside_down)" />
    <param name="imu_topic" value="$(arg imu_topic)" />
    <param name="queue_size" value="$(arg queue_size)" />
    <param name="offset" value="$(arg offset)" />
    <param name="get_height" value="$(arg get_height)" />
    <param name="use_local_transform" value="$(arg use_local_transform)" />
    <param name="output_log_data" value="$(arg output_log_data)" />
    <param name="gnss_reinit_fitness" value="$(arg gnss_reinit_fitness)" />
    <remap from="/points_raw" to="/sync_drivers/points_raw" if="$(arg sync)" />
  </node>

</launch>

Also modify the odometer receiving topic in ndt_matching.cpp

 // ros::Subscriber odom_sub = nh.subscribe("/vehicle/odom", _queue_size * 10, odom_callback); //default
  ros::Subscriber odom_sub = nh.subscribe("/ackermann_steering_controller/odom", _queue_size * 10, odom_callback);

Under the path autoware.ai/src/autoware/core_perception/points_downsampler/launch:
Create new points_downsample_steer_mini.launch

<launch>
  <arg name="sync" default="false" />
  <arg name="node_name" default="voxel_grid_filter" />
  <arg name="points_topic" default="/velodyne_points" />
  <arg name="output_log" default="false" />
  <arg name="measurement_range" default="200" />

  <node pkg="points_downsampler" name="$(arg node_name)" type="$(arg node_name)">
    <param name="points_topic" value="$(arg points_topic)" />
    <remap from="/points_raw" to="/sync_drivers/points_raw" if="$(arg sync)" />
    <param name="output_log" value="$(arg output_log)" />
    <param name="measurement_range" value="$(arg measurement_range)" />
  </node>
</launch>

Under the path autoware.ai/src/autoware/documentation/autoware_quickstart_examples/launch/rosbag_demo:
Create new my_localization_steer_mini.launch

<launch>

  <!-- setting path parameter -->
  <arg name="get_height" value="true" />

  <!-- Setup
  <include file="$(find runtime_manager)/launch_files/setup_tf.launch">
    <arg name="x" value="1.2" />
    <arg name="y" value="0.0" />
    <arg name="z" value="2.0" />
    <arg name="yaw" value="0.0" />
    <arg name="pitch" value="0.0" />
    <arg name="roll" value="0.0" />
    <arg name="frame_id" value="/base_link" />
    <arg name="child_frame_id" value="/velodyne" />
    <arg name="period_in_ms" value="10"/>
  </include>-->
  
<!-- <include file="$(find vehicle_description)/launch/vehicle_model.launch" /> -->

  <!-- points downsampler -->
  <include file="$(find points_downsampler)/launch/points_downsample_steer_mini.launch" />

  <!-- nmea2tfpose -->
  <!-- <include file="$(find gnss_localizer)/launch/nmea2tfpose.launch"/> -->

  <!-- ndt_matching -->
  <include file="$(find lidar_localizer)/launch/ndt_matching_steer_mini.launch">
    <arg name="get_height" value="$(arg get_height)" />
  </include>

</launch>

2. Start

roslaunch autoware_quickstart_examples my_map_steer_mini.launch
roslaunch autoware_quickstart_examples my_localization_steer_mini.launch
rviz
Given initial pose (required)
rosbag play your.bag

Rendering:

Summary

The above has realized the process of using lio_sam mapping and ndt_matching positioning in the simulation scenario. It is only used for testing. There is a problem that the simulation scene is small and there is no problem of adding GPS data to the robot model. Overall, it meets the positioning requirements.