Ghosting, rain and snow weather point cloud noise processing

Rain and snow weather point cloud noise processing

In rainy and snowy weather, point cloud noise may come from the following aspects:

  1. Interference caused by the reflection of precipitation objects such as raindrops and snowflakes on the lidar;
  2. The influence of meteorological factors such as humidity and temperature on lidar;
  3. Multiple reflections or echoes caused by obstructions or reflectors in the environment.

To solve the problem of point cloud noise, the following processing algorithms can be considered:

  1. Point cloud filtering: remove noise points through Gaussian filtering, median filtering, etc.;
  2. Point cloud down-sampling: down-sampling the point cloud, changing the dense point cloud data into sparse point cloud data, and reducing the amount of noise;
  3. Reflection intensity filtering: filter out points with low reflection intensity by setting the reflection intensity threshold;
  4. Plane segmentation: segment the point cloud into different planes and filter out non-planar noise points;
  5. Visual screening: Manually filter out noise points through visual tools.

The following is a simple algorithm and corresponding C++ code for point cloud processing of noise such as rainwater and snowflakes:

Algorithm steps:

  1. Extract the normal vector of each point from the point cloud.
  2. For each point, compute the difference of its normal vector from its surrounding points. If the normal vectors differ greatly, it is marked as a noise point.
  3. For points marked as noise points, delete them or replace them with surrounding points.

C++ code implementation:

#include <iostream>
#include <pcl/io/pcd_io.h>
#include <pcl/point_types.h>
#include <pcl/filters/voxel_grid.h>
#include <pcl/features/normal_3d.h>

int main (int argc, char** argv)
{<!-- -->
  // read point cloud
  pcl::PointCloud<pcl::PointXYZ>::Ptr cloud (new pcl::PointCloud<pcl::PointXYZ>);
  pcl::PCDReader reader;
  reader.read("input_cloud.pcd", *cloud);

  // downsampling filter
  pcl::PointCloud<pcl::PointXYZ>::Ptr downsampled_cloud (new pcl::PointCloud<pcl::PointXYZ>);
  pcl::VoxelGrid<pcl::PointXYZ> sor;
  sor.setInputCloud(cloud);
  sor.setLeafSize(0.1f, 0.1f, 0.1f);
  sor.filter (*downsampled_cloud);

  // calculate the normal vector
  pcl::PointCloud<pcl::Normal>::Ptr normals (new pcl::PointCloud<pcl::Normal>);
  pcl::NormalEstimation<pcl::PointXYZ, pcl::Normal> ne;
  ne.setInputCloud(downsampled_cloud);
  pcl::search::KdTree<pcl::PointXYZ>::Ptr tree (new pcl::search::KdTree<pcl::PointXYZ> ());
  ne. setSearchMethod(tree);
  ne.setRadiusSearch(0.5);
  ne. compute (*normals);

  // mark noise points
  pcl::PointCloud<pcl::PointXYZ>::Ptr denoised_cloud (new pcl::PointCloud<pcl::PointXYZ>);
  float threshold = 0.5;
  for (int i = 0; i < downsampled_cloud->size (); + + i)
  {<!-- -->
    pcl::PointXYZ point = downsampled_cloud->points[i];
    pcl::Normal normal = normals->points[i];
    pcl::PointXYZ neighbor_point;
    bool neighbor_found = false;
    for (int j = 0; j < downsampled_cloud->size (); + + j)
    {<!-- -->
      if (i == j)
        continue;
      neighbor_point = downsampled_cloud->points[j];
      pcl::Normal neighbor_normal = normals->points[j];
      float normal_diff = acos (normal. normal_x * neighbor_normal. normal_x +
                                normal.normal_y * neighbor_normal.normal_y +
                                normal.normal_z * neighbor_normal.normal_z);
      if (normal_diff > threshold)
      {<!-- -->
        neighbor_found = true;
        break;
      }
    }
    if (!neighbor_found)
      denoised_cloud->push_back (point);
  }

  // save the result
  pcl::PCDWriter writer;
  writer.write<pcl::PointXYZ> ("denoised_cloud.pcd", *denoised_cloud, false);
}

The PCL library is used here, and the PCL library needs to be installed and configured to compile and run. The comments in the code should help you understand how this algorithm is implemented.

In rainy and snowy weather, point cloud noise is usually caused by the impact and reflection of raindrops or snowflakes, and they will be mistaken as valid points in point cloud data.

One way to deal with point cloud noise is to use statistical filters, such as Gaussian, median, or average filters. These filters can filter out noise based on the statistical information of the surrounding point cloud data.

The following is a simple cpp code example that uses the statistical filter in the PCL library to filter the noise in the point cloud:

#include <iostream>
#include <pcl/point_types.h>
#include <pcl/filters/statistical_outlier_removal.h>

int main(){<!-- -->
  // create point cloud object
  pcl::PointCloud<pcl::PointXYZ>::Ptr cloud (new pcl::PointCloud<pcl::PointXYZ>);

  // read point cloud data from file
  pcl::io::loadPCDFile<pcl::PointXYZ> ("input_cloud.pcd", *cloud);

  // create statistical filter
  pcl::StatisticalOutlierRemoval<pcl::PointXYZ> sor;
  sor.setInputCloud(cloud);
  sor.setMeanK (50); // Set the number of neighbor points used in calculating statistics
  sor.setStddevMulThresh (1.0); // Set the multiplier threshold for the standard deviation
  sor.filter (*cloud_filtered); // apply filter

  // Save the filtered point cloud data to a file
  pcl::io::savePCDFile ("output_cloud.pcd", *cloud_filtered);

  return 0;
}

In the above code, the input point cloud data is stored in a file named “input_cloud.pcd” and the filtered point cloud data is stored in a file named “output_cloud.pcd”. The parameters of the statistical filter can be modified as needed to achieve a better filtering effect.

Rain and snow can cause noise in the point cloud due to water droplets or snowflakes falling on the laser scanner and interfering with the laser scanner’s measurements. These interfering signals can be mistaken for objects in the point cloud, resulting in noisy points.

To deal with this noise, the following algorithms are usually employed:

  1. Spatial filtering: Delete points that are too far away from adjacent points by performing distance filtering on point cloud data. This method can effectively remove isolated noise points.

  2. Temporal filtering: By comparing the similarity between two consecutive frames of point clouds, remove points that change greatly between two frames. This method works well for moving objects against a static background.

  3. Adaptive Gaussian filtering: The Gaussian filter is used to filter the point cloud, the shorter filter window is used to remove small noise, and the longer filter window is used to preserve large structures and objects in the point cloud.

  4. Statistical filtering: Statistical methods are used to filter the point cloud to remove points that are too different from the average value. This approach is suitable for very dense point cloud data.

For CPP implementation, you can use a point cloud processing library such as PCL (Point Cloud Library) or Open3D. These libraries contain many off-the-shelf point cloud filtering algorithms. You can find many related code samples online, here is a sample code using the PCL library:

#include <pcl/filters/voxel_grid.h> // Spatial filtering
#include <pcl/filters/passthrough.h> // depth filtering
#include <pcl/filters/statistical_outlier_removal.h> // Statistical filtering
#include <pcl/filters/radius_outlier_removal.h> // radius filter

pcl::PointCloud<pcl::PointXYZ>::Ptr noisy_cloud (new pcl::PointCloud<pcl::PointXYZ>);
pcl::PointCloud<pcl::PointXYZ>::Ptr filtered_cloud (new pcl::PointCloud<pcl::PointXYZ>);

// load point cloud data
pcl::io::loadPCDFile ("noisy_cloud.pcd", *noisy_cloud);

// define spatial filter
pcl::VoxelGrid<pcl::PointXYZ> sor;
sor.setInputCloud(noisy_cloud);
sor.setLeafSize(0.01f, 0.01f, 0.01f);
sor. filter (*filtered_cloud);

// define depth filter
pcl::PassThrough<pcl::PointXYZ> passthrough;
passthrough.setInputCloud(filtered_cloud);
passthrough.setFilterFieldName("z");
passthrough. setFilterLimits(0.0, 1.0);
passthrough. filter (*filtered_cloud);

// define statistical filter
pcl::StatisticalOutlierRemoval<pcl::PointXYZ> sor;
sor.setInputCloud(filtered_cloud);
sor.setMeanK(50);
sor.setStddevMulThresh(1.0);
sor. filter (*filtered_cloud);

// define radius filter
pcl::RadiusOutlierRemoval<pcl::PointXYZ> ror;
ror.setInputCloud(filtered_cloud);
ror.setRadiusSearch(0.1);
ror.setMinNeighborsInRadius(10);
ror.filter (*filtered_cloud);

// Save the processed point cloud data
pcl::io::savePCDFile ("filtered_cloud.pcd", *filtered_cloud);

The above is a simple point cloud filtering example, which includes spatial filtering, depth filtering, statistical filtering and radius filtering. You can modify and adjust as needed to suit your application scenarios.

Ghosting

Ghosting in point clouds usually refers to noise or artifacts caused by multiple reflections and intensity variations during a LiDAR scan. These noises can interfere with the quality and accuracy of the point cloud, so some preprocessing or filtering is required to remove them.

Common processing algorithms include:

  1. Point cloud denoising: use filters or clustering algorithms to remove noise or sparse data points;

  2. Point Cloud Smoothing: Use smoothing algorithms to smooth out noise or artifacts in point clouds;

  3. Point cloud registration: use a registration algorithm to fuse multiple point cloud data together to reduce noise and artifacts;

  4. Point Cloud Segmentation: Segment and classify different objects in a point cloud to reduce the impact of noise and artifacts.

For the CPP implementation of these algorithms, open source libraries such as PCL (Point Cloud Library) or OpenCV (Computer Vision Library) can be used. These libraries provide many ready-made filters, clustering algorithms, smoothing algorithms and registration algorithms, which are convenient for users to process point cloud data.

Ghosting in point clouds refers to the phenomenon of ghosting or overlapping parts in point cloud data. This may be due to multiple reflections in overlapping areas of the lidar scans. This phenomenon makes the point cloud data confusing and affects subsequent point cloud analysis and processing. Therefore, ghost images need to be removed.

A ghost removal algorithm typically involves the following steps:

  1. Segment point cloud data into different groups or subregions based on the characteristics of the LiDAR scan.

  2. The point cloud data in the same group or sub-region is registered to remove ghost images, usually using the ICP (Iterative Closest Point) algorithm or other registration algorithms.

  3. Use point cloud filtering algorithms, such as Gaussian filtering or median filtering, to remove noise.

  4. Reassemble the processed point cloud data.

The following is a C++ sample code implemented based on the PCL library to remove ghosting:

#include <iostream>
#include <pcl/filters/passthrough.h>
#include <pcl/filters/voxel_grid.h>
#include <pcl/registration/icp.h>
#include <pcl/point_types.h>
#include <pcl/io/pcd_io.h>

int main(int argc, char** argv)
{
    pcl::PointCloud<pcl::PointXYZ>::Ptr cloud_in(new pcl::PointCloud<pcl::PointXYZ>);
    pcl::PointCloud<pcl::PointXYZ>::Ptr cloud_out(new pcl::PointCloud<pcl::PointXYZ>);

    pcl::io::loadPCDFile<pcl::PointXYZ>("input.pcd", *cloud_in);

    // segment the point cloud
    pcl::PointCloud<pcl::PointXYZ>::Ptr cloud_filtered(new pcl::PointCloud<pcl::PointXYZ>);
    pcl::PassThrough<pcl::PointXYZ> pass;
    pass.setInputCloud(cloud_in);
    pass.setFilterFieldName("z");
    pass.setFilterLimits(0.0, 1.0);
    pass. filter(*cloud_filtered);

    // Voxel downsampling
    pcl::VoxelGrid<pcl::PointXYZ> sor;
    sor.setInputCloud(cloud_filtered);
    sor.setLeafSize(0.01f, 0.01f, 0.01f);
    sor. filter(*cloud_out);

    // registration
    pcl::IterativeClosestPoint<pcl::PointXYZ, pcl::PointXYZ> icp;
    icp.setInputSource(cloud_out);
    icp.setInputTarget(cloud_in);
    pcl::PointCloud<pcl::PointXYZ>::Ptr Final (new pcl::PointCloud<pcl::PointXYZ>);
    icp. align(*Final);

    pcl::io::savePCDFileASCII("output.pcd", *Final);

    return (0);
}

The above code divides the input point cloud data into regions with a height of 0 to 1, and then uses Voxel downsampling and ICP algorithm for processing. Finally, save the processed point cloud data to an output file. You can adjust the parameters by yourself according to your needs to get better results.