Which pcl filter to use to downsample a point cloud

tony497 picture tony497 · Apr 13, 2018 · Viewed 7.9k times · Source

I am getting a point cloud from a lidar on an autonomous driving robot, but it's too much data to process.

I already implemented a passthrough filter.

I did get a very good result and I was asking myself if there were others filters or methods I could dig into.

Of course I'm not looking for anything specific but rather a direction or advice, because I'm pretty new to the pcl library and it seems pretty huge.

Here is my filter now:

    pcl::PointCloud<PointXYZIR>::Ptr cloudInput;
    cloudInput.reset(new pcl::PointCloud<PointXYZIR> (cloud_in));

    pcl::PointCloud<PointXYZIR>::Ptr cloudFiltered;
    cloudFiltered.reset(new pcl::PointCloud<PointXYZIR>);

    // Create the filtering object: downsample the dataset using a leaf size
    pcl::VoxelGrid<PointXYZIR> avg;
    avg.setInputCloud(cloudInput);
    avg.setLeafSize(0.25f, 0.25f, 0.25f);
    avg.filter(*cloudFiltered);

    //Filter object
    pcl::PassThrough<PointXYZIR> filter;
    filter.setInputCloud(cloudFiltered);

    filter.setFilterFieldName("x");
    filter.setFilterLimits(-100, 100);
    filter.filter(*cloudFiltered);

    filter.setFilterFieldName("y");
    filter.setFilterLimits(-100, 100);
    filter.filter(*cloudFiltered);

    cloud_out = *cloudFiltered;

Answer

tony497 picture tony497 · Apr 17, 2018

Actually i did find a solution, but there is no general solution. In my case, and i think this problem is very specific to which point cloud you gonna get and what you wanna do with it.

The passtrought filter is a very efficient way to down sample without taking too many risk of losing interesting data.

http://pointclouds.org/documentation/tutorials/passthrough.php

Then i tested StatisticalOutlierRemoval, is efficient but not relevant in my case.

http://pointclouds.org/documentation/tutorials/statistical_outlier.php

Now, i downsample the pointcloud with a leafsize function, then i create a kdtree to filter the point by radius. It's about the same amount of calculation that the passtrought filter took, but it has more sense in my project to do it this way.

 // Create the filtering object: downsample the dataset using a leaf size
    pcl::VoxelGrid<PointXYZIR> avg;
    avg.setInputCloud(cloudInput);
    avg.setLeafSize(0.25f, 0.25f, 0.25f);
    avg.filter(*cloudFiltered);

    //searchPoint
    PointXYZIR searchPoint = cloudFiltered->at(0) ;

    //result from radiusSearch()
    std::vector<int> pointIdxRadiusSearch;
    std::vector<float> pointRadiusSquaredDistance;

    //kdTree
    pcl::KdTreeFLANN<PointXYZIR> kdtree;
    kdtree.setInputCloud (cloudFiltered);
    kdtree.setSortedResults(true);

    if ( kdtree.radiusSearch (searchPoint, 100, pointIdxRadiusSearch, pointRadiusSquaredDistance) > 0 )
    {
        //delete every point in target
        for (size_t j = 0; j < pointIdxRadiusSearch.size (); ++j)
        {
            //is this the way to erase correctly???
            cloud_out.push_back(cloudFiltered->points[pointIdxRadiusSearch[j]]);
        }
    }