PCL: Normal estimation or visualization not working - c++

I am trying to start working with PCL and do this basic example:
https://pcl.readthedocs.io/projects/tutorials/en/latest/how_features_work.html#how-3d-features-work
However, either the normal estimation or the visualization of the estimated normals doesn't work as it should.
As some test data, I generated this simple mesh in Blender:
I am using this test object because I have some point cloud data that was recorded with a real 3D-LIDAR sensor that is more complex so just for the sake of resolving this issue, I am using this simpler object here.
I convert this mesh into a PCL point cloud and that also works well.
Here is the view of the built in PCL point cloud visualizer:
However, after doing a surface normal estimation and trying to visualize it, the result is very crude:
I would have expected a view similar to the normal point cloud visualization, since I passed the original point cloud AND the normals, however I only get eleven (I counted them) normals of which some of them don't even seem to point in the right direction and no visualization of the original point cloud either.
The point cloud has 1024 points in it and the estimated normal set also has 1024 estimated normals in it so I am not sure what is going wrong here.
I ran this program with different parameters for the Sphere-Radius search (0.03, 0.06, 0.3, 0.6, 1, 3) but got similar results.
Here is my code:
int main()
{
PointCloud<PointXYZ> cloud;
PointCloud<PointXYZ>::Ptr cloud_ptr(&cloud);
string filename = "C:\\Users\\ilmu0\\Desktop\\Blendertestobjects\\planeconcave.obj";
int readresult = OBJReader::readPointCloudFromFile(filename, cloud_ptr); //My own class which reads Blender ".obj" files
if (readresult != 0)
{
cout << "File could not be read" << endl;
return 1;
}
cout << "Cloud size: " << cloud.size() << endl; // --> 1024
NormalEstimation<PointXYZ, Normal> estimation;
estimation.setInputCloud(cloud_ptr);
KdTree<PointXYZ>::Ptr tree_ptr(new KdTree<PointXYZ>());
estimation.setSearchMethod(tree_ptr);
PointCloud<Normal>::Ptr normalCloud_ptr(new PointCloud<Normal>());
estimation.setRadiusSearch(0.6);
estimation.compute(*normalCloud_ptr);
cout << "Normals: " << normalCloud_ptr->size() << endl; // --> 1024
string result;
cin >> result;
//This is used to visualize the original point cloud only
//CloudViewer viewer("Simple Cloud Viewer");
//viewer.showCloud(cloud_ptr);
//while (!viewer.wasStopped())
//{
//}
PCLVisualizer viewer("PCL Viewer");
viewer.setBackgroundColor(0.0, 0.0, 0.5);
viewer.addPointCloudNormals<PointXYZ, Normal>(cloud_ptr, normalCloud_ptr);
while (!viewer.wasStopped())
{
viewer.spinOnce();
}
return 0;
}

Related

Computation of the matrix inverse using the Eigen C++ library introduces noise

I have a publish-subscribe type of a node that receives pose information (position and orientation) from the subscribed data stream and it should compute the inverse and publish out.
In order to do so I'm creating a 4-by-4 homogeneous transformation matrix from the original pose data.
Inverse it using the Eigen C++ template library, convert the transformation matrix back to position and orientation form and publish it.
When I plotted the published data stream I noticed some noise so I ended up publishing the original data too for comparison, here is what I did:
convert original_pose to TF matrix, named as original_TF
convert original_TF back to pose, named as original_pose_
publish original_pose_
inverse original_TF assign to inverted_TF
convert inverted_TF to pose, named as inverted_pose_
publish inverted_pose_
When I plot the X, Y, Z position fields, I'm seeing a significant amount of noise (spikes and notches in the visual below) in the inverted pose data. Since I'm using the same functions to convert the original pose to TF and back, I know that those equations aren't the source of the noise.
Blue is the original, whereas red is the inverted.
Here is the code. Really nothing extraordinary.
bool inverse_matrix(std::vector<std::vector<double> > & input, std::vector<std::vector<double> > & output)
{
// TODO: Currently only supports 4-by-4 matrices, I can make this configurable.
// see https://eigen.tuxfamily.org/dox/group__TutorialMatrixClass.html
Eigen::Matrix4d input_matrix;
Eigen::Matrix4d output_matrix;
Eigen::VectorXcd input_eivals;
Eigen::VectorXcd output_eivals;
input_matrix << input[0][0], input[0][1], input[0][2], input[0][3],
input[1][0], input[1][1], input[1][2], input[1][3],
input[2][0], input[2][1], input[2][2], input[2][3],
input[3][0], input[3][1], input[3][2], input[3][3];
cout << "Here is the matrix input:\n" << input_matrix << endl;
input_eivals = input_matrix.eigenvalues();
cout << "The eigenvalues of the input_eivals are:" << endl << input_eivals << endl;
if(input_matrix.determinant() == 0) { return false; }
output_matrix = input_matrix.inverse();
cout << "Here is the matrix output:\n" << output_matrix << endl;
output_eivals = output_matrix.eigenvalues();
cout << "The eigenvalues of the output_eivals are:" << endl << output_eivals << endl;
// Copy output_matrix to output
for (int i = 0; i < 16; ++i)
{
int in = i/4;
int im = i%4;
output[in][im] = output_matrix(in, im);
}
return true;
}
-- Edit 1 --
I printed out the eigenvalues of the input and output matrices of the inverse_matrix function.
Here is the matrix input:
0.99916 -0.00155684 -0.0409514 0.505506
0.00342358 -0.992614 0.121267 0.19625
-0.0408377 -0.121305 -0.991775 1.64257
0 0 0 1
The eigenvalues of the input_eivals are:
(1,0)
(-0.992614,0.121312)
(-0.992614,-0.121312)
(1,0)
Here is the matrix output:
0.99916 0.00342358 -0.0408377 -0.438674
-0.00155684 -0.992614 -0.121305 0.39484
-0.0409514 0.121267 -0.991775 1.62597
-0 -0 0 1
The eigenvalues of the output_eivals are:
(1,0)
(-0.992614,0.121312)
(-0.992614,-0.121312)
(1,0)
-- Edit 2 --
I don't quite understand what you are plotting. Is it original_pose.{X,Y,Z} and inverted_pose.{X,Y,Z}? Then the "spikes" will really depend on the orientation-part of the matrix.
I am plotting original_pose_{position.x, position.y, position.z} and inverted_pose_{position.x, position.y, position.z} where the complete data that's published is <variable_name>{position.x, position.y, position.z, orientation.w, orientation.x, orientation.y, orientation.z}.
Can you elaborate on "the "spikes" will really depend on the orientation-part of the matrix."?
Also, how is your description related to the code-snippet? (I don't see any matching variable names).
I've identified that the source of the noise is the inversion, which is the item number 4 in my description: inverse original_TF assign to inverted_TF. To relate one another, I'm calling the function as follows:
isSuccess = inverse_matrix(original_TF, inverted_TF);
How do you store "poses" (is that the vector<vector> in your snippet)?
Yes, I'm storing them in 2-dimensional vectors of type double.
At any point, do you use Eigen::Transform to store transformations, or just plain Eigen::Matrix4d?
No, I'm only using Eigen::Matrix4d locally in the inverse_matrix function to be able to make use of the Eigen library for computation.

PCL: PCLvisualizer multiple point clouds (XYZ) in same viewport with different colours

I have a vector of point clouds of spatial points (XYZ only) which I am trying to display in the same window with different colours. I am using custom colour handlers in the PCLvisualizer.
Unfortunately the colours aren't different in the viewer on display however, but all the points are displayed in the same colour (the second one). It seems like the colour is changed for all the previous points in the viewer each time a new point cloud is added.
Here is my code:
pcl::visualization::PCLVisualizer viewer;
viewer.setBackgroundColor (0, 0, 0);
std::stringstream cloud_name;
int counter(0);
pcl::RGB rgb_color;
for (auto curr_cloud : clouds_vector)
{
rgb_color = pcl::GlasbeyLUT::at(counter);
++counter;
cloud_name.str("");
cloud_name << "Cloud " << counter;
if (counter < 5)
{
pcl::visualization::PointCloudColorHandlerCustom<pcl::PointXYZ> first_color(curr_cloud, rgb_color.r, rgb_color.g, rgb_color.b);
viewer.addPointCloud<pcl::PointXYZ> (curr_cloud, first_color, cloud_name.str());
}
else
{
pcl::visualization::PointCloudColorHandlerCustom<pcl::PointXYZ> second_color(curr_cloud, rgb_color.r, rgb_color.g, rgb_color.b);
viewer.addPointCloud<pcl::PointXYZ> (curr_cloud, second_color, cloud_name.str());
}
viewer.setPointCloudRenderingProperties (pcl::visualization::PCL_VISUALIZER_POINT_SIZE, 8, cloud_name.str());
}
Any ideas why this might be and what to change?
Many thanks to you all in advance!
EDIT: I have tried around a little, and the problem doesn't seem to be the way I'm visualising the point clouds itself (above described way is working with freshly created clouds with random values.
In my use case I am extracting multiple point clouds with the help of their indices from a single point cloud. If I do that they still all hold the same colour upon visualisation. I still can't figure out why that's the case though and am happy for any suggestions. Here is the full code with the extraction (for the benefit of a minimal example with a randomly created mother point cloud from which the extraction is taking place):
PointCloudT::Ptr cloud (new PointCloudT);
// 100 points in point cloud, random coordinates
cloud->resize(100);
for (PointCloudT::iterator cloud_it (cloud->begin()); cloud_it != cloud->end(); ++cloud_it) {
cloud_it->x = 1024 * rand () / (RAND_MAX + 1.0f);
cloud_it->y = 1024 * rand () / (RAND_MAX + 1.0f);
cloud_it->z = 1024 * rand () / (RAND_MAX + 1.0f);
}
// Create index vectors containing 20 indices each
std::vector <pcl::PointIndices> point_indices_vector;
int offset(20);
pcl::PointIndices indices;
for (int i=0;i<5;++i) {
for (int j=0;j<20;++j) {
indices.indices.push_back(j + i * offset);
}
point_indices_vector.push_back(indices);
indices.indices.clear(); // <----- THIS WAS MISSING
}
// Create extraction object to copy points from 100 strong cloud to 20-point clouds
pcl::ExtractIndices<PointT> extract;
extract.setInputCloud (cloud);
extract.setNegative (false);
// Extract points and copy them to clouds vector
std::vector<PointCloudT::Ptr, Eigen::aligned_allocator<PointCloudT::Ptr> > clouds_vector;
PointCloudT::Ptr curr_segment_cloud;
for (auto curr_index_vector : point_indices_vector) {
curr_segment_cloud.reset (new PointCloudT);
// Copy points of current line segment from source cloud into current segment cloud
extract.setIndices(boost::make_shared<const pcl::PointIndices> (curr_index_vector));
extract.filter (*curr_segment_cloud);
// Push back point cloud into return vector
clouds_vector.push_back(curr_segment_cloud);
}
// Create viewer
pcl::visualization::PCLVisualizer viewer;
viewer.setBackgroundColor (0, 0, 0);
// Visualize point clouds from clouds vector
std::stringstream cloud_name;
int counter(0);
pcl::RGB rgb;
for (auto curr_cloud : clouds_vector) {
++counter;
cloud_name.str("");
cloud_name << "Cloud " << counter;
// Generate unique colour
rgb = pcl::GlasbeyLUT::at(counter);
// Create colour handle
pcl::visualization::PointCloudColorHandlerCustom<pcl::PointXYZ> colour_handle(curr_cloud, rgb.r, rgb.g, rgb.b);
// Add points to viewer and set parameters
viewer.addPointCloud<pcl::PointXYZ> (curr_cloud, colour_handle, cloud_name.str());
viewer.setPointCloudRenderingProperties (pcl::visualization::PCL_VISUALIZER_POINT_SIZE, 8, cloud_name.str());
}
// Keep viewer running
while (!viewer.wasStopped()) {
viewer.spinOnce(100);
boost::this_thread::sleep (boost::posix_time::microseconds (100000));
}
Problem solved!
So if anybody wants to use the code... both of the snippets above are now working. (I found the mistake and corrected it in the above code snippet).
I'm embarrassed to say that I simply forgot to clear my extraction indices vector before extracting the points. So the last cloud was the full cloud again and was plotting over all of the before added points making them invisible. This way all the points seemed the same colour. o.O

NaN values in normal point

I'm having a problem verifying the value contained in the coordinates of the normal point. Initially I estimate the normal of my cloud and when I'll check the xyz coordinates of the normal it's returned nan.
pcl::PointCloud<pcl::PointXYZ>::Ptr cloud (new pcl::PointCloud<pcl::PointXYZ>);
pcl::PointCloud<pcl::Normal>::Ptr normals (new pcl::PointCloud<pcl::Normal>);
// Object for normal estimation.
pcl::NormalEstimation<pcl::PointXYZ, pcl::Normal> normalEstimation;
normalEstimation.setInputCloud(cloud);
// For every point, use all neighbors in a radius of 3cm.
normalEstimation.setRadiusSearch(0.03);
// A kd-tree is a data structure that makes searches efficient. More about it later.
// The normal estimation object will use it to find nearest neighbors.
pcl::search::KdTree<pcl::PointXYZ>::Ptr kdtree(new pcl::search::KdTree<pcl::PointXYZ>);
normalEstimation.setSearchMethod(kdtree);
// Calculate the normals.
normalEstimation.compute(*normals);
for (size_t i = 0; i < cloud->points.size(); i++){
std::cout << "x: " << cloud->points[i].x <<std::endl;
std::cout << "normal x: " << normals->points[i].normal_x <<std::endl; //-nan here
}
how can I solve it ?
Seems like a problem with your input point cloud. Make sure your point cloud is not a flat plane and make sure you don't have duplicate points. What was ailing me was the search radius. Try playing with that too.
I just encountered a similar problem with very similar code. The comments were very helpful, thanks. This is probably too late, but it might still help somebody.

Storing and adding past point clouds from kinect using point cloud library and ROS

I am trying to build a local map by adding point clouds from Kinect using iterative closest point from Point Cloud Library and ROS Hydro in Ubuntu 12.04. However, I am not able to add consecutive point clouds together to update the map. The problem is that the aligned pointcloud is only added with the source pointcloud for those current frames. I am having a bit of trouble storing the previous point clouds. As seen from the code I update the map with
Final+=*cloud_in;
However a new Final is computed every time, so I lose the old Final value. I need to retain it. I am a novice in C++ and ROS, so I would greatly appreciate help on this matter.
Listed below is the Code:
ros::Publisher _pub;
pcl::PointCloud<pcl::PointXYZRGB>::Ptr cloud_in (new pcl::PointCloud<pcl::PointXYZRGB>);
void
cloud_cb2 (const sensor_msgs::PointCloud2ConstPtr& next_input)
{
pcl::fromROSMsg (*next_input, *cloud_in);
//remove NAN points from the cloud
std::vector<int> indices;
pcl::removeNaNFromPointCloud(*cloud_in,*cloud_in, indices);
// Convert the sensor_msgs/PointCloud2 data to pcl::PointCloud
pcl::PointCloud<pcl::PointXYZRGB>::Ptr cloud2_in (new pcl::PointCloud<pcl::PointXYZRGB>);
pcl::fromROSMsg (*next_input, *cloud2_in);
//remove NAN points
std::vector<int> indices2;
pcl::removeNaNFromPointCloud(*cloud2_in,*cloud2_in, indices2);
pcl::IterativeClosestPoint<pcl::PointXYZRGB, pcl::PointXYZRGB> icp;
icp.setInputSource(cloud2_in);
icp.setInputTarget(cloud_in);
pcl::PointCloud<pcl::PointXYZRGB> Final;
icp.align(Final);
std::cout << "has converged:" << icp.hasConverged() << " score: " <<
icp.getFitnessScore() << std::endl;
std::cout << icp.getFinalTransformation() << std::endl;
Final+=*cloud_in;
// Convert the pcl::PointCloud to sensor_msgs/PointCloud2
sensor_msgs::PointCloud2 output;
pcl::toROSMsg( Final, output );
// Publish the map
_pub.publish(output);
}
int main (int argc, char** argv)
{
ros::init (argc, argv, "my_pcl_tutorial");
ros::NodeHandle nh;
// ROS subscriber for /camera/depth_registered/points
ros::Subscriber sub = nh.subscribe(
"/camera/depth_registered/points",
2,
cloud_cb2
);
// Create ROS publisher for transformed pointcloud
_pub = nh.advertise<sensor_msgs::PointCloud2>(
"output",
1
);
// Spin
ros::spin ();
}
I think you are doing it wrong... I mean, the idea of cloud_cb2 is to be a callback (at least that is the common thing in the examples where they use a similar name and definition), so the idea is that every time it enters this function, it gives you a new cloud that you should integrate to your previous cloud...
I suppose that by doing pcl::fromROSMsg (*next_input, *cloud2_in); you are forcing the program to give you a new cloud, but it should not be like that as I told you before.
Then, to answer your question:
icp.align(Final);
If you read the tutorial from PCL here, it tells you that this function receives as input a point cloud variable that will contain the icp result.
Also, the result will be the alignment (or try) of
icp.setInputSource(cloud2_in);
to match
icp.setInputTarget(cloud_in);
So you are overwriting Final, with the 2 new clouds align, and then adding the points of cloud_in that is already in the pointcloud.
I recommend you, to check your workflow again, it should be something like this
ros::Publisher _pub;
pcl::PointCloud<pcl::PointXYZRGB>::Ptr cloud_in (new pcl::PointCloud<pcl::PointXYZRGB>);
pcl::PointCloud<pcl::PointXYZRGB>::Ptr Final (new pcl::PointCloud<pcl::PointXYZRGB>);
void
cloud_cb2 (const sensor_msgs::PointCloud2ConstPtr& next_input)
{
pcl::PointCloud<pcl::PointXYZRGB>::Ptr tmp_cloud (new pcl::PointCloud<pcl::PointXYZRGB>);
pcl::fromROSMsg (*next_input, *cloud_in);
//remove NAN points from the cloud
std::vector<int> indices;
pcl::removeNaNFromPointCloud(*cloud_in,*cloud_in, indices);
pcl::IterativeClosestPoint<pcl::PointXYZRGB, pcl::PointXYZRGB> icp;
icp.setInputSource(cloud_in);
icp.setInputTarget(Final);
pcl::PointCloud<pcl::PointXYZRGB> Final;
icp.align(tmp_cloud);
std::cout << "has converged:" << icp.hasConverged() << " score: " <<
icp.getFitnessScore() << std::endl;
std::cout << icp.getFinalTransformation() << std::endl;
Final = tmp_cloud;
// Convert the pcl::PointCloud to sensor_msgs/PointCloud2
sensor_msgs::PointCloud2 output;
pcl::toROSMsg( Final, output );
// Publish the map
_pub.publish(output);
}
int main (int argc, char** argv)
{
ros::init (argc, argv, "my_pcl_tutorial");
ros::NodeHandle nh;
// ROS subscriber for /camera/depth_registered/points
ros::Subscriber sub = nh.subscribe(
"/camera/depth_registered/points",
2,
cloud_cb2
);
// Create ROS publisher for transformed pointcloud
_pub = nh.advertise<sensor_msgs::PointCloud2>(
"output",
1
);
// Spin
ros::spin ();
}
I just did some changes to show how it should be, but I haven't tested it so probably you'll need to correct it even further. I hope this answer helps you. Also, I do not know how the ICP algorithm will work in the first call back when the Final cloud is empty. Also, I recommend some downsampling of the data, or else it will use incredibly huge amounts of memory and cpu after doing it for some frames

PCLVisualizer not staying open

I have the following code to estimate and show normal vectors for points in my point cloud:
int main(int argc, char* argv[]) {
pcl::PointCloud<pcl::PointXYZ>::Ptr cloud (new pcl::PointCloud<pcl::PointXYZ>);
if (pcl::io::loadPCDFile<pcl::PointXYZ> ("coffee_mug_1_1_1.pcd", *cloud) == -1) //* load the file
{
PCL_ERROR ("Couldn't read file coffee_mug_1_1_1.pcd \n");
return (-1);
}
std::cout << "Loaded "
<< cloud->width * cloud->height
<< " data points from test_pcd.pcd with the following fields: "
<< std::endl;
pcl::NormalEstimation<pcl::PointXYZ, pcl::Normal> ne;
ne.setInputCloud (cloud);
// Create an empty kdtree representation, and pass it to the normal estimation object.
// Its content will be filled inside the object, based on the given input dataset (as no other search surface is given).
pcl::search::KdTree<pcl::PointXYZ>::Ptr tree (new pcl::search::KdTree<pcl::PointXYZ> ());
ne.setSearchMethod (tree);
// Output datasets
pcl::PointCloud<pcl::Normal>::Ptr cloud_normals (new pcl::PointCloud<pcl::Normal>);
// Use all neighbors in a sphere of radius 3cm
ne.setRadiusSearch (0.03);
// Compute the features
ne.compute (*cloud_normals);
cout << "Computed normals " << cloud_normals->width * cloud_normals->height << cloud_normals->points[0] << endl;
pcl::visualization::PCLVisualizer viewer("PCL Viewer");
viewer.setBackgroundColor(0.0, 0.0, 0.5);
viewer.addPointCloud(cloud);
viewer.addPointCloudNormals<pcl::PointXYZ,pcl::Normal>(cloud, cloud_normals);
while (!viewer.wasStopped ())
{
viewer.spinOnce ();
}
}
However, when I run it the PCLVisualizer flashes on the screen and then the program terminates. I have no idea why it doesn't stay. If I use a CloudViewer just to display the point cloud (not the normals), this works fine and stays on the screen.
3 things you can check:
Check that indeed you read your file with no error. You can put a cin.get() right after PCL_ERROR ("Couldn't read file coffee_mug_1_1_1.pcd \n"); so that it doesn't exit directly in case it couldn't read the file.
Probably you should give a different id to your normals if you want to visualize them, because I think that now, both your cloud and the nomals use the id "cloud". Check the PCL example on normal visualization.
(e.g. viewer.addPointCloud(cloud, "cloud");
viewer.addPointCloudNormals<pcl::PointXYZ,pcl::Normal>(cloud, cloud_normals, 10, 0.05, "normals");
Make sure you get valid values for your normals.
If nothing from those works, could you maybe upload your coffee mug pcd somewhere and provide a link?
I have found the problem I had here, I was not linking against the boost libs that I was compiling against as I had /usr/lib in my link library path