I'm trying to write a program that reads and writes PCL files without PCD (Point Cloud library),
I can read the positions of each point without a problem,
but the RGB value is written in uint32_t and I do not know how to read this format and translate it to RGB values.
# .PCD v0.7 - Point Cloud Data file format
VERSION 0.7
FIELDS x y z rgb
SIZE 4 4 4 4
TYPE F F F F
COUNT 1 1 1 1
WIDTH 100
HEIGHT 1
VIEWPOINT 0 0 0 1 0 0 0
POINTS 100
DATA ascii
-0.031568773 -0.99000001 0.99000013 2.3418052e-038
0.031568673 -0.98999995 0.99000013 2.3418052e-038
-0.031568974 -0.54999995 0.77000004 2.3418052e-038
0.031568889 -0.54999995 0.77000004 2.3418052e-038
convert the last value (2.3418052e-038) to RGB value?
Is there a way to do this without Point Cloud library?
Thank you.
Please read the below PLY format data from Wikipedia,
https://en.wikipedia.org/wiki/PLY_(file_format)
Adding a sample snippet to write a PLY file,
std::string fname = "sample.ply";
std::ofstream out(fname);
out << "ply\n";
out << "format ascii 1.0\n";
out << "comment\n";
out << "element vertex " << cloud5->points.size() << "\n";
out << "property float" << sizeof(float) * 8 << " x\n";
out << "property float" << sizeof(float) * 8 << " y\n";
out << "property float" << sizeof(float) * 8 << " z\n";
out << "property uchar red\n";
out << "property uchar green\n";
out << "property uchar blue\n";
out << "property list uchar int vertex_indices\n";
out << "end_header\n";
out.close();
out.open(fname, std::ios_base::app);
for (size_t i = 0; i < cloud5->points.size(); ++i)
{
out << cloud5->points[i].x << " " << cloud5->points[i].y << " " << cloud5->points[i].z << " " << (int)cloud5->points[i].r
<< " " << (int)cloud5->points[i].g << " " << (int)cloud5->points[i].b << "\n";
}
out.close();
Related
I am attempting to read parquet data from a binary stream (via API posts). For example: I have a rather large parquet file on the other side of an REST API and need to fetch parts of the file. I have been attempting to follow file spec here: https://github.com/apache/parquet-format however, the pattern seems to be failing (or I am misunderstanding part).
For my test, I have moved a parquet file onto my local system and am reading in binary data from the file using ifstream. My steps are as follows:
Read in magic number from header
Read in magic number from footer
Read in FileMetaData length
Read in FileMetaData (from bottom of file)
Convert stream to FileMetaData Type using:
std::shared_ptr<parquet::FileMetaData> _metadata = parquet::FileMetaData::Make(metadataBuffer.data(), &metadataLength);
Read in RowGroup(0) and RowGroup(1) file_offset and total_byte_size from the FileMetaData like this:
_metadata->RowGroup(x)->file_offset();
_metadata->RowGroup(x)->total_byte_size();
After storing this data, I proceed to read in each RowGroup from the file using ifstream again. My start position is the file_offset from the beginning of the file.
Once my RowGroup data is read in to a vector of objects, I attempt to convert the buffered data into RowGroupMetaData
std::shared_ptr<parquet::RowGroupMetaData> _rowGroupMetaData = parquet::RowGroupMetaData::Make(rowGroupData[x].rowGroupBuffer.data(), rowGroupData[x].schema);
This is where I get stuck. When I try to access parts of the _rowGroupMetaData, I am getting junk back. It seems I must be skipping a step or overlooking part of the file spec.
I noticed that there is data between the magic number PAR1 at the top of the file an the file offset of RowGroup(0). the magic number is 4 characters long but the RowGroup(0) file_offset = 113. I am not sure what the data between 4-113 is and I cannot find information on it in the spec.
My parquet file is rather simple. 2 RowGroups with 2 columns. Total of 5 rows across both RowGroups.
Code:
ifstream inFile("parquet-arrow-example.parquet", std::ofstream::binary | std::ios::ate);
std::streamsize fileSize = inFile.tellg();
inFile.seekg(0, std::ios::beg);
std::vector<char> headBuffer;
std::vector<char> tailBuffer;
std::vector<uint8_t> metadataBuffer;
headBuffer.resize(4);
tailBuffer.resize(4);
struct RowGroupData {
int groupId;
int64_t byteLength;
int64_t offset;
const parquet::SchemaDescriptor* schema;
vector<uint8_t> rowGroupBuffer;
};
uint32_t metadataLength = 0;
string header;
string footer;
//Header
inFile.read((char*)&headBuffer[0], headBuffer.size()); //PAR1
header = string(headBuffer.begin(), headBuffer.end());
cout << header << endl;
//Footer
inFile.seekg(-4, std::ios::end);
inFile.read((char*)&tailBuffer[0], tailBuffer.size()); //PAR1
footer = string(tailBuffer.begin(), tailBuffer.end());
cout << footer << endl;
//Metadata Size
inFile.seekg(-8, std::ios::end);
inFile.read((char*)&metadataLength, 4);
cout << "Metadata Length: " << metadataLength << endl;
int len = -8 - metadataLength;
//Get MetaData
inFile.seekg(len, std::ios::end);
metadataBuffer.resize(metadataLength);
inFile.read((char*)&metadataBuffer[0], metadataBuffer.size());
cout << string(metadataBuffer.begin(), metadataBuffer.end()) << endl;
std::shared_ptr<parquet::FileMetaData> _metadata = parquet::FileMetaData::Make(metadataBuffer.data(), &metadataLength);
cout << "Num Rows: " << _metadata->num_rows() << endl;
cout << "Num Columns: " << _metadata->num_columns() << endl;
cout << "Num RowGroups: " << _metadata->num_row_groups() << endl;
vector<RowGroupData> rowGroupData;
//int seeqPos = 4;
for (int x = 0; x < _metadata->num_row_groups(); x++) {
cout << "RowGroup " << x << " Byte Size: " << _metadata->RowGroup(x)->total_byte_size() << endl;
cout << "RowGroup " << x << " File Offset: " << _metadata->RowGroup(x)->file_offset() << endl;
cout << "RowGroup " << x << " Column 0 File Offset: " << _metadata->RowGroup(x)->ColumnChunk(0)->file_offset() << endl;
cout << "RowGroup " << x << " Column 0 Byte Size: " << _metadata->RowGroup(x)->ColumnChunk(0)->total_compressed_size() << endl;
cout << "RowGroup " << x << " Column 1 File Offset: " << _metadata->RowGroup(x)->ColumnChunk(1)->file_offset() << endl;
cout << "RowGroup " << x << " Column 1 Byte Size: " << _metadata->RowGroup(x)->ColumnChunk(1)->total_compressed_size() << endl;
RowGroupData rgData;
rgData.groupId = x;
rgData.byteLength = _metadata->RowGroup(x)->total_byte_size();
rgData.offset = _metadata->RowGroup(x)->file_offset();
rgData.schema = _metadata->RowGroup(x)->schema();
rgData.rowGroupBuffer.resize(rgData.byteLength);
//Store rowGroup Length
//Store rowGroup Data
inFile.seekg(rgData.offset, std::ios::beg);
inFile.read((char*)&rgData.rowGroupBuffer[0], rgData.rowGroupBuffer.size());
rowGroupData.push_back(rgData);
//seeqPos = seeqPos + rgData.byteLength;
}
cout << endl;
for (int x = 0; x < rowGroupData.size(); x++) {
vector<uint8_t> rgBuffer;
//rgBuffer = rowGroupData[x].rowGroupBuffer;
cout << "RowGroupId: " << rowGroupData[x].groupId << endl;
cout << "RowGroupData: " << string(rowGroupData[x].rowGroupBuffer.begin(), rowGroupData[x].rowGroupBuffer.end()) << endl;
std::shared_ptr<parquet::RowGroupMetaData> _rowGroupMetaData = parquet::RowGroupMetaData::Make(rowGroupData[x].rowGroupBuffer.data(), rowGroupData[x].schema);
cout << "RowGroup Rows: " << _rowGroupMetaData->num_rows() << endl;
cout << "Byte Size: " << _rowGroupMetaData->total_byte_size() << endl;
}
The data between the file header and the file_offset is the column_chunk metadata for the first column.
The parquet spec is a little confusing because there are two different file offsets. The one on the RowGroup is an offset to the first page of data in the row group. And the column chunk file_offset which points to the column chunks metadata.
To my knowledge the first offset is mostly used for splitting files, I think most other readers use the latter offset for parsing columns.
Also note that in C++ at least file_offset was being written out incorrectly prior to the release of Arrow 6.0 (it pointed to the same byte offset as that the column offset chunk did).
Last, parquet is a non-trivial format and it is easy to have subtle bugs, I'd strongly recommend trying to use a standard implementation which has been battle tested rather then creating your own. If something is missing from the API it might be simpler to contribute it to an existing implementation instead of trying to build everything from scratch.
I'm fairly new to the Point Cloud world. In addition, I'm not so experienced in C++.
I need to read .las files and process them using the pcl library. This is a sample file from the dataset that I need to read. I followed this youtube video. However, since the file I'm trying to read is of version 1.3, I followed the corresponding spec file to define the header fields and I used 'Data Record Format 3' which is the data record format mentioned in the file header.
This are my definitions for the header and data record format:
#pragma once
#include <string>
#include <vector>
struct float4
{
float x, y, z, intensity;
};
class PointCloud
{
public:
uint32_t getVertsCount();
float4* getVertsData();
template<typename PointT>
typename pcl::PointCloud<PointT>::Ptr read(const std::string& path);//void read(const std::string &path);
private:
std::vector<float4> verts;
#pragma pack(1)
struct Header
{
char magic[4];
uint16_t fileSourceID;
uint16_t globalEncoding;
uint32_t guidData1;
uint16_t guidData2;
uint16_t guidData3;
uint8_t guidData4[8];
uint8_t versionMaj, versionMin;
char systemIdentifier[32];
char genSoftware[32];
uint16_t creationDay, creationYear;
uint16_t headerSize;
uint32_t pointDataOffset;
uint32_t numVarLenRecords;
uint8_t pointDataRecordFormat;
uint16_t pointDataRecordLen;
uint32_t numberOfPoints;
uint32_t numPointsByReturn[5];
double scaleX, scaleY, scaleZ;
double offsetX, offsetY, offsetZ;
double maxX, minX, maxY, minY, maxZ, minZ;
uint64_t waveform;
};
//#pragma pack(1)
struct PointRecord3
{
uint32_t x, y, z;
uint16_t intensity;
uint8_t flags;
uint8_t classification;
uint8_t scanAngleRank;
uint8_t userData;
uint16_t pointSourceId;
double gpsTime;
uint16_t red;
uint16_t green;
uint16_t blue;
};
};
I used the following code to read the point data, but I failed to get correct points:
template<typename PointT>
typename pcl::PointCloud<PointT>::Ptr PointCloud::read(const string& path)
{
ifstream inf(path, ios::binary);
typename pcl::PointCloud<PointT>::Ptr lasCloud(new pcl::PointCloud<PointT>);
if (inf.is_open())
{
Header header;
inf.read((char*)&header, sizeof(header));
cout << "Signature: " << header.magic << endl;
cout << "Source ID: " << int(header.fileSourceID) << endl;
cout << "Global Encoding: " << int(header.globalEncoding) << endl;
cout << "Guid 1: " << int(header.guidData1) << endl;
cout << "Guid 2: " << int(header.guidData2) << endl;
cout << "Guid 3: " << int(header.guidData3) << endl;
cout << "Guid 4: " << header.guidData4 << endl;
cout << (int)header.versionMaj << '.' << (int)header.versionMin << endl;
cout << "Sys Identifier: " << header.systemIdentifier << endl;
cout << "Gen Software: " << header.genSoftware << endl;
cout << "Creation Day: " << header.creationDay << endl;
cout << "Creation Year: " << header.creationYear << endl;
cout << header.headerSize << " == " << sizeof(header) << endl;
cout << "Point Data Offset: " << header.pointDataOffset << endl;
cout << "Number of Variable Len Records: " << header.numVarLenRecords << endl;
cout << "point Data Record Format: " << header.pointDataRecordFormat << endl;
cout << "point Data Record Len: " << header.pointDataRecordLen << endl;
cout << "Number of Points: " << header.numberOfPoints << endl;
cout << "Number of Points by Return: " << header.numPointsByReturn << endl;
cout << "Scales: " << header.scaleX << ", " << header.scaleY << ", " << header.scaleZ << endl;
cout << "Offsets: " << header.offsetX << ", " << header.offsetY << ", " << header.offsetZ << endl;
cout << "Xmin = " << header.minX << ", Ymin = " << header.minY << ", Zmin = " << header.minZ << endl;
cout << "Xmax = " << header.maxX << ", Ymax = " << header.maxY << ", Zmax = " << header.maxZ << endl;
cout << "Waveform: "<<header.waveform << endl;
assert(header.versionMaj == 1 && header.versionMin == 3);
//assert(header.headerSize == sizeof(header));
assert(header.pointDataRecordFormat == 3);
//inf.seekg(header.pointDataOffset);
inf.seekg(sizeof(header));
//inf.seekg(header.pointDataOffset+sizeof(header.waveform));
for (uint32_t i = 0; i < header.numberOfPoints; i++)
{
//PointRecord1* points = new PointRecord1[header.numberOfPoints];
PointRecord3 point;
//inf.read((char*)(points + i), sizeof(PointRecord1));
//inf.read((char*)&point, sizeof(PointRecord1));
inf.read((char*)&point, sizeof(PointRecord3));
PointT cloudPoint;
cloudPoint.x = (float)(point.x * header.scaleX) + header.offsetX;
cloudPoint.y = (float)(point.y * header.scaleY) + header.offsetY;
cloudPoint.z = (float)(point.z * header.scaleZ) + header.offsetZ;
cloudPoint.intensity = (float)(point.intensity) / 65536.0;
lasCloud->points.push_back(cloudPoint);
}
if (!inf.good())
throw runtime_error("Reading went wrong!");
}
else
{
throw runtime_error("Can't find any!");
}
lasCloud->width = lasCloud->points.size();
lasCloud->height = 1;
lasCloud->is_dense = true;
std::cout << "Cloud size = " << lasCloud->points.size() << endl;
return lasCloud;
}
int main (int argc, char** argv)
{
std::cout << "starting enviroment" << std::endl;
pcl::visualization::PCLVisualizer::Ptr viewer (new pcl::visualization::PCLVisualizer ("3D Viewer"));
CameraAngle setAngle = FPS; //XY, FPS, Side, TopDown
initCamera(setAngle, viewer);
pcl::PointCloud<pcl::PointXYZI>::Ptr inputCloudI; //
PointCloud pcd;
inputCloudI=pcd.read<pcl::PointXYZI>("C:/Users/hedey/OneDrive/Documents/Research_papers/STDF/10_4231_MFQF-Q141/I-65/LiDAR/RoadSurface/NB/20180524_I65_NB_RoadSurface_1_50.5.las");
std::cout << "Cloud size = " << inputCloudI->points.size() << endl;
renderPointCloud(viewer, inputCloudI, "lasCloud");
while (!viewer->wasStopped())
{
viewer->spinOnce();
}
}
There is a problem that I noticed. The header size is defined to be 227 (this is the value of the header size field). However, there is an 8-byte field named 'Start of Waveform Data Packet Record' at the end of the header, which if included in the header definition will make the header size 235 bytes. Also, the pointDataOffset field which was used to seek the points data in the youtube video is pointing to 227 bytes. When I used it to seek the points data, I got unreasonable point values.
My target is to process this point cloud and display it using the pcl cloud, but I'm failing to read the points correctly.
I have no idea about Lidar and *.las files, but I've noticed that the input file was created with libLAS
> file 20180524_I65_NB_RoadSurface_1_50.5.las
20180524_I65_NB_RoadSurface_1_50.5.las: LIDAR point data records, version 1.3, SYSID libLAS, Generating Software libLAS 1.6.0
So, why don't you use libLAS to read the data in? https://liblas.org/
libLAS comes with convenient CLI utility programs to handle las files, e.g.:
lasinfo 20180524_I65_NB_RoadSurface_1_50.5.las
---------------------------------------------------------
Header Summary
---------------------------------------------------------
Version: 1.3
Source ID: 0
Reserved: 0
Project ID/GUID: '00000000-0000-0000-0000-000000000000'
System ID: 'libLAS'
Generating Software: 'libLAS 1.6.0'
File Creation Day/Year: 144/2018
Header Byte Size 227
Data Offset: 227
Header Padding: 0
Number Var. Length Records: None
Point Data Format: 3
Number of Point Records: 22017565
Compressed: False
Number of Points by Return: 0 0 0 0 0
Scale Factor X Y Z: 0.00100000000000 0.00100000000000 0.00100000000000
Offset X Y Z: 590284.000 4339456.000 157.000
Min X Y Z: 589879.772 4338728.975 149.667
Max X Y Z: 590334.248 4339568.021 178.397
Spatial Reference: None
---------------------------------------------------------
Schema Summary
---------------------------------------------------------
Point Format ID: 3
Number of dimensions: 16
Custom schema?: false
Size in bytes: 34
Dimensions
---------------------------------------------------------
'X' -- size: 32 offset: 0
'Y' -- size: 32 offset: 4
'Z' -- size: 32 offset: 8
'Intensity' -- size: 16 offset: 12
'Return Number' -- size: 3 offset: 14
'Number of Returns' -- size: 3 offset: 14
'Scan Direction' -- size: 1 offset: 14
'Flightline Edge' -- size: 1 offset: 14
'Classification' -- size: 8 offset: 15
'Scan Angle Rank' -- size: 8 offset: 16
'User Data' -- size: 8 offset: 17
'Point Source ID' -- size: 16 offset: 18
'Time' -- size: 64 offset: 20
'Red' -- size: 16 offset: 28
'Green' -- size: 16 offset: 30
'Blue' -- size: 16 offset: 32
---------------------------------------------------------
Point Inspection Summary
---------------------------------------------------------
Header Point Count: 22017565
Actual Point Count: 22017565
Minimum and Maximum Attributes (min,max)
---------------------------------------------------------
Min X, Y, Z: 600191.027, 4313816.564, 148.621
Max X, Y, Z: 600212.594, 4314678.007, 156.632
Bounding Box: 600191.027, 4313816.564, 600212.594, 4314678.007
Time: 55449082.421688, 55488872.904376
Return Number: 0, 0
Return Count: 0, 0
Flightline Edge: 0, 0
Intensity: 0, 255
Scan Direction Flag: 0, 0
Scan Angle Rank: 0, 0
Classification: 1, 3
Point Source Id: 0, 31
User Data: 0, 0
Minimum Color (RGB): 0 0 0
Maximum Color (RGB): 0 0 0
Number of Points by Return
---------------------------------------------------------
(1) 22017565
Number of Returns by Pulse
---------------------------------------------------------
(0) 22017565
Point Classifications
---------------------------------------------------------
7187055 Unclassified (1)
8128678 Ground (2)
6701832 Low Vegetation (3)
-------------------------------------------------------
0 withheld
0 keypoint
0 synthetic
-------------------------------------------------------
and
las2txt 20180524_I65_NB_RoadSurface_1_50.5.las qq.txt && head qq.txt
600209.243,4313837.086,155.155
600209.342,4313839.620,155.191
600209.232,4313836.806,155.154
600209.338,4313839.516,155.197
600209.221,4313836.523,155.165
600209.333,4313839.398,155.194
600209.206,4313836.177,155.158
600209.328,4313839.285,155.200
600209.189,4313835.778,155.145
600209.322,4313839.152,155.193
This means that the file is OK, the library works, and you'll save yourself lots of time learning the basic library usage rather then trying to reimplement it yourself (think of various data formats etc., error handling etc., testing etc., getting feedback in case of troubles, etc., and your time)
I have a point-cloud. I want to get its RGB value. How can I do that?
To make my question clearer, please see the codes.
// Load the first input file into a PointCloud<T> with an appropriate type :
pcl::PointCloud<pcl::PointXYZRGB>::Ptr cloud1 (new pcl::PointCloud<pcl::PointXYZRGB>);
if (pcl::io::loadPCDFile<pcl::PointXYZRGB> ("../data/station1.pcd", *cloud1) == -1)
{
std::cout << "Error reading PCD file !!!" << std::endl;
exit(-1);
}
I want to get each value alone
std::cout << " x = " << cloud1->points[11].x << std::endl;
std::cout << " y = " << cloud1->points[11].y << std::endl;
std::cout << " z = " << cloud1->points[11].z << std::endl;
std::cout << " r = " << cloud1->points[11].r << std::endl;
std::cout << " g = " << cloud1->points[11].g << std::endl;
std::cout << " b = " << cloud1->points[11].b << std::endl;
But as a result I get something like that :
x = 2.33672
y = 3.8102
z = 8.86153
r = �
g = w
b = �
From the point cloud docs:
A point structure representing Euclidean xyz coordinates, and the RGB color.
Due to historical reasons (PCL was first developed as a ROS package), the RGB information is packed into an integer and casted to a float. This is something we wish to remove in the near future, but in the meantime, the following code snippet should help you pack and unpack RGB colors in your PointXYZRGB structure:
// pack r/g/b into rgb
uint8_t r = 255, g = 0, b = 0; // Example: Red color
uint32_t rgb = ((uint32_t)r << 16 | (uint32_t)g << 8 | (uint32_t)b);
p.rgb = *reinterpret_cast<float*>(&rgb);
To unpack the data into separate values, use:
PointXYZRGB p;
// unpack rgb into r/g/b
uint32_t rgb = *reinterpret_cast<int*>(&p.rgb);
uint8_t r = (rgb >> 16) & 0x0000ff;
uint8_t g = (rgb >> 8) & 0x0000ff;
uint8_t b = (rgb) & 0x0000ff;
Alternatively, from 1.1.0 onwards, you can use p.r, p.g, and p.b directly.
Definition at line 559 of file point_types.hpp.
I am doing some image processing using ITK and then using VTK to print the results in a .png format however, the output image is always black.
Currently, I am converting itk::Image to vtk::vtkImageData using the itk::ImagetoVTKImageFilter(typedeffed to ITKtoVTKFilterType in my code).
ITKtoVTKFilterType::Pointer itk2vtkGray = ITKtoVTKFilterType::New();
itk2vtkGray->SetInput(grayBinary); //grayBinary is of type itk::Image<unsigned short, 2>
itk2vtkGray->Update();
vtkSmartPointer<vtkImageData> grayVTK = vtkSmartPointer<vtkImageData>::New();
grayVTK->SetExtent(extent);
grayVTK->SetSpacing(m_spacing);
grayVTK->SetScalarTypeToUnsignedShort();
grayVTK->SetNumberOfScalarComponents(1);
grayVTK->AllocateScalars();
grayVTK->DeepCopy(static_cast<vtkImageData*>(itk2vtkGray->GetOutput()));
//grayVTK = itk2vtkGray->GetOutput();
I have even confirmed that my VTK ImageData contains values of either 255 or 0 using the following code.
int *dims = grayVTK->GetDimensions();
std::cout << "Dims: " << " x: " << dims[0] << " y: " << dims[1] << " z: " << dims[2] << std::endl;
std::cout << "Number of points: " << grayVTK->GetNumberOfPoints() << std::endl;
std::cout << "Number of cells: " << grayVTK->GetNumberOfCells() << std::endl;
for (int y = 0; y < dims[1]; y++)
{
for (int x = 0; x < dims[0]; x++)
{
unsigned short *pixel = static_cast<unsigned short*>(grayVTK->GetScalarPointer(x,y,0));
std::cout << "PIXEL LOC/VAL "<< y*dims[0] + x << " " << pixel[0] <<std::endl;
}
std::cout << std::endl;
}
I then go on to do an ImageCast to ensure the type of the data is unsignedShort.
vtkSmartPointer<vtkImageCast> cast2 = vtkSmartPointer<vtkImageCast>::New();
cast2->SetInput(grayVTK);
cast2->SetOutputScalarTypeToUnsignedShort();
cast2->ClampOverflowOn();
cast2->Update();
Then finally I use vtkPNGwriter to output the .png files. Notice that I have tried to output both the actual vtkImageData as well as output from the ImageCastFilter.
vtkSmartPointer<vtkPNGWriter> writer =
vtkSmartPointer<vtkPNGWriter>::New();
writer->SetFileName(filename.toStdString().c_str());
writer->SetInputConnection(cast2->GetOutputPort());
//writer->SetInput(grayVTK); I have tried to method as well but to no success
writer->Write();
However, the .png output is always black. Does anyone know what I am doing wrong.
For future reference it seems that many PNG readers do not display 16 bit data. Hence the casting I was doing to unsigned short at the end should have rather been to char.
vtkSmartPointer<vtkImageCast> cast2 = vtkSmartPointer<vtkImageCast>::New();
cast2->SetInput(grayVTK);
cast2->SetOutputScalarTypeToChar();
cast2->ClampOverflowOn();
cast2->Update();
Hi I have the following piece of code:
void ViewPlane::colourPixel(const int u, const int v, const Colour c){
int step = img->widthStep;
cout << "U: " << u << ", V: " << v << ", step: " << step << "\n";
cout << "Colour: " << c.r << ", " << c.g << ", " << c.b << "\n";
cout << "Data: " << v*step+u*3 << ", " << v*step+u*3+1 << ", " << v*step+u*3+2 << "\n";
cout << "Data: " << data[v*step+u*3] << ", " << data[v*step+u*3+1] << ", " << data[v*step+u*3+2] << "\n";
//swap colours round as opencv default image is bgr, not rgb
data[v*step+u*3] = c.b;
cout << "done it\n";
data[v*step+u*3+1] = c.g;
cout << "done it\n";
data[v*step+u*3+2] = c.r;
cout <<"done it\n";
}
img is an IplImage pointer, and data is
data = (uchar *)img->imageData;
I am looping through lots of values of u and v (image coordinates) and assigning a colour to each pixel. However the code just stops half way through this pixel when it reaches u = 148.
Printing out the values reveals the green pixel at this point originally contains '*', and is the only one up to that point which does so. Here is a copy of the output:
....
U: 146, V: 0, step: -1219934912
Colour: 0, 0, 0
Data: 438, 439, 440
Data:
done it
done it
done it
U: 147, V: 0, step: -1219934912
Colour: 0, 0, 0
Data: 441, 442, 443
Data:
done it
done it
done it
U: 148, V: 0, step: -1219934912
Colour: 0, 0, 0
Data: 444, 445, 446
Data: �, *, �
done it
So my question is, what does this star mean? And why is it stopping me from assigning data there?
EDIT: The image size is 400x400px
PS: I just tried commenting out the green channel assignment and this error happens again for the red channel further on. This time it held '#' before assignment, so this makes me think it has no bearing on what's going wrong...In that case, what on earth? I'm not out of bounds of the array, the value I'm inserting is an integer 0...
Your program's output shows step being "-1219934912". It makes it look like you have a stray img pointer. It's rather strange that each line in image would take ~1GB