Iterating through the pixels of a .bmp in C++ - c++

NOTE: I changed the title from .png to .bmp due to a comment suggesting bitmaps instead.
I'm making this simple 2d grid based CMD-game, and I want to make .png levels and turn them into level data for my game.
So basically all I want to know is, how would I iterate through the pixels of a bmp to parse it to some level data.
This is how I did it with a .txt
int x = 0;
int y = 0;
std::ifstream file(filename);
std::string str;
while (std::getline(file, str))
{
x++;
for (char& c : str) {
y++;
updateTile(coordinate(x), coordinate(y), c);
}
}
I couldn't find any helpful threads so I posted this new one, hope I'm not breaking any rules

I don't know if you still want to read png-files, but if you do, check this decoder:
http://lodev.org/lodepng/
It loads a png-file into a vector where 4 chars (bytes) give one pixel(RGBA format). So by loading 4 chars at once, you will get one pixel.
I haven't used it before, but it looks easy to use.

Related

How to increse numbers in file?

I need to read file in parts ( for example by 4 bytes) and then increment numbers in files by one and then write back;
this part only fills in file by 1; How to increase this number on 1?
void Prepare()
{
//ifstream fileRead("\FILE", ios::in | ios::binary);
ofstream fileOut("\FILE.bin", ios::out | ios::binary);
int count = 10485760;
for (int i = 0; i < count-1; i++)
{
fileOut << 1;
}
fileOut.close();
}
If I understand your question, you need to read the file then write it out, changing the data. You can't really do it the way you've started.
There are two basic ways to do this. You can read the entire file into memory, then manipulate the memory, close the file, open it again for output this time (truncating it) and write it back out. This is easiest, but I don't think it's the approach you're looking for.
The other choice is to manipulate the file in place. That's trickier, but not that hard. You need to read about random access I/O (input/output). If you google for c++ random access file you'll get some good hits, but I'll show you a little bit.
// Open the file.
std::ifstream file{"file.dat"};
// Jump to a particular location in the file. Beginning is 0.
file.seekg(128);
// Read 4 bytes
char bytes[4];
file.read(bytes, 4);
// Manipulate it (more below)
int number = bytesToInt(bytes);
++number;
intToBytes(number, bytes);
// Seek again
file.seekg(128);
file.write(bytes, 4);
So the only remaining trick is that you have to convert the bytes to a number and then back into bytes. Due to endianness, it's not safe to read directly into the number. You also need to know the endianness of the data in the file. That's a separate topic you can look up if you're not already familiar with it.
(Specifically, you need to implement those two methods after verifying how the data is stored in your file.)
There may be other ways to do this, but the key to this method is the random access file.

Make sense of PGM data and load PGM data into vector/array

Let me preface this by saying this is my first time working with the PGM file format in C++, so I have multiple questions.
I'm currently using P2 PGM files, but I read that P5 is much simpler to work with. How can I convert a P2 to a P5?
With the P2, I am trying to take the image and transfer the pixel values to a 2D vector or array or anything remotely index-able. I'm using a very basic image (white background with a black rectangle in the foreground). Here's my code so far:
fstream img;
img.open(PATH_NAME, ios::in | ios::binary | ios::out);
string line;
getline(img, line); //header part that says "P2"
//stores column and row values
getline(img, line);
istringstream iss(line);
string row_string, col_string;
iss >> row_string;
iss >> col_string;
int original_rows = stoi(row_string);
int original_cols = stoi(col_string);
getline(img, line);//collects maxval
//now I am collecting actual image/pixel data
getline(img, line);
cout << line;
The problem with that last part is that when I cout << line; , this is the output:
\377\377\377\377\377\377\377\377\377\377\377\377\
and on and on and on for much longer than a single line should be (there are 162 columns in my sample photo and this code outputs much more than 162 377s). The maxval is 255, so I'm not really sure what the problem is.
As for actually transferring these values into something indexable, how do I go about that? So far, my idea is to take each string line, detect for '\' and collect the individual values that way and store it in an array; convert each string to int and then store it in a vector. Might be easier said than done, so I am open to more efficient options.
The problem is that you are confusing text I/O with binary I/O. As I understand the image data in a P5 file is held as binary byte values. Therefore you should be reading it into a byte vector (or similar) using read, instead of reading it using getline which is for text I/O.
Like this
vector<unsigned char> raster(original_rows*original_cols);
img.read(raster.data(), original_rows*original_cols);
raster is your something indexable.

Binary file not holding data properly

Im currently trying to replace a text based file in my application with a binary one. Im just doing some early tests so the code isn't exactly safe but I'm having problems with the data.
When trying to read out the data it gets about half way before it starts coming back with incorrect results.
Im creating the file in c++ and my client application is c#. I think the problem is in my c++ (which I haven't used very much)
Where the problem is at the moment is I have a vector of a struct that is called DoubleVector3 which consists of 3 doubles
struct DoubleVector3 {
double x, y, z;
DoubleVector3(std::string line);
};
Im currently writing the variables individually to the file
void ObjElement::WriteToFile(std::string file) {
std::ofstream fileStream;
fileStream.open(file); //, ios::out | ios::binary);
// ^^problem was this line. it should be
// fileStream.open(file, std::ios_base::out | std::ios_base::binary);
fileStream << this->name << '\0';
fileStream << this->materialName << '\0';
int size = this->vertices.size();
fileStream.write((char*)&size,sizeof(size));
//i have another int written here
for (int i=0; i<this->vertices.size(); i++) {
fileStream.write((char*)&this->vertices[i].x, 8);
fileStream.write((char*)&this->vertices[i].y, 8);
fileStream.write((char*)&this->vertices[i].z, 8);
}
fileStream.close();
}
When I read the file in c# the first 6 sets of 3 doubles are all correct but then I start getting 0s and minus infinities
Am I doing anything obviously wrong in my WriteToFile code?
I have the file uploaded on mega if anyone needs to look at it
https://mega.co.nz/#!XEpHTSYR!87ihtCfnGXJJNn13iE6GIpeRhlhbabQHFfN88kr_BAk
(im writing the name and material in first then the number of vertices before the actual list of vertices)
Small side question - Should I delimit these doubles or just add them in one after the other?
To store binary data in a stream, you must add std::ios_base::binary to the stream's flags when opening it. Without this, the stream is opened in text mode and line-ending conversions can happen.
On Windows, line-ending conversions mean inserting a byte 0x0D (ASCII for carriage-return) before each 0x0A byte (ASCII for line-feed). Needless to say, this corrupts binary data.

String being duplicated in vector array

Edit: Changed the title to better reflect the current issue.
Right i know now were the source of the issue lies, it's with the text splitting part of the function. I remember now what i did, i changed the splitting text because the tutorial for me was returning an error.
for(const char *c=text;*c;c++)
{
if(*c=='\n') {
string line;
for(const char *n=start_line;n<c;n++) line.append(1,*n);
lines.push_back(line);
start_line=c+1;
}
}
if(start_line)
{
string line;
for(const char *n=start_line; n < c;n++) line.append(1,*n);
lines.push_back(line);
}
The 'c' was returning undeclared, and there's no mention for any other c, so i guess it's referring to the pointer in the for loop above. Though bring the "if (start_line)" into the first code block, kept returning me each character in the text, instead of just the whole thing.
So i changed the code to the following:
for(const char *c=text;*c;c++)
{
if(*c=='\n')
{
string line;
for(const char *n=start_line;n<c;n++) line.append(1,*n);
lines.push_back(line);
start_line=c+1;
if(start_line)
{
string line;
for(const char *n=start_line;n<c;n++) line.append(1,*n);
lines.push_back(line);
}
}
else if (*c == *start_line)
{
lines.push_back(text);
}
}
I pretty sure that the "else if (*c == *start_line)" comparsion is what's causing me the issue. Unsure though what to replace it with. I guess though because i'm not using any newlines or don't plan to i can just go with:
for(const char *c=text;*c;c++)
{
lines.push_back(text);
break;
}
But it would still be nice to know were i was going wrong. *Note: That the above code works fine now, no issue with that and the doubling effect. So i'm sure that it was my text splitting code.
Here's an idea for you: in your text rendering method, add a static counter and use it to set the color of each string rendered. Since you don't seem to have that many strings per frame, you can use the 8 bits of one color component (e.g. red) for the counter and set the 2 other components to 255. If you had more than 255 strings, you could still encode the counter value over 2 or 3 color components.
With this little debug aid, you will be able to see in which order each piece of text is rendered. You can use pixie and/or zoomin to see the pixel values "live". Otherwise, just take a screenshot and examine the result.
It looks like the erroneously drawn text in that capture is "50b" which I doubt is a string that would normally appear in your game. It looks like you're drawing something that's normally an empty string, but sometimes picks up junk values - in other words, undefined behavior.
I can't be sure, of course, because I simply don't have enough information to find your problem. Your glClear looks fine to me, so you can be assured that the extra text is being drawn in the same frame as your intended text.

How to read in a data file of unknown dimensions in C/C++

I have a data file which contains data in row/colum form. I would like a way to read this data in to a 2D array in C or C++ (whichever is easier) but I don't know how many rows or columns the file might have before I start reading it in.
At the top of the file is a commented line giving a series of numbers relating to what each column holds. Each row is holding the data for each number at a point in time, so an example data file (a small one - the ones i'm using are much bigger!) could be like:
# 1 4 6 28
21.2 492.1 58201.5 586.2
182.4 1284.2 12059. 28195.2
.....
I am currently using Python to read in the data using numpy.loadtxt which conveniently splits the data in row/column form whatever the data array size, but this is getting quite slow. I want to be able to do this reliably in C or C++.
I can see some options:
Add a header tag with the dimensions from my extraction program
# 1 4 6 28
# xdim, ydim
21.2 492.1 58201.5 586.2
182.4 1284.2 12059. 28195.2
.....
but this requires rewriting my extraction programs and programs which use the extracted data, which is quite intensive.
Store the data in a database file eg. MySQL, SQLite etc. Then the data could be extracted on demand. This might be a requirement further along in the development process so it might be good to look into anyway.
Use Python to read in the data and wrap C code for the analysis. This might be easiest in the short run.
Use wc on linux to find the number of lines and number of words in the header to find the dimensions.
echo $((`cat FILE | wc -l` - 1)) # get number of rows (-1 for header line)
echo $((`cat FILE | head -n 1 | wc -w` - 1)) # get number of columns (-1 for '#' character)
Use C/C++ code
This question is mostly related to point 5 - if there is an easy and reliable way to do this in C/C++. Otherwise any other suggestions would be welcome
Thanks
Create table as vector of vectors:
std::vector<std::vector<double> > table;
Inside infinite (while(true)) loop:
Read line:
std::string line;
std::getline(ifs, line);
If something went wrong (probably EOF), exit the loop:
if(!ifs)
break;
Skip that line if it's a comment:
if(line[0] == '#')
continue;
Read row contents into vector:
std::vector<double> row;
std::copy(std::istream_iterator<double>(ifs),
std::istream_iterator<double>(),
std::back_inserter(row));
Add row to table;
table.push_back(row);
At the time you're out of the loop, "table" contains the data:
table.size() is the number of rows
table[i] is row i
table[i].size() is the number of cols. in row i
table[i][j] is the element at the j-th col. of row i
How about:
Load the file.
Count the number of rows and columns.
Close the file.
Allocate the memory needed.
Load the file again.
Fill the array with data.
Every .obj (3D model file) loader I've seen uses this method. :)
Figured out a way to do this. Thanks go mostly to Manuel as it was the most informative answer.
std::vector< std::vector<double> > readIn2dData(const char* filename)
{
/* Function takes a char* filename argument and returns a
* 2d dynamic array containing the data
*/
std::vector< std::vector<double> > table;
std::fstream ifs;
/* open file */
ifs.open(filename);
while (true)
{
std::string line;
double buf;
getline(ifs, line);
std::stringstream ss(line, std::ios_base::out|std::ios_base::in|std::ios_base::binary);
if (!ifs)
// mainly catch EOF
break;
if (line[0] == '#' || line.empty())
// catch empty lines or comment lines
continue;
std::vector<double> row;
while (ss >> buf)
row.push_back(buf);
table.push_back(row);
}
ifs.close();
return table;
}
Basically create a vector of vectors. The only difficulty was splitting by whitespace which is taken care of with the stringstream object. This may not be the most effective way of doing it but it certainly works in the short term!
Also I'm looking for a replacement for the deprecated atof function, but nevermind. Just needs some memory leak checking (it shouldn't have any since most of the objects are std objects) and I'm done.
Thanks for all your help
Do you need a square or a ragged matrix? If the latter, create a structure like this:
std:vector < std::vector <double> > data;
Now read each line at a time into a:
vector <double> d;
and add the vector to the ragged matrix:
data.push_back( d );
All data structures involved are dynamic, and will grow as required.
I've seen your answer, and while it's not bad, I don't think it's ideal either. At least as I understand your original question, the first comment basically specifies how many columns you'll have in each of the remaining rows. e.g. the one you've given ("1 4 6 28") contains four numbers, which can be interpreted as saying each succeeding line will contain 4 numbers.
Assuming that's correct, I'd use that data to optimize reading the data. In particular, after that, (again, as I understand it) the file just contains row after row of numbers. That being the case, I'd put all the numbers together into a single vector, and use the number of columns from the header to index into the rest:
class matrix {
std::vector<double> data;
int columns;
public:
// a matrix is 2D, with fixed number of columns, and arbitrary number of rows.
matrix(int cols) : columns(cols) {}
// just read raw data from stream into vector:
std::istream &read(std::istream &stream) {
std::copy(std::istream_iterator<double>(stream),
std::istream_iterator<double>(),
std::back_inserter(data));
return stream;
}
// Do 2D addressing by converting rows/columns to a linear address
// If you want to check subscripts, use vector.at(x) instead of vector[x].
double operator()(size_t row, size_t col) {
return data[row*columns+col];
}
};
This is all pretty straightfoward -- the matrix knows how many columns it has, so you can do x,y indexing into the matrix, even though it stores all its data in a single vector. Reading the data from the stream just means copying that data from the stream into the vector. To deal with the header, and simplify creating a matrix from the data in a stream, we can use a simple function like this:
matrix read_data(std::string name) {
// read one line from the stream.
std::ifstream in(name.c_str());
std::string line;
std::getline(in, line);
// break that up into space-separated groups:
std::istringstream temp(line);
std::vector<std::string> counter;
std::copy(std::istream_iterator<std::string>(temp),
std::istream_iterator<std::string>(),
std::back_inserter(counter));
// the number of columns is the number of groups, -1 for the leading '#'.
matrix m(counter.size()-1);
// Read the remaining data into the matrix.
m.read(in);
return m;
}
As it's written right now, this depends on your compiler implementing the "Named Return Value Optimization" (NRVO). Without that, the compiler will copy the entire matrix (probably a couple of times) when it's returned from the function. With the optimization, the compiler pre-allocates space for a matrix, and has read_data() generate the matrix in place.