Problems while trying to Read Binary File C++ - c++

I'm writing a simple console application in Visual Studio C++. I want to read a binary file with .cer extension to a byte array.
ifstream inFile;
size_t size = 0;
char* oData = 0;
inFile.open(path, ios::in|ios::binary);
if (inFile.is_open())
{
size = inFile.tellg(); // get the length of the file
oData = new char[size+1]; // for the '\0'
inFile.read( oData, size );
oData[size] = '\0' ; // set '\0'
inFile.close();
buff.CryptoContext = (byte*)oData;
delete[] oData;
}
But when I launch it, I receive in all the oData characters the same char, every time another one, For example:
oData = "##################################################...".
Then I tried another way:
std::ifstream in(path, std::ios::in | std::ios::binary);
if (in)
{
std::string contents;
in.seekg(0, std::ios::end);
contents.resize(in.tellg());
in.seekg(0, std::ios::beg);
in.read(&contents[0], contents.size());
in.close();
}
Now the content has very strange values: a part of the values is correct, and a part is negative and strange values (maybe it is related to signed char and unsigned char?).
Does anyone have any idea?
Thanks ahead!

Looking at the first version:
What makes you think that tellg gets the size of the stream? It does not, it returns the current read position. You then go on to give a pointer to your data to buff.CryptoContents and promptly delete the data pointed to! This is very dangerous practice; you need to copy the data, use a smart pointer or otherwise ensure the data has the correct lifespan. It is likely the deletion is stomping your data with a marker to show it has been deleted if you're running in debug mode which is why you are getting the stream of identical characters.
I suspect your suggestion about signed and unsigned may be correct for the second but I can't say without seeing your file and data.

You are setting CryptoContext to point to your data by byte pointer, and after that you delete that data!
buff.CryptoContext = (byte*)oData;
delete[] oData;
After this lines CryptoContext is pointing to released and invalid data. Just keep oData array longer in memory and delete it after you are done with decoding or whatever you are doing with it.

Related

Bad alloc error when creating a 4GB std::string

I am using a multiparser to send http POST request and I need to send a big file (4Go).
However when I test my code it give me a bad_alloc exception and crashes. I tried it with smaller files and it worked for files smaller than 500Mo, but the bigger the files get the more it crashes randomly.
Can you help me?
Here is the code where the crash occurs. It is when it buids the body of the request:
std::ifstream ifile(file.second, std::ios::binary | std::ios::ate); //the file I am trying to send
std::streamsize size = ifile.tellg();
ifile.seekg(0, std::ios::beg);
char *buff = new char[size];
ifile.read(buff, size);
ifile.close();
std::string ret(buff, size); //this make the bad_alloc exception
delete[] buff;
return ret; //return the body of the file in a string
Thanks
std::string ret(buff, size); creates a copy of buff. So you essentially double the memory consumption
https://www.cplusplus.com/reference/string/string/string/
(5) from buffer
Copies the first n characters from the array of characters pointed by s.
Then the question becomes how much you actually have, respectively how much your OS allows you to get (e.g. ulimit on linux)
As the comments say you should chunk your read and send single chunks with multiple POST requests.
You can loop over the ifstream check if ifile.eof() is reached as exit criteria for your loop:
std::streamsize size = 10*1024*1024 // read max 10MB
while (!ifile.eof())
{
std::streamsize readBytes = ifile.readsome(buff, size);
// do your sending of buff here.
}
You need to consider error handling and such to not leak buff or leave ifile open.

Is readsome() appropriate to read binary data on Windows?

Context: I am trying to read the content of a PNG picture in C++ to send it later to my Android app. To do so, I open the file in binary mode, read it's content by chuncks of 512 bytes, then send the data to the app. I'm on Windows.
Issue: I use an ifstream instance and the readsome() function as shown below, and it returns me 512, which is what I expected since I asked to read 512 bytes. However, it seems that I am far from really having 512 bytes in my buffer, which confuses me. While I debug my programm step by step, the number of char in the buffer seems random, but is never 512 as expected.
Code:
int currentByteRead = 0;
std::ifstream fl(imgPath.toStdString().c_str(), ios_base::binary);
fl.seekg( 0, std::ios::end );
int length = fl.tellg();
char *imgBytes = new char[512];
fl.seekg(0, std::ios::beg);
// Send the img content by blocks of 512 bytes
while(currentByteRead + 512 < length) {
int nbRead = fl.readsome(imgBytes, 512); // nbRead is always set to 512 here
if(fl.fail()) {
qDebug() << "Error when reading file content";
}
sendMessage(...);
currentByteRead += 512;
imgBytes = new char[512];
}
// Send the remaining data
int nbRemainingBytes = length - currentByteRead;
fl.readsome(imgBytes, nbRemainingBytes);
sendMessage(...);
fl.close();
currentByteRead += nbRemainingBytes;
The length I get at the beginning is the correct one, and it seems there is no error. But it is as if not all the data was copied into the buffer during the readsome() call.
Questions: Did I misunderstood something about the readsome() function ? Is there something related to Windows causing this behaviour ? Is there a more appropriate way to proceed ?
I finally found a way to do what I wanted, and as suggested by David Herring I will put here my answer.
My thoughts about the issue: If I use a std::ifstream::pos_type variable instead of an int, the correct number of bytes is read and put in the buffer. This was not the case when using an int, as if the chars were only written in the buffer until a given (random ?) point. I am not sure to understand why this behavior occurred. My guess was that I had issues with '\n' characters, but the randomness of the final content of the buffer is still unclear for me.
Correction: This is the working code I finally reached nonetheless. Starting with this, I was able to do what I had in mind.
std::ifstream ifs(imgPath.toStdString().c_str(), std::ios::binary|std::ios::ate);
std::ifstream::pos_type pos = ifs.tellg();
int length = ifs.tellg();
std::vector<char> result(pos);
ifs.seekg(0, std::ios::beg);
ifs.read(result.data(), pos);
ifs.close();
I hope this will help others. Thank you David for your suggestions.

How to check whether ifstream is end of file in C++

I need to read all blocks of one large file(about 10GB) sequentially, the file contains many floats with a few strings, like this(each item splited by '\n'):
6.292611
-1.078219E-266
-2.305673E+065
sod;eiwo
4.899747e-237
1.673940e+089
-4.515213
I read MAX_NUM_PER_FILE items each time and process them and write to another file, but i don't know when the ifstream is ended.
Here is my code:
ifstream file_input(path_input); //my file is a text file, but i tried both text and binary mode, both failed.
if(file_input)
{
file_input.seekg(0,file_input.end);
unsigned long long length = file_input.tellg(); //get file size
file_input.seekg(0,file_input.beg);
char * buffer = new char [MAX_NUM_PER_FILE+MAX_NUM_PER_LINE];
int i=1,j;
char c,tmp[3];
while(file_input.tellg()<length)
{
file_input.read(buffer,MAX_NUM_PER_FILE);
j=MAX_NUM_PER_FILE;
while(file_input.get(c)&&c!='\n')
buffer[j++]=c; //get a complete item
//process with buffer...
itoa(i++,tmp,10); //int2char
string out_name="out"+string(tmp)+".txt";
ofstream file_output(out_name);
file_output.write(buffer,j);
file_output.close();
}
file_input.close();
delete[] buffer;
}
My code goes wrong, length is bigger than real file size. I have tried file_input.good() or !file_input.eof(), they didn't work, getline(file_input,s) is good, but it is much slower than read, i want read, but i don't know how to check whether ifstream is end-of-file.
I do my work in WINDOWS 7 with VS2010.
I have searched, but there are not any answer about it, How to open a file using ifstream and keep reading it until the end this link can't answer my question.
Update, Problem solved
Hi everyone, I have figured it out that it's my fault. Both while(file_input.tellg()<length) and while(file_input.peek()!=EOF) work fine! while(file_input.peek()!=EOF) is recommended.
The extra items written after the end-of-file is the left items in buffer written in the last time.
Here is the correct code:
ifstream file_input(path_input);
if(file_input)
{
//file_input.seekg(0,file_input.end);
//unsigned long long length = file_input.tellg(); //get file size
//file_input.seekg(0,file_input.beg);
char * buffer = new char [MAX_NUM_PER_FILE+MAX_NUM_PER_LINE];
int i=1,j;
char c,tmp[3];
while(file_input.peek()!=EOF)
{
memset(buffer,0,sizeof(char)*(MAX_NUM_PER_FILE+MAX_NUM_PER_LINE)); //clear first!
file_input.read(buffer,MAX_NUM_PER_FILE);
j=MAX_NUM_PER_FILE;
while(file_input.get(c)&&c!='\n')
buffer[j++]=c;
itoa(i++,tmp,10);//int2char
string out_name="out"+string(tmp)+".txt";
ofstream file_output(out_name);
file_output.write(buffer,strlen(buffer)); //use the correct buffer size instead of j
file_output.close();
}
file_input.close();
delete[] buffer;
}
while( file_input.peek() != EOF )
{
// code
}
Basically peek() will read the next char without extracting it.
So you can simply compare it to EOF.

sqlite c++ put and retrieve blob

i finally figured out how to read a binary into ram, then write it somewhere with
ifstream::pos_type size;
char * memblock;
/* ...bunch of stuff... */
ifstream infile(filename1.c_str(), ios::in|ios::binary|ios::ate);
ofstream outfile(filename2.c_str(), ios::out|ios::binary);
if (infile.is_open())
{
size = infile.tellg();
memblock = new char [size]; //request mem allocation of that size
infile.seekg(0, ios::beg); //move to begining
infile.read(memblock, size);//read file
infile.close(); //close file
outfile.write(memblock,size);
outfile.close();
}
i have also tested that outfile.write(memblock,sizeof(memblock)); works totally fine
SO i applied this with sqlite and did it step by step by storing it in the database with sqlite3_bind_blob(stmt,2,memblock,size,SQLITE_TRANSIENT); and everything else that goes along with it like hte step and etc. I'm able to retrieve the data from the table with
sqlite3_prepare_v2(db,"SELECT Data, Size FROM table1",-1,&stmt,0);
sqlite3_step(stmt);
char * result = (char * )sqlite3_column_blob(stmt,0);
outfile.write(result,size);
this also works fine.
However when i use outfile.write(result,sizeof(result)); instead, it doesnt work
doing size - sizeof(result) gets me large number (i was just looking for a 0 or not mostly)
so i dont really know WHERE its going wrong. is it inputting the data into the database wrong?
am i retrieving the data wrong?
i even tried adding another column and then inserting the size and using that but didn't seem to work for some reason
i have also tested that outfile.write(memblock,sizeof(memblock)); works totally fine
This is exceedingly unlikely. sizeof(memblock) == sizeof(char*) == 4 (or perhaps 8). It does not report the size of the block of data that memblock points to. sizeof(result) doesn't work for the same reason.
Note that you don't need to store the size of the blob as a separate column in the database. SQLite keeps track of the size; you can retrieve it with sqlite3_column_bytes.

c++ use a buffer in memory instead reading directly a file

I have this code that works fine:
FILE *fp;
fp = fopen(filename.c_str(), "rb");
char id[5];
fread(id,sizeof(char),4,fp);
now I've changed something in my architecture, and instead the filename as fullpath of the file I have a char pointer that contains the data of the file.. so I don't need to read (fopen, etc..) but only to read the char* buffer...
how can I do this?
thanks in advance
If I'm understanding your question correctly, you want to access a four character ID somewhere in the middle of your buffer. The easiest way to do this is just to copy the data into a new buffer and add a NULL terminator.
size_t index = 0;
// ...
char id[5];
memcpy(id, &myData[index], 4);
id[4] = '\0';
index += 4;
You can then read through your buffer sequentially by updating the index value every time you read something.
char id[5];
strncpy(id,bfr,4);
id[4]='\0';
Where bfr is the buffer with your file data.
Also strongly advise you read the chapter on pointers and strings in K&R: The C Programming Language.