saving file with ofstream - c++

I'm new to C++, I have an image named "test.jpg", i convert it to base64 and decode it again like this:
std::ifstream inputFile;
inputFile.open("test.jpg",std::ios::binary);
std::filebuf* pbuf = inputFile.rdbuf();
inputFile.seekg (0, ios::end);
int length = inputFile.tellg();
// allocate memory to contain file data
char* buffer=new char[length];
// get file data
pbuf->sgetn (buffer,length);
inputFile.close();
CBase64 base64;
string encodedData = base64.base64_encode((unsigned char*)buffer,length);
delete[] buffer;
string decodedData = base64.base64_decode(encodedData);
ofstream outPutFile;
outPutFile.open("test2.jpg",ios::binary | ios::out);
outPutFile.write(decodedData.c_str(), decodedData.length());
outPutFile.close();
the "test2.jpg" has exact same size as "test.jpg"(the original file) but, i can't open it.
i couldn't find what is the problem.

i got it working. i just replaced:
outPutFile.open("test2.jpg",ios::binary | ios::out);
with
outPutFile.open("test2.jpg", ios_base::out | ios_base::binary);

std::string path = "file.txt";
std::string cfgString = "data";
std::ofstream output(path.c_str(), ios_base::out | std::ios::binary);
if (output.is_open()) {
output.write(cfgString.data(), cfgString.length());
}
output.close();

Apparently, there is no superficial problem with your file writing logic even though there are some irregularities. The biggest problem is in your main logic.
The program seems to be simple program of copying a file. What you are doing is reading a file, converting its data to base64 string and then again decoding the data to std::string. Now one small problem. Conversion of base64 string cannot be successfully done into a null terminated ANSI string for obvious reasons that any 0 in decoded binary data will terminate the string prematurely. Secondly you are opening a file in binary mode to write but trying to write std::string in the file. But that doesn't matter as you data has already been corrupted in your previous operation.
To solve this, you can simply use file copying example as this or make sure you write only binary data with care to your output file which means read in binary from input file and write to output file the same buffer. No base64 encoding decoding is required.

it looks like you forgot to write
inputFile.seekg (0, ios::beg);
after getting file length. it means you try to read from the end of the file instead of its beginning.

Related

Store protobuf byte type inside a binary file

I'm trying to split and combine a binary file, for a reason not related to this question i'm using protobuf to store the files char* in a protobuf byte type
The code to serialize char* looks like this:
char* buffer = new char[buffer_size];
ifstream fileStream;
fileStream.open(fileName,ios::in | ios::binary);
//stuff here
Data *data = new Data(); // Protobuf Constructor
fileStream.read(buffer, buffer_size);
data->set_payload(buffer);
data->set_piece_no(piece_no);
.proto file :
syntax = "proto3";
message Data {
int32 piece_no = 1;
bytes payload = 2;
}
Then i try to combine all the pieces like so :
ofstream fileOutput;
fileOutput.open("out.bin", ios::out | ios::binary);
fileOutput << data->payload();
But sadly this doesn't work and the binary file generated is substantially smaller than the original.
I then suspect that the bytes could have null characters \0, and as a result the bytes have actually been truncated.
To test out my hypothesis i do the following:
Data *data = new Header();
data->set_payload("hel\0lo");
data->set_piece_no(piece_no);
ofstream fileOutput;
fileOutput.open("out.bin",ios::out | ios::binary);
fileOutput << data->payload();
Opening the binary file in a text editor (vscode) shows the following:
hel
But the following code:
string data("hel\0lo",6);
ofstream fileOutput;
fileOutput.open("out.bin", ios::out | ios::binary);
fileOutput << data;
Shows the following:
hel?lo
How can i output exactly what i inputted into protobuf, without any truncation because of arbitrary null bytes ?
If you pass a string literal, then it will treat it as such and only read until the first null terminator.
Instead you can pass a std::string directly as in your last example.
See under "Singular String Fields (proto3)" in https://developers.google.com/protocol-buffers/docs/reference/cpp-generated#oneof

Load file data into memory and saving it back out

I'm working on a rough storage system for a small personal project. I have a struct that holds data for each stored file:
struct AssetTableRow {
std::string id = "Unnamed"; // a unique name given by the user
std::string guid = ""; // a guid generated based on the file data, used to detect duplicate files
std::string data; // binary data of the file
};
I load a file into it like this:
std::streampos size;
char* memblock;
std::ifstream file(filePath, std::ios::in | std::ios::binary | std::ios::ate);
if (file.is_open()) {
size = file.tellg();
memblock = new char[size];
file.seekg(0, std::ios::beg);
file.read(memblock, size);
file.close();
AssetTableRow row = AssetTableRow();
row.id = "myid";
row.guid = "myguid";
row.data = std::string(memblock);
AssetTable.push_back(row);
}
And then to try to write it back out to a file I used this:
std::ofstream file(destPath, std::ios::out | std::ios::binary);
if (file.is_open()) {
printf("Writing...\n", id.c_str());
// I think this is where it might be messing up
file.write((row.data.c_str(), row.data.c_str().size());
file.close();
printf("Done!\n", id.c_str());
}
Now when I try to open the file that got written out (.png sprite sheet) the photo viewer tells me it can't open a file of that type (but opening the original is fine).
If I open up the 2 files in Notepad++ (original on the left) I can see that they are indeed very different, there is almost no data in the output file!
I'm guessing this has something to do with the length on the write or read, but I've tried every different possible value I can think of for them and it doesn't seem to change anything.
If I print the data out to the console after it's read from the original file it appears as it does in the written file, so that leads me to believe the problem is with how I'm reading the file, but I fail to see any problems with that part of the code.
What is wrong with how I'm reading the file that it doesn't appear to be reading the whole file?
Also please forgive any awful mistakes I've made in my code, I'm still learning c++ and don't fully understand some parts of it so my code may not be the best.
EDIT:
As per Superman's advice about strings, I changed my my code to use char* for holding the data instead.
struct AssetTableRow {
std::string id = "Unnamed"; // a unique name given by the user
std::string guid = ""; // a guid generated based on the file data, used to detect duplicate files
char* data; // binary data of the file
};
And changed the read function so it's reading into the struct's data member:
std::ifstream file(actualPath, std::ios::in | std::ios::binary | std::ios::ate);
if (file.is_open()) {
AssetTableRow row = AssetTableRow();
size = file.tellg();
row.data = new char[size];
file.seekg(0, std::ios::beg);
file.read(row.data, size);
file.close();
row.id = "myid";
row.guid = "myguid";
printf("%s\n", row.data);
}
However I still see the same output from when I was using a string, so now I'm even more confused as to why this is happening.
EDIT2:
Upon further investigation I found that the size variable for reading the file reports the correct size in bytes. So now my guess is that for whatever reason it's not reading in the whole file

Reading a file and saving the same exact file c++

I am actually writing a c++ program that reads any kind of file and saves it as a bmp file, but first I need to read the file, and thats were the issue is
char fileName[] = "test.jpg";
FILE * inFileForGettingSize;//This is for getting the file size
fopen_s(&inFileForGettingSize, fileName, "r");
fseek(inFileForGettingSize, 0L, SEEK_END);
int fileSize = ftell(inFileForGettingSize);
fclose(inFileForGettingSize);
ifstream inFile;//This is for reading the file
inFile.open(fileName);
if (inFile.fail()) {
cerr << "Error Opening File" << endl;
}
char * data = new char[fileSize];
inFile.read(data, fileSize);
ofstream outFile;//Writing the file back again
outFile.open("out.jpg");
outFile.write(data, fileSize);
outFile.close();
cin.get();
But when I read the file, lets say its a plainttext file it allways outputs some wierd charactes at the end, for example:
assdassaasd
sdaasddsa
sdadsa
passes to:
assdassaasd
sdaasddsa
sdadsaÍÍÍ
So when I do this with a jpg, exe, etc. It corrupts it.
I am not trying to COPY a file, I know there are other ways for that, Im just trying to read a complete file byte per byte. Thanks.
EDIT:
I found out that those 'Í' are equal to the number of end lines the file has, but this doesn't help me much
This is caused by newline handling.
You open the files in text mode (because you use "r" instead of "rb" for fopen and because you don't pass ios::binary to your fstream open calls), and on Windows, text mode translates "\r\n" pairs to "\n" on reading and back to "\r\n" when writing. The result is that the in-memory size is going to be shorter than the on-disk size, so when you try to write using the on-disk size, you go past the end of your array and write whatever random stuff happens to reside in memory.
You need to open files in binary mode when working with binary data:
fopen_s(&inFileForGettingSize, fileName, "rb");
inFile.open(fileName, ios::binary);
outFile.open("out.jpg", ios::binary);
For future reference, your copy routine could be improved. Mixing FILE* I/O with iostream I/O feels awkward, and opening and closing the file twice is extra work, and (most importantly), if your routine is ever run on a large enough file, it will exhaust memory trying to load the entire file into RAM. Copying a block at a time would be better:
const int BUFFER_SIZE = 65536;
char buffer[BUFFER_SIZE];
while (source.good()) {
source.read(buffer, BUFFER_SIZE);
dest.write(buffer, source.gcount());
}
It's a binary file, so you need to read and write the file as binary; otherwise it's treated as text, and assumed to have newlines that need translation.
In your call to fopen(), you need add the "b" designator:
fopen_s(&inFileForGettingSize, fileName, "rb");
And in your fstream::open calls, you need to add std::fstream::binary:
inFile.open(fileName, std::fstream::binary);
// ...
outFile.open("out.jpg", std::fstream::binary);

C++ Read Entire Text File

i try to read the entire text file using vc++ with this code
ifstream file (filePath, ios::in|ios::binary|ios::ate);
if (file.is_open())
{
size = (long)file.tellg();
char *contents = new char [size];
file.seekg (0, ios::beg);
file.read (contents, size);
file.close();
isInCharString("eat",contents);
delete [] contents;
}
but it's not fetch all entire file ,why and how to handle this?
Note : file size is 1.87 MB and 39854 line
You are missing the following line
file.seekg (0, file.end);
before:
size = file.tellg();
file.seekg (0, file.beg);
As discribed in this example: http://www.cplusplus.com/reference/istream/istream/read/
Another way to do this is:
std::string s;
{
std::ifstream file ("example.bin", std::ios::binary);
if (file) {
std::ostringstream os;
os << file.rdbuf();
s = os.str();
}
else {
// error
}
}
Alternatively, you can use the C library functions fopen, fseek, ftell, fread, fclose. The c-api can be faster in some cases at the expense of a more STL interface.
You really should get the habit of reading documentation. ifstream::read is documented to sometimes not read all the bytes, and
The number of characters successfully read and stored by this function
can be accessed by calling member gcount.
So you might debug your issues by looking into file.gcount() and file.rdstate(). Also, for such big reads, using (in some explicit loop) the istream::readsome member function might be more relevant. (I would suggest reading by e.g. chunks of 64K bytes).
PS it might be some implementation or system specific issue.
Thanks all, i found the error where,
simply the code below reads the entire file ,
the problem was in VS watcher itself it was just display certain amount of data not the full text file.

Input from xml file and parsing using rapidxml

I am trying to do something like this using rapidxml using c++
xml_document<> doc;
ifstream myfile("map.osm");
doc.parse<0>(myfile);
and receive the following error
Multiple markers at this line - Invalid arguments ' Candidates are: void parse(char *) ' - Symbol 'parse' could not be resolved
file size can be up to a few mega bytes.
please help
You have to load the file into a null terminated char buffer first as specified here in the official documentation.
http://rapidxml.sourceforge.net/manual.html#classrapidxml_1_1xml__document_8338ce6042e7b04d5a42144fb446b69c_18338ce6042e7b04d5a42144fb446b69c
Just read the contents of your file into a char array and use this array to pass to the xml_document::parse() function.
If you are using ifstream, you can use something like the following to read the entire file contents to a buffer
ifstream file ("test.xml");
if (file.is_open())
{
file.seekg(0,ios::end);
int size = file.tellg();
file.seekg(0,ios::beg);
char* buffer = new char [size];
file.read (buffer, size);
file.close();
// your file should now be in the char buffer -
// use this to parse your xml
delete[] buffer;
}
Please note I have not compiled the above code, just writing it from memory, but this is the rough idea. Look at the documentation for ifstream for exact details. This should help you get started anyway.