I'm serializing data into binary file using ofstream/ifstream. Data is divided in 2 vectors of strings, one for data names and other for data values, std::vector<std::string> dataNames, std::vector<std::string> dataValues.
I'm writting the data using this function:
void Data::SaveData(std::string path)
{
std::ofstream outfile(path, std::ofstream::binary);
outfile.write(reinterpret_cast<const char *>(&dataNames[0]), dataNames.size() * sizeof(std::string));
outfile.write(reinterpret_cast<const char *>(&dataValues[0]), dataValues.size() * sizeof(std::string));
outfile.close();
}
And reading it using:
bool Data::LoadData(std::string path)
{
bool ret = false;
std::ifstream file(path, std::ifstream::in | std::ifstream::binary);
if (file.is_open())
{
// get length of file:
file.seekg(0, file.end);
int length = file.tellg();
file.seekg(0, file.beg);
char * buffer = new char[length];
file.read(buffer, length);
if (file)
{
char* cursor = buffer;
uint32_t bytes = length / 2;
dataNames.resize(bytes / sizeof(std::string));
memcpy(dataNames.data(), cursor, bytes);
cursor += bytes;
dataValues.resize(bytes / sizeof(std::string));
memcpy(dataValues.data(), cursor, bytes);
delete[] buffer;
buffer = nullptr;
}
file.close();
ret = true;
}
return ret;
}
It works. I can write and read it correctly. Except if any of the strings in dataNames or dataValues has 16 chars or more.
Example of data using strings with less than 16 chars:
dataNames[0] = "Type"
dataNames[1] = "GameObjectCount"
dataValues[0] = "Scene"
dataValues[1] = "5"
data 15 chars
Example of data using strings with more than 16 chars:
dataNames[0] = "Type"
dataNames[1] = "GameObjectsCount" //Added a s. Now have 16 chars
dataValues[0] = "Scene"
dataValues[1] = "5"
data 16 chars
Here you can see that word "GameObjectsCount" doesn't appear and extrange characters are shown.
When reading this file the string is not valid. Sometimes it's empty, sometimes says "Error reading characters of string", sometimes is a radom letter...
Any idea?
Reinterpreting a vector in the manner you have above isn't correct.
outfile.write(reinterpret_cast<const char *>(&dataNames[0]), dataNames.size() * sizeof(std::string));
You don't know how the vector stores data (on the heap, etc..), and you can't assume that you can blindly cast the pointer and write whatever you see out to a file as a method to serialize the data. Furthermore, a std::string isn't necessarily an in-place character array of the size of the input. It's more likely a pointer to an object on the heap.
So, if you want to serialize the data in a vector or another stdlib type, you'll need to write a function to do that manually by iterating over the items and writing them in a properly delimited way.
Related
I serialize the file via the code beneath, and send it over winsocks, this works fine with textfiles, but when I tried to send a jpg, the string contains \0 as some of the character elements, so the sockets only send part of the string, thinking \0 is the end, i was considering replacing \0 with something else, but say i replace it with 'xx', then replace it back on the other end, what if the file had natural occurrences of 'xx' that get lost? Sure I could make a large, unlikely sequence, but that bloats the file.
Any help appreciated.
char* read_file(string path, int& len)
{
std::ifstream infile(path);
infile.seekg(0, infile.end);
size_t length = infile.tellg();
infile.seekg(0, infile.beg);
len = length;
char* buffer = new char[len]();
infile.read(buffer, length);
return buffer;
}
string load_to_buffer(string file)
{
char* img;
int ln;
img = read_file(file, ln);
string s = "";
for (int i = 1; i <= ln; i++){
char c = *(img + i);
s += c;
}
return s;
}
Probably somewhere in your code (that isn't seen in the code you have posted) you use strlen() or std::string::length() to send the data, and/or you use std::string::c_str() to get the buffer. This results in truncated data because these functions stop at \0.
std::string is not good to handle binary data. Use std::vector<char> instead, and remove the new[] stuff.
I have to write some data to a text file, and at the end of each output I have to append a NULL terminating character '\0'. Currently this is what I have come up so far. It works well for some inputs, however for some it sometimes write the whole text file with garbage value. I there a better way to do this?? In my program I have to write some data, store its location on file and use that for some operations. the next write operation starts at address = address + 500;
long int address = get_address();
void write_to_file()
{
fstream pFILE ("my file.txt");
char * buffer = new char [500];
cin.getline(buffer,500);
pFILE.seekp(address);
pFILE << buffer;
pFILE.seekp(address + strlen(buffer));
pFILE << '\0';
address += 500;
}
To write a '\0' to file:
fstream output_file("output_file.txt", ios::binary);
output_file.put('\0');
The ios::binary prevents the compiler or OS from translating the '\0'.
I have a program that I need to read binary text into. I read the binary text via a redirection:
readData will be an executable made by my Makefile.
Example: readData < binaryText.txt
What I want to do is read the binary text, and store each character in the binary text file as a character inside a char array. The binary text is made up of 32 This is my attempt at doing so...
unsigned char * buffer;
char d;
cin.seekg(0, ios::end);
int length = cin.tellg();
cin.seekg(0, ios::beg);
buffer = new unsigned char [length];
while(cin.get(d))
{
cin.read((char*)&buffer, length);
cout << buffer[(int)d] << endl;
}
However, I keep getting a segmentation fault on this. Might anyone have any ideas on how to read binary text into a char array? Thanks!
I'm more a C programmer rather than a C++, but I think that you should have started your while loop
while(cin.get(&d)){
The easiest would be like this:
std::istringstream iss;
iss << std::cin.rdbuf();
// now use iss.str()
Or, all in one line:
std::string data(static_cast<std::istringstream&>(std::istringstream() << std::cin.rdbuf()).str());
Something like this should do the trick.
You retrieve the filename from the arguments and then read the whole file in one shot.
const char *filename = argv[0];
vector<char> buffer;
// open the stream
std::ifstream is(filename);
// determine the file length
is.seekg(0, ios_base::end);
std::size_t size = is.tellg();
is.seekg(0, std::ios_base::beg);
// make sure we have enough memory space
buffer.reserve(size);
buffer.resize(size, 0);
// load the data
is.read((char *) &buffer[0], size);
// close the file
is.close();
You then just need to iterate over the vector to read characters.
The reason why you are getting segmentation fault is because you are trying to access an array variable using a character value.
Problem:
buffer[(int)d] //d is a ASCII character value, and if the value exceeds the array's range, there comes the segfault.
If what you want is an character array, you already have that from cin.read()
Solution:
cin.read(reinterpret_cast<char*>(buffer), length);
If you want to print out, just use printf
printf("%s", buffer);
I used reinterpret_cast because it thought it is safe to convert to signed character pointer since most characters that are used would range from 0 ~ 127. You should know that character values from 128 to 255 would be converted wrongly.
Im doing a small exercise to read a file which contains one long string and load this into an array of strings. So far I have:
char* data[11];
char buf[15];
int i = 0;
FILE* indata;
indata = fopen( "somefile.txt", "r" );
while( i < 11)
{
fgets(buf, 16, indata);
data[i] = buf;
i++;
}
fclose( indata );
somefile.txt: "aaaaaaaaaaaaaaaaaaaaaaaaabbbbbbbaahhhhhbbbbdddddddddddddbbbbb"
etc..
This reads in 15 characters, adds that string to the array and gets the next 15. The problem is the array always equals the last string, so if the last string is "ccccv" the whole array, data[0] = "ccccv", data[1] = "ccccv", data[2] = "ccccv" and so on.
Does anyone know why this is happening and whether there is a better way to do it? Thanks
Each pointer in data will point to the same memory area, which is buf.
You need to use strcpy + malloc.
Also is seems like you have a "minor" buffer overflow. buf is size 15 and you're reading 16 characters.
None of the posted answers I've read work, so I'm asking again.
I'm trying to copy the string data pointed to by a char pointer into a char array.
I have a function that reads from a ifstream into a char array
char* FileReader::getNextBytes(int numberOfBytes) {
char *buf = new char[numberOfBytes];
file.read(buf, numberOfBytes);
return buf;
}
I then have a struct :
struct Packet {
char data[MAX_DATA_SIZE]; // can hold file name or data
} packet;
I want to copy what is returned from getNextBytes(MAX_DATA_SIZE) into packet.data;
EDIT: Let me show you what I'm getting with all the answers gotten below (memcpy, strcpy, passing as parameter). I'm thinking the error comes from somewhere else. I'm reading a file as binary (it's a png). I'll loop while the fstream is good() and read from the fstream into the buf (which might be the data array). I want to see the length of what I've read :
cout << strlen(packet.data) << endl;
This returns different sizes every time:
8
529
60
46
358
66
156
After that, apparently there are no bytes left to read although the file is 13K + bytes long.
This can be done using standard library function memcpy, which is declared in / :
strcpy(packet.data, buf);
This requires file.read returns proper char series that ends with '\0'. You might also want to ensure numberOfBytes is big enough to accommodate the whole string. Otherwise you could possibly get segmentation fault.
//if buf not properly null terminated added a null char at the end
buf[numberofbytes] = "\0"
//copy the string from buf to struc
strcpy(packet.data, buf);
//or
strncpy(packet.data, buf);
Edit:
Whether or not this is being handled as a string is a very important distinction. In your question, you referred to it as a "string", which is what got us all confused.
Without any library assistance:
char result = reader.getNextBytes(MAX_DATA_SIZE);
for (int i = 0; i < MAX_DATA_SIZE; ++MAX_DATA_SIZE) {
packet.data[i] = result[i];
}
delete [] result;
Using #include <cstring>:
memcpy(packet.data, result, MAX_DATA_SIZE);
Or for extra credit, rewrite getNextBytes so it has an output parameter:
char* FileReader::getNextBytes(int numberOfBytes, char* buf) {
file.read(buf, numberOfBytes);
return buf;
}
Then it's just:
reader.getNextBytes(MAX_DATA_SIZE, packet.data);
Edit 2:
To get the length of a file:
file.seekg (0, ios::end);
int length = file.tellg();
file.seekg (0, ios::beg);
And with that in hand...
char* buffer = new char[length];
file.read(buffer, length);
Now you have the entire file in buffer.
strlen is not a valid way to determine the amount of binary data. strlen just reads until it finds '\0', nothing more. If you want to read a chunk of binary data, just use a std::vector, resize it to the amount of bytes you read from the file, and return it as value. Problem solved.