struct MyStruct {
Items item[100][60];
string Something;
int X;
int Y;
};
I have this struct "MyStruct" with a 2D Array of 100 * 60.
If I want to save the struct in Json Array for the item[100][60]
how I can do it using nlohmann json?
could anyone help me please?
Or if there is a way to save as binary file without using boost, I'll take that too.
void Save(std::string name, MyStruct test) {
std::string filename = name + ".dat";
std::ofstream out(filename);
boost::archive::binary_oarchive binary_output_archive(out);
binary_output_archive& test;
out.close();
}
void Read(std::string filename) {
std::ifstream in(filename + ".dat");
boost::archive::binary_iarchive binary_input_archive(in);
MyStruct test;
binary_input_archive& test;
in.close();
}
I tried this but it also crash sometimes so I want a better way
void Save(const std::string& name, const MyStruct& test) {
auto result = nlohmann::json{
{"item", json::array()},
{"Something", test.Something}
{"X", test.X},
{"Y", test.Y},
};
for (auto i = 0u; i < 100u; ++i) {
auto& outer = result["item"];
for (auto j = 0u; j < 60u; ++j) {
// You'll need to convert whatever Items is into a JSON type first
outer[i].push_back(test[i][j]);
}
}
auto out = std::ofstream out(name + ".dat");
out << result;
}
Something like this will suffice for saving, you can work out the deserialisation from this and the docs.
I strongly advise that you do not use Items item[100][60], the stack is not for items that large. Use a vector of std::arrays to use the heap but retain the same memory layout.
Related
I am trying to save and load data into a binary file so I can retrieve the contents later. The code I has used to work if each data item was the same size, but with this class data, it doesn't work. I made some adjustments to try to fix this by saving a data size variable as well. This however didn't work either.
Here's my class:
class Player
{
public:
Player();
Player(int playerID, std::string name, std::string country, int startYear, std::vector<int> rankPerYear);
~Player();
int getRank() { return rank; }
private:
int playerID;
std::string name;
std::string country;
int rankingPts;
int rank;
int startYear;
std::vector<int> rankPerYear;
Schedule schedule;
PlayerStats playerStats;
};
And here are my load and save functions:
void GameData::saveGame(std::string fileName)
{
std::ofstream file;
fileName += ".bin";
file.open(fileName.c_str(), std::ios::binary | std::ios::out);
size_t tempSize = 0;
for (int i = 0; i < players.size(); i++) {
tempSize = sizeof(players[i]);
file.write((char*)&tempSize, sizeof(size_t));
file.write((char*)&players[i], tempSize);
}
}
void GameData::loadGame(std::string fileName)
{
players.clear();
fileName += ".bin";
if (std::ifstream(fileName)) {
std::ifstream file;
file.open(fileName.c_str(), std::ios::binary | std::ios::in);
if (!file.is_open()) {
std::cout << "Error while opening the file";
}
else {
std::string temp;
int location = 0;
size_t tempSize = 0;
Player player;
while (std::getline(file, temp)) {
file.seekg(location, std::ios::beg);
file.read((char*)&tempSize, sizeof(size_t));
file.seekg(sizeof(size_t) + location, std::ios::beg);
file.read((char*)&player, tempSize);
players.push_back(player);
location += sizeof(size_t) + tempSize;
}
}
}
else {
saveGame(fileName);
}
}
I am saving the players to a vector of Players. When I try to run this I get the error:
Exception thrown at 0x0F485097 (vcruntime140d.dll) in Ping Pong Manager.exe: 0xC0000005: Access violation reading location 0x081BCE20.
When debugging this I found that it would save some of the data correctly, but would then crash before it finished. So I know it's partially working.
I have tried many different things but cannot figure my problem. Thanks for the help.
You cannot serialize STL containers directly since they contains pointers.
The common way is to write the field one-by-one, using specialized methods for things that contains pointers to heap.
Please consider a serialization library, which generates the code for you. Here's a list:
Boost.Serialization
Cap'n'Proto
FlatBuffers
I really need your help. I have the following structs in my code:
struct Field{
char name[20];
int type;
int length;
};
struct Record{
vector<Field> structure;
vector<string> info;
};
What I want to do is to store a vector of my struct Record inside a binary file and to successfully load it back. The problem is that my struct has two vectors inside of it and they are causing me some trouble. Can you help me out?
You basically just write functions that would write the structure to a stream. First you write the size of the structure if it's a POD.
If it's not POD, you write the size of each element then you write the data for the element.
Below you can see for writevecfield, it writes the size of the vector first. Then it writes the entire POD structure to the stream. writesvecstring writes the size of the vector first. Then it goes through the vector and writes: the size of each string followed by its contents.
To read, you do the opposite. readvecfield reads the size of the vector from the file because it was the first thing written. After reading it, we resize the vector and read "size" amount of "Field" structures into the new vector.
To read the strings, we do the opposite as well. readvecstring reads the size of the vector from the file first. It resizes the vector to the read size. Next it loops "size" amount of times because that's the amount of strings in the file.
We then read the size of the string, resize a string and read the contents into that string. We then add that to the vector and move onto the next string: read the size first, resize a string, read contents, add to vector..
#include <fstream>
#include <vector>
#include <iostream>
#include <sstream>
using namespace std;
struct Field
{
char name[20];
int type;
int length;
};
struct Record
{
vector<Field> structure;
vector<string> info;
};
void writevecfield(ostream& os, const vector<Field> &vec)
{
typename vector<Field>::size_type size = vec.size();
os.write((char*)&size, sizeof(size));
os.write((char*)&vec[0], vec.size() * sizeof(Field));
}
void readvecfield(istream& is, vector<Field> &vec)
{
typename vector<Field>::size_type size = 0;
is.read((char*)&size, sizeof(size));
vec.resize(size);
is.read((char*)&vec[0], vec.size() * sizeof(Field));
}
void writevecstring(ostream& os, const vector<string> &vec)
{
typename vector<string>::size_type size = vec.size();
os.write((char*)&size, sizeof(size));
for (typename vector<string>::size_type i = 0; i < size; ++i)
{
typename vector<string>::size_type element_size = vec[i].size();
os.write((char*)&element_size, sizeof(element_size));
os.write(&vec[i][0], element_size);
}
}
void readvecstring(istream& is, vector<string> &vec)
{
typename vector<string>::size_type size = 0;
is.read((char*)&size, sizeof(size));
vec.resize(size);
for (typename vector<string>::size_type i = 0; i < size; ++i)
{
typename vector<string>::size_type element_size = 0;
is.read((char*)&element_size, sizeof(element_size));
vec[i].resize(element_size);
is.read(&vec[i][0], element_size);
}
}
void WriteRecord(ostream& out, const Record& r)
{
writevecfield(out, r.structure);
writevecstring(out, r.info);
}
void ReadRecord(istream& in, Record& r)
{
readvecfield(in, r.structure);
readvecstring(in, r.info);
}
int main()
{
Record R;
Field first = {"HELLO", 1, 20};
Field second = {"WORLD", 2, 40};
R.structure.push_back(first);
R.structure.push_back(second);
R.info.push_back("INFO FOR HELLO");
R.info.push_back("INFO FOR WORLD");
std::ofstream out("C:/Users/***/Desktop/Test.bin", std::ios::out | std::ios::binary);
WriteRecord(out, R);
out.close();
Record RR;
std::ifstream in("C:/Users/***/Desktop/Test.bin", std::ios::in | std::ios::binary);
ReadRecord(in, RR);
in.close();
for (int i = 0; i < RR.structure.size(); ++i)
{
std::cout<<"Name: "<<RR.structure[i].name<<"\n";
std::cout<<"Type: "<<RR.structure[i].type<<"\n";
std::cout<<"Length: "<<RR.structure[i].length<<"\n";
std::cout<<"INFO: "<<RR.info[i]<<"\n\n";
}
}
I used c-like sentence to process. The key is you just need to find the address of the first vec data, than use adjacent buffer to write them in files.
bool storeStructVec(FILE *fpOut, const vector<Field> &vec)
{
unsigned int nSize = vec.size();
if (nSize != fwrite(&vec[0],sizeof(Field),nSize,fpOut))
return false;
else return true;
}
Before I start, consider this code:
One data transfer object ObjectDTO
class ObjectDTO {
public:
int id;
string string1;
string string2;
string string3;
int code1;
vector<string> stringList1;
private:
friend class boost::serialization::access;
template<class Archive>
void serialize(Archive &archive, const unsigned int version) {
archive & id;
archive & string1;
archive & string2;
archive & string3;
archive & code1;
archive & stringList1;
}
Serialization
void OutputStreamService::writeReportsToFile(vector<ObjectDTO> objects, int filename){
ofstream outputFileStream(to_string(filename));
boost::archive::binary_oarchive outputArchive(outputFileStream);
outputArchive << objects;
}
Deserialization
vector<ObjectDTO> InputStreamService::readObjects() {
ifstream inputFileStream(to_string(fileNumber++));
boost::archive::binary_iarchive inputArchive(inputFileStream);
vector<ObjectDTO> objects;
inputArchive >> objects;
return objects;
}
I am using Boost Serialization C++ librarys to serialize a vector of ObjectDTOs and read it back later.
Supose i generated 30GB of random ObjectDTOs and saved it to the same file
How can i read only some of them to avoid reaching memory limit?
I am using Boost Serialization because it was the simples way i found to solve the first problem but i can change to any other approach if necessary!
Use Google Protocol buffers instead, there are CodedOutputStream class for serialization and CodedInputStream for deserialization.
One of CodedOutputStream methods is WriteVarint32, which allows to write a number which could be used as an index in the stream.
In CodeInputStream there is corresponding ReadVarint32 method, eg.
Serialization:
char text[[]] = "Hello world!";
coded_output->WriteVarint32(strlen(text));
coded_output->WriteRaw(text, strlen(text));
Deserialization:
uint32 size;
coded_input->ReadVarint32(&size);
char* text = new char[size + 1];
coded_input->ReadRaw(buffer, size);
The last line allows you to read the content of serialized stream starting from given index.
Here are my two methods to serialize/deserialize streams with given length at the start.
template < class T>
void TProtoBufSerializer::SerializeImplementation(const T& protoBuf, std::vector<char>& buffer )
{
int bufLength = protoBuf.ByteSize() + google::protobuf::io::CodedOutputStream::VarintSize32(protoBuf.ByteSize());
buffer.resize(bufLength);
google::protobuf::io::ArrayOutputStream arrayOutput(&buffer[0], bufLength);
google::protobuf::io::CodedOutputStream codedOutput(&arrayOutput);
codedOutput.WriteVarint32(protoBuf.ByteSize());
protoBuf.SerializeToCodedStream(&codedOutput);
}
template < class T>
bool TProtoBufSerializer::DeSerializeImplementation(std::vector<char>& buffer, T& protoBuf )
{
bool deserialized = false;
google::protobuf::io::ArrayInputStream arrayInput(&buffer[0],buffer.size());
google::protobuf::io::CodedInputStream codedInput(&arrayInput);
unsigned int object_size;
bool header_readed = codedInput.ReadVarint32(&object_size);
if(header_readed && object_size > 0)
{
if( buffer.size() >= codedInput.CurrentPosition() + object_size )
{
google::protobuf::io::CodedInputStream::Limit limit = codedInput.PushLimit(object_size);
if(protoBuf.ParseFromCodedStream(&codedInput))
{
std::vector<char>::iterator it = buffer.begin();
std::advance(it,codedInput.CurrentPosition());
std::move(it,buffer.end(),buffer.begin() );
buffer.resize(buffer.size() - codedInput.CurrentPosition());
deserialized = true;
}
else
{
throw TProtoBufSerializerPayloadException();
}
codedInput.PopLimit(limit);
}
}
else
{
//varint32 which is used in header is at the most 5 bytes long,
//if given buffer is 5 bytes or more long and header still cannot be decoded - raise exception
if(buffer.size() >= 5)
{
throw TProtoBufSerializerHeaderException();
}
}
return deserialized;
}
I solved the problem discarding Boost Serialization and vectors in favor of arrays with plain old C++ write and read on ofstream and ifstream respectively.
My OutputStreamService writeObjectsToFile ended like this:
void OutputStreamService::writeObjectssToFile(ObjectDTO * objects, int filename){
ofstream outputFileStream(to_string(filename), std::ios::binary);
outputFileStream.write((char *)&objects, sizeof(objects));
}
And InputStreamService with readObjects:
ObjectDTO * InputStreamService::readObjects() {
ifstream inputFileStream(to_string(fileNumber++), std::ios::binary);
ObjectDTO objects[10];
inputFileStream.read((char *)&objects, sizeof(objects));
return objects;
}
This way i can define 10 or any other integer as the number of objects i want to read in.
To solve the mais problem, i can now calculate the aprox number of objects my memory can handle and then limit the number of reads!
Ty!
I want to write/read data from a file. Is it possible to divide the file (inside the code) in multiple Strings/Sections? Or read data untill a specific line?
Just like: "Read the Data untill line 32, put it inside a String, read the next 32 lines and put it into another string"
Im already know how to read and find data with seekp but i dont really like it because my code always gets to long.
I already found some code but i dont understand it how it works:
dataset_t* DDS::readFile(std::string filename)
{
dataset_t* dataset = NULL;
std::stringstream ss;
std::ifstream fs;
uint8_t tmp_c;
try
{
fs.open(filename.c_str(), std::ifstream::in);
if (!fs)
{
std::cout << "File not found: " << filename << std::endl;
return NULL;
}
while(fs.good())
{
fs.read((char*)&tmp_c, 1);
if (fs.good()) ss.write((char*)&tmp_c, 1);
}
fs.close();
dataset = new dataset_t();
const uint32_t bufferSize = 32;
char* buffer = new char[bufferSize];
uint32_t count = 1;
while(ss.good())
{
ss.getline(buffer, bufferSize);
dataitem_t dataitem;
dataitem.identifier = buffer;
dataitem.count = count;
dataset->push_back(dataitem);
count++;
}
return dataset;
}
catch(std::exception e)
{
cdelete(dataset);
return NULL;
}
}
The Code edits a binary save file.
Or can someone link me a website where i can learn more about buffers and stringstreams?
You could create some classes to model your requirement: a take<N> for 'grab 32 lines', and a lines_from to iterate over lines.
Your lines_from class would take any std::istream: something encoded, something zipped, ... as long as it gives you a series of characters. The take<N> would convert that into array<string, N> chunks.
Here's a snippet that illustrates it:
int main(){
auto lines = lines_from{std::cin};
while(lines.good()){
auto chunk = take<3>(lines);
std::cout << chunk[0][0] << chunk[1][0] << chunk[2][0] << std::endl;
}
}
And here are the supporting classes and functions:
#include <iostream>
#include <array>
class lines_from {
public:
std::istream ∈
using value_type = std::string;
std::string operator*() {
std::string line;
std::getline(in, line);
return line;
}
bool good() const {
return in.good();
}
};
template<int N, class T>
auto take(T &range){
std::array<typename T::value_type, N> value;
for (auto &e: value) { e = *range; }
return value;
}
(demo on cpp.sh)
I am having problems trying to serialise a vector (std::vector) into a binary format and then correctly deserialise it and be able to read the data. This is my first time using a binary format (I was using ASCII but that has become too hard to use now) so I am starting simple with just a vector of ints.
Whenever I read the data back the vector always has the right length but the data is either 0, undefined or random.
class Example
{
public:
std::vector<int> val;
};
WRITE:
Example example = Example();
example.val.push_back(10);
size_t size = sizeof BinaryExample + (sizeof(int) * example.val.size());
std::fstream file ("Levels/example.sld", std::ios::out | std::ios::binary);
if (file.is_open())
{
file.seekg(0);
file.write((char*)&example, size);
file.close();
}
READ:
BinaryExample example = BinaryExample();
std::ifstream::pos_type size;
std::ifstream file ("Levels/example.sld", std::ios::in | std::ios::binary | std::ios::ate);
if (file.is_open())
{
size = file.tellg();
file.seekg(0, std::ios::beg);
file.read((char*)&example, size);
file.close();
}
Does anyone know what I am doing wrong or what to do or be able to point me in the direction that I need to do?
You can't unserialise a non-POD class by overwriting an existing instance as you seem to be trying to do - you need to give the class a constructor that reads the data from the stream and constructs a new instance of the class with it.
In outline, given something like this:
class A {
A();
A( istream & is );
void serialise( ostream & os );
vector <int> v;
};
then serialise() would write the length of the vector followed by the vector contents. The constructor would read the vector length, resize the vector using the length, then read the vector contents:
void A :: serialise( ostream & os ) {
size_t vsize = v.size();
os.write((char*)&vsize, sizeof(vsize));
os.write((char*)&v[0], vsize * sizeof(int) );
}
A :: A( istream & is ) {
size_t vsize;
is.read((char*)&vsize, sizeof(vsize));
v.resize( vsize );
is.read((char*)&v[0], vsize * sizeof(int));
}
You're using the address of the vector. What you need/want is the address of the data being held by the vector. Writing, for example, would be something like:
size = example.size();
file.write((char *)&size, sizeof(size));
file.write((char *)&example[0], sizeof(example[0] * size));
I would write in network byte order to ensure file can be written&read on any platform. So:
#include <fstream>
#include <iostream>
#include <iomanip>
#include <vector>
#include <arpa/inet.h>
int main(void) {
std::vector<int32_t> v = std::vector<int32_t>();
v.push_back(111);
v.push_back(222);
v.push_back(333);
{
std::ofstream ofs;
ofs.open("vecdmp.bin", std::ios::out | std::ios::binary);
uint32_t sz = htonl(v.size());
ofs.write((const char*)&sz, sizeof(uint32_t));
for (uint32_t i = 0, end_i = v.size(); i < end_i; ++i) {
int32_t val = htonl(v[i]);
ofs.write((const char*)&val, sizeof(int32_t));
}
ofs.close();
}
{
std::ifstream ifs;
ifs.open("vecdmp.bin", std::ios::in | std::ios::binary);
uint32_t sz = 0;
ifs.read((char*)&sz, sizeof(uint32_t));
sz = ntohl(sz);
for (uint32_t i = 0; i < sz; ++i) {
int32_t val = 0;
ifs.read((char*)&val, sizeof(int32_t));
val = ntohl(val);
std::cout << i << '=' << val << '\n';
}
}
return 0;
}
Read the other's answer to see how you should read/write a binary structure.
I add this one because I believe your motivations for using a binary format are mistaken. A binary format won't be easier that an ASCII one, usually it's the other way around.
You have many options to save/read data for long term use (ORM, databases, structured formats, configuration files, etc). The flat binary file is usually the worst and the harder to maintain except for very simple structures.