Serializing structs - c++

I'm working on a homework project and i'm trying to store inventory data into a file.
The inventory data size shouldn't be too large cause technically no one is going to really use it.
I need to write these contents to a file:
• Item Description
• Quantity on Hand
• Wholesale Cost
• Retail Cost
• Date Added to Inventory
I am going to need to make an interface that allows me to do this:
• Add new records to the file
• Display any record in the file
• Change any record in the file
Struct would be the easiest way to go about this imo. If I can just figure out how to read / write structs to a file this should be really easy.
If you could provide a small example on how to do this I would really appreciate it.
Thanks!

Ask your teacher, could you use boost library.
If yes, read boost serilization tutorial, it contains a simple examples: http://www.boost.org/doc/libs/1_38_0/libs/serialization/doc/tutorial.html
But if you want understand how to work with files, you should do this works without any help or boost.
If you want works with std::[io]fstreams you should decide what format you will support:
- text - for this case best way define operator<< and operator>> and use them for writing structure to file or reading;
- binary - your structure should be POD ( plain old data ) and doesn't should contain pointers - and you will use read and write streams methods.
example for binary file:
http://www.codeguru.com/forum/showthread.php?t=269648

If you don't mind really low level, you can just bit copy the structs in and out by casting a pointer to the struct to void* and using sizeof() to get the struct length. (IIRC their is a way to dump/read a void buffer to/from a file)
Note this ONLY works if the data has no pointers/references/etc.
I like C's IO better than C++'s so:
typedef struct { int hi; int mon; char[35] dat; } S;
S s;
S arr[22];
int f;
// write
f = open(/* I forget the args*/);
// one
if(sizeof(s) != write(f, &s, sizeof(s))) Error();
// many
if(sizeof(arr) != write(f, arr, sizeof(arr))) Error();
close(f);
// read
f = open(/* I forget the args*/);
// one
if(sizeof(s) != read(f, &s, sizeof(s))) Error();
// many
if(sizeof(arr) != read(f, arr, sizeof(arr))) Error();
close(f);

IOStream library does it
The ofstream class provides the interface to write data to files as output streams.
The ifstream class provides the interface to read data from files as input streams
Edit- Example

I would go with XML; it's structured, it's text based so you can look at it with any text editor.

Related

How to write content of an object into a file in c++

I have a code in this format:
srcSAXController control(input_filename.c_str());
std::string output_filename = input_filename;
output_filename = "c-" + output_filename.erase(input_filename.rfind(XML_STR));
std:: ofstream myfile(output_filename.c_str());
coverage_handler handler(i == MAIN_POS ? true : false, output_filename);
control.parse(&handler);
myfile.write((char *)&control, sizeof(control));
myfile.close();
I want the content of object 'control' to be written into my file. How to fix the code above, so that content of the control object is written to the file.
In general you need much more than just writing the bytes of the object to be able to save and reload it.
The problem is named "serialization" and depending on a lot of factors there are several strategies.
For example it's important to know if you need to save and reload the object on the same system or if you may need to reload it on a different system; it's also fundamental to know if the object contains links to other objects, if the link graph is a simple tree or if there are possibly loops, if you need to support versioning etc. etc.
Writing the bytes to disk like the code is doing is not going to work even for something as simple as an object containing an std::string.

Pass Binary string/file content from c++ to node js

I'm trying to pass the content of a binary file from c++ to node using the node-gyp library. I have a process that creates a binary file using the .fit format and I need to pass the content of the file to js to process it. So, my first aproach was to extract the content of the file in a string and try to pass it to node like this.
char c;
std::string content="";
while (file.get(c)){
content+=c;
}
I'm using the following code to pass it to Node
v8::Local<v8::ArrayBuffer> ab = v8::ArrayBuffer::New(args.GetIsolate(), (void*)content.data(), content.size());
args.GetReturnValue().Set(ab);
In node a get an arrayBuffer but when I print the content to a file it is different to the one that show a c++ cout.
How can I pass the binary data succesfully?
Thanks.
Probably the best approach is to write your data to a binary disk file. Write to disk in C++; read from disk in NodeJS.
Very importantly, make sure you specify BINARY MODE.
For example:
myFile.open ("data2.bin", ios::out | ios::binary);
Do not use "strings" (at least not unless you want to uuencode). Use buffers. Here is a good example:
How to read binary files byte by byte in Node.js
var fs = require('fs');
fs.open('file.txt', 'r', function(status, fd) {
if (status) {
console.log(status.message);
return;
}
var buffer = new Buffer(100);
fs.read(fd, buffer, 0, 100, 0, function(err, num) {
...
});
});
You might also find these links helpful:
https://nodejs.org/api/buffer.html
<= Has good examples for specific Node APIs
http://blog.paracode.com/2013/04/24/parsing-binary-data-with-node-dot-js/
<= Good discussion of some of the issues you might face, including "endianness" and "interpreting numbers"
ADDENDUM:
The OP clarified that he's considering using C++ as a NodeJS Add-On (not a standalone C++ program.
Consequently, using buffers is definitely an option. Here is a good tutorial:
https://community.risingstack.com/using-buffers-node-js-c-plus-plus/
If you choose to go this route, I would DEFINITELY download the example code and play with it first, before implementing buffers in your own application.
It depends but for example using redis
Values can be strings (including binary data) of every kind, for
instance you can store a jpeg image inside a value. A value can't be
bigger than 512 MB.
If the file is bigger than 512MB, then you can store it in chunks.
But I wouldnt suggest since this is an in-memory data store
Its easy to implement in both c++ and node.js

C++ Read specific parts of a file with start and endpoint

I am serializing multiple objects and want to save the given Strings to a file. The structure is the following:
A few string and long attributes, then a variable amount of maps<long, map<string, variant> >. My first idea was creating one valid JSONFile but this is very hard to do (all of the maps are very big and my temporary memory is not big enough). Since I cant serialize everything together I have to do it piece by piece. I am planning on doing that and I then want to save the recieved strings to a file. Here is how it will look like:
{ "Name": "StackOverflow"}
{"map1": //map here}
{"map2": //map here}
As you can see this is not one valid JSON object but 3 valid JSONObjects in one file. Now I want to deserialize and I need to give a valid JSONObject to the deserializer. I already save tellp() everytime when I write a new JSONObject to file, so in this example I would have the following adresses saved: 26, endofmap1, endofmap2.
Here is what I want to do: I want to use these addresses, to extract the strings from the file I wrote to. I need one string which is from 0 to (26-1), one string from 26 to(endofmap1-1) and one string from endofmap1 to (endofmap2-1). Since these strings would be valid JSONObjects i could deserialize them without problem.
How can I do this?
I would create a serialize and deserialize class that you can use as part of a hierarchy.
So for instance, in rough C++ psuedo-code:
class Object : public serialize, deserialize {
public:
int a;
float b;
Compound c;
bool serialize(fstream& fs) {
fs << a;
fs << b;
c->serialize(fs);
fs->flush();
}
// same for deserialize
};
class Compound : serialize, deserialize {
public:
map<> things;
bool serialize(fstream& fs) {
for(thing : things) {
fs << thing;
}
fs->flush();
}
};
With this you can use JSON as the file will be written as your walk the heirarchy.
Update:
To extract a specific string from a file you can use something like this:
// pass in an open stream (streams are good for unit testing!)
std::string extractString(fstream& fs) {
int location = /* the location of the start from file */;
int length = /* length of the string you want to extract */;
std::string str;
str.resize(length);
char* begin = *str.begin();
fs->seekp(location);
fs->read(begin, length);
return str;
}
Based on you saying "my temporary memory is not big enough", I'm going to assume two possibilities (though some kind of code example may help us help you!).
possibility one, the file is too big
The issue you would be facing here isn't a new one - a file too large for memory, assuming your algorithm isn't buffering all the data, and your stack can handle the recursion of course.
On windows you can use the MapViewOfFile function, the MSDN has plenty of detail on that. This function will effectively grab a "view" of a section of a file - allowing you to load enough of the file to modify only what you need, before closing and opening a view at a later offset.
If you are on a different platform, there will be similar functions.
possibility two, you are doing too much at once
The other option is more of a "software engineering" issue. You have so much data then when holding them in your std::maps, you run out of heap-memory.
If this is the case, you are going to need to use some clever thinking - here are some ideas!
Don't load all your data into the maps. wherever the data is coming from, take a CRC, Index, or Filename of the data-source. Store that information in the map, and leave the actual "big strings" on the hard disk. - This way you can load each item of data when you need it.
This works really well for data that needs to be sorted, or correlated.
Process or load your data when you need to write it. If you don't need to sort or correlate the data, why load it into a map beforehand at all? Just load each "big string" of data in sequence, then write them to the file with an ofstream.

How do I use QuaZip to extract multiple files?

I have the below code to move through a list of the folders and files in a zip archive creating them as I goes (also creating paths for files if not created yet).
The application crashes when I use readData(char*, qint64) to extract internal files data to stream it into a QFile. I don't think this is the right thing to use but it's all I've seen (in a very loose example on this site) and I also had to change the QuaZipFile.h to make the function public so I can use it (also hinting I shouldn't be using it).
It doesn't crash on the first file which has no contents but does after that. Here is the necessary code (ask if you need to see more):
QFile newFile(fNames);
newFile.open(QIODevice::WriteOnly);
QTextStream outToFile(&newFile);
char * data;
int len = file.readData(data, 100000000);
if(len > 0) {
outToFile << data;
}
newFile.close();
It doesn't pass the int len line. What should I be using here?
Note that the variable file is defined earlier pretty puch like this:
QuaZip zip("zip.zip");
QuaZipFile file(&zip);
...
zip.goToFirstFile();
...
zip.goToNextFile();
And the int passed to readData is a random number for the max data size.
The reason for the crash is that you have not allocated any memory for your buffer, named data.
Solved.
I tried using different reads (readData, read, readLine) and found that this line works with no need for a data buffer:
outToFile << file.readAll();

Parse config file in C/C++

I'm a newbie looking for a fast and easy way to parse a text file in C or C++ (wxWidgets)
The file will look something like this (A main category with "sub-objects") which will appear in a list box
[CategoryA]
[SubCat]
Str1 = Test
Str2 = Description
[SubCat] [End]
[SubCat]
Str1 = Othertest
...
[CategoryA] [End]
Any suggestions?
Sounds like you want to parse a file that's pretty close to an ini file.
There's at least a few INI parser libraries out there: minIni, iniParser, libini, for instance.
It should be fairly easy to write your own parser for this if you use streams. You can read a file using an std::ifstream:
std::ifstream ifs("filename.ext");
if(!ifs.good()) throw my_exceptions("cannot open file");
read_file(ifs);
Since it seems line-oriented, you would then first read lines, and then process these:
void read_file(std::istream& is)
{
for(;;) {
std::string line;
std::getline(is, line);
if(!is) break;
std::istringstream iss(line);
// read from iss
}
if(!is.eof()) throw my_exceptions("error reading file");
}
For the actual parsing, you could 1) first peek at the first character. If that's a [, pop it from the stream, and use std::getline(is,identifier,']') to read whatever is within '[' and ']'. If it isn't a [, use std::getline(is, key, '=') to read the left side of a key-value pair, and then std::getline(is, value) to read the right side.
Note: Stream input, unfortunately, is usually not exactly lightning fast. (This doesn't have to be that way, but in practice this often is.) However, it is really easy to do and it is fairly easy to do it right, once you know a very few patterns to work with its peculiarities (like if(strm.good()) not being the same as if(strm) and not being the opposite of if(strm.bad()) and a few other things you'll have to get used to). For something as performance-critical (har har!) as reading an ini file from disk, it should be fast enough in 999,999 out of 1,000,000 cases.
You may want to try Boost.Program_Options. However it has slightly different formatting. More close to INI files. Subcategories are done like this:
[CategoryA]
Option = Data
[CategoryB.Subcategory1]
Option = Data
[CategoryB.Subcategory2]
Option = Data
Also it has some other features so it is actually very useful IMO.
Try Configurator. It's easy-to-use and flexible C++ library for configuration file parsing (from simplest INI to complex files with arbitrary nesting and semantic checking). Header-only and cross-platform. Uses Boost C++ libraries.
See: http://opensource.dshevchenko.biz/configurator
It looks more straightforward to implement your own parser than to try to adapt an existing one you are unfamiliar with.
Your structure seems - from your example - to be line-based. This makes parsing it easy.
It generally makes sense to load your file into a tree, and then walk around it as necessary.
On Windows only, GetPrivateProfileSection does this. It's deprecated in favor of the registry but it's still here and it still works.
How about trying to make a simple XML file? There are plenty of libraries that can help you read it, and the added bonus is that a lot of other programs/languages can read it too.
If you're using wxWidgets I would consider wxFileConfig. I'm not using wxWidgets, but the class seems to support categories with sub-categories.
When you are using GTK, you are lucky.
You can use the Glib KeyFile save_to_file and load_from_file.
https://docs.gtk.org/glib/struct.KeyFile.html
Or when using Gtkmm (C++).
See: https://developer-old.gnome.org/glibmm/stable/classGlib_1_1KeyFile.html
Example in C++ with load_from_file:
#include <glibmm.h>
#include <string>
Glib::KeyFile keyfile;
keyfile.load_from_file(file_path);
std::string path = keyfile.get_string("General", "Path");
bool is_enabled = keyfile.get_boolean("General", "IsEnabled");
Saving is as easy as calling save_to_file:
Glib::KeyFile keyfile;
keyfile.set_string("General", "Path", path);
keyfile.set_boolean("General", "IsEnabled", is_enabled);
keyfile.save_to_file(file_path);