Writing and reading a binary file to fill a vector - C++ - c++

I'm working on a project that involves binary files.
So I started researching about binary files but I'm still confused about how to write and fill a vector from that binary file that I wrote before
Here's code: for writing.
void binario(){
ofstream fout("./Binario/Data.AFe", ios::out | ios::binary);
vector<int> enteros;
enteros.push_back(1);
enteros.push_back(2);
enteros.push_back(3);
enteros.push_back(4);
enteros.push_back(5);
//fout.open()
//if (fout.is_open()) {
std::cout << "Entre al if" << '\n';
//while (!fout.eof()) {
std::cout << "Entre al while" << '\n';
std::cout << "Enteros size: "<< enteros.size() << '\n';
int size1 = enteros.size();
for (int i = 0; i < enteros.size(); i++) {
std::cout << "for " << i << '\n';
fout.write((char*)&size1, 4);
fout.write((char*)&enteros[i], size1 * sizeof(enteros));
//cout<< fout.get(entero[i])<<endl;
}
//fout.close();
//}
fout.close();
cout<<"copiado con exito"<<endl;
//}
}
Here's code for reading:
oid leerBinario(){
vector<int> list2;
ifstream is("./Binario/Data.AFe", ios::binary);
int size2;
is.read((char*)&size2, 4);
list2.resize(size2);
is.read((char*)&list2[0], size2 * sizeof(list2));
std::cout << "Size del vector: " << list2.size() <<endl;
for (int i = 0; i < list2.size(); i++) {
std::cout << i << ". " << list2[i] << '\n';
}
std::cout << "Antes de cerrar" << '\n';
is.close();
}
I don't know if I'm writing correctly to the file, this is just a test so I don't mess up my main file, instead of writing numbers I need to save Objects that are stored in a vector and load them everytime the user runs the program.

Nope, you're a bit confused. You're writing the size in every iteration, and then you're doing something completely undefined when you try to write the value. You can actually do this without the loop, when you are using a vector.
fout.write(&size1, sizeof(size1));
fout.write(enteros.data(), size1 * sizeof(int));
And reading in:
is.read(&list2[0], size2 * sizeof(int));
To be more portable you might want to use data types that won't change (for example when you switch from 32-bit compilation to 64-bit). In that case, use stuff from <cctype> -- e.g. int32_t for both the size and value data.

Related

std::fstream read block of data from file and write data back to file until EOF

I'm reading blocks of data from the file, but not all at once (ex. 3 bytes per read/write) and then write same 3 bytes back to file to the very same position inside a file, and then continue looping until there are no more blocks to read.
In other words I'm trying to rewrite the file by it's very contents.
However there is a problem that final output isn't the same as it was in the beginning.
Following sample code reads 3 bytes per iteration from a file "sample.txt", file contents are simple:
0123456789
after reading data and writing data back to file, the contents are:
012345345345
As you see data doesn't get rewritten correctly for some reason.
#include <fstream>
#include <iostream>
using namespace std;
#define BLOCK_SIZE 3
int main()
{
// open file
fstream file;
file.open("sample.txt", ios::binary | ios::out | ios::in);
// determine size and number of blocks to read
file.seekg(0, ios::end);
streampos size = file.tellg();
int blocks = size / BLOCK_SIZE;
cout << "size:\t" << size << endl;
if (size % BLOCK_SIZE != 0)
{
++blocks;
}
cout << "blocks:\t" << blocks << endl;
// return to beginning
file.seekg(ios::beg);
// we will read data here
unsigned char* data = new unsigned char[BLOCK_SIZE];
streampos pos;
// read blocks of data and write data back
for (int i = 0; i < blocks; ++i)
{
pos = file.tellg();
cout << "before read:\t" << pos << endl;
// read block
file.read(reinterpret_cast<char*>(data), BLOCK_SIZE);
cout << "after read:\t" << file.tellg() << endl;
// write same block back to same position
file.seekp(pos);
cout << "before write:\t" << file.tellg() << endl;
file.write(reinterpret_cast<char*>(data), BLOCK_SIZE);
cout << "after write:\t" << file.tellg() << endl;
// reset buffer
memset(data, 0, BLOCK_SIZE);
}
file.close();
delete[] data;
cin.get();
return 0;
}
Do you see what could be the reason for bad overwrite?
EDIT:
Sorry, I can't see how the linked duplicate answers my question, I'm simply unable to apply given answer to the code above.
Your code does not handle the EOF condition well, and leaves the stream in a bad state after trying to read past the end of the file. On my system, this results in all further calls to the stream having no effect. I bet that isn't the case on your system (which I suspect is a bug in its iostream implementation). I re-did your code to handle the EOF condition correctly, and also to be a lot cleaner in a few other ways:
#include <fstream>
#include <iostream>
using namespace std;
const int BLOCK_SIZE = 3;
int main()
{
// open file
fstream file;
file.open("sample.txt", ios::binary | ios::out | ios::in);
// we will read data here
bool found_eof = false;
// read blocks of data and write data back
while (!found_eof)
{
unsigned char data[BLOCK_SIZE] = {0};
char * const data_as_char = reinterpret_cast<char *>(data);
streampos const pos = file.tellp();
int count_to_write = BLOCK_SIZE;
cout << "before read:\t" << file.tellg() << ' ' << pos << '\n';
// read block
if (!file.read(data_as_char, BLOCK_SIZE)) {
found_eof = true;
count_to_write = file.gcount();
file.clear();
cout << "Only " << count_to_write << " characters extracted.\n";
}
cout << "after read:\t" << file.tellg() << ' ' << file.tellp() << '\n';
// write same block back to same position
file.seekp(pos);
cout << "before write:\t" << file.tellg() << ' ' << file.tellp() << '\n';
file.write(data_as_char, count_to_write);
cout << "after write:\t" << file.tellg() << ' ' << file.tellp() << '\n';
file.seekp(file.tellp());
}
file.close();
cin.get();
return 0;
}
But, this is not fundamentally different. Both versions work for me just the same. I'm on Linux with g++.
From the linked to possible dupe, I would also suggest adding this just before the closing } of your for loop:
file.seekp(file.tellp());
I've put that in my code in the appropriate place.

Iteratively accessing and writing to multiple files using ofstream

I have a pair of header files. Within IsingModel.h, publically I declare:
ofstream logfile1;
ofstream logfile2;
Then to open the relevant files (logfile1 and logfile 2 have different names) I use:
do {
name2.str(""); //reset name stringstream
n++; //increase n value
name2 << "output_" << gridSize << "_" << seed << "_" << n << "_eqmCalc.txt"; //stream created
} while (if_exist(name2.str())); //test if file already exists
logfile2.open(name2.str());
Which works in creating the file. Then, throughout the code I use the ofstreams to act on the files, for example:
logfile1 << counter << " " << calcM() << " " << calcE() << endl;
This is fine for actions that are independent for each file, however when I call the destructor I want to write the same standard information to each file. To that end, I am experimenting with iteratively writing to the files and it does not seem to work:
void IsingSystem::test() {
for (int i = 1; i = 2; i++) {
if (ofstream("logfile" + to_string(i)).is_open); {
ofstream("logfile" + to_string(i)) << "success" << endl;
}
}
}
This instead creates files called logfile1 and logfile2. As an alternative, I tried to create an array of ofstreams:
void createFileHandles() {
const int count = 2;
std::ofstream logfile[count];
}
But, I could not work out how to pass this between functions properly.
What is the proper way of handling ofstreams so that I can have multiple files open, writing different instructions to them simultaneously but also have some actions that happen to both?
You can have a vector of ofstream
vector<ofstream> ofstreams(2);
//fill vec
for (int i = 0; i < 2; i++)
{
if (ofstreams[i].is_open);
{
ofstreams[i]<< "success" << endl;
}
}
You can then pass ofstreams to functions.

Dynamic array of structs

I have such piece of code:
typedef struct reader
{
char name[50];
char card_num[50];
char title[100];
}reader_t;
int main()
{
vector<reader> vec;
ifstream input_file("D:\\lab.txt", ios::binary);
reader_t master[1];
input_file.read((char*)&master, sizeof(master));
for (size_t idx = 0; idx < 1; idx++)
{
reader temp;
strcpy(temp.name, master[idx].name);
strcpy(temp.card_num, master[idx].card_num);
strcpy(temp.title, master[idx].title);
vec.push_back(temp);
cout << "Name: " << master[idx].name << endl;
cout << "Card num: " << master[idx].card_num << endl;
cout << "Title: " << master[idx].title<<endl;
}
cout << vec.size();
getchar();
}
What is does: it reads structures from binary file into an array of structures,copies them into vector and displays structure.And yes, I do need to do like this - I need to store structures from file in vector and this is the only working way to do it I could find(if you can tell, how to read structures to vector directly from file - you are welcome).
So,everything works fine, but the problem is that I need to create a function which would be able to do the same, but with dynamic array.I wrote something like this:
void read_structs(int vec_size)
{
ifstream input_file("D:\\lab.txt", ios::binary);
//Here I commented 2 ways how I tried to create a dynamic array of structs
//reader* master = new reader[vec_size];
//reader* master = (reader*)malloc(sizeof(reader) * vec_size);
input_file.read((char*)&master, sizeof(master));
for (size_t idx = 0; idx < vec_size; idx++)
{
reader temp;
strcpy(temp.name, master[idx].name);
strcpy(temp.card_num, master[idx].card_num);
strcpy(temp.title, master[idx].title);
vec.push_back(temp);
cout << "Name: " << master[idx].name << endl;
cout << "Card num: " << master[idx].card_num << endl;
cout << "Title: " << master[idx].title<<endl;
}
}
And that worked fine too unless I tried to run it.VS wasn't higlighting error in my code, it just was throwing an exception right as the moment when the program tried to access master[0].name.
There is absolutely no point in the temp struct. See, the
vec.push_back(temp);
is already using copy constructor, so copy constructor must work and then the set of strcpy is not doing anything different from that, so just go with
vec.push_back(master[0]).
You can't read into vector directly. You do need to read into temporary. So that is correct. Except I suppose you want to read all entries from the file no matter how many of them there are, so you need to put the read itself also into the loop.
There is not much point in creating an array of one element.
reader_t master[1];
input_file.read((char*)master, sizeof(master));
// ^ you *don't* need & here, arrays degrade to pointers automatically
and
reader_t master;
input_file.read((char *)&master, sizeof(master));
// ^ but you do need & here.
are equivalent. I would go with the later.
So we are basically down to:
reader temp; // calling it temp; the master name makes no sense.
while (input_file.read((char*)&temp, sizeof(temp)))
// read returns input_file and input_file is false if last operation failed
{
vec.push_back(temp);
// verify the stored values by reading back vfrom vec.back().
cout << "Name: " << vec.back().name << endl;
cout << "Card num: " << vec.back().card_num << endl;
cout << "Title: " << vec.back().title<<endl;
}
In the second example, you didn't initialize master, so it obviously crashed.
There is a more C++ approach though. First, you define a read operator for the structure:
std::istream &operator>>(std::istream &in, reader &r) {
return in.read((char *)&r, sizeof(r));
}
and then you simply read the vector using the istream_iterator:
vec.assign(std::istream_iterator<reader>(input_file),
std::istream_iterator<reader>());
and the standard library will generate the above loop for you.

vector of structs with weird behavior c++

i have a problem with one assignment that i have. I have to read a .ts file, read the packets that are inside and extract header information from each packet.
I have created a struct Packet that will hold all the info of the header, and i also have a vector in which i will push_back each Packet.
The problem is that the for loop stops for some reason on the 163rd loop. If i loop until lets say i=160, then the code escapes ends the loop, but when i print the vector.size() i get a really huge number which doesn't make sense. i guess it should be an integer value as high as the pushed back number of Packets.Here is the code that i have so far:
int main() {
FILE *ts_file = NULL;
ts_file = fopen64("/home/ddd/Desktop/Assignment/Streams/ddd.ts", "rb");
if (ts_file == NULL){
cout << "No file detected on this path, try again" << endl; // prints !!!Hello World!!!
}
TS_Analyzer *ts_analyzer;
ts_analyzer->parse_file(ts_file);
cout << "Finished main" << endl;
return 0;
}
void TS_Analyzer::parse_file(FILE *ts_file){
cout << "Inside parser" << endl;
fseek(ts_file,0,SEEK_END);
long file_size = ftell(ts_file);
rewind (ts_file);
number_of_packets = file_size/PACKET_SIZE;
unsigned int current_header_add = 0;
unsigned int i=0;
for (unsigned int j=1; i<number_of_packets; j++)
{
i++;
unsigned char TS_raw_header[4];
cout << "current position " << int(current_header_add) << endl;
current_header_add = ftell(ts_file);
fread(&TS_raw_header, sizeof(TS_raw_header), 1, ts_file);
Packet current_packet;
current_packet.sync_byte = TS_raw_header[0];
current_packet.transport_error_indicator = (TS_raw_header[1] & 0x80) >> 7;
current_packet.payload_start_indicator = (TS_raw_header[1] & 0x40) >> 6;
current_packet.transport_priority = (TS_raw_header[1] & 0x20) >> 5;
current_packet.PID = ((TS_raw_header[1] & 31) << 8) | TS_raw_header[2];
current_packet.transport_scrambling_control = (TS_raw_header[3] & 0xC0);
current_packet.adaption_field_control = (TS_raw_header[3] & 0x30) >> 4;
current_packet.continuity_counter = (TS_raw_header[3] & 0xF);
stream_packets.push_back(current_packet);
//cout << hex << int(current_packet.PID) << endl;
//cout << dec << "continuity counter " << int(current_packet.continuity_counter) << endl;
cout << " i " << int(i) << endl;
fseek(ts_file, 184, SEEK_CUR);
}
cout << "##" << endl;
cout << stream_packets.size() << endl;
}
class TS_Analyzer: public Analyzer {
public:
TS_Analyzer();
~TS_Analyzer();
struct Packet {
unsigned char sync_byte;
unsigned char transport_error_indicator;
unsigned char payload_start_indicator;
unsigned char transport_priority;
unsigned int PID;
unsigned char transport_scrambling_control;
unsigned char adaption_field_control;
unsigned char continuity_counter;
};
std::vector<Packet>stream_packets;
int number_of_packets = 0;
void parse_file(FILE *);
};
Any ideas of why the vector push_back breaks the for loop and why i cannot get a correct vector size?
If I put this code through the clang compiler, I get an error on following code:
TS_Analyzer *ts_analyzer;
ts_analyzer->parse_file(ts_file);
>> variable 'ts_analyzer' is uninitialized when used here
I guess you are encountering undefined behavior: As ts_analyzer as ptr is any random value, the data in its members is also very random.
I'm actually surprised that this code runs at all without crashing, though you can always be lucky.
If you like to fix this, try avoiding pointers by creating the object at the stack:
TS_Analyzer ts_analyzer;
ts_analyzer.parse_file(ts_file);
or if you really need allocated memory, at least fill in the pointer:
auto ts_analyzer = std::make_unique<TS_Analyzer>();
ts_analyzer->parse_file(ts_file);

how to access images in sequence using loop in openCV

Guys i ran into a problem regarding accessing images in sequential order. i have images whose names change with incrementing number i.e. cube_0.jpg, cube_1.jpg, .... and so on. Now i want to access each image one-by-one and show.
Following is my code that i am playing with since 2-days and don't know how to handle this situation or what is wrong with this problem.
ostringstream s;
for (int fileNumber = 0; fileNumber<=40; fileNumber++)
{
s<<"\"cube_"<<fileNumber<<"\.jpg\""<<endl;
string fullfileName(s.str());
images[i] = fullfileName;
}
stringstream ss;
cout<<"file name"<<images[0]<<endl;
for (int file = 0; file<41; file++)
{
string str = images[file];
cout<<"str "<<str<<endl;
img_raw = imread(ss.str(), 1); // load as color image Error
cout<<"Done"<<endl<<"size"<<img_raw.size();
system("pause");
}
This code runs fine till it gets reached to "img_raw = imread(ss.str())", now this line is basically hindering me from accessing file. Since imread requires "string& filename" therefore i performed stringstream operation but nothing is working!
Any help would be greatly appreciated!
There are a few errors.
Your stringstream ss is empty. You declared it but did not fill with any values. I am pretty sure you meant imread(str, 1); instead of imread(ss.str(), 1);
In the first for loop, you are continuously printing filenames to ostringstream, so it goes like this:
0: "cube_0.jpg\"
1: "cube_0.jpg\""cube_1.jpg\"
2: "cube_0.jpg\""cube_1.jpg\""cube_2.jpg\"
...
so the ostringstream just grows and grows. ostringstream needs to be declared in the loop to be cleared for every iteration.
Edited code:
string images[41];
Mat img_raw;
for (int fileNumber = 0; fileNumber < 41; fileNumber++)
{
stringstream ss;
ss << "\cube_" << fileNumber << "\.jpg" << endl;
string fullfileName;
ss >> fullfileName;
images[fileNumber] = fullfileName;
}
for (int file = 0; file < 41; file++)
{
cout << "Loading " << images[file] << endl;
img_raw = imread(images[file], 1);
if (!img_raw.empty())
{
cout << "Successfully loaded " << images[file] << " with size " << img_raw.size() << endl;
}
else
{
cout << "Error loading file " << images[file] << endl;
}
system("pause");
}