I'm trying to save binary data from a file metadata.mfs into the postgres database and then reading from the database, grab the saved binary data and load it into a file. For the purpose of illustrating my goal lets also call it metadata.mfs with the same name but in a different directory. When I run md5sum metadata.mfs on both files I am expecting to see the same hash. (Essentially I want whatever is saved into the database from file1 be exactly the same as what I extract from the database in file2)
Currently I am not able to achieve that.
Below is what I have so far:
string readFile2(const string &fileName)
{
ifstream ifs(fileName.c_str(), ios::in | ios::binary | ios::ate);
ifs.seekg(0, ios::end);
ifstream::pos_type fileSize = ifs.tellg();
ifs.seekg(0, ios::beg);
vector<char> bytes(fileSize);
ifs.read(bytes.data(), fileSize);
cout.write(bytes.data(),bytes.size());
cout << "\n";
cout << fileSize;
cout << "\n";
// return bytes.data();
return string(bytes.data(), fileSize);
}
int main() {
string content;
string test = "h";
char test1 = 'C';
try {
cout << "A1 \n";;
content = readFile2("/var/opt/lizardfs/lib/lizardfs/metadata.mfs");
pqxx::connection c("postgresql://mark#localhost:26257/metadata");
pqxx::nontransaction w(c);
w.exec("CREATE TABLE IF NOT EXISTS binary (id INT PRIMARY KEY, meta bytea)");
w.exec("INSERT INTO binary (id,meta) VALUES (18, '"+w.esc_raw(content)+"')");
pqxx::result r = w.exec("SELECT meta FROM binary WHERE id='18'");
std::ofstream outfile("metadata.mfs");
for (auto row: r) {
cout << row[0] << endl;
outfile << row[0] << endl;
}
outfile.close();
w.commit()
}
The problem is that cout.write(bytes.data(),bytes.size()); prints out exactly the same as what I would see in the linux terminal if I run cat metadata.mfs, but from cout << row[0] << endl I see everything in hex, ie, \x4c495a4d20322e39000021....
I suspect this is because I am using w.esc_raw() on the binary content before inserting into postgres. Does that mean I need to unescape, ie using w.unesc_raw() after extracting the binary data from the database? But how would I do that? I've been looking at the docs here: https://libpqxx.readthedocs.io/en/6.1.1/a00225.html
Related
I have a scenario to store five different date and time stamps in a file (.txt) and retrieve the same and map it to different variables for processing.
For example I have the following data's which need to be written in a file.
2018-07-16 12:32:12
2018-07-16 12:31:17
2018-07-16 12:30:45
In my application I need to retrieve it from the file and map it to three different variables for processing like below,
std:: string var1 = 2018-07-16 12:32:12;
std::string var2 = 2018-07-16 12:31:17;
std::string var3 = 2018-07-16 12:30:45;
I could able to read and write the variable for single line using the below code,
void readFromFile(std::string& var)
{
std::fstream file(fileName_str, std::fstream::in | std::fstream::out |
std::fstream::app );
if( ! file ) {
cout << "Unable to open file:" << fileName_str << ";
return;
}
std::string line;
if (std::getline(file, line)) {
var = line;
}
file.close();
}
void writeToFile(std::string& timeString)
{
if( fileName_str.empty() ) {
cout << "File name is empty so returning from it.";
return;
}
std::ofstream file(fileName_str);
if( ! file ) {
cout << "Unable to open file:" << fileName_str << ", continuing WITHOUT using it.";
return;
}
file << timeString;
file.close();
}
}
But need help to do the same for three different variables.Any suggestions to achieve it.
A good way to do this is to use std::getline, but more than once. Say you have three lines in your file. You could read those lines like this:
#include <vector>
void readFromFile(std::vector<std::string>& vector_of_lines)
{
std::fstream file(fileName_str, std::fstream::in | std::fstream::out |
std::fstream::app );
if (!file) {
cout << "Unable to open file:" << fileName_str << std::endl;
return;
}
std::string line;
while (std::getline(file, line)) {
vector_of_lines.push_back(line);
}
file.close();
}
And this should give you a vector filled with the lines in your file.
Then, if you want to store these retrieved values in variables, you would call the code like this:
std::vector<std::string> myvec;
readFromFile(myvec);
std::string str1 = myvec[0];
std::string str2 = myvec[1];
std::string str3 = myvec[2];
You could also choose to not even transfer myvec into other variables at all, and instead use myvec to store them for the time being. However, if you must store them elsewhere, then that is how you would do it.
I've been staring at this to long...
I have made a program that logs weather data from different sensors. It handles the data in a double linkedlist and saves it to a binary file. To store different "compressions" of the data different files are used. e.g no compression, hours, days, etc.
The main program first loads the content of the correct file (defined by the WeatherSeries constructor) and adds all content from the file into the linkedlist. Then adds the new element and then saves it all. It adds to the list from oldest to newest and in the file it is also saved so the newest record is the last to be added to the file.
The error that occurs is that it seems that I lose a couple of hours of recorded data. I have observed that the data have existed. I.e. seen that there were data recorded between e.g. 9PM and 10PM and then in the morning after this data is gone.
The weird thing is the following:
Error only occurs intermittent and only for the barometric sensor that delivers values with 6 digits compared to the humidity and temperature sensors which delivers values with four digits.
It has only happened to the "no compression" and never for any of the other compressions. This means that the program that retrieves the data from the sensors works. Also it means that the function that adds data to the double linkedlist works.
Left is the functions that opens and saves the data to the files.
Can you please see if you can find any errors in my code?
void weatherSeries::saveSeries()
{
ostringstream s;
s << "WLData/" << mSensorNbr << "_" << mSensorType << "_" << mTimeBase << ".dat";
ofstream file(s.str().c_str(), ios::out | ios::trunc | ios::binary);
if (!file)
{
file.clear();
file.open(s.str().c_str(), ios::out | ios::trunc | ios::binary);
}
if(file.is_open())
{
for (current = tail; current != NULL; current = current->prev)
{
file.write((char*)¤t->time_stamp, sizeof(time_t));
file.write((char*)¤t->val_avg, sizeof(double));
file.write((char*)¤t->min, sizeof(double));
file.write((char*)¤t->max, sizeof(double));
file.write((char*)¤t->nbrOfValues, sizeof(unsigned long int));
}
}
else
{
cerr << "Unable to open for saving to " << mSensorNbr << "_" << mSensorType << "_" << mTimeBase << ".dat";
}
file.close();
}
void weatherSeries::openSeries()
{
deleteAll();
ostringstream s;
s << "WLData/" << mSensorNbr << "_" << mSensorType << "_" << mTimeBase << ".dat";
ifstream file(s.str().c_str(), ios::in | ios::binary);
if (!file)
{
file.clear();
file.open(s.str().c_str(), ios::in | ios::binary);
}
if(file.is_open())
{
time_t tmp_TS = 0;
double tmp_val_avg = 0;
double tmp_min = 0;
double tmp_max = 0;
unsigned long int tmp_nbrOfValues = 0;
while (file.read((char*)&tmp_TS, sizeof(time_t)))
{
file.read((char*)&tmp_val_avg, sizeof(double));
file.read((char*)&tmp_min, sizeof(double));
file.read((char*)&tmp_max, sizeof(double));
file.read((char*)&tmp_nbrOfValues, sizeof(unsigned long int));
addToSeries(tmp_TS, tmp_val_avg, tmp_min, tmp_max, tmp_nbrOfValues, true);
}
}
else
{
cerr << "Unable to open for opening from " << mSensorNbr << "_" << mSensorType << "_" << mTimeBase << ".dat";
}
file.close();
}
Note: deleteAll() clears the double linkedlist.
You were correct. The error was found in another part of the program. When I started logging different things in the code.
More or less different mechanism instantiate the program and it happened to happen at the same time causing the file to be manipulated by to instances at the same time.
Edit: changed my question to be more accurate of the situation
I'm trying to open up a text file (create it if it doesnt exist,open it if it doesnt). It is the same input file as output.
ofstream oFile("goalsFile.txt");
fstream iFile("goalsFile.txt");
string goalsText;
string tempBuffer;
//int fileLength = 0;
bool empty = false;
if (oFile.is_open())
{
if (iFile.is_open())
{
iFile >> tempBuffer;
iFile.seekg(0, iFile.end);
size_t fileLength = iFile.tellg();
iFile.seekg(0, iFile.beg);
if (fileLength == 0)
{
cout << "Set a new goal\n" << "Goal Name:"; //if I end debugging her the file ends up being empty
getline(cin, goalSet);
oFile << goalSet;
oFile << ";";
cout << endl;
cout << "Goal Cost:";
getline(cin, tempBuffer);
goalCost = stoi(tempBuffer);
oFile << goalCost;
cout << endl;
}
}
}
Couple of issues. For one, if the file exist and has text within it, it still goes into the if loop that would normally ask me to set a new goal. I can't seem to figure out what's happening here.
The problem is simply that you are using buffered IO streams. Despite the fact that they reference the same file underneath, they have completely separate buffers.
// open the file for writing and erase existing contents.
std::ostream out(filename);
// open the now empty file for reading.
std::istream in(filename);
// write to out's buffer
out << "hello";
At this point, "hello" may not have been written to disk, the only guarantee is that it's in the output buffer of out. To force it to be written to disk you could use
out << std::endl; // new line + flush
out << std::flush; // just a flush
that means that we've committed our output to disk, but the input buffer is still untouched at this point, and so the file still appears to be empty.
In order for your input file to see what you've written to the output file, you'd need to use sync.
#include <iostream>
#include <fstream>
#include <string>
static const char* filename = "testfile.txt";
int main()
{
std::string hello;
{
std::ofstream out(filename);
std::ifstream in(filename);
out << "hello\n";
in >> hello;
std::cout << "unsync'd read got '" << hello << "'\n";
}
{
std::ofstream out(filename);
std::ifstream in(filename);
out << "hello\n";
out << std::flush;
in.sync();
in >> hello;
std::cout << "sync'd read got '" << hello << "'\n";
}
}
The next problem you'll run into trying to do this with buffered streams is the need to clear() the eof bit on the input stream every time more data is written to the file...
Try Boost::FileSystem::is_empty which test if your file is empty. I read somewhere that using fstream's is not a good way to test empty files.
I need to know if you can easily get the number of data entries in another file and save that number in the original file. Need a program that will process the other file no matter how many entries are in it. Hope that makes any sense.
Your question is very poorly worded but I think you are looking for getline. This function can parse an input file based on the newline character (default behaviour) or based on a user provided delimiter:
int entryCount = 0;
std::string currentLine;
std::ifstream inFile( "in.txt" );
std::ofstream outFile;
if (inFile) // short for inFile.good())
{
while (std::getline( inFile, currentLine))
{
++entryCount;
// Do your processing
}
inFile.close();
outFile.open( "out.txt" );
outFile << "End of file. " << entryCount << " entries read." << std::endl;
outFile.close();
}
else
std::cout << "oops... error opening inFile" << std::endl;
I am quite new to C++ and am trying to work out how to write a record in the format of this structure below to a text file:
struct user {
int id;
char username [20];
char password [20];
char name [20];
char email [30];
int telephone;
char address [70];
int level;
};
So far, I'm able to write to it fine but without an incremented id number as I don't know how to work out the number of records so the file looks something like this after I've written the data to the file.
1 Nick pass Nick email tele address 1
1 user pass name email tele address 1
1 test test test test test test 1
1 user pass Nick email tele addy 1
1 nbao pass Nick email tele 207 1
Using the following code:
ofstream outFile;
outFile.open("users.dat", ios::app);
// User input of data here
outFile << "\n" << 1 << " " << username << " " << password << " " << name << " "
<< email << " " << telephone << " " << address << " " << 1;
cout << "\nUser added successfully\n\n";
outFile.close();
So, how can I increment the value for each record on insertion and how then target a specific record in the file?
EDIT: I've got as far as being able to display each line:
if (inFile.is_open())
{
while(!inFile.eof())
{
cout<<endl;
getline(inFile,line);
cout<<line<<endl;
}
inFile.close();
}
What you have so far is not bad, except that it cannot handle cases where there is space in your strings (for example in address!)
What you are trying to do is write a very basic data base. You require three operations that need to be implemented separately (although intertwining them may give you better performance in certain cases, but I'm sure that's not your concern here).
Insert: You already have this implemented. Only thing you might want to change is the " " to "\n". This way, every field of the struct is in a new line and your problem with spaces are resolved. Later when reading, you need to read line by line
Search: To search, you need to open the file, read struct by struct (which itself consists of reading many lines corresponding to your struct fields) and identifying the entities of your interest. What to do with them is another issue, but simplest case would be to return the list of matching entities in an array (or vector).
Delete: This is similar to search, except you have to rewrite the file. What you do is basically, again read struct by struct, see which ones match your criteria of deletion. You ignore those that match, and write (like the insert part) the rest to another file. Afterwards, you can replace the original file with the new file.
Here is a pseudo-code:
Write-entity(user &u, ofstream &fout)
fout << u.id << endl
<< u.username << endl
<< u.password << endl
<< ...
Read-entity(user &u, ifstream &fin)
char ignore_new_line
fin >> u.id >> ignore_new_line
fin.getline(u.username, 20);
fin.getline(u.password, 20);
...
if end of file
return fail
Insert(user &u)
ofstream fout("db.dat");
Write-entity(u, fout);
fout.close();
Search(char *username) /* for example */
ifstream fin("db.dat");
user u;
vector<user> results;
while (Read-entity(u))
if (strcmp(username, u.username) == 0)
results.push_back(u);
fin.close();
return results;
Delete(int level) /* for example */
ifstream fin("db.dat");
ofstream fout("db_temp.dat");
user u;
while (Read-entity(u))
if (level != u.level)
Write-entity(u, fout);
fin.close();
fout.close();
copy "db_temp.dat" to "db.dat"
Side note: It's a good idea to place the \n after data has been written (so that your text file would end in a new line)
Using typical methods at least you will need to use fix size records if you want to have random access when reading the file so say you have 5 characters for name it will be stored as
bob\0\0
or whatever else you use to pad, this way you can index with record number * record size.
To increment the index you in the way you are doing you will need to the read the file to find the high existing index and increment it. Or you can load the file into memory and append the new record and write the file back
std::vector<user> users=read_dat("file.dat");
user user_=get_from_input();
users.push_back(user_);
then write the file back
std::ofstream file("file.dat");
for(size_t i=0; i!=users.size(); ++i) {
file << users.at(i);
//you will need to implement the stream extractor to do this easily
}
I suggest to wrap the file handler into a Class, and then overload the operator >> and << for your struct, with this was you will control the in and out.
For instance
struct User{
...
};
typedef std::vector<User> UserConT;
struct MyDataFile
{
ofstream outFile;
UserConT User_container;
MyDataFile(std::string const&); //
MyDataFile& operator<< (User const& user); // Implement and/or process the record before to write
MyDataFile& operator>> (UserConT & user); // Implement the extraction/parse and insert into container
MyDataFile& operator<< (UserConT const & user); //Implement extraction/parse and insert into ofstream
};
MyDataFile& MyDataFile::operator<< (User const& user)
{
static unsigned myIdRecord=User_container.size();
myIdRecord++;
outFile << user.id+myIdRecord << ....;
return *this;
}
int main()
{
MydataFile file("data.dat");
UserConT myUser;
User a;
//... you could manage a single record
a.name="pepe";
...
file<<a;
..//
}
A .Dat file is normally a simple text file itself that can be opened with notepad . So , you can simply read the Last Line of the file , read it , extract the first character , convert it into integer . THen increment the value and be done .
Some sample code here :
#include <iostream.h>
#include <fstream.h>
using namespace std;
int main(int argc, char *argv[])
{
ifstream in("test.txt");
if(!in) {
cout << "Cannot open input file.\n";
return 1;
}
char str[255];
while(in) {
in.getline(str, 255); // delim defaults to '\n'
//if(in) cout << str << endl;
}
// Now str contains the last line ,
if ((str[0] >=48) || ( str[0] <=57))
{
int i = atoi(str[0]);
i++;
}
//i contains the latest value , do your operation now
in.close();
return 0;
}
Assuming your file format doesn't not need to be human readable.
You can write the struct out to file such as.
outFile.open("users.dat", ios::app | ios::binary);
user someValue = {};
outFile.write( (char*)&someValue, sizeof(user) );
int nIndex = 0;
user fetchValue = {};
ifstream inputFile.open("user.data", ios::binary);
inputFile.seekg (0, ios::end);
int itemCount = inputFile.tellg() / sizeof(user);
inputFile.seekg (0, ios::beg);
if( nIndex > -1 && nIndex < itemCount){
inputFile.seekg ( sizeof(user) * nIndex , ios::beg);
inputFile.read( (char*)&fetchValue, sizeof(user) );
}
The code that writes to the file is a member function of the user struct?
Otherwise I see no connection with between the output and the struct.
Possible things to do:
write the id member instead of 1
use a counter for id and increment it at each write
don't write the id and when reading use the line number as id