Storing the last value of a file from SD card using arduino - c++

I am making a device that moves back and fourth and needs to store its last position so that upon power up, the last stored value can be grabbed from the last line of the file on an SD card, and it can resume operation. This file will then be destroyed and re-written. For this particular application homing and other methods can not be used because it must start in the spot it last was. Due to position tracking via encoder, there is no positional memory otherwise.The file is setup to be a single data column seperated by commas.
Currently I am successfully writing to the SD card as position changes, and reading the entire file to be printed on the Serial monitor. However, I really only need the last value. The length of the file will always be different do to system operation.
I have read a lot of different solutions but none of them seem to work for my application.
I can read the entire file using:
void read_file() {
// open the file for reading:
myFile = SD.open("test8.txt");
if (myFile) {
Serial.println("test8.txt:");
// read from the file until there's nothing else in it:
// read from the file until there's nothing else in it:
while (myFile.available()) {
String a = "";
for (int i = 0; i < 9; ++i)
{
int j;
char temp = myFile.read();
if (temp != ',' && temp != '\r')
{ //a=temp;
a += temp;
}
else if (temp == ',' || temp == '\r') {
j = a.toInt();
// Serial.println(a);
Serial.println(j);
break;
}
}
}
// close the file:
myFile.close();
} else {
// if the file didn't open, print an error:
Serial.println("error opening test8.txt");
}
}
This gives me a stream of the values separated by 0 like this:
20050
0
20071
0
20092
0
20113
0
20133
0
Ideally I just need 20133 to be grabbed and stored as an int.
I have also tried:
void read_file_3() {
// open the file for reading:
myFile = SD.open("test8.txt");
if (myFile) {
Serial.println("test8.txt:");
// read from the file until there's nothing else in it:
Serial.println(myFile.seek(myFile.size()));
// close the file:
myFile.close();
} else {
// if the file didn't open, print an error:
Serial.println("error opening test.txt");
}
}
This only returns "1", which does not make any sense to me.
Update:
I have found a sketch that does what I want, however it is very slow due to the use of string class. Per post #6 here: https://forum.arduino.cc/index.php?topic=379209.0
This does grab the last stored value, however it takes quite awhile as the file gets bigger, and may blow up memory.
How could this be done without the string class?
void read_file() {
// open the file for reading:
myFile = SD.open("test8.txt");
if (myFile) {
while (myFile.available())
{
String line_str = myFile.readStringUntil(','); // string lavue reading from the stream - from , to , (coma to comma)
int line = line_str.toInt();
if (line != 0) // checking for the last NON-Zero value
{
line2 = line; // this really does the trick
}
// Serial.print(line2);
// delay(100);
}
Serial.print("Last line = ");
Serial.print(line2);
// close the file:
myFile.close();
// SD.remove("test3.txt");
} else {
// if the file didn't open, print an error:
Serial.println("error opening test.txt");
}
}
Any help would be greatly appreciated!

seek returns true if it succesffuly goes to that position and false if it does not find anything there, like for instance if the file isn't that big. It does not give you the value at that position. That's why you see a 1, seek is returning true that it was able to go to the position (myFile.size()) and that's what you're printing.
Beyond that, you don't want to go to the end of the file, that would be after your number. You want to go to a position 5 characters before the end of the file if your number is 5 digits long.
Either way, once you seek that position, then you still need to use read just like you did in your first code to actually read the number. seek doesn't do that, it just takes you to that position in the file.
EDIT: Since you edited the post, I'll edit the answer to go along. You're going backwards. You had it right the first time. Use the same read method you started with, just seek the end of the file before you start reading so you don't have to read all the way through. You almost had it. The only thing you did wrong the first time was printing what you got back from seek instead of seeking the right position and then reading the file.
That thing you looked up with the String class is going backward from where you were. Forget you ever saw that. It's doing the same thing you were already doing in the first place only it's also wasting a lot of memory and code space in the process.
Use your original code and just add a seek to skip to the end of the file.
This assumes that it's always a 5 digit number. If not then you may need a little bit of tweaking:
void read_file() {
// open the file for reading:
myFile = SD.open("test8.txt");
if (myFile) {
Serial.println("test8.txt:");
/// ADDED THIS ONE LINE TO SKIP MOST OF THE FILE************
myFile.seek(myFile.size() - 5);
// read from the file until there's nothing else in it:
// read from the file until there's nothing else in it:
while (myFile.available()) {
String a = "";
for (int i = 0; i < 9; ++i)
{
int j;
char temp = myFile.read();
if (temp != ',' && temp != '\r')
{ //a=temp;
a += temp;
}
else if (temp == ',' || temp == '\r') {
j = a.toInt();
// Serial.println(a);
Serial.println(j);
break;
}
}
}
// close the file:
myFile.close();
} else {
// if the file didn't open, print an error:
Serial.println("error opening test8.txt");
}
}
See, all I've done is take your original function and add a line to seek the end to it.

Related

seekg() seeminlgy skipping characters past intended position C++

I've been having an issue with parsing through a file and the use of seekg(). Whenever a certain character is reached in a file, I want to loop until a condition is met. The loops works fine for the first iteration but, when it loops back, the file seemingly skips a character and causes the loop to not behave as expected.
Specifically, the loop works fine if it is all contained in one line in the file, but fails when there is at least one newline within the loop in the file.
I should mention I am working on this on Windows, and I feel like the issue arises from how Windows ends lines with \r\n.
Using seekg(-2, std::ios::cur) after looping back fixes the issue when the beginning loop condition is immediately followed by a newline, but does not work for a loop contained in the same line.
The code is structured by having an Interpreter class hold the file pointer and relevant variables, such as the current line and column. This class also has a functional map defined like so:
// Define function type for command map
typedef void (Interpreter::*function)(void);
// Map for all the commands
std::map<char, function> command_map = {
{'+', increment_cell},
{'-', decrement_cell},
{'>', increment_ptr},
{'<', decrement_ptr},
{'.', output},
{',', input},
{'[', begin_loop},
{']', end_loop},
{' ', next_col},
{'\n', next_line}
};
It iterates through each character, deciding if it has functionality or not in the following function:
// Iterating through the file
void Interpreter::run() {
char current_char;
if(!this->file.eof() && this->file.good()) {
while(this->file.get(current_char)) {
// Make sure character is functional command (ie not a comment)
if(this->command_map.find(current_char) != this->command_map.end()) {
// Print the current command if in debug mode
if(this->debug_mode && current_char != ' ' && current_char != '\n') {
std::cout << this->filename << ":" << this->line << ":"
<< this->column << ": " << current_char << std::endl;
}
// Execute the command
(this->*(command_map[current_char]))();
}
// If it is not a functional command, it is a comment. The rest of the line is ignored
else{
std::string temp_line = "";
std::getline(file, temp_line);
this->line++;
this->column = 0;
}
this->temp_pos = file.tellg();
this->column++;
}
}
else {
std::cout << "Unable to find file " << this->filename << "." << std::endl;
exit(1);
}
file.close();
}
The beginning of the loop (signaled by a '[' char) sets the beginning loop position to this->temp_pos:
void Interpreter::begin_loop() {
this->loop_begin_pointer = this->temp_pos;
this->loop_begin_line = this->line;
this->loop_begin_col = this->column;
this->run();
}
When the end of the loop (signaled by a ']' char) is reached, if the condition for ending the loop is not met, the file cursor position is set back to the beginning of the loop:
void Interpreter::end_loop() {
// If the cell's value is 0, we can end the loop
if(this->char_array[this->char_ptr] == 0) {
this->loop_begin_pointer = -1;
}
// Otherwise, go back to the beginning of the loop
if(this->loop_begin_pointer > -1){
this->file.seekg(this->loop_begin_pointer, std::ios::beg);
this->line = this->loop_begin_line;
this->column = this->loop_begin_col;
}
}
I was able to put in debugging information and can show stack traces for further clarity on the issue.
Stack trace with one line loop ( ++[->+<] ):
+ + [ - > + < ] [ - > + < ] done.
This works as intended.
Loop with multiple lines:
++[
-
>
+<]
Stack trace:
+ + [ - > + < ] > + < ] <- when it looped back, it "skipped" '[' and '-' characters.
This loops forever since the end condition is never met (ie the value of the first cell is never 0 since it never gets decremented).
Oddly enough, the following works:
++[
-
>+<]
It follows the same stack trace as the first example. This working and the last example not working is what has made this problem hard for me to solve.
Please let me know if more information is needed about how the program is supposed to work or its outputs. Sorry for the lengthy post, I just want to be as clear as possible.
Edit 1:
The class has the file object as std::ifstream file;.
In the constructor, it is opened with
this->file.open(filename), where filename is passed in as an argument.
For a file stream, seekg is ultimately defined in terms of fseek from the C standard library. The C standard has this to say:
7.21.9.2/4 For a text stream, either offset shall be zero, or offset shall be a value returned by an earlier successful call to the ftell function on a stream associated with the same file and whence shall be SEEK_SET.
So for a file opened in text mode, you can't do any arithmetic on offsets. You can rewind to the beginning, position at the end, or return to the position you were at previously and captured with tellg (which ultimately calls ftell). Anything else would exhibit undefined behavior.

bool function doesn't work as a while condition

I am trying to to check if a file has successfully opened, read from it and output what I've read from it all in one function, because I have 7 files to operate on in the same code and I want to avoid writing the same code over and over again.
So I have made a bool function and put it as a while condition.
If I succeed, the function returns true and if I don't it returns false. So a while(!function) should keep trying until it works, correct ? And the answer is yes, it works as intended.
But if I change the condition of the while to while(function) one would expect to repeat the function until it fails somehow (maybe it can't open the file.). But it doesn't behave as expected. It only works correctly on the first while iteration.
This is the example:
#include <iostream>
#include <unistd.h>
#include <string.h>
#include <fstream>
#include <sstream>
bool readConfig(std::fstream& file, std::string (&str)[10], std::string identity) {
if(file.is_open()) {
if(file.seekg(0)) {
std::cout<<"Start from 0"<<std::endl;
}
// Get content line by line of txt file
int i = 0;
while(getline(file, str[i++]));
std::cout<<"i= "<<i<<std::endl;
for(int k = 0; k<i; k++) {
std::cout<<identity<<" = "<<str[k]<<std::endl;
}
return true;
} else {
std::cout<<"ERROR ! Could not open file."<<std::endl;
return false;
}
}
int main() {
char configFilePath[]="test.txt";
std::fstream configFile;
configFile.open(configFilePath, std::fstream::in);
std::string test[10];
std::string id = "testing";
while(!readConfig(configFile, test,id)) {
usleep(1000*1000);
};
return 0;
}
This is the content of test.txt :
line 1
line 2
line 3
line 4
This is the output:
Start from 0
i= 5
testing = line 1
testing = line 2
testing = line 3
testing = line 4
testing =
i= 1
testing = line 1
i= 1
testing = line 1
and so on.
Why does it work on the first iteration but then it stops at i=1 ? I am asking because I don't know if what I did is correct or not. while(!function) works, but maybe it won't work all the time, maybe my code is flawed.
Or maybe while(getline(configFile, string[i++])); is at fault here ?
This is the code I am trying to replace:
void readConfig(std::fstream& configFile, std::string (&str)[10], std::string identity) {
if(configFile) {
// Get content line by line of txt file
int i = 0;
while(getline(configFile, str[i++]));
//for debug only
if((i-1) == 0) {
std::cout<<identity<<" = "<<str[i-1]<<std::endl;
} else {
for(int k = 0; k<i-1; k++) {
std::cout<<identity<<" = "<<str[k]<<std::endl;
}
}
} else {
log("ERROR ! Could not get content from file.");
}
}
int main() {
file.open(file, std::fstream::in);
if(file.is_open()) {
std::cout<<"Successfully opened URL Display Text file."<<std::endl;
std::string inputs[10];
std::string id = "url_text";
readConfig(file, inputs, id);
file.close();
} else {
// Could not open file
log("Error ! Could not open file.");
}
}
I do this 7 times, instead of just calling a function 7 times, that does all of that.
But if I change the condition of the while to while(function) one would expect to repeat the function until it fails somehow (maybe it can't open the file.).
You reasoning is off here. The function is not opening the file, so that is nothing that can go wrong on the next iteration when it suceeded on the first.
What the function does is: it reads all the contets of the file, then returns true. And subsequent iterations there is nothing left to read, but still the function returns true.
You should check if the file is open only once, not in each iteration. If the function is supposed to read a single line then make it so, currently it reads all.
Change the test from if (file.is_open()) to if (file). Failing to open the file is not the only way that a file stream can end up in a bad state. In particular, on the second call to this function, the stream is open, but it's in a failed state because the last read attempt failed.
If you just want to read the file line by line, print the lines and store them, I'd do it like this.
Rather than using a c-style array use a std::vector or std::array
Check the if the file is open before you call the read function
#include <fstream>
#include <iostream>
#include <string>
#include <vector>
void readConfig(std::ifstream& configFile,
std::vector<std::string>& lines,
const unsigned int limit) {
std::string line;
while (std::getline(configFile, line)) {
if (lines.size() >= limit) {
break;
}
lines.push_back(line);
}
}
int main() {
const std::array<std::string, 3> fileNames = {"test1.txt",
"test2.txt",
"test3.txt"};
// Iterate over all your files
for (const auto& fileName : fileNames) {
// Open the file
std::ifstream configFile(fileName);
if (!configFile.is_open()) {
std::cout << "ERROR! Could not open file.\n";
continue;
}
// Read the file
std::vector<std::string> lines;
constexpr unsigned int limit = 4;
readConfig(configFile, lines, limit);
if (configFile.is_open()) {
configFile.close();
}
// Work with the file content
std::cout << fileName << "\n";
for (const auto& line : lines) {
std::cout << "testing = " << line << "\n";
}
std::cout << "\n";
}
return 0;
}
Your output paints a fairly clear picture of what is going on. You have enough debugging output to identify what choices have been made. They key point I would focus on is the following sequence:
testing =
i= 1
The first of these lines is the fifth line read from your four-line file. Not surprisingly, there is nothing there. The next output line comes from the next invocation of readConfig, somewhere in the branch where file.isopen() is true. However, note that there is not a line saying "Start from 0" between these two. That means file converted to false after the call to file.seekg(0) (the value returned by that function is file, not directly a boolean). This indicates some sort of error state, and one should expect that error state to persist until cleared. And there is no attempt made to clear it.
The next bit of code is the getline loop. As with seekg, the getLine function returns the stream (file) rather than a boolean. As expected, the error state has persisted, making the loop condition false, hence no iterations of the loop.
testing = line 1
The next line of output is ambiguous. It could indicate that the position was successfully changed to the start of the file, and that the first line of input was successfully read. Or it could indicate that the call to getLine returned before erasing the provided string, leaving the contents from the first call to readConfig. I'm thinking the latter, but you could check for yourself by manually erasing str[0] before the getline loop.
In general, reusing resources like this makes debugging harder because the results could be misleading. Debugging would be less confusing if str was a local variable instead of a parameter. Similar for file – instead of a stream parameter, you could pass a string with the name of the file to open.

C++ fiile writer does not write characters that I give it

I'm currently working on a project for my Computer Organization class, and my professor decided to throw us headlong into a C++ project with no prior class time or experience besides bitwise operators and memory pointers. The goal of the project is to create a program that can compress or decompress files using Run Length Encoding, and we were given a framework of code to work with. I'm currently trying to write the Encode function, and this is what I have so far. Keep in mind, I have absolutely no prior experience in C or C++.
void compress( char* data, int count, FILE* outfile )
{
// TODO: compress the data instead of just writing it out to the file
char currentChar = data[0];
int charCount;
charCount = 0;
for (int i=0; i<count; ++i)
{
if(data[i] == currentChar)
{
charCount++;
}
else if(data[i] != currentChar)
{
if(charCount > 9)
{
while(charCount > 9)
{
putc(currentChar, outfile); // write the current char to the file
putc(9, outfile); // write 9 to the file
charCount -= 9;
}
putc( currentChar, outfile ); // write the current char to the file
putc( charCount, outfile); // write the number of currentChar to the file
}
else
{
putc( currentChar, outfile ); // write the current char to the file
putc( charCount, outfile); // write the number of currentChar to the file
}
// reset the currentChar and charCount variables
currentChar = data[i];
charCount = 1;
}
}
}
The output this code gives is as follows:
x x(unknown character)y(unknown character)
When it should be:
x9x1y4z3
What exactly am I doing wrong here? As far as my (extremely limited) knowledge goes, this should be correct. But again, I am completely new to C++ (my only other coding experience is in Python and Java).
EDIT: Ok, the numbers are writing correctly. The output is now: x9x1y4, which is almost correct. Bot now the compression code is still ignoring the three Z's I have at the end of the test file. I would run it through the debugging suite built into Eclipse, but for some reason it says the test file doesn't exist when I run it in debug mode.
Regarding the 3 z's not being counted: your code only outputs data while processing a character. After all characters have been processed by your for loop, you need to dump the currentChar and charCount again.
If you want to write a number to the file you could do something like
putc (charCount + 48, fp)
This will only work for integers 0-9. For larger numbers you will need to get each digit and add 48 to it.
Though from looking at your code, charCount variable will always be 1. I think the last line of the code might be
charCount += 1
Another option is:
putc('0' + charCount, outfile);
or
putc('0' + 9, outfile);

how to discard from streams? .ignore() doesnt work for this purpose, any other methods?

I have a lack of understanding about streams. The idea is, to read a file to the ifstream and then working with it. Extract Data from the stream to a string, and discard the part which is now in a string from the stream. Is that possible? Or how to handle those problems?
The following method, is for inserting a file which is properly read by the ifstream. (its a text file, containing informations about "Lost" episodes, its an episodeguide. It works fine, for one element of the class episodes. Every time i instantiate a episode file, i want to check the stream of that file, discard the informations about one episode (its indicated by "****", then the next episode starts) and process the informations discarded in a string. If I create a new object of Episode I want to discard the next informations about the episodes after "****" to the next "****" and so on.
void Episode::read(ifstream& in) {
string contents((istreambuf_iterator<char>(in)), istreambuf_iterator<char>());
size_t episodeEndPos = contents.find("****");
if ( episodeEndPos == -1) {
in.ignore(numeric_limits<char>::max());
in.clear(), in.sync();
fullContent = contents;
}
else { // empty stream for next episode
in.ignore(episodeEndPos + 4);
fullContent = contents.substr(0, episodeEndPos);
}
// fill attributes
setNrHelper();
setTitelHelper();
setFlashbackHelper();
setDescriptionHelper();
}
I tried it with inFile >> words (to read the words, this is a way to get the words out of the stream) another way i was thinking about is, to use .ignore (to ignore an amount of characters in the stream). But that doesnt work as intended. Sorry for my bad english, hopefully its clear what i want to do.
If your goal is at each call of Read() to read the next episode and advance in the file, then the trick is to to use tellg() and seekg() to bookmark the position and update it:
void Episode::Read(ifstream& in) {
streampos pos = in.tellg(); // backup current position
string fullContent;
string contents((istreambuf_iterator<char>(in)), istreambuf_iterator<char>());
size_t episodeEndPos = contents.find("****");
if (episodeEndPos == -1) {
in.ignore(numeric_limits<char>::max());
in.clear(), in.sync();
fullContent = contents;
}
else { // empty stream for next episode
fullContent = contents.substr(0, episodeEndPos);
in.seekg(pos + streamoff(episodeEndPos + 4)); // position file at next episode
}
}
In this way, you can call several time your function, every call reading the next episode.
However, please note that your approach is not optimised. When you construct your contents string from a stream iterator, you load the full rest of the file in the memory, starting at the current position in the stream. So here you keep on reading and reading again big subparts of the file.
Edit: streamlined version adapted to your format
You just need to read the line, check if it's not a separator line and concatenate...
void Episode::Read(ifstream& in) {
string line;
string fullContent;
while (getline(in, line) && line !="****") {
fullContent += line + "\n";
}
cout << "DATENSATZ: " << fullContent << endl; // just to verify content
// fill attributes
//...
}
The code you got reads the entire stream in one go just to use some part of the read text to initialize an object. Imagining a gigantic file that is almost certainly a bad idea. The easier approach is to just read until the end marker is found. In an ideal world, the end marker is easily found. Based on comments it seems to be on a line of its own which would make it quite easy:
void Episode::read(std::istream& in) {
std::string text;
for (std::string line; in >> line && line != "****"; ) {
text += line + "\n";
}
fullContent = text;
}
If the separate isn't on a line of its own, you could use code like this instead:
void Episode::read(std::istream& in) {
std::string text;
for (std::istreambuf_iterator<char> it(in), end; it != end; ++it) {
text.push_back(*it);
if (*it == '*' && 4u <= text.size() && text.substr(text.size() - 4) == "****") {
break;
}
if (4u <= text.size() && text.substr(text.size() - 4u) == "****") {
text.resize(text.size() - 4u);
}
fullContent = text;
}
Both of these approaches would simple read the file from start to end and consume the characters to be extracted in the process, stopping as soon as reading of one record is done.

Using seekg() in text mode

While trying to read in a simple ANSI-encoded text file in text mode (Windows), I came across some strange behaviour with seekg() and tellg(); Any time I tried to use tellg(), saved its value (as pos_type), and then seek to it later, I would always wind up further ahead in the stream than where I left off.
Eventually I did a sanity check; even if I just do this...
int main()
{
std::ifstream dataFile("myfile.txt",
std::ifstream::in);
if (dataFile.is_open() && !dataFile.fail())
{
while (dataFile.good())
{
std::string line;
dataFile.seekg(dataFile.tellg());
std::getline(dataFile, line);
}
}
}
...then eventually, further into the file, lines are half cut-off. Why exactly is this happening?
This issue is caused by libstdc++ using the difference between the current remaining buffer with lseek64 to determine the current offset.
The buffer is set using the return value of read, which for a text mode file on windows returns the number of bytes that have been put into the buffer after endline conversion (i.e. the 2 byte \r\n endline is converted to \n, windows also seems to append a spurious newline to the end of the file).
lseek64 however (which with mingw results in a call to _lseeki64) returns the current absolute file position, and once the two values are subtracted you end up with an offset that is off by 1 for each remaining newline in the text file (+1 for the extra newline).
The following code should display the issue, you can even use a file with a single character and no newlines due to the extra newline inserted by windows.
#include <iostream>
#include <fstream>
int main()
{
std::ifstream f("myfile.txt");
for (char c; f.get(c);)
std::cout << f.tellg() << ' ';
}
For a file with a single a character I get the following output
2 3
Clearly off by 1 for the first call to tellg. After the second call the file position is correct as the end has been reached after taking the extra newline into account.
Aside from opening the file in binary mode, you can circumvent the issue by disabling buffering
#include <iostream>
#include <fstream>
int main()
{
std::ifstream f;
f.rdbuf()->pubsetbuf(nullptr, 0);
f.open("myfile.txt");
for (char c; f.get(c);)
std::cout << f.tellg() << ' ';
}
but this is far from ideal.
Hopefully mingw / mingw-w64 or gcc can fix this, but first we'll need to determine who would be responsible for fixing it. I suppose the base issue is with MSs implementation of lseek which should return appropriate values according to how the file has been opened.
Thanks for this , though it's a very old post. I was stuck on this problem for more then a week. Here's some code examples on my site (the menu versions 1 and 2). Version 1 uses the solution presented here, in case anyone wants to see it .
:)
void customerOrder::deleteOrder(char* argv[]){
std::fstream newinFile,newoutFile;
newinFile.rdbuf()->pubsetbuf(nullptr, 0);
newinFile.open(argv[1],std::ios_base::in);
if(!(newinFile.is_open())){
throw "Could not open file to read customer order. ";
}
newoutFile.open("outfile.txt",std::ios_base::out);
if(!(newoutFile.is_open())){
throw "Could not open file to write customer order. ";
}
newoutFile.seekp(0,std::ios::beg);
std::string line;
int skiplinesCount = 2;
if(beginOffset != 0){
//write file from zero to beginoffset and from endoffset to eof If to delete is non-zero
//or write file from zero to beginoffset if to delete is non-zero and last record
newinFile.seekg (0,std::ios::beg);
// if primarykey < largestkey , it's a middle record
customerOrder order;
long tempOffset(0);
int largestKey = order.largestKey(argv);
if(primaryKey < largestKey) {
//stops right before "current..." next record.
while(tempOffset < beginOffset){
std::getline(newinFile,line);
newoutFile << line << std::endl;
tempOffset = newinFile.tellg();
}
newinFile.seekg(endOffset);
//skip two lines between records.
for(int i=0; i<skiplinesCount;++i) {
std::getline(newinFile,line);
}
while( std::getline(newinFile,line) ) {
newoutFile << line << std::endl;
}
} else if (primaryKey == largestKey){
//its the last record.
//write from zero to beginoffset.
while((tempOffset < beginOffset) && (std::getline(newinFile,line)) ) {
newoutFile << line << std::endl;
tempOffset = newinFile.tellg();
}
} else {
throw "Error in delete key"
}
} else {
//its the first record.
//write file from endoffset to eof
//works with endOffset - 4 (but why??)
newinFile.seekg (endOffset);
//skip two lines between records.
for(int i=0; i<skiplinesCount;++i) {
std::getline(newinFile,line);
}
while(std::getline(newinFile,line)) {
newoutFile << line << std::endl;
}
}
newoutFile.close();
newinFile.close();
}
beginOffset is a specific point in the file (beginning of each record) , and endOffset is the end of the record, calculated in another function with tellg (findFoodOrder) I did not add this as it may become very lengthy, but you can find it on my site (under: menu version 1 link) :
http://www.buildincode.com