Arduino read last line from SD Card - c++

I am pretty new to Arduino business. How do I read the last line from a SD Card? With following code snippet I can read the first line (all characters before "\n"). Now I would like to include a "backwards" statement (or something).
My code so far:
#include <SD.h>
#include <SPI.h>
File SD_File;
int pinCS = 10;
char cr;
void setup() {
Serial.begin(9600);
SD.begin();
SD_File = SD.open("test.txt", FILE_WRITE);
SD_File.println("hello");
SD_File.close();
SD_File = SD.open("test.txt");
while(true){
cr = SD_File.read();
if((cr == '\n') && ("LAST LINE?"))
break;
Serial.print(cr);
}
SD_File.close();
}
void loop() {
}
Your help is much appreciated.

Since you are technically opening text files, you could use seekg to jump to the end of the file and read the last line, as described in this answer.
If this is not helpful, adding a bit more context and an example file would help us understand your question better.

I am not sure I understood your question.
"How do I implement seekg?" There is not seekg. There is however, a seek.
This is the documentation page for the SD library. In the right side of the page there is a list of all File class methods (seek among others).
" How do I read the last line..." There is no line reading in your code. If you just want to go to the end of file use: SD_File.seek( SD_File.size() ); If you want to read the last line, the simplest way is to write a getline function and read the whole file line by line until end. Assuming MAX_LINE is large enough and getline returns zero on success:
//...
char s[ MAX_LINE ];
while ( getline( f, s, MAX_LINE , '\n' ) == 0 )
;
// when reaching this point, s contains the last line
Serial.print( "This is the last line: " );
Serial.print( s );
Here's a getline idea (no warranty - not tested):
/*
s - destination
count - maximum number of characters to write to s, including the null terminator. If
the limit is reached, it returns -2.
delim - delimiting character ('\n' in your case)
returns:
0 - no error
-1 - eof reached
-2 - full buffer
*/
int getline( File& f, char* s, int count, char delim )
{
int ccount = 0;
int result = 0;
if ( 0 < count )
while ( 1 )
{
char c = f.peek();
if ( c == -1 )
{
f.read(); // extract
result = -1;
break; // eof reached
}
else if ( c == delim )
{
f.read(); // extract
++ccount;
break; // eol reached
}
else if ( --count <= 0 )
{
result = -2;
break; // end of buffer reached
}
else
{
f.read(); // extract
*s++ = c;
++ccount;
}
}
*s = '\0'; // end of string
return ccount == 0 ? -1 : result;
}

Related

Alternating between reading and writing repeatedly

My objective is to read a file line by line, check if that line contains some number, and if so rewrite that line. Then continue reading the file.
I've successfully been able to do this for one line, but I can't figure out how to continue reading the rest of the file.
Here's how I replace one line (every line is a known fixed size):
while(getline(fs, line)){
if(condition){
pos = fs.tellg(); //gets current read position (end of the line I want to change)
pos -= line.length()+1; //position of the beginning of the line
fs.clear(); //switch to write mode
fs.seekp(pos); //seek to beginning of line
fs << new_data; //overwrite old data with new data (also fixed size)
fs.close(); //Done.
continue;
}
}
How do I switch back to read and continue the getline loop?
I had the same problem, TB-scale files and I wanted to modify some header information in the beginning of the file.
Obviously one has to leave enough room when one initially creates the file for any new content, because there is no way to increase the file size (besides appending to it) and the new line has to have the exact same line length as the original one.
Here is a simplification of my code:
#include <iostream>
#include <fstream>
using namespace std;
bool CreateDummy()
{
ofstream out;
out.open("Dummy.txt");
// skip: test if open
out<<"Some Header"<<endl;
out<<"REPLACE1 12345678901234567890"<<endl;
out<<"REPLACE2 12345678901234567890"<<endl;
out<<"Now ~1 TB of data follows..."<<endl;
out.close();
return true;
}
int main()
{
CreateDummy(); // skip: test if successful
fstream inout;
inout.open("Dummy.txt", ios::in | ios::out);
// skip test if open
bool FoundFirst = false;
string FirstText = "REPLACE1";
string FirstReplacement = "Replaced first!!!";
bool FoundSecond = false;
string SecondText = "REPLACE2";
string SecondReplacement = "Replaced second!!!";
string Line;
size_t LastPos = inout.tellg();
while (getline(inout, Line)) {
if (FoundFirst == false && Line.compare(0, FirstText.size(), FirstText) == 0) {
// skip: check if Line.size() >= FirstReplacement.size()
while (FirstReplacement.size() < Line.size()) FirstReplacement += " ";
FirstReplacement += '\n';
inout.seekp(LastPos);
inout.write(FirstReplacement.c_str(), FirstReplacement.size());
FoundFirst = true;
} else if (FoundSecond == false && Line.compare(0, SecondText.size(), SecondText) == 0) {
// skip: check if Line.size() >= SecondReplacement.size()
while (SecondReplacement.size() < Line.size()) SecondReplacement += " ";
SecondReplacement += '\n';
inout.seekp(LastPos);
inout.write(SecondReplacement.c_str(), SecondReplacement.size());
FoundSecond = true;
}
if (FoundFirst == true && FoundSecond == true) break;
LastPos = inout.tellg();
}
inout.close();
return 0;
}
The input is
Some Header
REPLACE1 12345678901234567890
REPLACE2 12345678901234567890
Now ~1 TB of data follows...
The output is:
Some Header
Replaced first!!!
Replaced second!!!
Now ~1 TB of data follows...

Segmentation Fault searching for End of Line

I'm writing code that counts the amount of lines and characters of a file.
#include <fstream>
#include <iostream>
#include <stdlib.h>
using namespace std;
int main(int argc, char* argv[])
{
ifstream read(argv[1]);
char line[256];
int nLines=0, nChars=0, nTotalChars=0;
read.getline(line, 256);
while(read.good()) /
{
nChars=0;
int i=0;
while(line[i]!='\n')
{
if ((int)line[i]>32) {nChars++;}
i++;
}
nLines++;
nTotalChars= nTotalChars + nChars;
read.getline(line, 256);
}
cout << "The number of lines is "<< nLines << endl;
cout << "The number of characters is "<< nTotalChars << endl;
}
The line while(line[i]!='\n') seems to be the cause of the following error
Segmentation fault (core dumped)
I can't figure out what's wrong. The internet tells me that I'm checking for the end of a line correctly as far as I can tell.
Your code will not find '\n' because it is discarded from the input sequence. From the documentation of getline:
The delimiting character is the newline character [...]: when found in the input sequence, it is extracted from the input sequence, but discarded and not written to s.
You should be searching for '\0':
while(line[i])
{
if ((int)line[i]>32) {nChars++;}
i++;
}
Because getline will not store \n, so the loop:
while(line[i]!='\n')
{
if ((int)line[i]>32) {nChars++;}
i++;
}
will never end, until line[i] exceeds the array length and causes segmentation fault.
You do not have an end of line character in the line. So, you should be checking for a NULL character (end of string) instead of the end of line. Also make sure that you do not go past the size of your buffer (256) in your case.
I think a for loop would be safer:
for ( unsigned int i = 0; i < line.size(); i++ ) {
//whatever
}
There are several problems with your code, but for starters, you
shouldn't be reading lines into a char[]. If you use
std::string, then you don't have to worry about reading
partial lines, etc.
Then there's the fact that getline extracts the '\n' from
the file, but does not store it, so your code (even modified
to use std::string) will never see a '\n' in the buffer. If
you're using string, you iterate from line.begin() to
line.end(); if you're using a char[], you iterate over the
number of bytes returned by read.gcount(), called after the
call to getline. (It's very difficult to get this code right
using a char[] unless you assume that no text file in the
world contains a '\0'.)
Finally, if the last line doesn't end with a '\n' (a frequence
case under Windows), you won't process it. If you're using
std::string, you can simply write:
std::getline( read, line );
while ( read ) {
// ...
std::getline( read, line );
}
or even:
while ( std::getline( read, line ) ) {
++ nLines;
for ( std::string::const_iterator current = line.begin();
current != line.end();
++ current ) {
// process character *current in line...
}
}
(The latter is ubiquitous, even if it is ugly.)
With char[], you have to replace this with:
while ( read.getline( buffer, sizeof(buffer) ) || read.gcount() != 0 ) {
int l = read.gcount();
if ( read ) {
++ nLines;
} else {
if ( read.eof() ) {
++ nLines; // Last line did not end with a '\n'
} else {
read.clear(); // Line longer than buffer...
}
for ( int i = 0; i != l; ++ i ) {
// process character buffer[i] in line...
}
}
One final question: what is (int)line[i] > 32 supposed to
mean? Did you want !isspace( line[i] ) &&
!iscntrl( line[i] )? (That's not at all what it does, of
course.)

reading last n lines from file in c/c++

I have seen many posts but didn't find something like i want.
I am getting wrong output :
ÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿ...... // may be this is EOF character
Going into infinite loop.
My algorithm:
Go to end of file.
decrease position of pointer by 1 and read character by
character.
exit if we found our 10 lines or we reach beginning of file.
now i will scan the full file till EOF and print them //not implemented in code.
code:
#include<iostream>
#include<stdio.h>
#include<conio.h>
#include<stdlib.h>
#include<string.h>
using namespace std;
int main()
{
FILE *f1=fopen("input.txt","r");
FILE *f2=fopen("output.txt","w");
int i,j,pos;
int count=0;
char ch;
int begin=ftell(f1);
// GO TO END OF FILE
fseek(f1,0,SEEK_END);
int end = ftell(f1);
pos=ftell(f1);
while(count<10)
{
pos=ftell(f1);
// FILE IS LESS THAN 10 LINES
if(pos<begin)
break;
ch=fgetc(f1);
if(ch=='\n')
count++;
fputc(ch,f2);
fseek(f1,pos-1,end);
}
return 0;
}
UPD 1:
changed code: it has just 1 error now - if input has lines like
3enil
2enil
1enil
it prints 10 lines only
line1
line2
line3ÿine1
line2
line3ÿine1
line2
line3ÿine1
line2
line3ÿine1
line2
PS:
1. working on windows in notepad++
this is not homework
also i want to do it without using any more memory or use of STL.
i am practicing to improve my basic knowledge so please don't post about any functions (like tail -5 tc.)
please help to improve my code.
Comments in the code
#include <stdio.h>
#include <stdlib.h>
int main(void)
{
FILE *in, *out;
int count = 0;
long int pos;
char s[100];
in = fopen("input.txt", "r");
/* always check return of fopen */
if (in == NULL) {
perror("fopen");
exit(EXIT_FAILURE);
}
out = fopen("output.txt", "w");
if (out == NULL) {
perror("fopen");
exit(EXIT_FAILURE);
}
fseek(in, 0, SEEK_END);
pos = ftell(in);
/* Don't write each char on output.txt, just search for '\n' */
while (pos) {
fseek(in, --pos, SEEK_SET); /* seek from begin */
if (fgetc(in) == '\n') {
if (count++ == 10) break;
}
}
/* Write line by line, is faster than fputc for each char */
while (fgets(s, sizeof(s), in) != NULL) {
fprintf(out, "%s", s);
}
fclose(in);
fclose(out);
return 0;
}
There are a number of problems with your code. The most
important one is that you never check that any of the functions
succeeded. And saving the results an ftell in an int isn't
a very good idea either. Then there's the test pos < begin;
this can only occur if there was an error. And the fact that
you're putting the results of fgetc in a char (which results
in a loss of information). And the fact that the first read you
do is at the end of file, so will fail (and once a stream enters
an error state, it stays there). And the fact that you can't
reliably do arithmetic on the values returned by ftell (except
under Unix) if the file was opened in text mode.
Oh, and there is no "EOF character"; 'ÿ' is a perfectly valid
character (0xFF in Latin-1). Once you assign the return value
of fgetc to a char, you've lost any possibility to test for
end of file.
I might add that reading backwards one character at a time is
extremely inefficient. The usual solution would be to allocate
a sufficiently large buffer, then count the '\n' in it.
EDIT:
Just a quick bit of code to give the idea:
std::string
getLastLines( std::string const& filename, int lineCount )
{
size_t const granularity = 100 * lineCount;
std::ifstream source( filename.c_str(), std::ios_base::binary );
source.seekg( 0, std::ios_base::end );
size_t size = static_cast<size_t>( source.tellg() );
std::vector<char> buffer;
int newlineCount = 0;
while ( source
&& buffer.size() != size
&& newlineCount < lineCount ) {
buffer.resize( std::min( buffer.size() + granularity, size ) );
source.seekg( -static_cast<std::streamoff>( buffer.size() ),
std::ios_base::end );
source.read( buffer.data(), buffer.size() );
newlineCount = std::count( buffer.begin(), buffer.end(), '\n');
}
std::vector<char>::iterator start = buffer.begin();
while ( newlineCount > lineCount ) {
start = std::find( start, buffer.end(), '\n' ) + 1;
-- newlineCount;
}
std::vector<char>::iterator end = remove( start, buffer.end(), '\r' );
return std::string( start, end );
}
This is a bit weak in the error handling; in particular, you
probably want to distinguish the between the inability to open
a file and any other errors. (No other errors should occur,
but you never know.)
Also, this is purely Windows, and it supposes that the actual
file contains pure text, and doesn't contain any '\r' that
aren't part of a CRLF. (For Unix, just drop the next to the
last line.)
This can be done using circular array very efficiently.
No additional buffer is required.
void printlast_n_lines(char* fileName, int n){
const int k = n;
ifstream file(fileName);
string l[k];
int size = 0 ;
while(file.good()){
getline(file, l[size%k]); //this is just circular array
cout << l[size%k] << '\n';
size++;
}
//start of circular array & size of it
int start = size > k ? (size%k) : 0 ; //this get the start of last k lines
int count = min(k, size); // no of lines to print
for(int i = 0; i< count ; i++){
cout << l[(start+i)%k] << '\n' ; // start from in between and print from start due to remainder till all counts are covered
}
}
Please provide feedback.
int end = ftell(f1);
pos=ftell(f1);
this tells you the last point at file, so EOF.
When you read, you get the EOF error, and the ppointer wants to move 1 space forward...
So, i recomend decreasing the current position by one.
Or put the fseek(f1, -2,SEEK_CUR) at the beginning of the while loop to make up for the fread by 1 point and go 1 point back...
I believe, you are using fseek wrong. Check man fseek on the Google.
Try this:
fseek(f1, -2, SEEK_CUR);
//1 to neutrialize change from fgect
//and 1 to move backward
Also you should set position at the beginning to the last element:
fseek(f1, -1, SEEK_END).
You don't need end variable.
You should check return values of all functions (fgetc, fseek and ftell). It is good practise. I don't know if this code will work with empty files or sth similar.
Use :fseek(f1,-2,SEEK_CUR);to back
I write this code ,It can work ,you can try:
#include "stdio.h"
int main()
{
int count = 0;
char * fileName = "count.c";
char * outFileName = "out11.txt";
FILE * fpIn;
FILE * fpOut;
if((fpIn = fopen(fileName,"r")) == NULL )
printf(" file %s open error\n",fileName);
if((fpOut = fopen(outFileName,"w")) == NULL )
printf(" file %s open error\n",outFileName);
fseek(fpIn,0,SEEK_END);
while(count < 10)
{
fseek(fpIn,-2,SEEK_CUR);
if(ftell(fpIn)<0L)
break;
char now = fgetc(fpIn);
printf("%c",now);
fputc(now,fpOut);
if(now == '\n')
++count;
}
fclose(fpIn);
fclose(fpOut);
}
I would use two streams to print last n lines of the file:
This runs in O(lines) runtime and O(lines) space.
#include<bits/stdc++.h>
using namespace std;
int main(){
// read last n lines of a file
ifstream f("file.in");
ifstream g("file.in");
// move f stream n lines down.
int n;
cin >> n;
string line;
for(int i=0; i<k; ++i) getline(f,line);
// move f and g stream at the same pace.
for(; getline(f,line); ){
getline(g, line);
}
// g now has to go the last n lines.
for(; getline(g,line); )
cout << line << endl;
}
A solution with a O(lines) runtime and O(N) space is using a queue:
ifstream fin("file.in");
int k;
cin >> k;
queue<string> Q;
string line;
for(; getline(fin, line); ){
if(Q.size() == k){
Q.pop();
}
Q.push(line);
}
while(!Q.empty()){
cout << Q.front() << endl;
Q.pop();
}
Here is the solution in C++.
#include <iostream>
#include <string>
#include <exception>
#include <cstdlib>
int main(int argc, char *argv[])
{
auto& file = std::cin;
int n = 5;
if (argc > 1) {
try {
n = std::stoi(argv[1]);
} catch (std::exception& e) {
std::cout << "Error: argument must be an int" << std::endl;
std::exit(EXIT_FAILURE);
}
}
file.seekg(0, file.end);
n = n + 1; // Add one so the loop stops at the newline above
while (file.tellg() != 0 && n) {
file.seekg(-1, file.cur);
if (file.peek() == '\n')
n--;
}
if (file.peek() == '\n') // If we stop in the middle we will be at a newline
file.seekg(1, file.cur);
std::string line;
while (std::getline(file, line))
std::cout << line << std::endl;
std::exit(EXIT_SUCCESS);
}
Build:
$ g++ <SOURCE_NAME> -o last_n_lines
Run:
$ ./last_n_lines 10 < <SOME_FILE>

c++ Reading numbers from text files, ignoring comments

So I've seen lots of solutions on this site and tutorials about reading in from a text file in C++, but have yet to figure out a solution to my problem. I'm new at C++ so I think I'm having trouble piecing together some of the documentation to make sense of it all.
What I am trying to do is read a text file numbers while ignoring comments in the file that are denoted by "#". So an example file would look like:
#here is my comment
20 30 40 50
#this is my last comment
60 70 80 90
My code can read numbers fine when there aren't any comments, but I don't understand parsing the stream well enough to ignore the comments. Its kind of a hack solution right now.
/////////////////////// Read the file ///////////////////////
std::string line;
if (input_file.is_open())
{
//While we can still read the file
while (std::getline(input_file, line))
{
std::istringstream iss(line);
float num; // The number in the line
//while the iss is a number
while ((iss >> num))
{
//look at the number
}
}
}
else
{
std::cout << "Unable to open file";
}
/////////////////////// done reading file /////////////////
Is there a way I can incorporate comment handling with this solution or do I need a different approach? Any advice would be great, thanks.
If your file contains # always in the first column, then just test, if the line starts with # like this:
while (std::getline(input_file, line))
{
if (line[0] != "#" )
{
std::istringstream iss(line);
float num; // The number in the line
//while the iss is a number
while ((iss >> num))
{
//look at the number
}
}
}
It is wise though to trim the line of leading and trailing whitespaces, like shown here for example: Remove spaces from std::string in C++
If this is just a one of use, for line oriented input like yours, the
simplest solution is just to strip the comment from the line you just
read:
line.erase( std::find( line.begin(), line.end(), '#' ), line.end() );
A more generic solution would be to use a filtering streambuf, something
like:
class FilterCommentsStreambuf : public std::streambuf
{
std::istream& myOwner;
std::streambuf* mySource;
char myCommentChar;
char myBuffer;
protected:
int underflow()
{
int const eof = std::traits_type::eof();
int results = mySource->sbumpc();
if ( results == myCommentChar ) {
while ( results != eof && results != '\n') {
results = mySource->sbumpc(0;
}
}
if ( results != eof ) {
myBuffer = results;
setg( &myBuffer, &myBuffer, &myBuffer + 1 );
}
return results;
}
public:
FilterCommentsStreambuf( std::istream& source,
char comment = '#' )
: myOwner( source )
, mySource( source.rdbuf() )
, myCommentChar( comment )
{
myOwner.rdbuf( this );
}
~FilterCommentsStreambuf()
{
myOwner.rdbuf( mySource );
}
};
In this case, you could even forgo getline:
FilterCommentsStreambuf filter( input_file );
double num;
while ( input_file >> num || !input_file.eof() ) {
if ( ! input_file ) {
// Formatting error, output error message, clear the
// error, and resynchronize the input---probably by
// ignore'ing until end of line.
} else {
// Do something with the number...
}
}
(In such cases, I've found it useful to also track the line number in
the FilterCommentsStreambuf. That way you have it for error
messages.)
An alternative to the "read aline and parse it as a string", can be use the stream itself as the incoming buffer:
while(input_file)
{
int n = 0;
char c;
input_file >> c; // will skip spaces ad read the first non-blank
if(c == '#')
{
while(c!='\n' && input_file) input_file.get(c);
continue; //may be not soooo beautiful, but does not introduce useless dynamic memory
}
//c is part of something else but comment, so give it back to parse it as number
input_file.unget(); //< this is what all the fuss is about!
if(input_file >> n)
{
// look at the nunber
continue;
}
// something else, but not an integer is there ....
// if you cannot recover the lopop will exit
}

Reading text file per line in C++, with unknown line length

I have a text file, that is formatted somewhat like this:
1 3 4 5 6
6 7 8
4 12 16 17 18 19 20
20
0
A line can contain 1 to 10000 integers. What I need to do, is read all of them line by line.
Pseudocode like this:
line=0;
i=0;
while(!file.eof()){
while(!endLine){
array[0][i++]=file.readChar();
}
line++;i=0;
}
So, I have an array , into which I would like to read every line, and each line would consist of each of these integers.
The problem I'm having, is how to check if the end of a line has come.
Note, I can't use strings.
Yes, This is for a homework, but the main task for the assignment is to build a tree and then transform it. I can do that, but I've no idea how to read the integers from the file.
Probably something like this:
after reading an int, I manually skip spaces, tabs, carriage return and end of line (for this one you'll have to implement your logic).
To read an int I read it directly using the C++ functions of ifstream. I don't read it character by character and then recompose it as a string :-)
Note that I skip \r as "spaces. The end of line for me is \n.
#include <iostream>
#include <fstream>
#include <vector>
int main()
{
std::ifstream file("example.txt");
std::vector<std::vector<int>> ints;
bool insertNewLine = true;
int oneInt;
//The good() here is used to check the status of
//the opening of file and for the failures of
//peek() and read() (used later to skip characters).
while (file.good() && file >> oneInt)
{
if (insertNewLine)
{
std::vector<int> vc;
ints.push_back(vc);
//With C++11 you can do this instead of the push_back
//ints.emplace_back(std::vector<int>());
insertNewLine = false;
}
ints.back().push_back(oneInt);
std::cout << oneInt << " ";
int ch;
while ((ch = file.peek()) != std::char_traits<char>::eof())
{
if (ch == ' '|| ch == '\t' || ch == '\r' || ch == '\n')
{
char ch2;
if (!file.read(&ch2, 1))
{
break;
}
if (ch == '\n' && !insertNewLine)
{
std::cout << std::endl;
insertNewLine = true;
}
}
else
{
break;
}
}
}
//Here we should probably check if we exited for eof (good)
//or for other file errors (bad! bad! bad!)
return 0;
}
There is a function called getline() which will read a whole line. Link
You need a function to read a value from a file or indicates an end of line or end of file condition, something like:
result_type GetNextValue (input_file, &value)
{
if next thing in file is a number, set value and return number_type
if next thing in file is an end of line, return end_of_line_type
if end of file found, return end_of_file_type
}
and then your array building loop becomes:
line = 0
item = 0
eof = false
while (!eof)
{
switch (GetNextValue (input_file, value))
{
case value_type:
array [line][item++] = value
case end_of_line_type:
line++;
item = 0;
case end_of_file_type:
eof = true
}
}
I'll leave the details to you as it's homework.
You could read the numbers in a char and check against carriage return. A snippet that I had just tried is given below:
ifstream ifile;
ifile.open("a.txt");
char ch;
while((ch = ifile.get()) != EOF)
{
std::cout<<ch<<"\n";
if (ch == '\n')
std::cout<<"Got New Line";
}
ifile.close();