I need to read a text file that's in this format:
n k
S1
S2
S3
.
.
.
Sn
N being and integer, and S's being strings. Now, as far as I've seen a string cannot be read with fscanf function, but rather an array of char's has to be used.
The problem is that I need to set the length of the character array even though I have no way of knowing how long a word will be:
in = fopen("01.in", "r");
int N, k;
fscanf(in, "%d %d", &N, &k);
for (int i=0; i<N; i++){
char temp[100];
fscanf(in, "%s", temp);
}
So is there a way to maybe use vectors or something?
Or maybe in the off case that this problem cannot be solved, can I convert a string of chars into a string, and then create a vector of strings?
Why not use std::ifstream and std::getline something like this:
std::ifstream in("01.in");
int N, k;
if(!(in >> N >> k))
{
std::cerr << "Error reading file!" << '\n';
return EXIT_FAILURE;
}
std::string line; // read lines into this
int i = 0;
while(i < N && std::getline(in, line))
{
// deal with line here
++i; // keep track
}
The first step towards code sanity here is to stop using char arrays and start using std::string instead. The big difference between the two is that an array's size is set in stone at compile time, whereas a std::string's initial size can be can be chosen at runtime, and it can also grow and shrink while the program runs.
Now, as far as I've seen a string cannot be read with fscanf function,
but rather an array of char's has to be used.
Since C++11, that's not strictly true. std::strings are in many ways compatible with C functions. For example, you can safely get a pointer to the underlying buffer with &s[0]. Therefore, you could technically do this:
std::string temp(100, '\0');
fscanf(in, "%s", &temp[0]);
But that has not gotten us far. Apart from some other bad things about this "solution" (unidiomatic, undefined behaviour if too many characters are read, wasteful if too few characters are read), as you can see, the original problem still persists; the number 100 is hard-coded in the program. This is the real problem, as you have also said in the comment you added:
What I mean is what if I get a string that's longer than 100 characters?
The answer to that is: Just don't use fscanf anymore. Use std::ifstream along with the std::getline function. std::getline reads a whole line, i.e. everything until the next line break, and stores the result in a std::string. Size and memory management are all handled automatically for you:
std::ifstream is("01.in");
std::string temp;
std::getline(is, temp);
Related
char str1[10];
fscanf(fp , "%s", str1);
I want to know size of %s before assigning to str1 to avoid crashing in case of huge input data.
Since this is C++, you don't have to rely on fscanf. We can avoid buffer overflow altogether by using std::istream:
std::string readWord(std::istream& input) {
std::string word;
if(input >> word) {
return word;
} else {
// Handle error
}
}
This will automatically read characters until reaching the first whitespace character is encountered, and it'll automatically allocate memory as needed.
You can specify the maximum number of characters to read:
fscanf(fp, "%9s", str1);
This will not write more than 10 chars into str1 including the null terminator.
But in C++ you should use streams and strings which are safe in this respect.
I have learned my lesson, so i will be short, and to the subiect.
I need a function, in my class, that can read a file line by line, and store them into a array/string so i can use it.
I have the following example( please don`t laugh, i am a begginer):
int CMYCLASS::LoadLines(std::string Filename)
{
std::ifstream input(Filename, std::ios::binary | ios::in);
input.seekg(0, ios::end);
char* title[1024];
input.read((char*)title, sizeof(int));
// here what ?? -_-
input.close();
for (int i = 0; i < sizeof(title); i++)
{
printf(" %.2X ";, title[i]);
}
printf("\");
return 0;
}
I'm not sure exactly what your are asking.
However - below is some code that reads a file line-by-line and stores the lines in a vector. The code also prints the lines - both as text lines and the integer value of each character. Hope it helps.
int main()
{
std::string Filename = "somefile.bin";
std::ifstream input(Filename, std::ios::binary | ios::in); // Open the file
std::string line; // Temp variable
std::vector<std::string> lines; // Vector for holding all lines in the file
while (std::getline(input, line)) // Read lines as long as the file is
{
lines.push_back(line); // Save the line in the vector
}
// Now the vector holds all lines from the file
// and you can do what ever you want it
// For instance we can print the lines
// Both as a line and as the hexadecimal value of every character
for(auto s : lines) // For each line in vector
{
cout << s; // Print it
for(auto c : s) // For each character in the line
{
cout << hex // switch to hexadecimal
<< std::setw(2) // print it in two
<< std::setfill('0') // leading zero
<< (unsigned int)c // cast to get the integer value
<< dec // back to decimal
<< " "; // and a space
}
cout << endl; // new line
}
return 0;
}
I do not laugh due to your original code - no way - I was also a beginner once. But your code is c-style code and contains a lot of bugs. So my advice is: Please use c++ style instead. For instance: never use the C-style string (i.e. char array). It is so error prone...
As you are a beginner (your own words :) let me explain a few things about your code:
char* title[1024];
This is not a string. It is 1024 pointers to characters which can also by 1024 pointers to c-style strings. However - you have not reserved any memory for holding the strings.
The correct way would be:
char title[1024][256]; // 1024 lines with a maximum of 256 chars per line
Here you must make sure that the input file has less than 1024 lines and that each line each less than 256 chars.
Code like that is very bad. What to do if the input file has 1025 lines?
This is where c++ helps you. Using std::string you don't need to worry about the length of the string. The std::string container will just adjust to the size you put into in to it.
The std::vector is like an array. But without a fixed size. So you can just keep adding to it and it will automatically adjust the size.
So c++ offers std::string and std::vector to help you to handle the dynamic size of the input file. Use it...
Good luck.
I have a text file with numbers ranging from 0-255 separated by commas. I want to be able to store each of these numbers into an integer array. An example of what the text file might contain is;
"32,51,45,12,5,2,7,2,9,233,132,175,143,33..." etc
I have managed to get my program to store the data from the text file as a string and output them on the screen. What I need to do next is store the values of that string in an integer array, separating the numbers by the commas.
Here is the code I have written so far, which I am having problems getting it working;
int _tmain(int argc, _TCHAR* argv[])
{
string line;
ifstream myfile ("example.txt");
if (myfile.is_open())
{
while ( myfile.good() )
{
getline (myfile,line);
cout << line << endl;
}
myfile.close();
}
else cout << "Unable to open file";
//STRING CONVERSION
std::string str = line;
std::vector<int> vect;
std::stringstream ss(str);
int i = 0;
while (ss >> i)
{
vect.push_back(i);
if (ss.peek() == ',')
ss.ignore();
}
system("pause");
return 0;
It looks like your code for tokenizing your string is bit off. In particular you need to make sure you call atoi() on the string of your integer to get an integer. I'll focus on the parsing of the string though.
One thing you could use is C's strtok. I recommend this mainly because your case is rather simple, and this is probably the simplest way to go about it.
The code you'd look for is essentially this:
char* numStr = strtok(str.c_str(), ",");
while (numStr)
{
vect.push_back(atoi(numStr));
numStr = strtok(NULL, ",");
}
strtok() takes two arguments: a pointer to the C-style string (char*) you're tokenizing, and the string of delimiters (note that each character in the delimiter string is treated as its own delimiter).
I should mention that strtok is not thread-safe, and you also have to ensure that the string you extract from the file ends with a null character \0.
The answers to this question provide many alternatives to my solution. If you'd prefer to use std::stringstream then I suggest you look at the 5th answer on that page.
Regarding your trouble with PDBs, what is the exact error you're getting?
I am trying to read from a text file and tokenize the input. I was getting a segmentation fault until I realized I forgot to close my ifstream. I added the close call and now it loops infinitely. I'm just trying to learn how to use strtok for now, that is why the code doesn't really look complete.
void loadInstructions(char* fileName)
{
ifstream input;
input.open(fileName);
while(!input.eof());
{
string line;
getline (input,line);
char * lineChar = &line[0];
//instruction cmd; //This will be used later to store instructions from the parse
char * token;
token = strtok (lineChar," ");
// just trying to get the line number for now
int lineNumber = atoi(token);
cout << lineNumber << "\n";
}
input.close();
}
input file:(one line)
5 +8 0 0 25
This while(input.good()); is probably not what you intended...
Use this:
string line;
while(getline (input,line))
{
If the getline() works then the loop is entered.
If you try and read past the EOF then it will fail and the loop will exit.
So this should word as expected.
Rather than using strtok() (which damages the string) and atoi() which is non portable.
Use std::stringstream
std::stringstream linestream(line);
int lineNumber;
linestream >> lineNumber; // reads a number from the line.
Don't explicitly close() the stream (unless you want to detect and correct for any problems). The file will be closed when the object goes out of scope at the end of the function.
You want to use eof() not good().
Avoid strtok. There are other ways to tokenize a string that do not require the called function to modify your string. The fact that it modifies the string it tokenizes could also be what causes the loop here.
But more likely, the good() member is not the right one. Try !input.eof() or similar, depending on what you need.
While you've already gotten some answers to the question you asked, perhaps it's worth answering some you should have about the code that you didn't ask:
void loadInstructions(char* fileName)
Since the function isn't going to modify the file name, you almost certainly want to change this to:
void loadInstructions(char const *fileName)
or
void loadInstructions(std::string const &fileName)
ifstream input;
input.open(fileName);
It's much cleaner to combine these:
ifstream input(fileName);
or (if you passed a string instead):
ifstream input(fileName.c_str());
while(!input.eof());
This has already been covered.
string line;
getline (input,line);
char * lineChar = &line[0];
//instruction cmd; //This will be used later to store instructions from the parse
char * token;
token = strtok (lineChar," ");
// just trying to get the line number for now
int lineNumber = atoi(token);
Most of this is just extraneous. You can just let atoi convert directly from the original input:
string line;
getline(input, line);
int lineNumber = atoi(line);
If you're going to tokenize more later, you can use strtol instead:
char *end_ptr;
int lineNumber = strtol(line, &end_ptr, 10);
This will set end_ptr to point just past the end of the part that strtol converted.
I'd also consider an alternative though: moving your code for reading and parsing a line into a class, and define operator>> to read those:
struct line {
int number;
operator int() { return number; }
};
std::istream &operator>>(std::istream &is, line &l) {
// Just for fun, we'll read the data in an alternative fashion.
// Instead of read a line into a buffer, then parse out the first number,
// we'll read a number from the stream, then ignore the rest of the line.
// I usually prefer the other way, but this is worth knowing as well.
is >> l.number;
// When you're ready to parse more, add the extra parsing code here.
is.ignore(std::numeric_limits<std::istream::pos_type>::max, '\n');
return is;
}
With this in place, we can print out the line numbers pretty easily:
std::copy(std::istream_iterator<line>(input),
std::istream_iterator<line>(),
std::ostream_iterator<int>(std::cout, "\n"));
input.close();
I'd usually just let the stream close automatically when it goes out of scope.
How would you get the size input string to console or size of valid characters in buffer?
char buffer[100];
cin >> buffer;
I'm looking to put the '\0' where the input ends.
Prefer using std::string, instead of char* or char[]. That makes such things easy! The problem with char buffer[100] is that if the size of input string is more than 100, then your cin >> buffer would invoke undefined behavior, as it would attempt to write beyond the array. This problem can easily be avoided if you use std::string.
std::string input;
cin >> input; //this can read string of any unknown size!
cout << "length of input string : " << input.size()<< endl;
You can also use input.length() instead of input.size(). They return the same value.
Online Demo : http://www.ideone.com/Wdo31
The question is moot. When the user types more than 100 characters, you have a buffer overrun. You may crash. If not, you got a security issue at best. You shouldn't do this. Read the input a character at a time, or use a safer string library. gets_s comes to mind if it's supported on your platform.
But in answer to your question, this might be what you need:
char buffer[100] = {}; // zero-init the entire array
int length = 0;
cin >> buffer;
length = strlen(buffer); // length is the length of the string
You don't need to (and quite possibly, can't). Instead, use a std::string instead of a char buffer.