I really didn't find a satisfied answer at google and I/O in C++ is a little bit tricky. I would like to read text file by blocks into a vector if possible. Alas, I couldn't figure out how. I am not even sure, if my infinite loop will be break in all possibilities, because I/O is tricky. So, the best way I was able to figure out is this:
char buffer[1025]; //let's say read by 1024 char block
buffer[1024] = '\0';
std::fstream fin("index.xml");
if (!fin) {
std::cerr << "Unable to open file";
} else {
while (true) {
fin.read(buffer, 1024);
std::cout << buffer;
if (fin.eof())
break;
}
}
Please, note the second line with '\0'. Is it not odd? Can I do something better? Can I read the data into the vector instead of char array? Is it appropriate to read into vector directly?
Thanks for your answers.
PS. Reading by chunks have sense indeed. This code is short but I am storing it in cyclic buffer.
You should be fine doing the following
vector<char> buffer (1024,0); // create vector of 1024 chars with value 0
fin.read(&buffer[0], buffer.size());
The elements in a vector are guaranteed to be stored contiguously, so this should work - but you should ensure that the vector is never empty. I asked a similar question here recently - check the answers to that for specific details from the standard Can I call functions that take an array/pointer argument using a std::vector instead?
std::ifstream fin("index.xml");
std::stringstream buffer;
buffer << fin.rdbuf();
std::string result = buffer.str();
Exactly what you need.
Recently, I have encountered the same problem. I use read and gcount founction to solve it. It works well. Here is the code.
vector<string> ReadFileByBlocks(const char* filename)
{
vector<string> vecstr;
ifstream fin(filename, ios_base::in);
if (fin.is_open())
{
char* buffer = new char[1024];
while (fin.read(buffer, 1024))
{
string s(buffer);
vecstr.push_back(s);
}
// if the bytes of the block are less than 1024,
// use fin.gcount() calculate the number, put the va
// into var s
string s(buffer, fin.gcount());
vecstr.push_back(s);
delete[] buffer;
fin.close();
}
else
{
cerr << "Cannot open file:" << filename << endl;
}
return vecstr;
}
Related
I have learned my lesson, so i will be short, and to the subiect.
I need a function, in my class, that can read a file line by line, and store them into a array/string so i can use it.
I have the following example( please don`t laugh, i am a begginer):
int CMYCLASS::LoadLines(std::string Filename)
{
std::ifstream input(Filename, std::ios::binary | ios::in);
input.seekg(0, ios::end);
char* title[1024];
input.read((char*)title, sizeof(int));
// here what ?? -_-
input.close();
for (int i = 0; i < sizeof(title); i++)
{
printf(" %.2X ";, title[i]);
}
printf("\");
return 0;
}
I'm not sure exactly what your are asking.
However - below is some code that reads a file line-by-line and stores the lines in a vector. The code also prints the lines - both as text lines and the integer value of each character. Hope it helps.
int main()
{
std::string Filename = "somefile.bin";
std::ifstream input(Filename, std::ios::binary | ios::in); // Open the file
std::string line; // Temp variable
std::vector<std::string> lines; // Vector for holding all lines in the file
while (std::getline(input, line)) // Read lines as long as the file is
{
lines.push_back(line); // Save the line in the vector
}
// Now the vector holds all lines from the file
// and you can do what ever you want it
// For instance we can print the lines
// Both as a line and as the hexadecimal value of every character
for(auto s : lines) // For each line in vector
{
cout << s; // Print it
for(auto c : s) // For each character in the line
{
cout << hex // switch to hexadecimal
<< std::setw(2) // print it in two
<< std::setfill('0') // leading zero
<< (unsigned int)c // cast to get the integer value
<< dec // back to decimal
<< " "; // and a space
}
cout << endl; // new line
}
return 0;
}
I do not laugh due to your original code - no way - I was also a beginner once. But your code is c-style code and contains a lot of bugs. So my advice is: Please use c++ style instead. For instance: never use the C-style string (i.e. char array). It is so error prone...
As you are a beginner (your own words :) let me explain a few things about your code:
char* title[1024];
This is not a string. It is 1024 pointers to characters which can also by 1024 pointers to c-style strings. However - you have not reserved any memory for holding the strings.
The correct way would be:
char title[1024][256]; // 1024 lines with a maximum of 256 chars per line
Here you must make sure that the input file has less than 1024 lines and that each line each less than 256 chars.
Code like that is very bad. What to do if the input file has 1025 lines?
This is where c++ helps you. Using std::string you don't need to worry about the length of the string. The std::string container will just adjust to the size you put into in to it.
The std::vector is like an array. But without a fixed size. So you can just keep adding to it and it will automatically adjust the size.
So c++ offers std::string and std::vector to help you to handle the dynamic size of the input file. Use it...
Good luck.
I'm trying to make a programm, which will read the file, change specified word to symbols '#' and write back to same file. But I have problems with that.
1st question.
It seems like I need to store file in buffer before writing it to the file. How should I do it?
2nd question:
I cant understand why loop in this code never ends? It's about 200 words in that file, but I always get memory exception and i gets 10075.
int main(int argc, char* argv[]){
char** temp = new char*[10000];
int i = 0;
fstream fTemp("D:\doc.txt", ios_base::in);
while (!fTemp.eof()){
temp[i] = new char[50];
fTemp >> temp[i];
temp[i][1] = '#';
cout << temp[i] << endl;
i++;
}
fTemp.open("D:\doc.txt", ios_base::trunc);
for (int i = 0; i < sizeof(*temp); i++){
fTemp << temp[i];
}
_getch();
}
First, you should use getline as your usage of eof is incorrect (eof bit is set only after failed read).
Next, store the strings in a std::vector<string>. This will allow you to not care about memory management (current one is leaking) and provide a more flexible solution.
std::string buffer;
std::vector<string> data;
while(std::getline(fTemp,buffer)) {
data.push_back(buffer);
}
The problem you probably have, is the incorrect eof() call, buy you should check you cout output to determine the problem with this code.
to store the data of file in a buffer, you can get the size of file and use the function read to get all file data. see this code:
// Load file in a buffer
ifstream fl(FileName);
fl.seekg(0, ios::end);
size_t len = fl.tellg();
char* fdata = new char[len];
fl.seekg(0, ios::beg);
fl.read(fdata, len);
fl.close();
in your case the same fstream that you used to open are being used to write without close the file before reopen.
Your loop never ends because it is a pointer, and it size isn't managed, the better way is get the size of file while it is open, in this case the size of file is the "size_t len".
to rewrite your code you can create another stream, see this code:
// Write File
ofstream flOut(FileName, ios_base::trunc);
flOut.write(fdata, len);
flOut.close();
between these two codes above, you can change the data of fdata, but what exactly you wanna make? is replace some word to symbol '#'? which word?
I'm trying to read character by character from a text file until EOF, put them into a character array, so that I can manipulate it after. Compiled with g++ without errors, and when run, I'm prompted for the input file but then it just hangs.
int main (int argc, char *argv[]) {
string filename;
ifstream infile;
char *cp, c[1024];
memset (c, 0, sizeof(c));
cp = c;
cout << "Enter file name: " << endl;
cin >> filename;
//open file
infile.open( filename.c_str() );
//if file can't open
if(!infile) {
cerr << "Error: file could not be opened" << endl;
exit(1);
}
while (!infile.eof()); {
infile.get(c, sizeof(infile));
// get character from file and store in array c[]
}
}//end main
You should try the istream::read() method rather than get(). This will help resolve any buffer overruns:
unsigned int chars_read = 0;
//...
// Read in the file.
if (!infile.read(c, sizeof(c))
{
// Handle the read error here.
// Also check for EOF here too.
}
// Obtain the number of characters actually read.
chars_read = infile.gcount();
First off, you don't want to test for eof()! Somehow I start to feel like Don Quixote having found my windmills. However, I do know that you need to check that the input was successful after trying to read it because before attempting to read the stream can't know whether it will be successful.
You program actually doesn't hang! It just waits for you to enter sizeof(infile) characters or end the input (e.g., using Ctrl-D on UNIXes and Ctrl-Z on Windows). Of course, this may look remarkable like a hanging program. You can verify that this is, indeed, the problem by using a smaller size, e.g., 4. Of course, sizeof(infile) is nearly as good as a small random number: It is the size of an object of type std::ifstream and who can tell what that is? You probably meant to use sizeof(c) to make sure that the call to get(c, n) won't write more character than can fit into c.
Try this:
int cont = 0;
while(infile.good()) {
c[cont++] = infile.get();
}
i have the next code:
std::string line;
std::ifstream myfile ("text.txt");
if (myfile.is_open())
{
while ( myfile.good() )
{
getline (myfile,line);
std::cout << line << std::endl;
}
myfile.close();
}
is there a way to do it, and use char* instead of string?
Yes, if you really insist. There's a version of getline that's a member of std::istream that will do it:
char buffer[1024];
std::ifstream myfile("text.txt");
while (myfile.getline(buffer, sizeof(buffer))
std::cout << buffer << "\n";
myfile.close();
Note, however, that most C++ programmers would consider this obsolescent at best. Oh, and for the record, the loop in your question isn't really correct either. Using string, you'd typically want something like:
std::string line;
std::ifstream myfile("text.txt");
while (std::getline(myfile, line))
std::cout << line << "\n";
myfile.close();
or, you could use the line proxy from one of my previous answers, in which case it becomes simpler still:
std::copy(std::istream_iterator<line>(myfile),
std::istream_iterator<line>(),
std::ostream_iterator<std::string>(std::cout, "\n"));
So you're looking for a more "C-like" solution?
#include<cstdio>
#define ENOUGH 1000
int main() {
char buffer[ENOUGH];
FILE* f = fopen("text.txt", "r");
while (true) {
if (fgets(buffer, ENOUGH, f) == NULL) break;
puts(buffer);
}
fclose(f);
return 0;
}
...plus some check whether the file was correctly opened. In this case, you use fgets() on the file f, reading into the char* buffer. However, buffer has only ENOUGH space allocated and this limit is also an important parameter to the fgets() function. It will stop reading the line when reaching ENOUGH - 1 characters, so you should make sure the ENOUGH constant is large enough.
But if you didn't mean to solve this in a "C-like" way, but are still going to use <iostream>, then you probably just want to know that the c_str() method of std::string returns the char* representation of that std::string.
I want to put each byte in a char array and rewrite the text file removing the first 100,000 characters.
int fs=0;
ifstream nm,nm1;
nm1.open("C:\\Dev-Cpp\\DCS\\Decom\\a.txt");
if(nm1.is_open())
{
nm1.seekg(0, ios::end );
fs = nm1.tellg();
}
nm1.close();
char ss[500000];
nm.open("C:\\Dev-Cpp\\DCS\\Decom\\a.txt");
nm.read(ss,fs-1);
nm.close();
ofstream om;
om.open("C:\\Dev-Cpp\\DCS\\Decom\\a.txt");
for(int i=100000;i<fs-1;i++){
om >> ss[i];
}
om.close();
Problem is i can't set the character array to a 5 million size. I tried using vector also
vector <char> ss (5000000);
int w=0;
ifstream in2("C:\\Dev-Cpp\\DCS\\Decom\\a.txt", ios::binary);
unsigned char c2;
while( in2.read((char *)&c2, 1) )
{
in2 >> ss[w];
w++;
}
Over here the size of w is almost half that of fs and a lot of characters are missing.
How to do it ?
In most implementations, char ss[5000000] tries allocating on the stack, and the size of the stack is limited as compared to the overall memory size. You can often allocate larger arrays on the heap than on the stack, like this:
char *ss = new char [5000000];
// Use ss as usual
delete[] ss; // Do not forget to delete
Note that if the file size fs is larger than 5000000, you will write past the end of the buffer. You should limit the amount of data that you read:
nm.read(ss,min(5000000,fs-1));
This part is not correct
while( in2.read((char *)&c2, 1) )
{
in2 >> ss[w];
w++;
}
bacause you first try to read one character into c2 and, if that succeeds, read another character into ss[w].
I'm not at all surprised if you lose about half the characters here!
The best way to solve your problem is to use the facilities of the standard library. That way, you also don't have to care about buffer overflows.
The following code is untested.
std::fstream file("C:\\Dev-Cpp\\DCS\\Decom\\a.txt", std::ios_base::in);
if (!file)
{
std::cerr << "could not open file C:\\Dev-Cpp\\DCS\\Decom\\a.txt for reading\n";
exit(1);
}
std::vector<char> ss; // do *not* give a size here
ss.reserve(5000000); // *expected* size
// if the file is too large, the capacity will automatically be extended
std::copy(std::istreambuf_iterator<char>(file), std::istreambuf_iterator<char>(),
std::back_inserter(ss));
file.close();
file.open("C:\\Dev-Cpp\\DCS\\Decom\\a.txt", std::ios_base::out | std::ios_base::trunc);
if (!file)
{
std::cerr << "could not open C:\\Dev-Cpp\\DCS\\Decom\\a.txt for writing\n";
exit(1);
}
if (ss.size() > 100000) // only if the file actually contained more than 100000 characters
std::copy(ss.begin()+100000, ss.end(), std::ostreambuf_iterator<char>(file));
file.close();