i try to read the entire text file using vc++ with this code
ifstream file (filePath, ios::in|ios::binary|ios::ate);
if (file.is_open())
{
size = (long)file.tellg();
char *contents = new char [size];
file.seekg (0, ios::beg);
file.read (contents, size);
file.close();
isInCharString("eat",contents);
delete [] contents;
}
but it's not fetch all entire file ,why and how to handle this?
Note : file size is 1.87 MB and 39854 line
You are missing the following line
file.seekg (0, file.end);
before:
size = file.tellg();
file.seekg (0, file.beg);
As discribed in this example: http://www.cplusplus.com/reference/istream/istream/read/
Another way to do this is:
std::string s;
{
std::ifstream file ("example.bin", std::ios::binary);
if (file) {
std::ostringstream os;
os << file.rdbuf();
s = os.str();
}
else {
// error
}
}
Alternatively, you can use the C library functions fopen, fseek, ftell, fread, fclose. The c-api can be faster in some cases at the expense of a more STL interface.
You really should get the habit of reading documentation. ifstream::read is documented to sometimes not read all the bytes, and
The number of characters successfully read and stored by this function
can be accessed by calling member gcount.
So you might debug your issues by looking into file.gcount() and file.rdstate(). Also, for such big reads, using (in some explicit loop) the istream::readsome member function might be more relevant. (I would suggest reading by e.g. chunks of 64K bytes).
PS it might be some implementation or system specific issue.
Thanks all, i found the error where,
simply the code below reads the entire file ,
the problem was in VS watcher itself it was just display certain amount of data not the full text file.
Related
I'm new to C++, I have an image named "test.jpg", i convert it to base64 and decode it again like this:
std::ifstream inputFile;
inputFile.open("test.jpg",std::ios::binary);
std::filebuf* pbuf = inputFile.rdbuf();
inputFile.seekg (0, ios::end);
int length = inputFile.tellg();
// allocate memory to contain file data
char* buffer=new char[length];
// get file data
pbuf->sgetn (buffer,length);
inputFile.close();
CBase64 base64;
string encodedData = base64.base64_encode((unsigned char*)buffer,length);
delete[] buffer;
string decodedData = base64.base64_decode(encodedData);
ofstream outPutFile;
outPutFile.open("test2.jpg",ios::binary | ios::out);
outPutFile.write(decodedData.c_str(), decodedData.length());
outPutFile.close();
the "test2.jpg" has exact same size as "test.jpg"(the original file) but, i can't open it.
i couldn't find what is the problem.
i got it working. i just replaced:
outPutFile.open("test2.jpg",ios::binary | ios::out);
with
outPutFile.open("test2.jpg", ios_base::out | ios_base::binary);
std::string path = "file.txt";
std::string cfgString = "data";
std::ofstream output(path.c_str(), ios_base::out | std::ios::binary);
if (output.is_open()) {
output.write(cfgString.data(), cfgString.length());
}
output.close();
Apparently, there is no superficial problem with your file writing logic even though there are some irregularities. The biggest problem is in your main logic.
The program seems to be simple program of copying a file. What you are doing is reading a file, converting its data to base64 string and then again decoding the data to std::string. Now one small problem. Conversion of base64 string cannot be successfully done into a null terminated ANSI string for obvious reasons that any 0 in decoded binary data will terminate the string prematurely. Secondly you are opening a file in binary mode to write but trying to write std::string in the file. But that doesn't matter as you data has already been corrupted in your previous operation.
To solve this, you can simply use file copying example as this or make sure you write only binary data with care to your output file which means read in binary from input file and write to output file the same buffer. No base64 encoding decoding is required.
it looks like you forgot to write
inputFile.seekg (0, ios::beg);
after getting file length. it means you try to read from the end of the file instead of its beginning.
I am actually writing a c++ program that reads any kind of file and saves it as a bmp file, but first I need to read the file, and thats were the issue is
char fileName[] = "test.jpg";
FILE * inFileForGettingSize;//This is for getting the file size
fopen_s(&inFileForGettingSize, fileName, "r");
fseek(inFileForGettingSize, 0L, SEEK_END);
int fileSize = ftell(inFileForGettingSize);
fclose(inFileForGettingSize);
ifstream inFile;//This is for reading the file
inFile.open(fileName);
if (inFile.fail()) {
cerr << "Error Opening File" << endl;
}
char * data = new char[fileSize];
inFile.read(data, fileSize);
ofstream outFile;//Writing the file back again
outFile.open("out.jpg");
outFile.write(data, fileSize);
outFile.close();
cin.get();
But when I read the file, lets say its a plainttext file it allways outputs some wierd charactes at the end, for example:
assdassaasd
sdaasddsa
sdadsa
passes to:
assdassaasd
sdaasddsa
sdadsaÍÍÍ
So when I do this with a jpg, exe, etc. It corrupts it.
I am not trying to COPY a file, I know there are other ways for that, Im just trying to read a complete file byte per byte. Thanks.
EDIT:
I found out that those 'Í' are equal to the number of end lines the file has, but this doesn't help me much
This is caused by newline handling.
You open the files in text mode (because you use "r" instead of "rb" for fopen and because you don't pass ios::binary to your fstream open calls), and on Windows, text mode translates "\r\n" pairs to "\n" on reading and back to "\r\n" when writing. The result is that the in-memory size is going to be shorter than the on-disk size, so when you try to write using the on-disk size, you go past the end of your array and write whatever random stuff happens to reside in memory.
You need to open files in binary mode when working with binary data:
fopen_s(&inFileForGettingSize, fileName, "rb");
inFile.open(fileName, ios::binary);
outFile.open("out.jpg", ios::binary);
For future reference, your copy routine could be improved. Mixing FILE* I/O with iostream I/O feels awkward, and opening and closing the file twice is extra work, and (most importantly), if your routine is ever run on a large enough file, it will exhaust memory trying to load the entire file into RAM. Copying a block at a time would be better:
const int BUFFER_SIZE = 65536;
char buffer[BUFFER_SIZE];
while (source.good()) {
source.read(buffer, BUFFER_SIZE);
dest.write(buffer, source.gcount());
}
It's a binary file, so you need to read and write the file as binary; otherwise it's treated as text, and assumed to have newlines that need translation.
In your call to fopen(), you need add the "b" designator:
fopen_s(&inFileForGettingSize, fileName, "rb");
And in your fstream::open calls, you need to add std::fstream::binary:
inFile.open(fileName, std::fstream::binary);
// ...
outFile.open("out.jpg", std::fstream::binary);
I'm writing a simple console application in Visual Studio C++. I want to read a binary file with .cer extension to a byte array.
ifstream inFile;
size_t size = 0;
char* oData = 0;
inFile.open(path, ios::in|ios::binary);
if (inFile.is_open())
{
size = inFile.tellg(); // get the length of the file
oData = new char[size+1]; // for the '\0'
inFile.read( oData, size );
oData[size] = '\0' ; // set '\0'
inFile.close();
buff.CryptoContext = (byte*)oData;
delete[] oData;
}
But when I launch it, I receive in all the oData characters the same char, every time another one, For example:
oData = "##################################################...".
Then I tried another way:
std::ifstream in(path, std::ios::in | std::ios::binary);
if (in)
{
std::string contents;
in.seekg(0, std::ios::end);
contents.resize(in.tellg());
in.seekg(0, std::ios::beg);
in.read(&contents[0], contents.size());
in.close();
}
Now the content has very strange values: a part of the values is correct, and a part is negative and strange values (maybe it is related to signed char and unsigned char?).
Does anyone have any idea?
Thanks ahead!
Looking at the first version:
What makes you think that tellg gets the size of the stream? It does not, it returns the current read position. You then go on to give a pointer to your data to buff.CryptoContents and promptly delete the data pointed to! This is very dangerous practice; you need to copy the data, use a smart pointer or otherwise ensure the data has the correct lifespan. It is likely the deletion is stomping your data with a marker to show it has been deleted if you're running in debug mode which is why you are getting the stream of identical characters.
I suspect your suggestion about signed and unsigned may be correct for the second but I can't say without seeing your file and data.
You are setting CryptoContext to point to your data by byte pointer, and after that you delete that data!
buff.CryptoContext = (byte*)oData;
delete[] oData;
After this lines CryptoContext is pointing to released and invalid data. Just keep oData array longer in memory and delete it after you are done with decoding or whatever you are doing with it.
I am trying to do something like this using rapidxml using c++
xml_document<> doc;
ifstream myfile("map.osm");
doc.parse<0>(myfile);
and receive the following error
Multiple markers at this line - Invalid arguments ' Candidates are: void parse(char *) ' - Symbol 'parse' could not be resolved
file size can be up to a few mega bytes.
please help
You have to load the file into a null terminated char buffer first as specified here in the official documentation.
http://rapidxml.sourceforge.net/manual.html#classrapidxml_1_1xml__document_8338ce6042e7b04d5a42144fb446b69c_18338ce6042e7b04d5a42144fb446b69c
Just read the contents of your file into a char array and use this array to pass to the xml_document::parse() function.
If you are using ifstream, you can use something like the following to read the entire file contents to a buffer
ifstream file ("test.xml");
if (file.is_open())
{
file.seekg(0,ios::end);
int size = file.tellg();
file.seekg(0,ios::beg);
char* buffer = new char [size];
file.read (buffer, size);
file.close();
// your file should now be in the char buffer -
// use this to parse your xml
delete[] buffer;
}
Please note I have not compiled the above code, just writing it from memory, but this is the rough idea. Look at the documentation for ifstream for exact details. This should help you get started anyway.
Everytime I read in by fstream I got 1 extra character at the end, How can I avoid this?
EDIT:
ifstream readfile(inputFile);
ofstream writefile(outputFile);
char c;
while(!readfile.eof()){
readfile >> c;
//c = shiftChar(c, RIGHT, shift);
writefile << c;
}
readfile.close();
writefile.close();
This typically results from testing for the end of file incorrectly. You normally want to do something like:
while (infile>>variable) ...
or:
while (std::getline(infile, whatever)) ...
but NOT:
while (infile.good()) ...
or:
while (!infile.eof()) ...
The first two do a read, check whether it failed, and if so exit the loop. The latter two attempt a read, process what's now in the variable, and then exit the loop on the next iteration if the previous attempt failed. On the last iteration, what's in the variable after the failed read will normally be whatever was in it previously, so loops like either of the second two will typically appear to process the last item in the file twice.
To copy one file to another easily, consider using something like this:
// open the files:
ifstream readfile(inputFile);
ofstream writefile(outputFile);
// do the copy:
writefile << readfile.rdbuf();
This works well for small files, but can slow down substantially for a larger file. In such a case, you typically want to use a loop, reading from one file and writeing to the other. This also has possibilities for subtle errors as well. One way that's been tested and generally work reasonably well looks like this:
std::ifstream input(in_filename, std::ios::binary);
std::ofstream output(out_filename, std::ios::binary);
const size_t buffer_size = 512 * 1024;
char buffer[buffer_size];
std::size_t read_size;
while (input.read(buffer, buffer_size), (read_size = input.gcount()) > 0)
output.write(buffer, input.gcount());
Based on the code, it appears what you're trying to do is copy the contents of one file to another?
If so, I'd try something like this:
ifstream fin(inputFile, ios::binary);
fin.seekg(0, ios::end);
long fileSize = fin.tellg();
fin.seekg(0, ios::beg);
char *pBuff = new char[fileSize];
fin.read(pBuff, fileSize);
fin.close();
ofstream fout(outputFile, ios::binary)
fout.write(pBuff, fileSize);
fout.close;
delete [] pBuff;