Invalid null pointer error when using ifstream - c++

So I have this strange "invalid null pointer" exception in this code (cut down to the core of the problem)
#include <fstream>
#include <iostream>
#include <iomanip>
int main(){
std::ifstream input;
std::ofstream output;
unsigned __int16 currentWord = 0;
output.open("log.txt");
input.open("D:\\Work\\REC022M0007.asf", std::ios::binary);
input.seekg(0, input.end);
int length = input.tellg();
input.seekg(0, input.beg);
for (int i = 0; i < length;){
int numData = 0;
input.read(reinterpret_cast<char *>(currentWord), sizeof(currentWord));
}
input.close();
output.close();
return 0;
}
The line that gets me exception is
input.read(reinterpret_cast<char *>(currentWord), sizeof(currentWord));
and it do so on the very first go-through, so it's not like I was trying to read file further, than it is.
When I try to change value of currentWord to 1 I get exception like
trying to write to memory 0x0000001
or smth along the lines (number of zeroes may not be correct)
Web search told me it have something to do either with file being empty (which is not the case), not found (not the case either, as length gets value) or something in type casting mumbo-jumbo. Any suggestions?

'currentWord' is ZERO (0) and when reinterpret_cast<char*>(currentWord) is executed it makes it a nullptr so when you try to write to the null address you'll get a memory write protection error.
change it to reinterpret_cast<char*>(&currentWord) (note the &)

You are using the value of current_word as though it were a pointer. It isn't. That's why you get a null pointer address when current_word is zero and you get a memory exception at address 1 when current_word is 1. Use &current_word as the first argument to input.read().

Related

C++ memory allocation (constructor)

I need to write class which contains char pointer to text line and constructor which retrieves text line from the list of arguments, dynamically allocates memory and copies the text of the line to the component class.
Writed programm doesn't works correctly. Example Error.
What is wrong? Please help!
class A
{
char* text;
public:
A(char *line);
};
A::A(char *line) {
int length = strlen(line);
text = new char[length];
if (strlen(line) <= sizeof(text))
strcpy_s(text, length, line);
else
{
cout << text << endl;
cout << "Too long string" << endl;
}
}
int main()
{
A ob("aaaaaa");
system("PAUSE");
return 0;
}
The sizeof operator returns the size of the object (not the size of a string). So in this case
sizeof(text)
It returns the size of the object text. You declare text as
char* text;
So it returns the size of a char*. The exact size of this will depend on the system, but lets guess its 4. So any string that has a length greater than 4 will result in the output of:
Too long string
If we look at the string: "aaaaaa" is longer than 4 so you get the expected output.
I expect you are trying to check that the previous line succeded.
text = new char[length];
But in C++ the new will either work or throw an exception (causing program termination for this program). So either that line works or the program will exit. So there is no need to check the result of new (unlike C where you should check the result of malloc())
Also note: You should check the result of strcpy_s() as it will return an error on failure. Since you do not provide enough space in the destination it will indicate an error (you don't provide space for the null terminator).

C++ reading large files part by part

I've been having a problem that I not been able to solve as of yet. This problem is related to reading files, I've looked at threads even on this website and they do not seem to solve the problem. That problem is reading files that are larger than a computers system memory. Simply when I asked this question a while ago I was referred too using the following code.
string data("");
getline(cin,data);
std::ifstream is (data);//, std::ifstream::binary);
if (is)
{
// get length of file:
is.seekg (0, is.end);
int length = is.tellg();
is.seekg (0, is.beg);
// allocate memory:
char * buffer = new char [length];
// read data as a block:
is.read (buffer,length);
is.close();
// print content:
std::cout.write (buffer,length);
delete[] buffer;
}
system("pause");
This code works well apart from the fact that it eats memory like fat kid in a candy store.
So after a lot of ghetto and unrefined programing, I was able to figure out a way to sort of fix the problem. However I more or less traded one problem for another in my quest.
#include <iostream>
#include <vector>
#include <string>
#include <fstream>
#include <stdio.h>
#include <stdlib.h>
#include <iomanip>
#include <windows.h>
#include <cstdlib>
#include <thread>
using namespace std;
/*======================================================*/
string *fileName = new string("tldr");
char data[36];
int filePos(0); // The pos of the file
int tmSize(0); // The total size of the file
int split(32);
char buff;
int DNum(0);
/*======================================================*/
int getFileSize(std::string filename) // path to file
{
FILE *p_file = NULL;
p_file = fopen(filename.c_str(),"rb");
fseek(p_file,0,SEEK_END);
int size = ftell(p_file);
fclose(p_file);
return size;
}
void fs()
{
tmSize = getFileSize(*fileName);
int AX(0);
ifstream fileIn;
fileIn.open(*fileName, ios::in | ios::binary);
int n1,n2,n3;
n1 = tmSize / 32;
// Does the processing
while(filePos != tmSize)
{
fileIn.seekg(filePos,ios_base::beg);
buff = fileIn.get();
// To take into account small files
if(tmSize < 32)
{
int Count(0);
char MT[40];
if(Count != tmSize)
{
MT[Count] = buff;
cout << MT[Count];// << endl;
Count++;
}
}
// Anything larger than 32
else
{
if(AX != split)
{
data[AX] = buff;
AX++;
if(AX == split)
{
AX = 0;
}
}
}
filePos++;
}
int tz(0);
filePos = filePos - 12;
while(tz != 2)
{
fileIn.seekg(filePos,ios_base::beg);
buff = fileIn.get();
data[tz] = buff;
tz++;
filePos++;
}
fileIn.close();
}
void main ()
{
fs();
cout << tmSize << endl;
system("pause");
}
What I tried to do with this code is too work around the memory issue. Rather than allocating memory for a large file that simply does not exist on a my system, I tried to use the memory I had instead which is about 8gb, but I only wanted to use maybe a few Kilobytes of it if at all possible.
To give you a layout of what I am talking about I am going to write a line of text.
"Hello my name is cake please give me cake"
Basically what I did was read said piece of text letter by letter. Then I put those letters into a box that could store 32 of them, from there I could use something like xor and then write them onto another file.
The idea in a way works but it is horribly slow and leaves off parts of files.
So basically how can I make something like this work without going slow or cutting off files. I would love to see how xor works with very large files.
So if anyone has a better idea than what I have, then I would be very grateful for the help.
To read and process the file piece-by-piece, you can use the following snippet:
// Buffer size 1 Megabyte (or any number you like)
size_t buffer_size = 1<<20;
char *buffer = new char[buffer_size];
std::ifstream fin("input.dat");
while (fin)
{
// Try to read next chunk of data
fin.read(buffer, buffer_size);
// Get the number of bytes actually read
size_t count = fin.gcount();
// If nothing has been read, break
if (!count)
break;
// Do whatever you need with first count bytes in the buffer
// ...
}
delete[] buffer;
The buffer size of 32 bytes, as you are using, is definitely too small. You make too many calls to library functions (and the library, in turn, makes calls (although probably not every time) to OS, which are typically slow, since they cause context-switching). There is also no need of tell/seek.
If you don't need all the file content simultaneously, reduce the working set first - like a set of about 32 words, but since XOR can be applied sequentially, you may further simplify the working set with constant size, like 4 kilo-bytes.
Now, you have the option to use file reader is.read() in a loop and process a small set of data each iteration, or use memmap() to map the file content as memory pointer which you can perform both read and write operations.

C++ Pointer array and buffering

I have a problem while reading from a file.
The code below ends with runtime error after like 100 loop, after tracing found that the
mybuff my doesn't reintialize with (mybuff = new char [1024];) because after debugging i still see the prvious message at the end of it.
and the problem happens when I try to fill sendbuff because same issue.
the error saying aboout "Access violation reading location" happens at this step ( sprintf(sendbuff,mybuff ))
any idea how to solve this issue?
char sendbuff[1024];
char * mybuff = new char[];
While(....){
mybuff = new char [1024];
myfile.read(mybuff ,bufsize);
sprintf(sendbuff,mybuff );
ibytessent=0;
tmpCount = strlen(sendbuff);
ibufferlen = strlen(sendbuff);
ibytessent = send(s,sendbuff,ibufferlen,0);
delete [] mybuff ;
}
I think the way you call ifstream::read() is wrong. read() doesnot add the null character at the end for you and you need to check the eofbit and failbit.
Quote from the manual,
The number of characters successfully read and stored by this function
can be accessed by calling member gcount.
I also think the run-time error is caused by the reason about read() function just as above, Also I don't think it's necessary to re-new 1024 bytes space at each iteration, why not reuse the buffer~
By the way, I try to repro your problem, I am not sure if the code below is the same with yours, and I don't get any run-time error
#include <cstdio>
#include <fstream>
using namespace std;
int bufsize = 1024;
int main(){
char sendbuff[1024];
char * mybuff = new char[];
std::ifstream ifs;
ifs.open ("test.txt", std::ifstream::in);
while(1){
mybuff = new char [1024];
ifs.read(mybuff ,bufsize);
sprintf(sendbuff,mybuff );
int ibytessent=0;
int tmpCount = strlen(sendbuff);
int ibufferlen = strlen(sendbuff);
//ibytessent = send(s,sendbuff,ibufferlen,0);
delete [] mybuff ;
}
return 0;
}

strcat error "Unhandled exception.."

My goal with my constructor is to:
open a file
read into everything that exists between a particular string ("%%%%%")
put together each read row to a variable (history)
add the final variable to a double pointer of type char (_stories)
close the file.
However, the program crashes when I'm using strcat. But I can't understand why, I have tried for many hours without result. :/
Here is the constructor code:
Texthandler::Texthandler(string fileName, int number)
: _fileName(fileName), _number(number)
{
char* history = new char[50];
_stories = new char*[_number + 1]; // rows
for (int j = 0; j < _number + 1; j++)
{
_stories[j] = new char [50];
}
_readBuf = new char[10000];
ifstream file;
int controlIndex = 0, whileIndex = 0, charCounter = 0;
_storieIndex = 0;
file.open("Historier.txt"); // filename
while (file.getline(_readBuf, 10000))
{
// The "%%%%%" shouldnt be added to my variables
if (strcmp(_readBuf, "%%%%%") == 0)
{
controlIndex++;
if (controlIndex < 2)
{
continue;
}
}
if (controlIndex == 1)
{
// Concatenate every line (_readBuf) to a complete history
strcat(history, _readBuf);
whileIndex++;
}
if (controlIndex == 2)
{
strcpy(_stories[_storieIndex], history);
_storieIndex++;
controlIndex = 1;
whileIndex = 0;
// Reset history variable
history = new char[50];
}
}
file.close();
}
I have also tried with stringstream without results..
Edit: Forgot to post the error message:
"Unhandled exception at 0x6b6dd2e9 (msvcr100d.dll) in Step3_1.exe: 0xC00000005: Access violation writing location 0c20202d20."
Then a file named "strcat.asm" opens..
Best regards
Robert
You've had a buffer overflow somewhere on the stack, as evidenced by the fact one of your pointers is 0c20202d20 (a few spaces and a - sign).
It's probably because:
char* history = new char[50];
is not big enough for what you're trying to put in there (or it's otherwise not set up correctly as a C string, terminated with a \0 character).
I'm not entirely certain why you think multiple buffers of up to 10K each can be concatenated into a 50-byte string :-)
strcat operates on null terminated char arrays. In the line
strcat(history, _readBuf);
history is uninitialised so isn't guaranteed to have a null terminator. Your program may read beyond the memory allocated looking for a '\0' byte and will try to copy _readBuf at this point. Writing beyond the memory allocated for history invokes undefined behaviour and a crash is very possible.
Even if you added a null terminator, the history buffer is much shorter than _readBuf. This makes memory over-writes very likely - you need to make history at least as big as _readBuf.
Alternatively, since this is C++, why don't you use std::string instead of C-style char arrays?

C++ ifstream::read() - corrupts ifstream get pointer?

Does anyone here know of a way a C++ ifstream's get pointer might get corrupted after a read() call? I'm seeing some truly bizarre behaviour that I'm at a loss to explain. For example (illustrative code, rather than what I'm actually running):
int main()
{
// datafile.bin is a 2MB binary file...
std::ifstream ifs( "datafile.bin", ios::binary );
ifs.exceptions ( ifstream::eofbit | ifstream::failbit | ifstream::badbit );
int data[100];
std::istream::pos_type current_pos = ifs.tellg();
// current_pos = 0, as you'd expect...
ifs.read( reinterpret_cast<char*>(data), 100 * sizeof(int) );
// throws no exception, so no error bits set...
std::streamsize bytes_read = ifs.gcount();
// gives 400, as you'd expect...
current_pos = ifs.tellg();
// current_pos = 0x1e1a or something similarly daft
return 0;
}
My example shows an array read, but it's happened even when reading single values of built-in types; the get pointer before the read is correct, the gcount() call reports the correct number of bytes read, but afterwards the get pointer is completely screwy. This doesn't happen with every read() call - sometimes I get through bunches of them before one stuffs up. What could possibly be monkeying with the get pointer? Am I doing something profoundly stupid?
Any and all help greatly appreciated...
Simon
pos_type isn't an integral type but a class, I'd not try to try to interpret its representation. It is implicitly convertible to an integral type, but if you are looking at it in the debugger, you'll see the internal representation.
I tried running your code in VS 2008 on Vista machine, but did not get any error. I have modified your code a bit for printing on console.
#include <iostream>
#include <fstream>
using namespace std;
int main()
{
// datafile.bin is a 2MB binary file...
std::ifstream ifs( "H_Line.bmp", ios::binary );
ifs.exceptions ( ifstream::eofbit | ifstream::failbit | ifstream::badbit );
int data[100];
std::istream::pos_type current_pos = ifs.tellg();
cout<<current_pos<<endl; // current_pos = 0, as mentioned
ifs.read( reinterpret_cast<char*>(data), 100 * sizeof(int) );
// throws no exception, so no error bits set...
std::streamsize bytes_read = ifs.gcount();
cout<<bytes_read<<endl; // gives 400, as you have mentioned
current_pos = ifs.tellg();
cout<<current_pos<<endl; // FOR ME IT IS GIVING 400
return 0;
}
I have tested this on a BMP image file of size >20 MB
Could you please elaborate which machine/compiler you are using.
Thanks