C++ code is running in Windows 7 but not in windows 10 - c++

I have a c++ code running in windows 7 but not works in windows10. It is working in MAC/LINUX.
I am trying to parse a large hex file. My code loads in to array and then applies the business logic to generate the csv.
The size of the file is 2.38GB.
Below is the code.
bool readFile (string filename, char ** buffer ,unsigned int & sizeOfFile )
{
ifstream inFile (filename.c_str (), ios::in | ios::binary);
if (!inFile)
return false;
inFile.seekg (0, ios::end);
size_t size = inFile.tellg ();
inFile.seekg (0, ios::beg);
*buffer = new char[size];
cout<<"\n Length of the ARRAY= "<<size;
inFile.read (*buffer, size);
inFile.close ();
sizeOfFile = size;
cout<<"File successfully read Press Any Key to Continue.. "<<endl;
//getch();
return true;
}
It is failing to load the file into array when I execute it in windows 10 under visual studio 2015 as well as under dev c++. It works perfectly in windows 7.

Wouldn't it be easier to use file mapping?
https://msdn.microsoft.com/en-us/library/windows/desktop/aa366556(v=vs.85).aspx
EDIT:
You mention below that you are using the same code on macOS and Linux. I'm more familiar with these platforms, where I would use mmap(). Effectively this automates the loading of data from the file to memory.
Otherwise, as mentioned above, you need to provide more information on the nature of the problem (32 or 64bit build, type of failure).

size_t size = inFile.tellg ();
...
*buffer = new char[size];
Looks like you're not creating a large enough space for the whole file to be read in. If the file were 10 bytes in size, you'd need to allocate 11 bytes as the null-terminated string needs an extra character at the end to store the '\0' character to terminate it. The fact that it works on some systems, but not others is pure luck as it means that on those systems there happened to be a NUL character at the right point in memory to properly terminate the string.

Related

Reading big content from a file and write that content into a new file

I have written code to read a big size file content and write that content into a new file.
That code works fine with a small and medium size file content but with a big size file content, approximately 1.8GB and above it does not works and gives me an unknown error/exception during runtime.
Also, I have tried to debug and the following is the debugging result:
The code:
char * myClass::getFileContent(const char * fileName) {
std::ifstream file(fileName, std::ios::binary|std::ios::ate);
if (!file.is_open()) {
perror(strerror(errno));
return "";
}
char * strBuffer = NULL;
long long length = file.tellg();
strBuffer = new char[length];
file.seekg(0, std::ios::beg);
file.read(strBuffer, length);
return strBuffer;
}
// The implementation
char *fileName = "C:\\desktop-amd64.iso";
char *fileContent = myClass.getFileContent(fileName);
ofstream file("c:\\file.iso", ios::binary);
if (file.is_open()) {
file.write(fileContent, myClass.getFileSize(fileName));
file.close();
}
delete fileContent;
Note: I am using Visual Studio 2015 on Windows 7 x64.
Why does that problem happen with the big files?
The debugger shows that an std::bad_alloc exception occurs at the line
strBuffer = new char[length];
It seems that you are trying to allocate a block in memory that's size is the same as the file you are trying to read. As the file's size is around 1.8 GB, it is possible that the operating system just cannot allocate a chunk in memory with that size.
I recommend reading this answer about handling huge files without storing all their contents in the memory.
As far i can see. You are trying to release memory with free while you allocate memory with new. You cannot mix them.
To deallocate memory, use delete.
Now, if you need to free it with free, you must allocate memory with malloc.

Incorrect size of file found using Visual Studio C++

I am porting over c++ code from linux to windows. I am currently using Visual Studio 2013 to port my code.
I need to read a binary file and am using this portion of c++ code:
// Open the stream
std::ifstream is("myfile.bin");
// Determine the file length
is.seekg(0, std::ios_base::end);
std::size_t size=is.tellg();
is.seekg(0, std::ios_base::begin);
// Create a vector to store the data
int* Data = new int[size/sizeof(int)];
// Load the data
is.read((char*) &Data[0], size);
// Close the file
is.close();
In linux, the size of my binary file is correctly found to be 744mb. However, in windows, the size of my binary file is incorrectly found to be >4GB. How can I correct this issue?
Change std::ifstream is("myfile.bin"); to std::ifstream is("myfile.bin", std::ios::binary);
With your current default open mode, the compiler choses "char" mode. In Linux chars in files are UTF8, first 128 positions are 1-byte char. But for memory UTF32, 4-bytes per char, is used. In Windows chars are "wide-chars", 2-bytes per char.
I finally had the time to actually run this myself, though I had to fix a couple of things, like ios_base::beg instead of begin (different function) Also, as mentioned, the array allocation should be this int* Data = new int[size / sizeof(int) + 1]; // At most one extra int
I found your problem: you're not in the right directory. Check if you successfully opened the file or not. If you don't, then you get a huge garbage value (probably -1, but unsigned, so massive) for size.
Try this to find your directory in Windows: (probably need Windows.h or something that I "just had" already)
char dirBuf[256];
GetCurrentDirectory(256, dirBuf);
cout << "Current directory is: " << dirBuf << endl;
See if that's where your file is and move it accordingly. Or specify the ENTIRE path in the constructor to ifstream.
Also, it has nothing to do with ios::binary or not. Works fine both ways, or fails if the file isn't there.
std::size_t size=is.tellg();
The standard doesn't require tellg to return the byte offset from the beginning of the file. In general, this may not be a reliable way to get the size of the file, though it probably does what you expect on Linux and Windows.
The return type of the tellg method is std::basic_stream::pos_type, so you're starting with an implicit conversion to std::size_t which may or may not be appropriate. In a 32-bit build, for example, it's conceivable that the size of a file could be larger than a std::size_t can represent.
But the root problem is that you're not checking for errors. If you have exceptions disabled, then tellg reports an error by returning pos_type(-1). When you cast that to an unsigned type (which std::size_t is), then you get a very large value. I suspect you failed to open the file, and since you didn't detect that error, the seekg and the tellg failed. You then coerced pos_type(-1) to a std::size_t, which made it look like the file was huge.
You also have the problems others have noted: failing to open the file in binary mode and computing the wrong size for the buffer when the file isn't a multiple of the size of an int.
The most reliable to get the file size is to use the OS's API. On Windows, you can do this instead:
// Open the file. [TODO: Get the file name in wide characters and use
// CreateFileW instead. If the file name contains characters not
// representable by the user's ANSI codepage, then CreateFileA will fail.]
HANDLE hfile = CreateFileA("myfile.bin", GENERIC_READ, FILE_SHARE_READ,
nullptr, OPEN_EXISTING,
FILE_ATTRIBUTE_NORMAL | FILE_FLAG_SEQUENTIAL_SCAN,
nullptr);
if (hfile == INVALID_HANDLE_VALUE) { error handling here }
// Figure out how big it is.
LARGE_INTEGER li_size;
if (!GetFileSizeEx(hfile, &li_size)) { error handling here }
// TODO: On a 32-bit build, this won't be able to handle huge files,
// so check that here.
std::size_t size = li_size.QuadPart;
// Create a buffer to store the data, being careful to round up to a
// multiple of sizeof(int). [TODO: Use a std::vector instead.]
int* Data = new int[(size + sizeof(int) - 1) / sizeof(int)];
// Load the data.
const DWORD BytesToRead = static_cast<DWORD>(size);
DWORD BytesRead = 0;
if (!ReadFile(hfile, Data, &BytesRead, nullptr) || BytesRead < BytesToRead) {
error handling here
}
// Close the file
CloseHandle(hfile);
int* Data = new int[size/sizeof(int)];
Why are you doing this? You're dividing the size by 4. You don't want to do this. It should just be int* Data = new int[size]
Also, it should be std::ifstream f("filename.bin", std::ios::binary);

How to check whether ifstream is end of file in C++

I need to read all blocks of one large file(about 10GB) sequentially, the file contains many floats with a few strings, like this(each item splited by '\n'):
6.292611
-1.078219E-266
-2.305673E+065
sod;eiwo
4.899747e-237
1.673940e+089
-4.515213
I read MAX_NUM_PER_FILE items each time and process them and write to another file, but i don't know when the ifstream is ended.
Here is my code:
ifstream file_input(path_input); //my file is a text file, but i tried both text and binary mode, both failed.
if(file_input)
{
file_input.seekg(0,file_input.end);
unsigned long long length = file_input.tellg(); //get file size
file_input.seekg(0,file_input.beg);
char * buffer = new char [MAX_NUM_PER_FILE+MAX_NUM_PER_LINE];
int i=1,j;
char c,tmp[3];
while(file_input.tellg()<length)
{
file_input.read(buffer,MAX_NUM_PER_FILE);
j=MAX_NUM_PER_FILE;
while(file_input.get(c)&&c!='\n')
buffer[j++]=c; //get a complete item
//process with buffer...
itoa(i++,tmp,10); //int2char
string out_name="out"+string(tmp)+".txt";
ofstream file_output(out_name);
file_output.write(buffer,j);
file_output.close();
}
file_input.close();
delete[] buffer;
}
My code goes wrong, length is bigger than real file size. I have tried file_input.good() or !file_input.eof(), they didn't work, getline(file_input,s) is good, but it is much slower than read, i want read, but i don't know how to check whether ifstream is end-of-file.
I do my work in WINDOWS 7 with VS2010.
I have searched, but there are not any answer about it, How to open a file using ifstream and keep reading it until the end this link can't answer my question.
Update, Problem solved
Hi everyone, I have figured it out that it's my fault. Both while(file_input.tellg()<length) and while(file_input.peek()!=EOF) work fine! while(file_input.peek()!=EOF) is recommended.
The extra items written after the end-of-file is the left items in buffer written in the last time.
Here is the correct code:
ifstream file_input(path_input);
if(file_input)
{
//file_input.seekg(0,file_input.end);
//unsigned long long length = file_input.tellg(); //get file size
//file_input.seekg(0,file_input.beg);
char * buffer = new char [MAX_NUM_PER_FILE+MAX_NUM_PER_LINE];
int i=1,j;
char c,tmp[3];
while(file_input.peek()!=EOF)
{
memset(buffer,0,sizeof(char)*(MAX_NUM_PER_FILE+MAX_NUM_PER_LINE)); //clear first!
file_input.read(buffer,MAX_NUM_PER_FILE);
j=MAX_NUM_PER_FILE;
while(file_input.get(c)&&c!='\n')
buffer[j++]=c;
itoa(i++,tmp,10);//int2char
string out_name="out"+string(tmp)+".txt";
ofstream file_output(out_name);
file_output.write(buffer,strlen(buffer)); //use the correct buffer size instead of j
file_output.close();
}
file_input.close();
delete[] buffer;
}
while( file_input.peek() != EOF )
{
// code
}
Basically peek() will read the next char without extracting it.
So you can simply compare it to EOF.

Problems while trying to Read Binary File C++

I'm writing a simple console application in Visual Studio C++. I want to read a binary file with .cer extension to a byte array.
ifstream inFile;
size_t size = 0;
char* oData = 0;
inFile.open(path, ios::in|ios::binary);
if (inFile.is_open())
{
size = inFile.tellg(); // get the length of the file
oData = new char[size+1]; // for the '\0'
inFile.read( oData, size );
oData[size] = '\0' ; // set '\0'
inFile.close();
buff.CryptoContext = (byte*)oData;
delete[] oData;
}
But when I launch it, I receive in all the oData characters the same char, every time another one, For example:
oData = "##################################################...".
Then I tried another way:
std::ifstream in(path, std::ios::in | std::ios::binary);
if (in)
{
std::string contents;
in.seekg(0, std::ios::end);
contents.resize(in.tellg());
in.seekg(0, std::ios::beg);
in.read(&contents[0], contents.size());
in.close();
}
Now the content has very strange values: a part of the values is correct, and a part is negative and strange values (maybe it is related to signed char and unsigned char?).
Does anyone have any idea?
Thanks ahead!
Looking at the first version:
What makes you think that tellg gets the size of the stream? It does not, it returns the current read position. You then go on to give a pointer to your data to buff.CryptoContents and promptly delete the data pointed to! This is very dangerous practice; you need to copy the data, use a smart pointer or otherwise ensure the data has the correct lifespan. It is likely the deletion is stomping your data with a marker to show it has been deleted if you're running in debug mode which is why you are getting the stream of identical characters.
I suspect your suggestion about signed and unsigned may be correct for the second but I can't say without seeing your file and data.
You are setting CryptoContext to point to your data by byte pointer, and after that you delete that data!
buff.CryptoContext = (byte*)oData;
delete[] oData;
After this lines CryptoContext is pointing to released and invalid data. Just keep oData array longer in memory and delete it after you are done with decoding or whatever you are doing with it.

c++ how to run an .exe file whose contents are stored in a char array?

I'm making a specific program and i just wondered if I could do this:
run a file whose contents are stored in a char array ON WINDOWS.
this is the code that reads the executable and stores it in a char array:
filetoopen.open (C:\blahlbah.exe, ios::binary);
filetoopen.seekg (0, ios::end);
length = filetoopen.tellg();
filetoopen.seekg (0, ios::beg);
buffer = new char [length];
filetoopen.read (buffer, length);
filetoopen.close();
I heard something about RunPE and I did some searching, I haven't succeeded in finding any piece of C++ code to use.
This shows how to Load an EXE File and Run It from Memory : http://www.codeproject.com/KB/cs/LoadExeIntoAssembly.aspx
Additional readings here : CreateProcess from memory buffer and here : How to run unmanaged executable from memory rather than disc