My application returns 0x3 and crashes. I found that it maybe that fstream cannot open the file. Where it should be? I mean, same folder as the application.exe or elsewhere? I am using Code::Blocks
EDIT
Code::Blocks is setting the working directory with the cb runner
GLuint sh;
int meret;
char * s;
std::ifstream fa1 ("vertex.vert",std::ios_base::binary);
fa1.seekg(0, fa1.end);
meret = fa1.tellg();
fa1.seekg(0, fa1.beg);
fa1.read(s,meret);
fa1.close();
//sh = glCreateShader(st);
const char * s1[1] = {s};
std::cout << s;
I think your problem is this line:
fa1.read(s,meret);
Since you've not allocated any space for the array s, attempting to write to it will be bad.
s = new char[ meret ];
Just before the read might fix your issue, as long as the file isn't too big...
NB: you don't need to manually close the ifstream object - it will close when it goes out of scope.
Related
I have written code to read a big size file content and write that content into a new file.
That code works fine with a small and medium size file content but with a big size file content, approximately 1.8GB and above it does not works and gives me an unknown error/exception during runtime.
Also, I have tried to debug and the following is the debugging result:
The code:
char * myClass::getFileContent(const char * fileName) {
std::ifstream file(fileName, std::ios::binary|std::ios::ate);
if (!file.is_open()) {
perror(strerror(errno));
return "";
}
char * strBuffer = NULL;
long long length = file.tellg();
strBuffer = new char[length];
file.seekg(0, std::ios::beg);
file.read(strBuffer, length);
return strBuffer;
}
// The implementation
char *fileName = "C:\\desktop-amd64.iso";
char *fileContent = myClass.getFileContent(fileName);
ofstream file("c:\\file.iso", ios::binary);
if (file.is_open()) {
file.write(fileContent, myClass.getFileSize(fileName));
file.close();
}
delete fileContent;
Note: I am using Visual Studio 2015 on Windows 7 x64.
Why does that problem happen with the big files?
The debugger shows that an std::bad_alloc exception occurs at the line
strBuffer = new char[length];
It seems that you are trying to allocate a block in memory that's size is the same as the file you are trying to read. As the file's size is around 1.8 GB, it is possible that the operating system just cannot allocate a chunk in memory with that size.
I recommend reading this answer about handling huge files without storing all their contents in the memory.
As far i can see. You are trying to release memory with free while you allocate memory with new. You cannot mix them.
To deallocate memory, use delete.
Now, if you need to free it with free, you must allocate memory with malloc.
I have a c++ code running in windows 7 but not works in windows10. It is working in MAC/LINUX.
I am trying to parse a large hex file. My code loads in to array and then applies the business logic to generate the csv.
The size of the file is 2.38GB.
Below is the code.
bool readFile (string filename, char ** buffer ,unsigned int & sizeOfFile )
{
ifstream inFile (filename.c_str (), ios::in | ios::binary);
if (!inFile)
return false;
inFile.seekg (0, ios::end);
size_t size = inFile.tellg ();
inFile.seekg (0, ios::beg);
*buffer = new char[size];
cout<<"\n Length of the ARRAY= "<<size;
inFile.read (*buffer, size);
inFile.close ();
sizeOfFile = size;
cout<<"File successfully read Press Any Key to Continue.. "<<endl;
//getch();
return true;
}
It is failing to load the file into array when I execute it in windows 10 under visual studio 2015 as well as under dev c++. It works perfectly in windows 7.
Wouldn't it be easier to use file mapping?
https://msdn.microsoft.com/en-us/library/windows/desktop/aa366556(v=vs.85).aspx
EDIT:
You mention below that you are using the same code on macOS and Linux. I'm more familiar with these platforms, where I would use mmap(). Effectively this automates the loading of data from the file to memory.
Otherwise, as mentioned above, you need to provide more information on the nature of the problem (32 or 64bit build, type of failure).
size_t size = inFile.tellg ();
...
*buffer = new char[size];
Looks like you're not creating a large enough space for the whole file to be read in. If the file were 10 bytes in size, you'd need to allocate 11 bytes as the null-terminated string needs an extra character at the end to store the '\0' character to terminate it. The fact that it works on some systems, but not others is pure luck as it means that on those systems there happened to be a NUL character at the right point in memory to properly terminate the string.
I need to read all blocks of one large file(about 10GB) sequentially, the file contains many floats with a few strings, like this(each item splited by '\n'):
6.292611
-1.078219E-266
-2.305673E+065
sod;eiwo
4.899747e-237
1.673940e+089
-4.515213
I read MAX_NUM_PER_FILE items each time and process them and write to another file, but i don't know when the ifstream is ended.
Here is my code:
ifstream file_input(path_input); //my file is a text file, but i tried both text and binary mode, both failed.
if(file_input)
{
file_input.seekg(0,file_input.end);
unsigned long long length = file_input.tellg(); //get file size
file_input.seekg(0,file_input.beg);
char * buffer = new char [MAX_NUM_PER_FILE+MAX_NUM_PER_LINE];
int i=1,j;
char c,tmp[3];
while(file_input.tellg()<length)
{
file_input.read(buffer,MAX_NUM_PER_FILE);
j=MAX_NUM_PER_FILE;
while(file_input.get(c)&&c!='\n')
buffer[j++]=c; //get a complete item
//process with buffer...
itoa(i++,tmp,10); //int2char
string out_name="out"+string(tmp)+".txt";
ofstream file_output(out_name);
file_output.write(buffer,j);
file_output.close();
}
file_input.close();
delete[] buffer;
}
My code goes wrong, length is bigger than real file size. I have tried file_input.good() or !file_input.eof(), they didn't work, getline(file_input,s) is good, but it is much slower than read, i want read, but i don't know how to check whether ifstream is end-of-file.
I do my work in WINDOWS 7 with VS2010.
I have searched, but there are not any answer about it, How to open a file using ifstream and keep reading it until the end this link can't answer my question.
Update, Problem solved
Hi everyone, I have figured it out that it's my fault. Both while(file_input.tellg()<length) and while(file_input.peek()!=EOF) work fine! while(file_input.peek()!=EOF) is recommended.
The extra items written after the end-of-file is the left items in buffer written in the last time.
Here is the correct code:
ifstream file_input(path_input);
if(file_input)
{
//file_input.seekg(0,file_input.end);
//unsigned long long length = file_input.tellg(); //get file size
//file_input.seekg(0,file_input.beg);
char * buffer = new char [MAX_NUM_PER_FILE+MAX_NUM_PER_LINE];
int i=1,j;
char c,tmp[3];
while(file_input.peek()!=EOF)
{
memset(buffer,0,sizeof(char)*(MAX_NUM_PER_FILE+MAX_NUM_PER_LINE)); //clear first!
file_input.read(buffer,MAX_NUM_PER_FILE);
j=MAX_NUM_PER_FILE;
while(file_input.get(c)&&c!='\n')
buffer[j++]=c;
itoa(i++,tmp,10);//int2char
string out_name="out"+string(tmp)+".txt";
ofstream file_output(out_name);
file_output.write(buffer,strlen(buffer)); //use the correct buffer size instead of j
file_output.close();
}
file_input.close();
delete[] buffer;
}
while( file_input.peek() != EOF )
{
// code
}
Basically peek() will read the next char without extracting it.
So you can simply compare it to EOF.
I have a problem while reading from a file.
The code below ends with runtime error after like 100 loop, after tracing found that the
mybuff my doesn't reintialize with (mybuff = new char [1024];) because after debugging i still see the prvious message at the end of it.
and the problem happens when I try to fill sendbuff because same issue.
the error saying aboout "Access violation reading location" happens at this step ( sprintf(sendbuff,mybuff ))
any idea how to solve this issue?
char sendbuff[1024];
char * mybuff = new char[];
While(....){
mybuff = new char [1024];
myfile.read(mybuff ,bufsize);
sprintf(sendbuff,mybuff );
ibytessent=0;
tmpCount = strlen(sendbuff);
ibufferlen = strlen(sendbuff);
ibytessent = send(s,sendbuff,ibufferlen,0);
delete [] mybuff ;
}
I think the way you call ifstream::read() is wrong. read() doesnot add the null character at the end for you and you need to check the eofbit and failbit.
Quote from the manual,
The number of characters successfully read and stored by this function
can be accessed by calling member gcount.
I also think the run-time error is caused by the reason about read() function just as above, Also I don't think it's necessary to re-new 1024 bytes space at each iteration, why not reuse the buffer~
By the way, I try to repro your problem, I am not sure if the code below is the same with yours, and I don't get any run-time error
#include <cstdio>
#include <fstream>
using namespace std;
int bufsize = 1024;
int main(){
char sendbuff[1024];
char * mybuff = new char[];
std::ifstream ifs;
ifs.open ("test.txt", std::ifstream::in);
while(1){
mybuff = new char [1024];
ifs.read(mybuff ,bufsize);
sprintf(sendbuff,mybuff );
int ibytessent=0;
int tmpCount = strlen(sendbuff);
int ibufferlen = strlen(sendbuff);
//ibytessent = send(s,sendbuff,ibufferlen,0);
delete [] mybuff ;
}
return 0;
}
In a C/C++ program, is it correct for me to do this:
int i;
FILE **files = malloc(numFiles * sizeof(FILE *));
std::string file("foo"), ext(".bar");
char *num[10];
for (i = 0; i < numFiles; i++) {
files[i] = fopen((file + itoa(i, num, 10) + ext).c_str(), "w");
}
This is basically what I am doing, but I am not getting anything written to the files. They're blank.
EDIT
I have fixed my problem. I thought I might be doing something wrong here, but it turned out to be elsewhere. Thanks for the responses, anyway.
Sure they are blanked, you did not write anything, you simply open the file in writing mode!
You have to use the fwrite or fprintf function to write the data to the file and then close the file with fclose.
You have array of pointers to char. But you need array of char.
char *num[10]; --> char num[10].
I am wondering how it isn't crashing :)