fopen mlock access violation - c++

I'm repeatedly getting the same error exception for the following method.
Unhandled exception at 0x77a8f4e1 in AST.exe: 0xC0000005: Access violation reading location 0x29919ed9.
bool package::write(char * buf, size_t size, const char *fname)
{
//makeDir(fname);
FILE * output = fopen(fname, "wb");//break point
if (output == NULL)//break point
{
perror("ERROR: ");
return false;
}
fwrite(buf, size, sizeof(char), output);
fclose(output);
return true;
}
It has something to do with the fopen, I know that because of breakpoints. But it only gets the exception the four time it's used, no matter what I do. I've changed the the fname repeatedly, but it always crashes the forth time it's used. And for some reason after I click "Break", I end up at the 345th line of mlock.c.
I'd really appreciate any help in fixing this really annoying headache of an error.

Here is a simple complete example written in C that can be trivially converted to C++ showing the correct way to write your function.
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
int write( char * buffer, size_t bytes, const char * name )
{
FILE * output = fopen( name, "wb");
if ( !output )
{
perror( "ERROR" );
return 0;
}
fwrite( buffer, sizeof( char ), bytes, output);
fclose( output );
return 1;
}
int main( )
{
write( "Hello world!", strlen( "Hello world!" ), "output.txt" );
return 0;
}
You were calling fwrite() with the wrong order of arguments. The 2nd parameter is the size in bytes of each element to be written to your buffer. The 3rd parameter is the total number of elements to be written to the buffer. For reference, see fwrite().
You should specify an extension when creating a file with fwrite(). Otherwise, the system will create a file of generic type.
It isn't necessary to add a colon and a space after the string passed as an argument to perror() The function does this for you. For reference, see perror()

Break the problem down.
Write a program that calls your function four times in a row. If that does not fail then you know the problem is somewhere else.
You can either add more of your original program to the test program, or begin subtracting pieces of your original program. Back it up first, or commit it to source code control.
When you have things that act so strangely like this seems to be doing, it is very likely that the problem is somewhere completely different. Your program might be writing into memory that it doesn't own. For example, if part of the program holds a pointer to an object and that object gets deleted, and then mlock allocates and reuses that memory AND THEN the program uses that old pointer, it would be writing over mlock information, causing the crash.

Related

a write function throwing the exception of read access violation in visual studio c++

I tried writing to a file and I am unable to write to it due to "Access violation reading location 0x0000000F" I am able to isolate the problem here is the sample code:
void test_1() {
std::fstream fio{ "xyz.dat",std::ios::in | std::ios::out | std::ios::binary | std::ios::app };
if (!fio) {
std::cerr << "sorry no file";
return;
}
std::string s_test{ "xyz hii \n workf" };
fio.write( ( char* )(s_test.length()), sizeof( size_t ) ); //the write causing issue
}
( char* )(s_test.length()) is treating the length of the string as though it's a pointer and passing address 15 into write. Since there is no valid character at address 15, this triggers Undefined Behaviour, and the behaviour in this case is the program crashes.
This is always a problem when forced to use such a wide cast to force re-interpretation of a type. You can screw up horribly and all the compiler's defenses have been turned off. I don't have a good solution for this.
You need to pass in a legitimate address containing the length for write to operate on. To get this, you'll need to create a variable you can take the address of. &s_test.length(); isn't good enough here because you cannot take the address of a prvalue returned by a function.
auto len = s_test.length();
fio.write( reinterpret_cast<const char*>(&len), sizeof( len ) );
Note that writing a variable of automatically deduced type or a variable of a type that can change between compiler implementations is risky. It's hard to be sure how many bytes you're going to need to read at the other side.
uint32_t len = x.length();
Would be safer, and probably more compact, but at the risk of overflow with strings greater than 4.4 billion characters in length. That's a risk I'm willing to put up with.
Another concern is endian. It's not as common a problem as it used to be, but both the writer and reader need to agree on the byte order of the integer. htonl and ntohl can help mitigate this threat by guaranteeing a byte order.
Assuming that you are trying to write the length of the string to your output file, you can do it like this:
size_t len = s_test.length();
fio.write( ( const char * ) &len, sizeof( size_t ) );

Converting file to char* Issue

I am a beginner at C++ programming, and I encountered an issue.
I want to be able to convert the contents of a file to a char*, and I used file and string streams. However, it's not working.
This is my function that does the work:
char* fileToChar(std::string const& file){
std::ifstream in(file);
if (!in){
std::cout << "Error: file does not exist\n";
exit(EXIT_FAILURE);
}
std::stringstream buffer;
buffer << in.rdbuf() << std::flush;
in.close();
return const_cast<char *>(buffer.str().c_str());
}
However, when I test the method out by outputting its contents into another file like this:
std::ofstream file("test.txt");
file << fileToChar("fileTest.txt");
I just get tons of strange characters like this:
îþîþîþîþîþîþîþîþîþîþîþîþîþîþîþîþîþ[...etc]
What exactly is going on here? Is there anything I missed?
And if there's a better way to do this, I would be glad to know!
return const_cast<char *>(buffer.str().c_str());
returns a pointer to the internal char buffer of a temporary copy of the internal buffer of the local stringstream. Long story short: As soon as you exit the function, this pointer points to garbage.
Btw, even if that was not a problem, the const_cast would be dangerous nonsense, you are not allowed to write through the pointer std::string::c_str returns. Legitimate uses of const_cast are extremely rare.
And for the better way: The best and easiest way would be returning std::string. Only if this is not allowed, a std::vector<char> (preferred) or new char[somelength] (frowned on) would be viable solutions.
char* fileToChar(std::string const& file){
This line already shows that something is going into the wrong direction. You return a pointer to some string, and it's completely unclear to the user of the function who is responsible for releasing the allocated memory, if it has to be released at all, if nullptr can be returned, and so on.
If you want a string, then by all means use std::string!
std::string fileToChar(std::string const& file){
return const_cast<char *>(buffer.str().c_str());
Another line that should make all alarms go off. const_cast is always a workaround to some underlying problem (or some problem with external code).
There is usually a good reason why something is const. By forcing the compiler to turn off the security check and allowing it to attempt modifications of unmodifiable data, you typically turn compilation errors into hard-to-diagnose run-time errors.
Even if this function worked correctly, any attempt to modify the result would be undefined behaviour:
char* file_contents = fileToChar("foo.txt");
file_contents[0] = 'x'; // undefined behaviour
But it does not work correctly anyway. buffer.str() returns a temporary std::string object. c_str() returns a pointer to that temporary object's internally managed memory. The object's lifetime ends when the full expression return const_cast<char *>(buffer.str().c_str()) has been evaluated. Using the resulting pointer is therefore undefined behaviour, too.
The problems sound complicated, but the fix is easy. Make the function return std::string and turn the last statement into return buffer.str();.
If your question is, how to read the content of a file into an buffer, consider my following suggestion. But take care that buffer is big enough for the file content. A file size check and preallocation of the memory is advised before calling fileToChar().
bool fileToChar(std::string const& file, char* buffer, unsigned int &buffer_size )
{
FILE *f = fopen( file.c_str(), "rb" );
if( f == nullptr )
{
return false;
}
fseek(f , 0, SEEK_END );
const int size = ftell( f );
rewind( f );
fread( buffer, 1, size, f );
fclose( f );
return true;
}

File loader problems

i have a text file which contains authors and books lists, i need to load it to my program, here is the code of the method which should load it:
void Loader::loadFile(const char* path)
{
FILE* file = fopen(path, "r");
char* bufferString;
while (feof(file) != 1) {
fgets(bufferString, 1000, file);
printf("%s", bufferString);
}
}
I use it in my main file:
int main(int argc, char** argv) {
Loader* loader = new Loader();
loader->loadFile("/home/terayon/prog/parser/data.txt");
return 0;
}
And I get data.txt file is not completely printed.
What I should do to get data completed?
fgets reads into the memory pointed to by the pointer passed as first parameter, bufferString on your case.
But your bufferString is an uninitialised pointer (leading to undefined behaviour):
char * bufferString;
// not initialised,
// and definitely not pointing to valid memory
So you need to provide some memory to read into, e.g by making it an array:
char bufferString[1000];
// that's a bit large to store on the stack
As a side note: Your code is not idiomatic C++. You're using the IO functions provided by the C standard library, which is possible, but using the facilities of the C++ STL would be more appropriate.
You have undefined behavior, you have a pointer bufferString but you never actually make int point anywhere. Since it's not initialized its value will be indeterminate and will seem to be random, meaning you will write to unallocated memory in the fgets call.
It's easy to solve though, declare it as an array, and use the array size when calling fgets:
char bufferString[500];
...
fgets(bufferString, sizeof(bufferString), file);
Besides the problem detailed above, you should not do while(!feof(file)), it will not work as you expect it to. The reason is that the EOF flag is not set until you try to read from beyond the file, leading the loop to iterate once to many.
You should instead do e.g. while (fgets(...) != NULL)
The code you have is not very C++-ish, instead it's using the old C functions for file handling. Instead I suggest you read more about the C++ standard I/O library and std::string which is a auto-expanding string class that won't have the limits of C arrays, and won't suffer from potential buffer overflows in the same way.
The code could then look something like this
std::ifstream input_file(path);
std::string input_buffer;
while (std::getline(input_file, input_buffer))
std::cout << input_buffer << '\n';

Memory Managing C

I wrote a function, using the C header stdio.h that returns content of a file (text or html). Can anyone please go through it and suggest if I have done the memory management efficiently. I shall be so pleased to hear better suggestions that I can improve my codes.
char *getFileContent(const char *filePath)
{
//Prepare read file
FILE *pReadFile;
long bufferReadSize;
char *bufferReadFileHtml;
size_t readFileSize;
char readFilePath[50];
sprintf_s(readFilePath, "%s", filePath);
pReadFile = fopen (readFilePath, "rb");
if (pReadFile != NULL)
{
// Get file size.
fseek (pReadFile , 0 , SEEK_END);
bufferReadSize = ftell (pReadFile);
rewind (pReadFile);
// Allocate RAM to contain the whole file:
bufferReadFileHtml = (char*) malloc (sizeof(char) * bufferReadSize);
if (bufferReadFileHtml != NULL)
{
// Copy the file into the buffer:
readFileSize = fread (bufferReadFileHtml, sizeof(char), bufferReadSize, pReadFile);
if (readFileSize == bufferReadSize)
{
return bufferReadFileHtml;
} else {
char errorBuffer[50];
sprintf_s(errorBuffer, "Error! Buffer overflow for file: %s", readFilePath);
}
} else {
char errorBuffer[50];
sprintf_s(errorBuffer, "Error! Insufficient RAM for file: %s", readFilePath);
}
fclose (pReadFile);
free (bufferReadFileHtml);
} else {
char errorBuffer[50];
sprintf_s(errorBuffer, "Error! Unable to open file: %s", readFilePath);
}
}
This looks like a C program, not a C++ program. While it will compile using most C++ compilers, it doesn't take advantage of any C++ features (e.g. new/new[], delete/delete[], explicit casting, stream operators, strings, nullptr etc.)
Your code almost looks like a safe C function, although I think sprintf_s is a Microsoft-only function, and thus probably won't compile using GCC, Clang, Intel, etc. as it isn't a part of the standard.
Your function should also return a value at all times. Turn compiler warnings on to catch these kinds of things; they make debugging a lot easier :)
There's not much to be said without knowing how you will be using the buffer you've created. Here are some possible considerations:
1) When you are reading the file into a buffer, your processor is doing nothing else for this program. It might be better to read and begin analyzing the already read portion in parallel.
2) If you need really fast-efficient-low memory file IO, consider converting your program to a state machine and forget about the buffer altogether.
3) If you don't really have a very demanding application, you are killing yourself by writing in C. C#, python, etc-- almost any other language has better string manipulation libraries.
Btw, you should use snprintf for portability and safety, as others have pointed out.

c++, reading files, segmentation fault

I was working on this function read. The main I used has no problem in file I/O, it connects fine, closes, the files are okay too. However, I am getting a segmentation fault by the end of the reading. I have tried printing out for testing, and the error is reading the last line. It finishes reading the last line for string a, and then x, and then in.good() becomes false too. I have tried resetting in.clear(), also, setting the string a=""; if in.good becomes false. Nothing is working.
read(istream& in){
string a;
int x;
in>>a;
while( in.good() ){
in>>x;
char *ch;
strcpy( ch, a.c_str() );
Word cwd(ch);
anObject.add(cwd,x);
}
}
You see a segfault because you're not allocating space for ch, and then you're attempting to copy a string over it. ch is an uninitialized memory address that doesn't belong to you.
You'll need to allocate space for the string:
char *ch = new[(MAX_SIZE + 1) * sizeof(char)];
But why is it that you need a char * here? Note that you can always pass around a and use a.c_str() if you must have a C string. I'm not sure what Word is, or if it needs it's own copy of a string, but can you use: Word cwd(a.c_str())?
It seems you don't allocate memory storage for char *ch. The moment you define this variable, it is a random value on the stack. Writing a random memory will corrupt the memory and cause seg fault when cleanup the memory(both manually or automatically at function return).