I have a problem while reading from a file.
The code below ends with runtime error after like 100 loop, after tracing found that the
mybuff my doesn't reintialize with (mybuff = new char [1024];) because after debugging i still see the prvious message at the end of it.
and the problem happens when I try to fill sendbuff because same issue.
the error saying aboout "Access violation reading location" happens at this step ( sprintf(sendbuff,mybuff ))
any idea how to solve this issue?
char sendbuff[1024];
char * mybuff = new char[];
While(....){
mybuff = new char [1024];
myfile.read(mybuff ,bufsize);
sprintf(sendbuff,mybuff );
ibytessent=0;
tmpCount = strlen(sendbuff);
ibufferlen = strlen(sendbuff);
ibytessent = send(s,sendbuff,ibufferlen,0);
delete [] mybuff ;
}
I think the way you call ifstream::read() is wrong. read() doesnot add the null character at the end for you and you need to check the eofbit and failbit.
Quote from the manual,
The number of characters successfully read and stored by this function
can be accessed by calling member gcount.
I also think the run-time error is caused by the reason about read() function just as above, Also I don't think it's necessary to re-new 1024 bytes space at each iteration, why not reuse the buffer~
By the way, I try to repro your problem, I am not sure if the code below is the same with yours, and I don't get any run-time error
#include <cstdio>
#include <fstream>
using namespace std;
int bufsize = 1024;
int main(){
char sendbuff[1024];
char * mybuff = new char[];
std::ifstream ifs;
ifs.open ("test.txt", std::ifstream::in);
while(1){
mybuff = new char [1024];
ifs.read(mybuff ,bufsize);
sprintf(sendbuff,mybuff );
int ibytessent=0;
int tmpCount = strlen(sendbuff);
int ibufferlen = strlen(sendbuff);
//ibytessent = send(s,sendbuff,ibufferlen,0);
delete [] mybuff ;
}
return 0;
}
Related
So I have this strange "invalid null pointer" exception in this code (cut down to the core of the problem)
#include <fstream>
#include <iostream>
#include <iomanip>
int main(){
std::ifstream input;
std::ofstream output;
unsigned __int16 currentWord = 0;
output.open("log.txt");
input.open("D:\\Work\\REC022M0007.asf", std::ios::binary);
input.seekg(0, input.end);
int length = input.tellg();
input.seekg(0, input.beg);
for (int i = 0; i < length;){
int numData = 0;
input.read(reinterpret_cast<char *>(currentWord), sizeof(currentWord));
}
input.close();
output.close();
return 0;
}
The line that gets me exception is
input.read(reinterpret_cast<char *>(currentWord), sizeof(currentWord));
and it do so on the very first go-through, so it's not like I was trying to read file further, than it is.
When I try to change value of currentWord to 1 I get exception like
trying to write to memory 0x0000001
or smth along the lines (number of zeroes may not be correct)
Web search told me it have something to do either with file being empty (which is not the case), not found (not the case either, as length gets value) or something in type casting mumbo-jumbo. Any suggestions?
'currentWord' is ZERO (0) and when reinterpret_cast<char*>(currentWord) is executed it makes it a nullptr so when you try to write to the null address you'll get a memory write protection error.
change it to reinterpret_cast<char*>(¤tWord) (note the &)
You are using the value of current_word as though it were a pointer. It isn't. That's why you get a null pointer address when current_word is zero and you get a memory exception at address 1 when current_word is 1. Use ¤t_word as the first argument to input.read().
I've been having a problem that I not been able to solve as of yet. This problem is related to reading files, I've looked at threads even on this website and they do not seem to solve the problem. That problem is reading files that are larger than a computers system memory. Simply when I asked this question a while ago I was referred too using the following code.
string data("");
getline(cin,data);
std::ifstream is (data);//, std::ifstream::binary);
if (is)
{
// get length of file:
is.seekg (0, is.end);
int length = is.tellg();
is.seekg (0, is.beg);
// allocate memory:
char * buffer = new char [length];
// read data as a block:
is.read (buffer,length);
is.close();
// print content:
std::cout.write (buffer,length);
delete[] buffer;
}
system("pause");
This code works well apart from the fact that it eats memory like fat kid in a candy store.
So after a lot of ghetto and unrefined programing, I was able to figure out a way to sort of fix the problem. However I more or less traded one problem for another in my quest.
#include <iostream>
#include <vector>
#include <string>
#include <fstream>
#include <stdio.h>
#include <stdlib.h>
#include <iomanip>
#include <windows.h>
#include <cstdlib>
#include <thread>
using namespace std;
/*======================================================*/
string *fileName = new string("tldr");
char data[36];
int filePos(0); // The pos of the file
int tmSize(0); // The total size of the file
int split(32);
char buff;
int DNum(0);
/*======================================================*/
int getFileSize(std::string filename) // path to file
{
FILE *p_file = NULL;
p_file = fopen(filename.c_str(),"rb");
fseek(p_file,0,SEEK_END);
int size = ftell(p_file);
fclose(p_file);
return size;
}
void fs()
{
tmSize = getFileSize(*fileName);
int AX(0);
ifstream fileIn;
fileIn.open(*fileName, ios::in | ios::binary);
int n1,n2,n3;
n1 = tmSize / 32;
// Does the processing
while(filePos != tmSize)
{
fileIn.seekg(filePos,ios_base::beg);
buff = fileIn.get();
// To take into account small files
if(tmSize < 32)
{
int Count(0);
char MT[40];
if(Count != tmSize)
{
MT[Count] = buff;
cout << MT[Count];// << endl;
Count++;
}
}
// Anything larger than 32
else
{
if(AX != split)
{
data[AX] = buff;
AX++;
if(AX == split)
{
AX = 0;
}
}
}
filePos++;
}
int tz(0);
filePos = filePos - 12;
while(tz != 2)
{
fileIn.seekg(filePos,ios_base::beg);
buff = fileIn.get();
data[tz] = buff;
tz++;
filePos++;
}
fileIn.close();
}
void main ()
{
fs();
cout << tmSize << endl;
system("pause");
}
What I tried to do with this code is too work around the memory issue. Rather than allocating memory for a large file that simply does not exist on a my system, I tried to use the memory I had instead which is about 8gb, but I only wanted to use maybe a few Kilobytes of it if at all possible.
To give you a layout of what I am talking about I am going to write a line of text.
"Hello my name is cake please give me cake"
Basically what I did was read said piece of text letter by letter. Then I put those letters into a box that could store 32 of them, from there I could use something like xor and then write them onto another file.
The idea in a way works but it is horribly slow and leaves off parts of files.
So basically how can I make something like this work without going slow or cutting off files. I would love to see how xor works with very large files.
So if anyone has a better idea than what I have, then I would be very grateful for the help.
To read and process the file piece-by-piece, you can use the following snippet:
// Buffer size 1 Megabyte (or any number you like)
size_t buffer_size = 1<<20;
char *buffer = new char[buffer_size];
std::ifstream fin("input.dat");
while (fin)
{
// Try to read next chunk of data
fin.read(buffer, buffer_size);
// Get the number of bytes actually read
size_t count = fin.gcount();
// If nothing has been read, break
if (!count)
break;
// Do whatever you need with first count bytes in the buffer
// ...
}
delete[] buffer;
The buffer size of 32 bytes, as you are using, is definitely too small. You make too many calls to library functions (and the library, in turn, makes calls (although probably not every time) to OS, which are typically slow, since they cause context-switching). There is also no need of tell/seek.
If you don't need all the file content simultaneously, reduce the working set first - like a set of about 32 words, but since XOR can be applied sequentially, you may further simplify the working set with constant size, like 4 kilo-bytes.
Now, you have the option to use file reader is.read() in a loop and process a small set of data each iteration, or use memmap() to map the file content as memory pointer which you can perform both read and write operations.
My goal with my constructor is to:
open a file
read into everything that exists between a particular string ("%%%%%")
put together each read row to a variable (history)
add the final variable to a double pointer of type char (_stories)
close the file.
However, the program crashes when I'm using strcat. But I can't understand why, I have tried for many hours without result. :/
Here is the constructor code:
Texthandler::Texthandler(string fileName, int number)
: _fileName(fileName), _number(number)
{
char* history = new char[50];
_stories = new char*[_number + 1]; // rows
for (int j = 0; j < _number + 1; j++)
{
_stories[j] = new char [50];
}
_readBuf = new char[10000];
ifstream file;
int controlIndex = 0, whileIndex = 0, charCounter = 0;
_storieIndex = 0;
file.open("Historier.txt"); // filename
while (file.getline(_readBuf, 10000))
{
// The "%%%%%" shouldnt be added to my variables
if (strcmp(_readBuf, "%%%%%") == 0)
{
controlIndex++;
if (controlIndex < 2)
{
continue;
}
}
if (controlIndex == 1)
{
// Concatenate every line (_readBuf) to a complete history
strcat(history, _readBuf);
whileIndex++;
}
if (controlIndex == 2)
{
strcpy(_stories[_storieIndex], history);
_storieIndex++;
controlIndex = 1;
whileIndex = 0;
// Reset history variable
history = new char[50];
}
}
file.close();
}
I have also tried with stringstream without results..
Edit: Forgot to post the error message:
"Unhandled exception at 0x6b6dd2e9 (msvcr100d.dll) in Step3_1.exe: 0xC00000005: Access violation writing location 0c20202d20."
Then a file named "strcat.asm" opens..
Best regards
Robert
You've had a buffer overflow somewhere on the stack, as evidenced by the fact one of your pointers is 0c20202d20 (a few spaces and a - sign).
It's probably because:
char* history = new char[50];
is not big enough for what you're trying to put in there (or it's otherwise not set up correctly as a C string, terminated with a \0 character).
I'm not entirely certain why you think multiple buffers of up to 10K each can be concatenated into a 50-byte string :-)
strcat operates on null terminated char arrays. In the line
strcat(history, _readBuf);
history is uninitialised so isn't guaranteed to have a null terminator. Your program may read beyond the memory allocated looking for a '\0' byte and will try to copy _readBuf at this point. Writing beyond the memory allocated for history invokes undefined behaviour and a crash is very possible.
Even if you added a null terminator, the history buffer is much shorter than _readBuf. This makes memory over-writes very likely - you need to make history at least as big as _readBuf.
Alternatively, since this is C++, why don't you use std::string instead of C-style char arrays?
I am trying to constantly read data into a buffer of type unsigned char* from different files. However, I can't seem to set the buffer to NULL prior to reading in the next file.
Here is only the relevant code:
#include <stdio.h>
#include <fstream>
int
main (int argc, char** argv) {
FILE* dataFile = fopen("C:\\File1.txt", "rb");
unsigned char *buffer = NULL;
buffer = (unsigned char*)malloc(1000);
fread(buffer,1,1000,dataFile);
fclose(dataFile);
dataFile = fopen("C:\\File2.txt", "rb");
buffer = NULL;
fread(buffer,1,1000,dataFile);
fclose(dataFile);
system("pause");
return 0;
}
The error I run into is at the second occurrence of this line: fread(buffer,1,1000,dataFile);
The error I get is:
Debug Assertion Failed!
Expression: (buffer != NULL)
It points me to Line 147 of fread.c which is basically:
/* validation */
_VALIDATE_RETURN((buffer != NULL), EINVAL, 0);
if (stream == NULL || num > (SIZE_MAX / elementSize))
{
if (bufferSize != SIZE_MAX)
{
memset(buffer, _BUFFER_FILL_PATTERN, bufferSize);
}
_VALIDATE_RETURN((stream != NULL), EINVAL, 0);
_VALIDATE_RETURN(num <= (SIZE_MAX / elementSize), EINVAL, 0);
}
I did Google for ways to get the buffer pointer to NULL and tried the various suggestions, but none seem to work. Anyone can clarify what is the right way to set it to NULL?
Your buffer is a pointer.
When you do this:
buffer = (unsigned char*)malloc(1000);
you allocate some space in memory, and assign its starting position to buffer. Remember, buffer holds the address of the beginning of the space, that's all. When you do this:
buffer = NULL;
you have thrown away that address.
EDIT:
C++ style, without dynamic memory:
#include <fstream>
using std:: string;
using std:: ifstream;
void readFromFile(string fname)
{
char buffer[1000];
ifstream fin(fname.c_str());
fin.read(buffer, sizeof(buffer));
// maybe do things with the data
}
int main ()
{
readFromFile("File1.txt");
readFromFile("File2.txt");
return 0;
}
There's no need to erase the contents of the buffer. If the cost of allocating and deallocating the buffer with each call is too much, just add static:
static char buffer[1000];
It will be overwritten each time.
You can't say buffer = NULL because fread wil try to dereference it. Dereferencing NULL is one of the things that are certainly and completely illegal in C++. In effect you're losing what you got from malloc. Perhaps you're looking for memset and trying to zero the buffer:
memset(buffer, 0, 1000);
However, you don't need to do this before calling fread. There's simply no reason since fread will write the buffer anyway: it doesn't care if it's zeroed or not.
As a side note: you're writing very C-ish code in what I suspect is C++ (given your fstream header). There are better-suited I/O options for C++.
I'm trying to write to a file and i get a segmentation fault when i delete the allocated memory. I don't understant what is the problem, please help:
void writeToLog(string msg) {
int len = msg.size()+1;
char *text = new char(len);
strcpy(text,msg.c_str());
char* p = text;
for(int i=0; i<len; i++){
fputc(*p, _log) ;
p++;
}
delete[] text; //THIS IS WHERE IT CRASHES
}
I also tried without the [ ] but then i get
*** glibc detected *** ./s: free(): invalid next size (fast): 0x09ef7308 ***
So what is the problem?
Thanks!
This:
char *text = new char(len);
should be:
char *text = new char[len + 1];
And this is all unnecessary anyway. why are you doing it?
Well, delete[] doesn't balance new char(N), it balances new char[N]. The former creates a pointer to a single char and gives it the value N; the latter creates a pointer to an array of char with length N, and leaves the values indefined.
Of course, to write a std::string to a FILE *, why not just do:
fwrite(msg.c_str(), sizeof(char), msg.size() + 1, _log);
Note that preserves the trailing null character; so does your original code.
char *text = new char(len);
allocates just one char. Try with:
char *text = new char[len];
Try this:
char *text = new char[len];
Then:
delete[] text;
Although the technical issue has been answer (mismatched new/delete pair), I still think you could benefit from some help here. And I thus propose to help you trim your code.
First: there would not be any issue if you simply did not perform a copy.
void writeToLog(string msg) {
typedef std::string::const_iterator iterator;
for(iterator it = msg.begin(), end = msg.end(); it != end; ++it) {
fputc(*it, _log) ;
}
}
Note how I reworked the code to use C++ iterators instead of a mix of pointers and indices.
Second: what is this fputc call ?
You should not need to use a FILE* in your code. If you do, you are likely to get it wrong too and forget to close it, or close it twice etc...
The Standard Library provides the Streams collection to handle input and output, and for a log file the ofstream class seems particularly adapted.
std::ofstream _log("myLogFile");
void writeToLog(std::string const& msg) { // by reference (no copy)
_log << msg;
}
Note how it is much simpler ? And you cannot forget to close the file either, because if you do forget, then it'll be closed when _log is destructed anyway.
Of course at this point one might decide that it is superflous to have a function. However such a function allows you to prefix the message, typically with timestamps / PID / Thread ID or other decorations, so it's still nice.