I am getting heap corruption error while trying to free memory with delete
Here's code
char** split(char* inputstr, char delim, int& count){
char** ostr=NULL;
int numStr = 0;
int i=0,j,index=0;
while(inputstr[i]){
if(inputstr[i++]==delim)
numStr++;
}
if(inputstr[i-1]!=delim)
numStr++;
count= numStr;
ostr = new char*[numStr];
i=0;
while(inputstr[i])
{
j=i;
while(inputstr[j] && inputstr[j] != delim)
j++;
ostr[index] = new char[j-i+1];
//istr[j] = 0;
strncpy(ostr[index], inputstr+i,j-i);
ostr[index++][j-i]=0;
i=j+1;
}
return ostr;
}
for(int i=0,countStr;i<_numComp;i++){
char** _str = split(str[1+i],':',countStr);
message.lastTransList.cmpName[i] = new char[strlen(_str[0])+1];
strcpy(message.lastTransList.cmpName[i],_str[0]);
message.lastTransList.price[i] = atof(_str[1]);
for(int i=0; i<countStr;i++)
{
delete[] _str[i]; //this is working fine
_str[i] = 0;
}
delete[] _str; //exception is thrown at this line
}
I am not able to find the problem. Please help !
It's hard to see any error, there could be something wrong with your indexing that's causing a buffer overrun in the split function that's caught only when you try to delete the char** array.
How about converting to std::string and std::vectors like carlpett recommends (it's a good recommendation).
something like this:
void split(const std::string& str_, char delimiter_, std::vector<std::string>& result_)
{
std::string token;
std::stringstream stream(str_);
while( std::getline(stream, token, delimiter_) ) result_.push_back(token);
}
Then, you just call it with your string, delimiter and an empty std::vector and end up with a populated vector of substrings. You don't have to use new/delete and worry about the memory issues.
Related
Here is a problem. When I try to convert it by using strncpy_s, array has some type of "trash data" from memory in the end of it. Even when I fill buffer with "\0". How to convert it clear?
typedef class Ryadok {
private:
int LengthOf = 0;
char text[20];
string* address;
public:
Ryadok(string strin) {
this->text[0] = '\0';
memset(text, '\0', sizeof(text));
strncpy_s(text, strin.c_str(), sizeof(text) - 1);
this->address = &strin;
for (int i = 0; i < sizeof(strin); i++) {
cout << this->text[i];
}
}
~Ryadok() {
}
}*cPtr;
int main()
{
Ryadok example("sdsdfsdf");
}
The idea to use c_str() function to convert the std::string to a a-string. Then we can simply call strcpy() function to copu the c-string into char array
std::string s = "Hello World!";
char cstr[s.size() + 1];
strcpy(cstr, s.c_str()); // or pass &s[0]
std::cout << cstr << '\n';
return 0;
When using the strncpy_s function you tell it to copy as many chars as will fit into your buffer "text". Since the string you create the "example" instance with is shorter, the copy function will keep going after the end of the actual string.
That is where your garbage comes from. Even worse you risk a Segmentation Fault this way. Your code might access parts of the RAM it is not allowed to read from. That will cause it to crash.
You are right though to copy the data pointed to by the return of c_str(). The pointer returned by c_str() points to data that belongs to the std::string object and might be changed or even invalidated by that object. (Read more here)
Here's a modified version of your code that should avoid the garbage:
typedef class Ryadok {
private:
int LengthOf = 0;
char text[20];
string* address;
public:
Ryadok(string strin) {
this->text[0] = '\0';
memset(text, '\0', sizeof(text));
if(strin.length()+1 <= sizeof(text)) {
strncpy_s(text, strin.c_str(), strin.length()+1);
} else {
//some error handling needed since our buffer is too small
}
this->address = &strin;
for (int i = 0; i < sizeof(strin); i++) {
cout << this->text[i];
}
}
~Ryadok() {
}
}*cPtr;
int main()
{
Ryadok example("sdsdfsdf");
}
I'm studying C/C++ and the exercise I'm doing is to create a program which evaluates an arithmetic expression.
To complete the exercise, I need a general purpose function which is able to tokenize a string.
As the size of the string to parse is not known at compile time, I have to allocate dynamically some data in the heap.
After the work is done, the memory in the heap can be released.
My question is simple: I'm releasing the memory correctly? See the questions in the comments.
Tokenize function
char** Tokenize(const char delimiter, const char* string)
{
const char* pString = string;
char** tokens = new char*[strlen(string)];
char* buffer = new char[strlen(string)];
char* pointer = buffer;
int c = 0;
for (int k = 0; k < strlen(string); k++)
{
if (string[k] == delimiter)
{
buffer[k] = '\0';
tokens[c] = pointer;
pointer = buffer + k + 1;
c++;
continue;
}
buffer[k] = string[k];
}
tokens[c] = nullptr;
return tokens;
}
The main function which tests Tokenize function and relases the heap.
int main()
{
char** tokens = Tokenize('.', "192.168.1.1");
char** startTokensPointer = tokens;
char* startBufferPointer = *tokens;
while (*tokens != nullptr)
{
cout << *tokens << endl;
tokens++;
}
delete[] startTokensPointer; //Releases tokens??
delete[] startBufferPointer; //Releases buffer??
system("PAUSE");
}
You are not deallocating buffer correctly. If none of the chars in string is equal to delimiter the code in this if statement :
if (string[k] == delimiter)
will never be executed and c will remain 0. Then this line :
tokens[c] = nullptr;
will set the first element of tokens that is stored in startBufferPointer to nullptr. In that case you are leaking buffer as the pointer to buffer is "forgotten" in main.
tokens is deallocated correctly in all cases.
Yes, there is no memory leak, but why not use a type that makes it guaranteed?
struct Tokens
{
explicit Tokens(size_t len) : tokens(new char*[len]), buffer(new char[len])
{ }
std::unique_ptr<char*[]> tokens;
std::unique_ptr<char[]> buffer;
};
Tokens Tokenize(const char delimiter, const char* string)
{
auto len = strlen(string);
Tokens result(len);
char* pointer = result.buffer.get();
int c = 0;
for (size_t k = 0; k < len; k++)
{
if (string[k] == delimiter)
{
result.buffer[k] = '\0';
result.tokens[c] = pointer;
pointer = result.buffer.get() + k + 1;
c++;
continue;
}
result.buffer[k] = string[k];
}
result.tokens[c] = nullptr;
return result;
}
int main()
{
auto tok = Tokenize('.', "192.168.1.1");
char** tokens = tok.tokens.get();
while (*tokens != nullptr)
{
cout << *tokens << endl;
tokens++;
}
}
Now all the memory is managed automatically and it's almost impossible to leak.
I have a problem with my code and i dont know whats wrong :(
I want to convert a string to a char* array and print it out at the end.
The output is currently 3 times the last word of the sentence i entered.
void parse(std::string &s, char **argv)
{
std::istringstream iss(s);
std::string tmp;
while(iss >> tmp)
{
*argv++ = (char*) tmp.c_str();
}
}
int main()
{
std::string input;
while (1)
{
std::getline(std::cin, input);
int argCount = countArgs(input);
char *argv[argCount];
parse(input, argv);
for(int i=0; i<argCount; ++i)
{
std::cout << argv[i] << std::endl;
}
}
return 0;
}
I won't get into details why your current code does not work, because what you are doing is basically not-really-safe/good/sane thing. Please rethink your approach.
Why use char* if you can use string?
If possible, instead of
char *argv[argCount];
...
void parse(std::string &s, char **argv)
...
*argv++ = (char*) tmp.c_str();
use
string argv[argCount];
...
void parse(std::string &s, string* argv)
...
*argv++ = tmp;
it will work almost identically to what you have now, "just" using a different datatype.
However, note that this all will still not be really C++'y, as you are using string argv[argCount]; with non-constant argCount. I'm suprised your compiler actually compiles it, you must be using some C-compliance compilation flag. You should be using std::vector or std::list to keep an "array of variable length".
You can avoid the issues mentioned in the comments by calling strdup() before saving your char *:
void parse(std::string &s, char **argv)
{
std::istringstream iss(s);
std::string tmp;
while(iss >> tmp)
{
*argv++ = strdup( tmp.c_str() );
}
}
You'll need to free (not delete()) these copies later.
for (int i = 0; i < argCount; ++i)
{
std::cout << argv[i] << std::endl;
free( argv[i] );
}
I am trying to replace arrays with vectors but I can't figure out how.
Replace this function to dynamically allocate memory for vectors:
string readFile(string filename, string** list, int size){
*list = new string[size];
ifstream file(filename);
string line;
for (int i = 0; i < size; i++){
getline(file, line);
*(*list + i) = line;
}
file.close();
return **list;
}
And here's my attempt to change it to vectors with no luck. Any feedback is greatly appreciated:
string processFile(string filename, vector<string>** list, int size){
*list = new vector<string>(size);
ifstream file(filename);
string line;
for (int i = 0; i < size; i++){
getline(file, line);
*list[i] = line; // error
}
file.close();
return **list; // error
}
You will need some proper error handling, but basically, you need neither pointers nor fixed sizes if you use containers:
std::vector<std::string> readLinesFromFile(const std::string& filename)
{
std::vector<std::string> result;
std::ifstream file(filename);
for (std::string line; std::getline(file, line); )
{
result.push_back(line);
}
return result;
}
There are several problems:
You don't need to use vector**, vector is equivalent to the list in previous code.
The return type is string, but you are returning vector**
This code should work, not tested though:
void processFile(string filename, vector<string>& list, int size){
//list = new vector<string>(size); // no need if you get a vector reference
ifstream file(filename);
string line;
for (int i = 0; i < size; i++){
getline(file, line);
list.push_back(line); //the error was because you are assigning string to a vector<string>*
}
file.close();
// you dont have to return, as vector is passed by reference
}
If you still need to use pointer of vector
void processFile(string filename, vector<string>** list, int size){
*list = new vector<string>(size); // bad practice
ifstream file(filename);
string line;
for (int i = 0; i < size; i++){
getline(file, line);
(*list)->push_back(line);
}
file.close();
// you dont have to return, as vector is passed by pointer
}
Change *list[i] = line to *list->push_back(line) and you should be okay for the first error.
The second error is going to depend on what your intent is for the return value. I think return *list->front(); will give the same result as the first example, but if you are planning on returning more than just the first line then you will need to do some concatenation. You can just create a local string and append each line as you read them.
Hopefully your teacher knows using new vector is almost always a code smell in C++ and is using this for a specific reason with a plan to fix it later.
here is a working example. enjoy :)
BTW - you don't need to pass the length, just instantiate the memory and use the push_back method.
#include <vector>
#include <fstream>
#include <string>
using namespace std;
void processFile(string filename, vector<string>** list, int size);
void main()
{
vector<string>* list = NULL;
processFile("C:\\temp.txt", &list, 13);
int i = 1;
}
void processFile(string filename, vector<string>** list, int size){
*list = new vector<string>();
ifstream file(filename);
string line;
for (int i = 0; i < size; i++){
getline(file, line);
(**list).push_back(line); // error
}
file.close();
}
Why am i getting "Access violation error reading " on the following program:
The error is on the while loop to read the file.
#include <iostream>
class fileReader
{
public:
FILE *fp;
char** lines;
fileReader()
{
fp = NULL;
}
fileReader(const char* path)
{
int i=0;
fp = fopen(path,"r");
while ( fgets(lines[i], 100, fp) )
i++;
}
};
int main(int argv, char** argc)
{
const char* path = "D:\\PS4263-2.txt";
fileReader *p = new fileReader(path);
for (int i=0; i<2; i++)
std::cout<<p->lines[i];
return 0;
}
EDIT
As mentioned by the answers I changed my code to (below), but I am still getting the same error.
#include <iostream>
class fileReader
{
public:
FILE *fp;
char** lines;
fileReader()
{
fp = NULL;
}
fileReader(char* path)
{
int j=0;
fp = fopen(path,"r");
if (fp == NULL)
return;
else
{
lines = (char**) malloc(sizeof(char *)*56000);
for (int i=0; i<56000; i++)
lines[i] = (char*)malloc(sizeof(char)*1440);
while ( fgets(lines[j], 1440, fp) )
j++;
fclose(fp);
}
}
};
int main(int argv, char** argc)
{
char* path = "D:\\testfile.txt";
fileReader *p = new fileReader(path);
for (int i=0; i<2; i++)
std::cout<<p->lines[i];
return 0;
}
There are a number of problems with this code. But primarily, the problem is that you're writing some evil C/C++ hybrid. Pick one of the two languages, and use that.
Here's a revised version of your code:
#include <iostream>
class fileReader
{
public:
FILE *fp;
char** lines;
fileReader() : fp(NULL) // initialization of members happens here
{
//fp = NULL; // anything here happens *after* initialization
lines = new char*[100]; // let's just assume max 100 lines. We have to allocate space for them
for (int i = 0; i < 100; ++i) {
lines[i] = new char[100]; // allocate space for the contents of each individual line
}
}
fileReader(const char* path)
{
lines = new char*[100]; // let's just assume max 100 lines. We have to allocate space for them
for (int i = 0; i < 100; ++i) {
lines[i] = new char[100]; // allocate space for the contents of each individual line
}
int i=0;
fp = fopen(path,"r");
while ( fgets(lines[i], 100, fp) )
i++;
}
~fileReader() {
// deallocate and close our members:
fclose(fp);
for (int i = 0; i < 100; ++i) {
delete[] lines[i]; // delete the contents of each line
}
delete[] lines; // delete the lines array
}
};
int main(int argv, char** argc)
{
const char* path = "D:\\PS4263-2.txt";
fileReader p(path); // don't use new unless you really really have to
for (int i=0; i<2; i++)
std::cout<<p.lines[i];
return 0;
}
Now at least it works, if each line contains less than 100 characters and there are fewer than 100 lines and the file exists and a dozen other conditions that we really should protect against. In particular, we spend a lot of effort on memory management: allocating and deallocating space for all the line data.
But we can do a lot better with just a few changes, if we actually start writing C++.
#include <iostream> // we need this for the standard streams (cout)
#include <fstream> // we need proper C++ file streams too
#include <string> // C++ has strings. Don't waste your time on char pointers
#include <vector> // C++ has a dynamic array class. Don't use pointers as ad-hoc arrays
class fileReader
{
public:
// FILE* fp; // no point in making this a class member, when it's only used in one function
std::vector<std::string> lines; // use a vector of strings. Much easier to manage
fileReader() // vectors are automatically initialized, no need to do anything
{
}
fileReader(std::string path)
{
std::ifstream fp(path); // create an input file stream
std::string result; // store the contents of the current line here
while (std::getline(fp, result)) {
lines.push_back(result); // append the resulting line to the end of the vector
}
}
};
int main(int argv, char** argc)
{
std::string path = "blah.txt";
fileReader p(path); // don't use new unless you absolutely have to
for (int i=0; i<2; i++)
std::cout<<p.lines[i];
return 0;
}
Note that we no longer have to manage our array memory. Vectors and strings automatically clean up after themselves when they go out of scope. And because we no longer use new to allocate the fileReader, it automatically gets deleted when it goes out of scope. This effectively starts a chain reaction where its members start cleaning up after themselves: the file stream closes, the vectors deallocates its memory after asking its stored strings to clean up and shut down. And the entire program folds over and closes down without us having to write a single line of code to handle it.
char** lines;
Was never allocated any memory!
To be able to do anything meaningful with it, You need to allocate it enough memory to hold the contents that you intend to hold in it.
Also on a sidenote,
You never deallocate the dynamic memory allocated to p by calling delete p; once done with its usage, this gives you an Undefined Behavior.
You never check for return value of standard library functions, You should always do so.
There are many problems in your codes:
char** lines has not been allocated. You need to allocate lines and
lines[i].
You never check if the file is really open. Check fp before using it.
You forgot to close the file pointer at the end. Call fclose(fp).
EDIT :
You are not deallocating lines, lines[i] and p. Be careful, you must use free() for lines and lines[i] and delete for p.