I have been doing some research and still can't find a solution to my problem.
As far as I know, when we declare variables outside of a function, they are allocated within the heap, and that memory is not freed until the end of execution; unless we specifically do so with the delete function. I've tried the following functions to free the variables declared at the beginning of the code and none of them worked (getting a debug error in dbgdel.cpp): delete, delete [], free().
What am I doing wrong?
I will paste underneath a summarized version of the code. Any help is appreciated. Thank you!
(I know global variables are not usually desirable in proper programming, but is not my piece of code, I am just trying to fix it as it is.)
#include <stdio.h>
#include <conio.h>
#include <cv.h>
#include <highgui.h>
#include <cxcore.h>
#include "Viewer.h"
....
// Arrays
float z_base [5201][5201];
....
uchar maskThreshold [5200][5200];
...
void main(){
.....
delete [] z_base;
//free (z_base);
//delete z_base;
//free (&z_base);
}
As far as I know, when I declare variables outside of every function they are allocated within the heap
This is not true. In general, you only have to call delete or delete[] if you have allocated memory with new or new [] respectively.
There is no heap (in C++).
All memory is released at the end of execution.
delete what you new, delete[] what you new[].
void main is wrong (main must return int).
That is all.
You don't. The run-time will do it for you. As a rule of thumb, you only need to delete/delete[] that which you allocated with new/new[].
Also, note that globals are not allocated on the heap, but in static memory.
Related
I tried to allocate dynamic memory for an array of empty queues using the malloc function, as the code shows. However, the output of (*pq).size() is incorrect -128, and calling (*pq).push() will cause an error. Where did I go wrong? How to allocate the memory correctly?
#include <stdio.h>
#include <stdlib.h>
#include <queue>
typedef std::queue<int> iq;
iq *pq;
int main() {
pq = (iq *) malloc(16 * sizeof(iq));
printf("%d\n", (*pq).size());
printf("%d\n", (*pq).empty());
// (*pq).push(12);
(*pq).pop();
printf("%d\n", (*pq).size());
free(pq);
return 0;
}
How to allocate memory for an array of queues by malloc?
Just like you did in your example. Technically, it's not the allocation that's wrong. Although, see †.
Where did I go wrong?
You only allocated some amount of memory for the queues. You never constructed any queue object into that block of memory. Dynamic objects are constructed using a new-expression. To construct an object into a block of memory allocated by malloc, you could use the placement-new syntax.
How to allocate the memory correctly?
By not using malloc.
† There is no good reason to use malloc in C++.
Here is a correct way to allocate dynamic memory for an array of 16 queues, as well as construct those queues:
std::vector<std::queue<int> > pq(16);
The vector will take care of numerous problems that you would shoot yourself in the foot otherwise. It will take care of constructing the elements as well as destroying them, avoids memory leaks and double removes, as well as more subtle issues like exception safety.
It's not clear if malloc is a requirement of your solution. A native C++ solution avoids most of the issues with readable semantics. See below.
In the code below, I've switched to using iostream and vector, as it frees you to reason about everything at the same level of abstraction. Malloc is a low-level C routine for allocating dynamic memory. You're already using std::queue, so it doesn't really make sense to mix malloc when a vector will do just fine.
Solution
#include <queue>
#include <iostream>
#include <vector>
using iq = std::queue<int>;
using vec = std::vector<iq>;
int main()
{
using namespace std;
vec pq;
pq.resize(16);
pq[0].empty();
cout << pq[0].size() << endl;
pq[0].push(12);
pq[0].push(13);
pq[0].push(11);
pq[0].pop();
cout << pq[0].size() << endl;
return 0;
}
Example Output
$main
0
2
I'm trying to hunt down memory leaks using Xcode instruments (5.0.1) for a C++ project. Consider this:
#include <iostream>
#include <chrono>
#include <thread>
class Person
{
public:
int _age;
};
int main(int argc, const char * argv[])
{
Person* pers1 = new Person();
pers1->_age = 25;
std::cout << "Pers1 age is " << pers1->_age << std::endl;
std::this_thread::sleep_for(std::chrono::milliseconds(5000));
return 0;
}
I'm not expecting the "Leaks" instrument to report any leaks, since
Leaks [...] doesn't know what you're gonna do with your pointers. All it knows is that every block of allocated memory is still referenced somewhere.
as Zneak points out in his answer to a similar question. *
However, I'm quite surprised that Allocations doesn't hint at a Person object being allocated, and subsequently leaked.
What would I have to do (except for _pers1 = NULL;) in order to get Instruments report that this program was leaking? Or is there no leak here at all, and I have misunderstood something regarding the memory allocation model of C++? Doesn't every new operator call have to be matched with a delete call?
*) In the same answer, the claim is being made that
For a leak to occur, you have to lose the reference you have to allocated memory.
I have no better word for heap memory not being freed by a program that allocates said memory than "leak". So technically what I'm asking about may not be a "leak", but I think it is clear what phenomena I'm thinking about.
Your pers1 variable is still in scope until the end of the main function, so the memory isn't ever going to be reported as leaked. As soon as the memory would leak, the program terminates anyway.
If you wanted to make the memory leak, you could put the first three lines of the body of main in another function, then call that function from main before you call sleep. It should be able to report that the memory leaked, because the program is still running, but the pers1 variable would no longer be in scope.
This question already has answers here:
Stack overflow visual C++, potentially array size?
(2 answers)
Closed 9 years ago.
Why is this code giving segmentation fault? I am using code::blocks.
#include <iostream>
#include <cstdio>
using namespace std;
int main()
{
int a[555555];
}
This is what called stack overflow.
Generally stack size are small and you can't allocate memory in such a large amount on stack.
For this purpose programmers allocate it on heap (using dynamic allocation). In C, you can use malloc family function
int *a = malloc(sizeof(int)*55555); // Use free(a) to deallocate
In C++ you can use new operator
int *b = new int[555555]; // Use delete [] to deallocate
Because you are trying to allocate a bit over 2MB (Previously I failed at math and thought it was 2GB) of memory on stack, which then blows the stack.
Note: for windows the default stack size for a particular thread is 1MB and on Gnu/linux you can find out stack size value using ulimit -s command.
You've come to the right place to ask the question. ;)
The array is large and lives on the stack. The code crashes because it runs out of the limited stack space.
If you allocate a on the heap, the problem will likely disappear.
As other already had told you you are trying to allocate a large amount of memory in the stack which space is usually very limited.
See for instance:
#include <iostream>
#include <cstdio>
using namespace std;
int main()
{
int a[555555];
int* b = new int[555555];
delete [] b;
}
In that snipped, you have two arrays of integers, one allocated in the heap and the other allocated on the stack.
Here you can find some explanations about which are the differences between the heap and the stack:
What and where are the stack and heap?
http://gribblelab.org/CBootcamp/7_Memory_Stack_vs_Heap.html
I've got some considerations on your code.
First, modern compilers will recognize that a is unused and then it'll be thrown away.
But if you put a certain value into a certain position, a will be allocated, even though it's bigger than the stack size. The kernel will not allow you do that: that's why you get a SIGSEGV.
Finally, you should rely on either std::array or std::vector rather than pure C arrays.
I'm using some dynamically allocated arrays of multiprecision variables (from the mpc-library) and wonder if it is necessary both to clear the variables and to delete the arrays to avoid memory leaks etc? In other words, is all the housekeeping in the snippet below necessary?
using namespace std;
#include <gmp.h>
#include <mpfr.h>
#include <mpc.h>
int main() {
int i;
mpc_t *mpcarray;
mpcarray=new mpc_t[3];
for(i=0;i<3;i++) mpc_init2(mpcarray[i], 64);
// Manipulations
for(i=0;i<3;i++) mpc_clear(mpcarray[i]);
delete [] mpcarray;
return 0;
}
Yes, it is necessary.
The general rule of life applies here:
"You should dispose what you use!"
If you don't it results in a memory leak or much worse an Undefined Behavior if the destructor of mpc_t contains code which produces side effects .
Dynamic memory is an feature which provides you explicit memory management for your program and if you use it(calling new or new []) then it is your responsibility to ensure its proper usage (deallocate it by calling delete or delete [] respectively).
Note that you are much better of to use auto/local variables instead of dynamically pointers.
And if you must, you should use smart pointers instead of raw pointers. They provide you advantages of dynamic memory minus the explicit memory management overhead.
why i'm getting the memory leak errors without allocating or adding any elements to list below. should i just ignore it?
#define CRTDBG_MAP_ALLOC
#include <crtdbg.h>
#include <list>
using std::list;
int main()
{
list <char*> roots;
_CrtDumpMemoryLeaks();
}
You are not giving the roots variable a chance to be destroyed before checking for memory leaks. If roots is destroyed first, you should notice that everything is cleaned up. Try this instead.
#define CRTDBG_MAP_ALLOC
#include <crtdbg.h>
#include <list>
using std::list;
int main()
{
{
list <char*> roots;
}
_CrtDumpMemoryLeaks();
}
The list hasn't been destructed yet when you call _CrtDumpMemoryLeaks, so any allocation it has performed is treated as a memory leak. This has nothing to do with the char*: the same thing would happen with list<int>.
_CrtDumpMemoryLeaks simply reports any allocations that haven't been freed yet. It has no way of knowing that the list destructor is yet to run and perform deallocations.
If you do C++, then using std::string instead of char* might be a better practice.
Anyway, you must understand that the container holds pointers to chars, not the chars themselves. So, on destruction, it will free the memory occupied by the pointers, but not the pointed memory.
In short terms, it is up to you to free every char* before destroying/clearing the list.
In general it's better to do
_CrtSetDbgFlag(_CRTDBG_LEAK_CHECK_DF); (You might need other flags too...)
which causes the runtime to dump memory leaks before it exits than to explicitly call _CrtDumpMemoryLeaks. If you do that you can be sure that any local variables still in scope as well as any global variables will have been freed so any reported memory leaks are "real".