This question already has answers here:
Undefined, unspecified and implementation-defined behavior
(9 answers)
Closed 6 years ago.
I was solving an online exercise related to de-allocating memory pointed to by pointer using delete keyword in C++. Following is my code.
#include<iostream>
#include<string>
#include<conio.h>
int main()
{
double *ptrDouble = new double;
*ptrDouble = 22;
std::cout << "\nValue of ptrDouble = " << *ptrDouble << std::endl;
delete ptrDouble;
std::cout << "Value of ptrDouble = " << *ptrDouble << std::endl;
getch();
}
So according to the online site where i am solving this exercise,
If you use the delete keyword on the pointer , the memory will be
deallocated and therefore the contents will not be available to your
application any longer. Attempting to access the contents will cause
your application to crash due to a memory violation.
But when i try to print the value of the ptrDouble after deallocating the memory, the program doesn't crashes, instead a garbage value is printed on the console.
Question is, am i doing something wrong or that online site is wrong about whether program should crash or not ?
PS. I am using visual studio 2015 community.
Accessing freed memory leads to undefined behavior. Crashing or reading garbage both fall into that category. Whether or not the program will actually crash depends on whether that particular block of memory was returned to the OS or merely made available for reuse.
Related
This question already has answers here:
No out of bounds error
(7 answers)
Closed 1 year ago.
I'm working on dynamic arrays for my c++ course, but I'm confused about the behavior of my dynamic arrays. For example, if I run this code:
int* myDynamicArr = new int[3];
for (int i = 0; i < 10; i++)
{
myDynamicArr[i] = i + 1;
cout << myDynamicArr[i] << endl;
}
I would expect it to not work since I only declared it as size 3. But when I run it, it prints out 0-9. Same thing if I do this:
char* myCharArr = new char[2];
strcpy(myCharArr, "ThisIsALongString");
cout << myCharArr;
It prints the full string even though it seems like it should fail. Can anyone explain what I'm doing wrong here? Thanks!
C++ does not perform bounds checking on arrays. So when you read or write past the bounds of an array you trigger undefined behavior.
With undefined behavior, your program may crash, it may output strange results, or it may (as in your case) appear to work properly.
Just because it could crash doesn't mean it will.
This question already has answers here:
Why doesn't vector::clear remove elements from a vector?
(10 answers)
Closed 6 years ago.
Here's the code:
vector<double> samples;
int main()
{
samples.resize(100);
for(int i=0; i<100; i++) {
samples[i]=i/100.0;
}
samples.clear();
cout << "vector size: " << samples.size() << endl;
cout << "... but samples[9]=" << samples[9] << endl;
}
And the output it:
vector size: 0
... but samples[9]=0.09
After clearing the vector (size is 0) I can still access to its data. Is it that normal?
Reference says elements will be "destroyed", but it seems it doesn't mean "default/empty" values.
In other languages I would get a "out of range" error message at runtime...
it seems it doesn't mean "default/empty" values.
Yes, it's just UB.
Note std::vector::operator[] doesn't perform bounds checking, while std::vector::at() does, and an exception of type std::out_of_range will be thrown for the case.
C++ is different from other languages like Java, C# or Python that many things are defined as undefined behavior instead of producing an error, especially things were detecting them would have a runtime cost associated. Checking out-of-bound accesses for arrays is such an example. Being undefined behavior, this also gives the compiler a great degree of freedom to optimize your code.
In you specific case, an out-of-bound access via std::vector::operator[] is undefined behavior. The compiler is free to generate any behavior it wants. The common case is just to perform the access to the memory position and return what is there.
This question already has answers here:
Can a local variable's memory be accessed outside its scope?
(20 answers)
Closed 7 years ago.
So I'm learning C++ (coming from a Java background). I thought I understood how memory works on a high level (stack vs heap and pointers). To experiment, I wrote the following two toy functions:
int* pntrToHeap(int val) {
return new int(val);
}
and
int* pntrToStack(int val) {
return &val;
}
At first I thought pntrToStack just wouldn't work, because the local variable val is on the stack which is "deleted" after the function exits. But after the following code worked without errors (with 1 warning, however), I reconsidered:
int main()
{
int val1 = *pntrToHeap(3);
int val2 = *pntrToStack(4);
cout << val1 << endl;
cout << val2 << endl;
return 0;
}
Both 3 and 4 printed to the screen. It seems as though the stack isn't actually deleted, but the CPU just loses the ability to access local variables on it -- is this correct? If so, in a case like this, which function should we prefer?
Lastly, since val1 is a local variable of main, is pntToHeap creating a memory leak since I can't delete the value it created on the heap?
I know these concepts have been asked about before, but I couldn't quite find the answers. Thanks!
Definitely the first one! If you want something to live after the stack frame expires you should heap allocate it.
And yes, the value pointed to by the pointer returned from pntrToStack will be overwritten the next time you allocate a new stack frame ie. call a function. When you exit out of a scope the memory is not erased. It is merely marked as being free to allocate.
This question already has answers here:
Can a local variable's memory be accessed outside its scope?
(20 answers)
Closed 8 years ago.
Please consider this simple example:
#include <iostream>
const int CALLS_N = 3;
int * hackPointer;
void test()
{
static int callCounter = 0;
int local = callCounter++;
hackPointer = &local;
}
int main()
{
for(int i = 0; i < CALLS_N; i++)
{
test();
std::cout << *hackPointer << "(" << hackPointer << ")";
std::cout << *hackPointer << "(" << hackPointer << ")";
std::cout << std::endl;
}
}
The output (VS2010, MinGW without optimization) has the same structure:
0(X) Y(X)
1(X) Y(X)
2(X) Y(X)
...
[CALLS_N](X) Y(X)
where X - some address in memory, Y - some rubbish number.
What is done here is the case of undefined behaviour. However I want to understand why there is such behaviour in current conditions (and it is rather stable for two compilers).
It seems that after test() call first read of hackPointer leads to valid memory, but second successive instant read of it leads to rubbish. Also on any call address of local is the same. I always thought that memory for stack variable is allocated on every function call and is released after return but I can't explain output of the program from this point of view.
"Releasing" automatic storage doesn't make the memory go away, or change the pattern of bits stored there. It just makes it available for reuse, and causes undefined behaviour if you try to access the object that used to be there.
Immediately after returning from the function, the memory occupied by the local probably hasn't been overwritten, so reading it will probably give the value that was assigned within the function.
After calling another function (in this case, operator<<()), the memory is likely to have been reused for a variable within that function, so probably has a different value.
You are quite right that this is undefined behaviour.
That aside, what's happening is that std::cout << *hackPointer involves a function call: operator<<() gets called after the value of *hackPointer has been read. In all likelihood, operator<<() uses its own local variables that end up on the stack where local was, wiping out the latter.
This question already has answers here:
Can a local variable's memory be accessed outside its scope?
(20 answers)
Closed 8 years ago.
Why does this run fine? (And several times in a row..)
double* p(nullptr);
cout << p << endl; // "00000000"
{
double d(82.);
p = &d;
}
cout << p << endl; // "0029FD98"
// Naughty, dirty, sneaky..
// .. but rather *pure* curiosity after all.. u_u
cout << *p << endl; // "82", first surprise
*p = 83.; // (getting further down the hole..)
cout << *p << endl; // "83", and I almost feel disappointed. :(
Isn't d supposed to be out of scope and 0029FD98 deallocated? Why isn't my OS mad at me? Am I just super lucky?
You are invoking undefined behavior. According to the C++ specification, anything might happen here. Undefined behavior is a very bad thing, because it means you cannot know what your program might do. Avoid it at all costs.
On your particular platform with your particular compiler, this probably works because the variable was allocated on the stack, and the stack memory is not (usually) deallocated while the program is running. As a result, 0029FD98 refers to an address within an allocated region of memory (in this case, the stack). As soon as you call a function, this location is likely to be overwritten with whatever that function needs the stack space for.
On other systems and/or compilers, where local variables and/or the stack might behave or be implemented differently, this could output some random number, or it might crash, or it might output the collective works of Shakespeare.