Pointer Error Debugging Help - c++

I am new to C++, and I am trying to program a replica of Pong using freeglut with Visual Studio 2010. My code was working fine until I made some revisions to it in order to make it more object oriented. When I ran it again, it worked a couple times, until I made some minor edit (to be honest, I can't remember what it was that I changed, since I only built my code the next morning), and I received this runtime error as soon as I ran the program:
Unhandled exception at 0x1000bbae in Pong.exe: 0xC0000005: Access violation writing location 0x000000a8.
I'm not particularly good at debugging, but I believe this is coming at the line glutDisplayFunc() in my main loop. I think this has something to do with null pointers, but I have no idea what. Could somebody help me find my problem?
Below is the applicable portion of my program (Main.cpp):
include "header.h"
using namespace std;
int main(int argc, char** argv) {
//Initialize GLUT
glutInit(&argc, argv);
glutInitDisplayMode(GLUT_DOUBLE | GLUT_RGB | GLUT_DEPTH);
glutInitWindowSize(WIDTH, HEIGHT); //Set the window size
//Create the window
//modesetup(2);
//Set handler functions for drawing, keypresses, and window resizes
glutDisplayFunc(mainDraw);
glutKeyboardFunc(handleKeypress);
glutKeyboardUpFunc(handleKeyup);
glutSpecialFunc(handleSpecial);
glutSpecialUpFunc(handleSpecialUp);
glutReshapeFunc(handleResize);
glutTimerFunc(25, update, 0);
glutMainLoop(); //Start the main loop. glutMainLoop doesn't return.
return 0; //This line is never reached
}
Thanks in advance!
EDIT: I just made a new program that uses very standard functions for everything, but I still get this error. I'm beginning to think that there may be something wrong with my freeglut installation.

Use F5 to run the program with debugging in visual studio. When you get the error the debugger should place you right on the line with the illegal access. If it doesn't have source code for the location then check up the call stack.
Cheers & hth.

Run your program under valgrind. It will give you a lot more information about the problem, and might even point out memory errors other than the immediate cause of the crash.

You probably already solved the problem. But since I had the same problem and couldn t find an answer two it anywhere I will answer the question:
You simply forgot to write the
glutCreateWindow("Windowname");
function before the
glutDisplayFunc(mainDraw);
glutKeyboardFunc(handleKeypress);
glutKeyboardUpFunc(handleKeyup);
glutSpecialFunc(handleSpecial);
glutSpecialUpFunc(handleSpecialUp);
glutReshapeFunc(handleResize);
glutTimerFunc(25, update, 0);
part.
This leads to the exception you get.

Related

How to repair this error of executing?

Every time I compile my simple SDL1.2 code it's compiled successfully
but when I try to run it via terminal (alt+t in Ubuntu):
./game
Segmentation fault (core dumped)
I get this error. Can you help please? This is the code:
#include<SDL/SDL.h>
int main(int argc,char args)
{
SDL_Init( SDL_INIT_EVERYTHING);
SDL_Surface* screen;
screen=SDL_SetVideoMode(640,480,32,SDL_HWSURFACE);
SDL_Flip(screen) ![problem running the program][1];
SDL_Delay(5000);
SDL_FreeSurface(screen);
SDL_Quit();
}
SDL_SetVideoMode returns NULL on error which you do not check for.
Since you're running this via a terminal, I suspect you may have forgotten to tell Xorg to allow running from it. In fact, if this is really the problem it'll prevent any program from running when started that way.
To fix the problem, enter this into the terminal (this only needs to be done once per session):
xhost +
You should get a message that it was successful. I cannot recall the exact message, but it is something like this:
Clients are now allowed to connect from any host.
What was happening (assuming that I was correct regarding xhost) was that the SDL_SetVideoMode() call was failing and returning NULL, because Xorg rejected the connection. Since you're not checking for that, SDL_Flip() ended dereferencing a NULL pointer --- hence the segfault.
SIDE-NOTE: There is an error in your code, however --- namely, you should not call SDL_FreeSurface(screen);; that particular surface is special, and is freed by SDL_Quit(); automatically. Source (see "Return Value" section): http://www.libsdl.org/release/SDL-1.2.15/docs/html/sdlsetvideomode.html
Check if SDL_SetVideoMode() failed!
screen = SDL_SetVideoMode(640, 480, 32, SDL_HWSURFACE);
if (screen == NULL) /* error */;
Run it under valgrind. Or GDB. Or some other debugger of your choice.
You should probably be successfully allocating memory for screen.

Basic OpenGL program crashes, but works with gdb

I copied the first program in the OpenGL Reference Guide for an incredibly basic GLUT OpenGL program.
I am using the Code::Blocks IDE and running on Ubuntu 12.10.
I am using an ATI Mobile Radeon 4670 with the fglrx driver.
I am using this code to make sure my environment was working properly.
Here is the code:
#include <GL/glut.h>
#include <stdlib.h>
void display(void)
{
glClear(GL_COLOR_BUFFER_BIT);
glBegin(GL_POLYGON);
glVertex3f(0.25,0.25,0.0);
glVertex3f(0.75,0.25,0.0);
glVertex3f(0.75,0.75,0.0);
glVertex3f(0.25,0.75,0.0);
glEnd();
glFlush();
}
void init(void)
{
glClearColor(0.0,0.0,0.0,0.0);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(0.0,1.0,0.0,1.0,-1.0,1.0);
}
int main(int argc, char** argv)
{
glutInit(&argc,argv);
glutInitDisplayMode(GLUT_SINGLE | GLUT_RGB);
glutInitWindowSize(250,250);
glutInitWindowPosition(100,100);
glutCreateWindow("hello");
init();
glutDisplayFunc(display);
glutMainLoop();
return 0;
}
I figured this was the simplest code I could run and hope would compile. After some fenangling with the include/ and lib/ directories (/usr/include and /usr/lib/x86_64-linux-gnu), I managed to get it to compile with no errors.
When ran either from within Code::Blocks or using the terminal, I get a segmentation fault. A shadow of a window appears but then it is destroyed and the program exits.
The really strange part is when I try to debug it. Selecting "Debug" from within Code::Blocks or using gdb myself (running gdb <program> and then run) on the command line, it runs just fine. No errors or issues are encountered whatsoever and it executes as expected.
This makes it extremely difficult for me to figure out what the problem is. I had gdb check a generated core file from executing normally, but all it said was
Program terminated with signal 11, Segmentation fault.
#0 0x00007f9ee3a5815c in ?? ()
Real big help. Any ideas? I might have something wrong with my configuration, so ask away.
I'm not sure, since I never used glut, but this site seems to suggest calling glutInitDisplayMode like this
glutInitDisplayMode(GLUT_RGB | GLUT_DOUBLE);
While you call it like this
glutInitDisplayMode(GLUT_SINGLE | GLUT_RGB);
Comparing the two makes me wonder if you should maybe switch your two parameters?
edit:
Nevermind me, I'm being retarded. Those are not 2 parameters ofcourse >< they are simply OR'd. It's 9 AM here and I haven't slept, so forgive my stupidity :<

freeglut fails to open display with valgrind

I'm running the Eclipse IDE on Ubuntu 12.04 with the Valgrind plugin. I've been working on a game in C++ for quite some time, and I seem to have a memory error somewhere. Normally, I trace these with Valgrind. However, glut fails to initialize when I run valgrind. I should note that the game initializes without a problem when I'm not using valgrind. The code for my main function is as follows:
int main(int argc, char** argv) {
char windowTitle[12] = "Game Window";
printf("Initializing Glut...\n");
glutInit(&argc, argv);
printf("Glut initialized!\n");
alutInit(&argc, argv);
Game_Object* game = new Game_Object(windowTitle, 1200, 675, argc, argv);
delete game;
printf("game ended\n");
return 0;
}
The resulting output to the console is:
Initializing Glut
freeglut (/home/dsnettleton/Documents/Programming/Eclipse/workspace/Plutoids/Debug/Plutoids): failed to open display ''
Obviously, the program isn't getting very far with valgrind running.
It's really disheartening to be in such a final stage of my development, only to get stuck trying to weed out a memory error. What might be keeping glut from initializing, and what can I do to fix the problem?
This is my guess: your IDE is probably missing the $DISPLAY environment variable. Somewhere you have to configure the environment to set $DISPLAY before launching Valgrind.
Launch a terminal and echo $DISPLAY. Its value is probably :0.0.
In the worst case, I'd try using setenv() inside the C code or set DISPLAY in the command line that launches Valgrind (none of these cases was tested, they may not work).
Also you have to add this environment variable DISPLAY:=0.0 inside Eclipse.
In the launch configuration of your executable, add a Display variable to the Environment tab, or select to inherit environment.
Run->RunConfiguration...->Environment
Now click on
New
and add
DISPLAY :0
in it

Why cvWaitKey(0) doesn't work?

I'm not sure why, but for a mysterious reason my c++ application doesn't wait anymore when it reaches cvWaitKey(0) it just passes this line, like this function doesn't do anything!
I also tried cvWaitKey(100000) it doesn't work either...
void main() {
cvWaitKey(0);
return;
}
My project is a little complex, I'm using Visual Studio 2010 and It includes opencv ffmpeg pthread winsocks and some other libraries.
Can you guess why this happens?
Have you called cvNamedWindow yet? It will not work without cvNamedWindow.
I've had the issue myself a few times, but I can only speculate on what causes this. I can offer a work-around though:
while(1){
int key=cvWaitKey(10);
if(key==27) break;
}
This will block until ESC is pressed.

Corruption of the heap & F12 Problem

I'm trying to draw a line using GLUT with C++ - the IDE is VS 2008 -but an error message occurred :
Windows has triggered a breakpoint in
Graphics.exe.
This may be due to a corruption of the
heap, which indicates a bug in
Graphics.exe or any of the DLLs it has
loaded.
This may also be due to the user
pressing F12 while Graphics.exe has
focus.
The output window may have more
diagnostic information
of course I don't have any breakpoint in my code this is my code :
#include <glut.h>
void init (void)
{
glClearColor(1.0,1.0,1.0,0.0);
glMatrixMode(GL_PROJECTION);
gluOrtho2D(0.0,200.0,0.0,15.0);
}//end of the function init
void lineSegment(void)
{
glClear(GL_COLOR_BUFFER_BIT);
glColor3f(1.0,0.0,0.0);
// D R A W A L I N E
glBegin(GL_LINES);
glVertex2i(180,15);
glVertex2i(10,145);
glEnd();
glFlush();
}//end of the function "lineSegment"
void main(int argc, char** argv)
{
glutInit(&argc,argv);
glutInitDisplayMode(GLUT_SINGLE|GLUT_RGB);
glutInitWindowPosition(50,100);
glutInitWindowSize(400,300);
glutCreateWindow("N.S");
init();
glutDisplayFunc(lineSegment);
glutMainLoop();
}//end of the "Main" function
Anyone know the problem?
A little googling produced some results. It looks like F12 is reserved by the OS when you are running in the debugger. Here is a good thread about the subject. There is a workaround available from MSFT in this connect article. This gist of it is that when a debugger is active, the OS responds to F12 by entering the debugger at exactly the line of code that is currently executing.
If you are not in a debugger, then this is probably a stack corruption problem. Your code snippet looks pretty simple, but I do not know GL well enough to know if you are missing a required call or breaking some other procedural rule.
Just to add to what D.Shawley's written: The F12 key is quite handy, once you know about it.
It's worth stressing that the F12 key is only active while a debugger is attached and this key acts normally when there's no debugger. Still, it's safer to avoid mapping the F12 shortcut to anything useful in your app, for those times when somebody needs to debug.
I got this same error message when programming in visual studio in C, totally unrelated to the F12 key. For anyone else programming in C who found this post via Google - my error was caused by a dangling pointer I had in my code.
Check all of your "free" statements and make sure you don't have any pointers left that refer to the memory you are deallocating.