Basic OpenGL program crashes, but works with gdb - c++

I copied the first program in the OpenGL Reference Guide for an incredibly basic GLUT OpenGL program.
I am using the Code::Blocks IDE and running on Ubuntu 12.10.
I am using an ATI Mobile Radeon 4670 with the fglrx driver.
I am using this code to make sure my environment was working properly.
Here is the code:
#include <GL/glut.h>
#include <stdlib.h>
void display(void)
{
glClear(GL_COLOR_BUFFER_BIT);
glBegin(GL_POLYGON);
glVertex3f(0.25,0.25,0.0);
glVertex3f(0.75,0.25,0.0);
glVertex3f(0.75,0.75,0.0);
glVertex3f(0.25,0.75,0.0);
glEnd();
glFlush();
}
void init(void)
{
glClearColor(0.0,0.0,0.0,0.0);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(0.0,1.0,0.0,1.0,-1.0,1.0);
}
int main(int argc, char** argv)
{
glutInit(&argc,argv);
glutInitDisplayMode(GLUT_SINGLE | GLUT_RGB);
glutInitWindowSize(250,250);
glutInitWindowPosition(100,100);
glutCreateWindow("hello");
init();
glutDisplayFunc(display);
glutMainLoop();
return 0;
}
I figured this was the simplest code I could run and hope would compile. After some fenangling with the include/ and lib/ directories (/usr/include and /usr/lib/x86_64-linux-gnu), I managed to get it to compile with no errors.
When ran either from within Code::Blocks or using the terminal, I get a segmentation fault. A shadow of a window appears but then it is destroyed and the program exits.
The really strange part is when I try to debug it. Selecting "Debug" from within Code::Blocks or using gdb myself (running gdb <program> and then run) on the command line, it runs just fine. No errors or issues are encountered whatsoever and it executes as expected.
This makes it extremely difficult for me to figure out what the problem is. I had gdb check a generated core file from executing normally, but all it said was
Program terminated with signal 11, Segmentation fault.
#0 0x00007f9ee3a5815c in ?? ()
Real big help. Any ideas? I might have something wrong with my configuration, so ask away.

I'm not sure, since I never used glut, but this site seems to suggest calling glutInitDisplayMode like this
glutInitDisplayMode(GLUT_RGB | GLUT_DOUBLE);
While you call it like this
glutInitDisplayMode(GLUT_SINGLE | GLUT_RGB);
Comparing the two makes me wonder if you should maybe switch your two parameters?
edit:
Nevermind me, I'm being retarded. Those are not 2 parameters ofcourse >< they are simply OR'd. It's 9 AM here and I haven't slept, so forgive my stupidity :<

Related

D3DERR_INVALIDCALL in d3d->CreateDevice causing window flicker on startup in Allegro 5 D3D program

I'm working on debugging window creation flicker when creating an Allegro 5 Direct3D window with multi sampling enabled. I've narrowed the problem down to window creation in allegro's d3d_disp.cpp source file. However, I can't get any debug output from DirectX. The flickering only happens in D3D mode (not OpenGL) and only when multi sampling is enabled. Also to note, this only happens when running the program on NVIDIA gpus, not on my integrated Intel.
I'm running Windows 10.
I've tried debugging this in Visual Studio 2017, but it doesn't capture debug output from DX. I installed the DirectX debug symbols when installing the DirectX SDK from June 2010.
I've tried rebuilding allegro and linking to libd3dx9d.a in gcc, but I still can't step into directx function calls and the symbols aren't loaded. There's no libd3d9d.a (note the d for debugging) library available, in MinGW-W64 GCC 8.1 or in the DirectX SDK from June 2010.
I tried running my program through PIX but it gives me an incompatibility error that I can't solve.
Here is testable example allegro 5 code :
#include <allegro5/allegro.h>
#include <allegro5/allegro_color.h>
#include <allegro5/allegro_primitives.h>
#include <allegro5/allegro_direct3d.h>
#include <cstdio>
#include <climits>
int main(int argc, char **argv) {
if (!al_init()) { return 1; }
al_init_primitives_addon();
al_install_keyboard();
ALLEGRO_EVENT_QUEUE* queue = al_create_event_queue();
if (!queue) { return 2; }
al_register_event_source(queue, al_get_keyboard_event_source());
al_set_new_display_option(ALLEGRO_SAMPLE_BUFFERS, 1, ALLEGRO_REQUIRE);
al_set_new_display_option(ALLEGRO_SAMPLES, 8, ALLEGRO_SUGGEST);
bool use_opengl = false;
if (use_opengl) {
al_set_new_display_flags(ALLEGRO_OPENGL);
}
else {
al_set_new_display_flags(ALLEGRO_DIRECT3D);
}
ALLEGRO_DISPLAY *display = al_create_display(1024, 600);
if (!display) { return 2; }
if (use_opengl) {
al_set_window_title(display, "OpenGL window");
}
else {
al_set_window_title(display, "Direct3D window");
}
al_register_event_source(queue, al_get_display_event_source(display));
al_clear_to_color(al_color_name("black"));
al_draw_circle(500, 300, 200, al_color_name("white"), 5.0);
al_draw_line(200, 200, 700, 300, al_color_name("white"), 5.0);
al_flip_display();
bool quit = false;
while (!quit) {
ALLEGRO_EVENT ev;
al_wait_for_event(queue, &ev);
if (ev.type == ALLEGRO_EVENT_KEY_DOWN && ev.keyboard.keycode == ALLEGRO_KEY_ESCAPE) { quit = true; }
if (ev.type == ALLEGRO_EVENT_DISPLAY_CLOSE) { quit = true; }
}
return 0;
}
Ideally, the window doesn't flicker upon creation. There are many many programs that use Direct3D that don't flicker on window creation.
I've narrowed down the problem to failed calls to d3d->CreateDevice returning D3DERR_INVALIDCALL on these lines in src\win\d3d_disp.cpp in allegro's source code here : https://github.com/liballeg/allegro5/blob/master/src/win/d3d_disp.cpp#L812-L837
I need help getting the debug output from DirectX, and nothing works so far. Any tips on debugging with DirectX9, VS 2017 (and/or) MinGW-W64 GCC 8.1 and GDB, or other methods is appreciated.
EDIT
An update on the things I've tried.
Defining D3D_DEBUG_INFO before including d3d9.h didn't seem to do anything when rebuilding allegro.
Enabling DirectX debugging output in the dxcpl did nothing.
Trying to run my app through PIX results in an incompatibility error. It says the directx subversions don't match between the app and the pix runtime. How do I build for a specific version of the DirectX dll?
I found that when multisampling is enabled through the D3D_PRESENT_PARAMETERS, the swap effect must be D3DSWAPEFFECT_DISCARD. Fixed that, nothing changed.
Still getting D3DERR_INVALIDCALL. I can't see anything in the presentation parameters that isn't initialized.
If I can't enable DirectX debug output I really can't tell why this error is occurring.
Debugging tips welcome. I can see that window creation fails multiple times before it succeeds and that is why the window flickers.
EDIT2
It seems to be a problem with the BackBufferFormat specified, as that is the only difference between successful window creation and failure.
EDIT3
The BackBufferFormat is fine. The real difference was the quality level of multi sampling being attempted. As per
https://github.com/liballeg/allegro5/blob/master/src/win/d3d_display_formats.cpp#L95
and
CheckDeviceMultiSampleType
there was an off by one error that made it attempt to set a quality level that was off by one. quality levels indicates the count, not the maximum index.
The flicker is gone but more testing needs to be done.
As for the supplementary question, how can I enable debug info with DirectX? Nothing I've done has worked as per the above. I will award the answer to anyone who can help me achieve debug output from D3D and DX.
#Gull_Code If you like, you can clone my testing fork of allegro here :
https://github.com/EdgarReynaldo/allegro5/tree/test
Bugsquasher
Each search I made on the topic returned the same 3 solutions:
-Bad driver, update or a downgrade
-Bad directx install, in the microsoft forums they asked the guy to uninstall directx, delete the root registry entry for it, reinstall directx
-A lot were pointing that it happens when the D3D struct is partially initialized without giving a fully initialized one...
I'll come back if I have some more informations.

How to repair this error of executing?

Every time I compile my simple SDL1.2 code it's compiled successfully
but when I try to run it via terminal (alt+t in Ubuntu):
./game
Segmentation fault (core dumped)
I get this error. Can you help please? This is the code:
#include<SDL/SDL.h>
int main(int argc,char args)
{
SDL_Init( SDL_INIT_EVERYTHING);
SDL_Surface* screen;
screen=SDL_SetVideoMode(640,480,32,SDL_HWSURFACE);
SDL_Flip(screen) ![problem running the program][1];
SDL_Delay(5000);
SDL_FreeSurface(screen);
SDL_Quit();
}
SDL_SetVideoMode returns NULL on error which you do not check for.
Since you're running this via a terminal, I suspect you may have forgotten to tell Xorg to allow running from it. In fact, if this is really the problem it'll prevent any program from running when started that way.
To fix the problem, enter this into the terminal (this only needs to be done once per session):
xhost +
You should get a message that it was successful. I cannot recall the exact message, but it is something like this:
Clients are now allowed to connect from any host.
What was happening (assuming that I was correct regarding xhost) was that the SDL_SetVideoMode() call was failing and returning NULL, because Xorg rejected the connection. Since you're not checking for that, SDL_Flip() ended dereferencing a NULL pointer --- hence the segfault.
SIDE-NOTE: There is an error in your code, however --- namely, you should not call SDL_FreeSurface(screen);; that particular surface is special, and is freed by SDL_Quit(); automatically. Source (see "Return Value" section): http://www.libsdl.org/release/SDL-1.2.15/docs/html/sdlsetvideomode.html
Check if SDL_SetVideoMode() failed!
screen = SDL_SetVideoMode(640, 480, 32, SDL_HWSURFACE);
if (screen == NULL) /* error */;
Run it under valgrind. Or GDB. Or some other debugger of your choice.
You should probably be successfully allocating memory for screen.

freeglut fails to open display with valgrind

I'm running the Eclipse IDE on Ubuntu 12.04 with the Valgrind plugin. I've been working on a game in C++ for quite some time, and I seem to have a memory error somewhere. Normally, I trace these with Valgrind. However, glut fails to initialize when I run valgrind. I should note that the game initializes without a problem when I'm not using valgrind. The code for my main function is as follows:
int main(int argc, char** argv) {
char windowTitle[12] = "Game Window";
printf("Initializing Glut...\n");
glutInit(&argc, argv);
printf("Glut initialized!\n");
alutInit(&argc, argv);
Game_Object* game = new Game_Object(windowTitle, 1200, 675, argc, argv);
delete game;
printf("game ended\n");
return 0;
}
The resulting output to the console is:
Initializing Glut
freeglut (/home/dsnettleton/Documents/Programming/Eclipse/workspace/Plutoids/Debug/Plutoids): failed to open display ''
Obviously, the program isn't getting very far with valgrind running.
It's really disheartening to be in such a final stage of my development, only to get stuck trying to weed out a memory error. What might be keeping glut from initializing, and what can I do to fix the problem?
This is my guess: your IDE is probably missing the $DISPLAY environment variable. Somewhere you have to configure the environment to set $DISPLAY before launching Valgrind.
Launch a terminal and echo $DISPLAY. Its value is probably :0.0.
In the worst case, I'd try using setenv() inside the C code or set DISPLAY in the command line that launches Valgrind (none of these cases was tested, they may not work).
Also you have to add this environment variable DISPLAY:=0.0 inside Eclipse.
In the launch configuration of your executable, add a Display variable to the Environment tab, or select to inherit environment.
Run->RunConfiguration...->Environment
Now click on
New
and add
DISPLAY :0
in it

Pointer Error Debugging Help

I am new to C++, and I am trying to program a replica of Pong using freeglut with Visual Studio 2010. My code was working fine until I made some revisions to it in order to make it more object oriented. When I ran it again, it worked a couple times, until I made some minor edit (to be honest, I can't remember what it was that I changed, since I only built my code the next morning), and I received this runtime error as soon as I ran the program:
Unhandled exception at 0x1000bbae in Pong.exe: 0xC0000005: Access violation writing location 0x000000a8.
I'm not particularly good at debugging, but I believe this is coming at the line glutDisplayFunc() in my main loop. I think this has something to do with null pointers, but I have no idea what. Could somebody help me find my problem?
Below is the applicable portion of my program (Main.cpp):
include "header.h"
using namespace std;
int main(int argc, char** argv) {
//Initialize GLUT
glutInit(&argc, argv);
glutInitDisplayMode(GLUT_DOUBLE | GLUT_RGB | GLUT_DEPTH);
glutInitWindowSize(WIDTH, HEIGHT); //Set the window size
//Create the window
//modesetup(2);
//Set handler functions for drawing, keypresses, and window resizes
glutDisplayFunc(mainDraw);
glutKeyboardFunc(handleKeypress);
glutKeyboardUpFunc(handleKeyup);
glutSpecialFunc(handleSpecial);
glutSpecialUpFunc(handleSpecialUp);
glutReshapeFunc(handleResize);
glutTimerFunc(25, update, 0);
glutMainLoop(); //Start the main loop. glutMainLoop doesn't return.
return 0; //This line is never reached
}
Thanks in advance!
EDIT: I just made a new program that uses very standard functions for everything, but I still get this error. I'm beginning to think that there may be something wrong with my freeglut installation.
Use F5 to run the program with debugging in visual studio. When you get the error the debugger should place you right on the line with the illegal access. If it doesn't have source code for the location then check up the call stack.
Cheers & hth.
Run your program under valgrind. It will give you a lot more information about the problem, and might even point out memory errors other than the immediate cause of the crash.
You probably already solved the problem. But since I had the same problem and couldn t find an answer two it anywhere I will answer the question:
You simply forgot to write the
glutCreateWindow("Windowname");
function before the
glutDisplayFunc(mainDraw);
glutKeyboardFunc(handleKeypress);
glutKeyboardUpFunc(handleKeyup);
glutSpecialFunc(handleSpecial);
glutSpecialUpFunc(handleSpecialUp);
glutReshapeFunc(handleResize);
glutTimerFunc(25, update, 0);
part.
This leads to the exception you get.

Corruption of the heap & F12 Problem

I'm trying to draw a line using GLUT with C++ - the IDE is VS 2008 -but an error message occurred :
Windows has triggered a breakpoint in
Graphics.exe.
This may be due to a corruption of the
heap, which indicates a bug in
Graphics.exe or any of the DLLs it has
loaded.
This may also be due to the user
pressing F12 while Graphics.exe has
focus.
The output window may have more
diagnostic information
of course I don't have any breakpoint in my code this is my code :
#include <glut.h>
void init (void)
{
glClearColor(1.0,1.0,1.0,0.0);
glMatrixMode(GL_PROJECTION);
gluOrtho2D(0.0,200.0,0.0,15.0);
}//end of the function init
void lineSegment(void)
{
glClear(GL_COLOR_BUFFER_BIT);
glColor3f(1.0,0.0,0.0);
// D R A W A L I N E
glBegin(GL_LINES);
glVertex2i(180,15);
glVertex2i(10,145);
glEnd();
glFlush();
}//end of the function "lineSegment"
void main(int argc, char** argv)
{
glutInit(&argc,argv);
glutInitDisplayMode(GLUT_SINGLE|GLUT_RGB);
glutInitWindowPosition(50,100);
glutInitWindowSize(400,300);
glutCreateWindow("N.S");
init();
glutDisplayFunc(lineSegment);
glutMainLoop();
}//end of the "Main" function
Anyone know the problem?
A little googling produced some results. It looks like F12 is reserved by the OS when you are running in the debugger. Here is a good thread about the subject. There is a workaround available from MSFT in this connect article. This gist of it is that when a debugger is active, the OS responds to F12 by entering the debugger at exactly the line of code that is currently executing.
If you are not in a debugger, then this is probably a stack corruption problem. Your code snippet looks pretty simple, but I do not know GL well enough to know if you are missing a required call or breaking some other procedural rule.
Just to add to what D.Shawley's written: The F12 key is quite handy, once you know about it.
It's worth stressing that the F12 key is only active while a debugger is attached and this key acts normally when there's no debugger. Still, it's safer to avoid mapping the F12 shortcut to anything useful in your app, for those times when somebody needs to debug.
I got this same error message when programming in visual studio in C, totally unrelated to the F12 key. For anyone else programming in C who found this post via Google - my error was caused by a dangling pointer I had in my code.
Check all of your "free" statements and make sure you don't have any pointers left that refer to the memory you are deallocating.