gl calls end in EXC_BAC_ACCESS - bad opengl context? - opengl

I have the following program
void allocVars(){
m_window = new GLWindow(); //glGenTexture() is called //CRASH!
m_window->Init(m_cam.w, m_cam.h, "Window Name");
}
void glInit()
{
glutReshapeFunc(reshape);
glutIdleFunc(idle);
glutKeyboardFunc(keyboard);
glutMouseFunc(mouse);
glutDisplayFunc(display);
glBlendFunc(GL_SRC_ALPHA, GL_ONE); //CRASH!
}
int main(int argc, char **argv)
{
glutInit(&argc, argv);
glInit(); // CRASH HERE
camInit(); //ok
allocVars(); // CRASH HERE
trackingInit();
glutMainLoop();
return 0;
}
Acccording to other posts, in order to make gl calls I have to have a valid openGL context first. (For strange reasons in Windows it works even if the context is not valid yet.)
That is why I move everything after glutInit and glInit functions but this application always crashes in gl functions like
glGenTextures() inside of allocVars(); in GLWindow(); or in
glBlendFunc() inside of glInit()
I wonder what am I missing here and/or how can I check I have a valid opengl context?
Thanks in advance

answer is in the comments ↑

Related

OpenGL Optimization or...?

I'm making a OpenGL game, and I have problem with optimization. When I start it, it does not respond. If in Update() I just put a for loop and _time += 0.1f, I get a blank screen.
void Update(){
for(; ;){
_time += 0.1f;
Render();
}
}
void Render() {
glClearDepth(1.0);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
_colorProgram.use();
GLuint timeLocation = _colorProgram.getUniformLocation("time");
glUniform1f(timeLocation, _time);
_sprite.Render();
_colorProgram.unuse();
glutSwapBuffers();
}
int main(int argc, char** argv) {
std::printf("OpenGL version is %s",glGetString(GL_VERSION));
// Window
glutInit(&argc, argv);
glutInitDisplayMode(GLUT_DEPTH | GLUT_DOUBLE | GLUT_RGBA);
glutInitWindowPosition(520, 200);
glutInitWindowSize(800, 600);
glutInitDisplayMode(GLUT_DOUBLE);
glutCreateWindow("OpenGL [ Shader #1 error pt 3 ]");
// Setup GLEW
if( GLEW_OK != glewInit()){
return 1;
} while( GL_NO_ERROR != glGetError() );
// After creating window
Init();
glutDisplayFunc(Render);
Update();
glutMainLoop();
}
The infinite loop in Update() never lets GLUT pump the event queue.
Use glutTimerFunc() or glutIdleFunc() to call Update() instead. That way execution flow periodically returns to GLUT and GLUT can do what it needs to keep the OS happy.
The proper way to run an animation with GLUT is to use a timer function. This way, GLUT can get back to its main loop, and call your display function. You're not supposed to call the display function directly from your own code.
For example, register a timer function during initialization (the first argument is a time in milliseconds):
glutTimerFunc(10, timerFunc, 0);
Then in the timer function:
void timerFunc(int value) {
_time += 0.1f;
glutPostRedisplay();
glutTimerFunc(10, timerFunc, 0);
}
There are two critical pieces in the code fragment above:
You do not call your Render() function directly. Instead, you call glutPostRedisplay() to tell GLUT that a redisplay is needed. It will then call your Render() function because you registered it as the display function with glutDisplayFunc().
You have to register the timer again. glutTimerFunc() fires the timer only once, not periodically. So you have to re-register it every time it fired.
There is one other problem in your code. You have these calls in your main():
glutInitDisplayMode(GLUT_DEPTH | GLUT_DOUBLE | GLUT_RGBA);
...
glutInitDisplayMode(GLUT_DOUBLE);
The flags passed in the second call will override the ones from the first call. While GL_RGBA is the default anyway, you will not get a depth buffer because GL_DEPTH is missing in the second call. You can simply remove the second call, since the first one is most likely what you want.

using a complex display function prevents glutKeyboardFunc from working

Using glutKeyboardFunc in a very simple exemple, i could get it to work very easily :
void special(int key, int x, int y) {
printf("key %d\n", key);
}
void keyboard(unsigned char key, int x, int y) {
printf("key %d\n", key);
}
void display() {}
int main(int argc, char** argv) {
glutInit(&argc, argv);
glutInitDisplayMode(GLUT_RGBA | GLUT_DOUBLE | GLUT_DEPTH);
glutInitWindowSize(640, 480);
glutCreateWindow("GLUT Program");
glutDisplayFunc(display);
glutKeyboardFunc(keyboard);
glutSpecialFunc(special);
glutMainLoop();
return EXIT_SUCCESS;
}
but using it with a very complex display function, the keyboard routine is never called, I am not sure about the part that prevents this call from working so i am just asking here for ideas about what could be the cause, i can't really post the code because that would be the entire project... i am using freeglut with an opengl context and glm for math computation.
Nevertheless here is the new call :
static void loop_function() {
glutSetWindow(win);
Scene::unique_scene->mainloop();
}
I am a bit lost about what to do to find the error, if someone could enlighten me i would be grateful.
Having an infinite loop in the glutDisplayFunc is not the way to go, it is called automatically at the end of the glutKeyboardFunc if you added the call
glutPostRedisplay();

GLUT displaying part of desktop instead of what's in my code

Last week, I tried a few examples I found while reading GLUT tutorials and everything was working great.
Now, when I retry those same examples, I get a weird behaviour: my GLUT window displays the part of the desktop that is at the place the window is (so if the GLUT window starts at (100,100) and is sized 500px by 500px, it displays the part of the desktop that starts at (100,100) and ends at (600,600)).
The following example that I tested last week should show a wire teapot on a black background. I'd like to know if there's something wrong with my code (since I don't recall changing anything in it) or if the problems come from the library not being linked correctly or if it's something else that I'm not aware of.
#include <GL/glut.h>
void displayFunc(void);
int glutWindow;
int main(int argc, char** argv){
glutInit(&argc, argv);
glutInitDisplayMode(GLUT_RGBA | GLUT_SINGLE);
// define the size
glutInitWindowSize(500,500);
// the position where the window will appear
glutInitWindowPosition(100,100);
glutWindow = glutCreateWindow("UITest");
glClearColor(0.0, 0.0, 0.0, 0.0);
glutDisplayFunc(displayFunc);
glutMainLoop();
return EXIT_SUCCESS;
}
void displayFunc(void){
glClear(GL_COLOR_BUFFER_BIT);
glutWireTeapot(0.5);
}
You need to add a call to glFlush() in the end of displayFunc().

glutCreateWindow(...) resetting my enabled flags?

Here's my initialization code:
const int WIN_HEIGHT = 640;
const int WIN_WIDTH = 640;
void main(int argc, char **argv) {
glutInit(&argc, argv);
glutInitDisplayMode(GLUT_DOUBLE | GLUT_RGB | GLUT_DEPTH);
/* lines in question */
glEnable(GL_DEPTH_TEST);
glEnable(GL_NORMALIZE);
glEnable(GL_LIGHTING);
glEnable(GL_LIGHT0);
glShadeModel(GL_SMOOTH);
glutInitWindowSize(WIN_WIDTH, WIN_HEIGHT);
glutCreateWindow("OpenGL");
glutDisplayFunc(Draw);
glutKeyboardFunc(HandleInput);
Initialize();
glutMainLoop();
}
So, most of that code is pretty boilerplate for a basic 3D program. The problem is, if I put all the glEnable() lines before glutCreateWindow() they are reset. It's an easy enough fix to move them after creating the window (I moved them to my own Initialize() function), but why would glutCreateWindow() disable these?
I would say it's because a context has not yet been created before your call to glutCreateWindow(). You are then able to set them after one is created, as can be interpreted from the following text:
In order for any OpenGL commands to work, a context must be current; all OpenGL commands affect the state of whichever context is current.
Since there was no context (or an old, invalid one), your calls to glInit() don't affect the current window's context.
Because glEnable is being done to the current window. Once you call glutCreateWindow you've made a new window and replaced your current one with it. This new window has a new opengl context. After creating a new window you can go ahead and enable and modify it's context as you want.
Reference: http://www.opengl.org/documentation/specs/glut/spec3/node16.html

Win7 GLUT window doesn't receive events

I have created a simple Visual Studio Express 2010 C++ project using GLUT and OpenGL,
it compiles and runs ok, except that the window it creates receives no events..
the close/minimize buttons don't do anything (not even mouseover) no context menu on the task bar on right click, and the window doesn't come to the foreground when clicked if partially covered.
The project is set up as a console application, I can close the program by closing the console.
I have this in main:
int main(int argc, char** argv)
{
glutInit(&argc, argv);
glutInitWindowSize(window_width, window_height);
glutInitDisplayMode(GLUT_RGB | GLUT_DOUBLE);
glutCreateWindow("MyApp");
glutIdleFunc(main_loop_function);
GLenum err = glewInit();
if (GLEW_OK != err)
{
/* Problem: glewInit failed, something is seriously wrong. */
fprintf(stderr, "Error: %s\n", glewGetErrorString(err));
}
if (GLEW_VERSION_1_3)
{
fprintf(stdout, "OpenGL 1.3 is supported \n");
}
fprintf(stdout, "Status: Using GLEW %s\n", glewGetString(GLEW_VERSION));
GL_Setup(window_width, window_height);
glutMainLoop();
}
You miss a display callback. Could you try:
void display();
int main(int argc, char** argv)
{
glutInit(&argc, argv);
glutInitWindowSize(window_width, window_height);
glutInitDisplayMode(GLUT_RGB | GLUT_DOUBLE);
glutCreateWindow("MyApp");
/// glutIdleFunc(main_loop_function);
glutDisplayFunc(display);
// ...
glutMainLoop();
}
void display(){
glClear(GL_COLOR_BUFFER_BIT|GL_DEPTH_BUFFER_BIT);
glutSwapBuffers();
}
Also, your idle function seems bad, if you effectively loop inside that function. Glut is callback oriented. You don't need to create a loop, but rather rely on idle, mouse, keyboard, display and resize callbacks. If you don't do so, you are going to miss window manager events.
EDIT:
If you want to create an animation, you could use:
glutIdleFunc(idleFunc);
void idleFunc(){
glutPostRedisplay();
}