Win7 GLUT window doesn't receive events - c++

I have created a simple Visual Studio Express 2010 C++ project using GLUT and OpenGL,
it compiles and runs ok, except that the window it creates receives no events..
the close/minimize buttons don't do anything (not even mouseover) no context menu on the task bar on right click, and the window doesn't come to the foreground when clicked if partially covered.
The project is set up as a console application, I can close the program by closing the console.
I have this in main:
int main(int argc, char** argv)
{
glutInit(&argc, argv);
glutInitWindowSize(window_width, window_height);
glutInitDisplayMode(GLUT_RGB | GLUT_DOUBLE);
glutCreateWindow("MyApp");
glutIdleFunc(main_loop_function);
GLenum err = glewInit();
if (GLEW_OK != err)
{
/* Problem: glewInit failed, something is seriously wrong. */
fprintf(stderr, "Error: %s\n", glewGetErrorString(err));
}
if (GLEW_VERSION_1_3)
{
fprintf(stdout, "OpenGL 1.3 is supported \n");
}
fprintf(stdout, "Status: Using GLEW %s\n", glewGetString(GLEW_VERSION));
GL_Setup(window_width, window_height);
glutMainLoop();
}

You miss a display callback. Could you try:
void display();
int main(int argc, char** argv)
{
glutInit(&argc, argv);
glutInitWindowSize(window_width, window_height);
glutInitDisplayMode(GLUT_RGB | GLUT_DOUBLE);
glutCreateWindow("MyApp");
/// glutIdleFunc(main_loop_function);
glutDisplayFunc(display);
// ...
glutMainLoop();
}
void display(){
glClear(GL_COLOR_BUFFER_BIT|GL_DEPTH_BUFFER_BIT);
glutSwapBuffers();
}
Also, your idle function seems bad, if you effectively loop inside that function. Glut is callback oriented. You don't need to create a loop, but rather rely on idle, mouse, keyboard, display and resize callbacks. If you don't do so, you are going to miss window manager events.
EDIT:
If you want to create an animation, you could use:
glutIdleFunc(idleFunc);
void idleFunc(){
glutPostRedisplay();
}

Related

How realize two different windows (no contemporary) in OpenGL

I want to create in OpenGL two differents windows but they should appears not together.
I mean: when I execute my code I visualize both of windows contemporarily.
My aim instead is to visualize at the beginning the first window, then the first window should be closed and the second will appear. My pseudocode is the following (this realeses only the two windows together ntemporarily):
int main(int argc, char** argv)
{
glutInit(&argc, argv);
initRendering();
glutInitDisplayMode(GLUT_DOUBLE | GLUT_RGB | GLUT_DEPTH);
glutInitWindowSize(500, 500);
//window1=
glutCreateWindow("First window");
glutDisplayFunc(drawScene1);
glutKeyboardFunc(handleKeypress);
glutReshapeFunc(handleResize);
cout << "Before inserting 'choice' I want to visualize the first window\n
After the insert of 'choice' I want to visualize the second window
and close the first one !";
cin >> choice;
//create the second window
//window2 =
glutCreateWindow("Second Window");
//define a window position for second window
glutPositionWindow(540, 40);
// register callbacks for second window, which is now current
glutReshapeFunc(handleResize);
glutDisplayFunc(drawScene2);
glutKeyboardFunc(handleKeypress);
glutMainLoop();
return 0;
}

GLEW/GLUT: After calling init and creating a window, deinitialize and re-init?

Essentially, I am attempting to discover the controls of the GLEW-GLUT setup. The first objective here is the "hello-world" case where a window is initialized:
int argc = 0;
char ** argv = (char **) calloc(1,sizeof(char **));
argv[0] = (char *) calloc(1,sizeof(char *));
argv[0][0] = '\0';
glutInit(&argc, argv);
glutInitWindowSize( 500, 500);
glutInitWindowPosition(100, 100);
glutCreateWindow("foo");
std::this_thread::sleep_for(std::chrono::seconds(1));
glutDestroyWindow(glutGetWindow());
free(argv[0]);
free(argv);
From a "controls" perspective, it seems as if I cannot achieve a runtime state that existed prior to calling glutInit in the sense that I cannot "re-initialize" glut without getting a segmentation fault.
So, once glut is initialized, it appears as if glut is always initialized. This seems strange.
How does one tear everything down after GLUT is initialized, so that it can be re-initialized? Every method and setting I have tries leaves a window that will not close until the code exits.
There must be some sort of "GLEW/GLUT Teardown and Exit Everything" function...?
Or is every GLEW/GLUT window a one way ticket?
Firstly, freeglut cannot cause a segmentation fault on re-initialization. It will simply notify you of this attempt and terminate the program, but this is not a segmentation fault.
Secondly, yes, it is possible to deinitialize and reinitialize freeglut. The fgDeinitialize() function should do this.
Something like this:
void fgDeinitialize( void ); // put it above main function
int main (int argc, char** argv, char** env)
{
int Window, i = 0;
glutInit(&i, NULL); // i don't want send arguments
glutInitDisplayMode(GLUT_DOUBLE | GLUT_RGBA | GLUT_DEPTH | GLUT_ALPHA);
Window = glutCreateWindow("frog"); // init window
glutDestroyWindow(Window); // destroy window
printf("reinitialize freeglut\n");
fgDeinitialize(); // destroy glut
glutInit(&i, NULL); // initialize it again
glutInitDisplayMode(GLUT_DOUBLE | GLUT_RGBA | GLUT_DEPTH | GLUT_ALPHA);
glutInitWindowSize(1024, 768);
glutInitWindowPosition(100, 100);
Window = glutCreateWindow("Second Window!");
if (!Window) exit(-1); // if no window returned - error
... some other code ...

C++ update OpenGL/Glut window

I've tried some OpenGL C++ training.
But I have a logic problem, how can I update my OpenGL Windows window.
It should draw text one, then delay 1-2sec, then draw text 2, but now it draws same time. Can anyone help or give a hint.
void text () {
wait(1);
Sleep(1000);
std::string text_one;
text_one = "Text 1";
glColor3f(1,01, 0);
drawText(text_one.data(), text_one.size(), 050, 150);
glutPostRedisplay();
wait (1)
std::string text_two;
text_two = "Text 2";
glColor3f(1,0, 0);
drawText(text_two.data(), text_two.size(), 250, 150);
}
and here the main
int main(int argc, char **argv) {
// init GLUT and create Window
glutInit(&argc, argv);
glutInitDisplayMode(GLUT_DEPTH | GLUT_DOUBLE | GLUT_RGBA);
glutInitWindowPosition(100,100);
glutInitWindowSize(640,640);
glutCreateWindow("Test 001");
// register callbacks
glutDisplayFunc(renderScene);
glutIdleFunc(text);
// enter GLUT event processing cycle
glutMainLoop();
return 1;
}
You should render in renderScene callback. It will be called automatically in you screen refresh rate. If you want some delay you need to implement it inside this callback (functions called from this callback).
So basically you need to re-render everything every 1/60 second.
If you want to implement easy delay you can do something like this:
void renderScene() {
time += deltaTime;
RenderText1();
if (time > delayTime)
RenderText2();
glutSwapBuffers();
}

Setting up GLUT with VisualStudio 2012 on Windows 8

I'm running into trouble setting up GLUT (3.7.6 binaries obtained from Nate Robins) on Windows 8 64bit with VS2012. The glut32.dll is copied to the SysWOW64 dir, both include and lib path are set in my project files and the libraries are set in the Linker->Input settings ("...;glut32.lib;glu32.lib;opengl32.lib;...").
My code looks like this:
#include <GL/glut.h>
void display()
{
}
int main(int argc, char **argv)
{
glutInit(&argc, argv);
glutDisplayFunc(display);
glutMainLoop();
}
The build process is successful but the application crashes with the following error message:
Unhandled exception at 0x1000BBAE (glut32.dll) in HelloOpenGL.exe: 0xC0000005: Access violation writing location 0x000000A8.
The setup seems fairly simple. Any ideas what I'm missing?
The call to glutDisplayFunc() without opening a window caused the crash. This is the updated code that opens a new window before passing the display function:
#include <GL/glut.h>
void display()
{
}
int main(int argc, char **argv)
{
glutInit(&argc, argv);
//Set Display Mode
glutInitDisplayMode(GLUT_SINGLE | GLUT_RGB);
//Set the window size
glutInitWindowSize(250,250);
//Set the window position
glutInitWindowPosition(100,100);
//Create the window
glutCreateWindow("Hello OpenGL");
//Set the display function
glutDisplayFunc(display);
//Enter the main loop
glutMainLoop();
}

gl calls end in EXC_BAC_ACCESS - bad opengl context?

I have the following program
void allocVars(){
m_window = new GLWindow(); //glGenTexture() is called //CRASH!
m_window->Init(m_cam.w, m_cam.h, "Window Name");
}
void glInit()
{
glutReshapeFunc(reshape);
glutIdleFunc(idle);
glutKeyboardFunc(keyboard);
glutMouseFunc(mouse);
glutDisplayFunc(display);
glBlendFunc(GL_SRC_ALPHA, GL_ONE); //CRASH!
}
int main(int argc, char **argv)
{
glutInit(&argc, argv);
glInit(); // CRASH HERE
camInit(); //ok
allocVars(); // CRASH HERE
trackingInit();
glutMainLoop();
return 0;
}
Acccording to other posts, in order to make gl calls I have to have a valid openGL context first. (For strange reasons in Windows it works even if the context is not valid yet.)
That is why I move everything after glutInit and glInit functions but this application always crashes in gl functions like
glGenTextures() inside of allocVars(); in GLWindow(); or in
glBlendFunc() inside of glInit()
I wonder what am I missing here and/or how can I check I have a valid opengl context?
Thanks in advance
answer is in the comments ↑