Here's my initialization code:
const int WIN_HEIGHT = 640;
const int WIN_WIDTH = 640;
void main(int argc, char **argv) {
glutInit(&argc, argv);
glutInitDisplayMode(GLUT_DOUBLE | GLUT_RGB | GLUT_DEPTH);
/* lines in question */
glEnable(GL_DEPTH_TEST);
glEnable(GL_NORMALIZE);
glEnable(GL_LIGHTING);
glEnable(GL_LIGHT0);
glShadeModel(GL_SMOOTH);
glutInitWindowSize(WIN_WIDTH, WIN_HEIGHT);
glutCreateWindow("OpenGL");
glutDisplayFunc(Draw);
glutKeyboardFunc(HandleInput);
Initialize();
glutMainLoop();
}
So, most of that code is pretty boilerplate for a basic 3D program. The problem is, if I put all the glEnable() lines before glutCreateWindow() they are reset. It's an easy enough fix to move them after creating the window (I moved them to my own Initialize() function), but why would glutCreateWindow() disable these?
I would say it's because a context has not yet been created before your call to glutCreateWindow(). You are then able to set them after one is created, as can be interpreted from the following text:
In order for any OpenGL commands to work, a context must be current; all OpenGL commands affect the state of whichever context is current.
Since there was no context (or an old, invalid one), your calls to glInit() don't affect the current window's context.
Because glEnable is being done to the current window. Once you call glutCreateWindow you've made a new window and replaced your current one with it. This new window has a new opengl context. After creating a new window you can go ahead and enable and modify it's context as you want.
Reference: http://www.opengl.org/documentation/specs/glut/spec3/node16.html
Related
I've tried some OpenGL C++ training.
But I have a logic problem, how can I update my OpenGL Windows window.
It should draw text one, then delay 1-2sec, then draw text 2, but now it draws same time. Can anyone help or give a hint.
void text () {
wait(1);
Sleep(1000);
std::string text_one;
text_one = "Text 1";
glColor3f(1,01, 0);
drawText(text_one.data(), text_one.size(), 050, 150);
glutPostRedisplay();
wait (1)
std::string text_two;
text_two = "Text 2";
glColor3f(1,0, 0);
drawText(text_two.data(), text_two.size(), 250, 150);
}
and here the main
int main(int argc, char **argv) {
// init GLUT and create Window
glutInit(&argc, argv);
glutInitDisplayMode(GLUT_DEPTH | GLUT_DOUBLE | GLUT_RGBA);
glutInitWindowPosition(100,100);
glutInitWindowSize(640,640);
glutCreateWindow("Test 001");
// register callbacks
glutDisplayFunc(renderScene);
glutIdleFunc(text);
// enter GLUT event processing cycle
glutMainLoop();
return 1;
}
You should render in renderScene callback. It will be called automatically in you screen refresh rate. If you want some delay you need to implement it inside this callback (functions called from this callback).
So basically you need to re-render everything every 1/60 second.
If you want to implement easy delay you can do something like this:
void renderScene() {
time += deltaTime;
RenderText1();
if (time > delayTime)
RenderText2();
glutSwapBuffers();
}
I'm making a OpenGL game, and I have problem with optimization. When I start it, it does not respond. If in Update() I just put a for loop and _time += 0.1f, I get a blank screen.
void Update(){
for(; ;){
_time += 0.1f;
Render();
}
}
void Render() {
glClearDepth(1.0);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
_colorProgram.use();
GLuint timeLocation = _colorProgram.getUniformLocation("time");
glUniform1f(timeLocation, _time);
_sprite.Render();
_colorProgram.unuse();
glutSwapBuffers();
}
int main(int argc, char** argv) {
std::printf("OpenGL version is %s",glGetString(GL_VERSION));
// Window
glutInit(&argc, argv);
glutInitDisplayMode(GLUT_DEPTH | GLUT_DOUBLE | GLUT_RGBA);
glutInitWindowPosition(520, 200);
glutInitWindowSize(800, 600);
glutInitDisplayMode(GLUT_DOUBLE);
glutCreateWindow("OpenGL [ Shader #1 error pt 3 ]");
// Setup GLEW
if( GLEW_OK != glewInit()){
return 1;
} while( GL_NO_ERROR != glGetError() );
// After creating window
Init();
glutDisplayFunc(Render);
Update();
glutMainLoop();
}
The infinite loop in Update() never lets GLUT pump the event queue.
Use glutTimerFunc() or glutIdleFunc() to call Update() instead. That way execution flow periodically returns to GLUT and GLUT can do what it needs to keep the OS happy.
The proper way to run an animation with GLUT is to use a timer function. This way, GLUT can get back to its main loop, and call your display function. You're not supposed to call the display function directly from your own code.
For example, register a timer function during initialization (the first argument is a time in milliseconds):
glutTimerFunc(10, timerFunc, 0);
Then in the timer function:
void timerFunc(int value) {
_time += 0.1f;
glutPostRedisplay();
glutTimerFunc(10, timerFunc, 0);
}
There are two critical pieces in the code fragment above:
You do not call your Render() function directly. Instead, you call glutPostRedisplay() to tell GLUT that a redisplay is needed. It will then call your Render() function because you registered it as the display function with glutDisplayFunc().
You have to register the timer again. glutTimerFunc() fires the timer only once, not periodically. So you have to re-register it every time it fired.
There is one other problem in your code. You have these calls in your main():
glutInitDisplayMode(GLUT_DEPTH | GLUT_DOUBLE | GLUT_RGBA);
...
glutInitDisplayMode(GLUT_DOUBLE);
The flags passed in the second call will override the ones from the first call. While GL_RGBA is the default anyway, you will not get a depth buffer because GL_DEPTH is missing in the second call. You can simply remove the second call, since the first one is most likely what you want.
I'm using Sumanta Guha's code sample and I'm trying to create two windows. Using the following code:
int main(int argc, char **argv)
{
glutInit(&argc, argv);
glutInitDisplayMode (GLUT_DOUBLE | GLUT_RGB);
// First top-level window definition.
glutInitWindowSize(250, 500);
glutInitWindowPosition(100, 100);
// Create the first window and return id.
id1 = glutCreateWindow("windows.cpp - window 1");
// Initialization, display, and other routines of the first window.
setup1();
glutDisplayFunc(drawScene1);
glutReshapeFunc(resize1);
glutKeyboardFunc(keyInput); // Routine is shared by both windows.
// Second top-level window definition.
glutInitWindowSize(250, 500);
glutInitWindowPosition(400, 100);
// Create the second window and return id.
id2 = glutCreateWindow("windows.cpp - window 2");
// Initialization, display, and other routines of the second window.
setup2();
glutDisplayFunc(drawScene2);
glutReshapeFunc(resize2);
glutKeyboardFunc(keyInput); // Routine is shared by both windows.
glutMainLoop();
return 0;
}
I'm using Windows 7, and normally it should display two windows. But as you can see, only one Window displays properly and the other one doesn't seem to work quite as well. Are there additional steps that I have to take other than GLUT_DOUBLE and buffer swap?
Are there additional steps that I have to take other than GLUT_DOUBLE and buffer swap?
Since you are creating multiple windows, you have to call glutSetWindow() in your callbacks.
freeglut has an extension (which doesn't work) to create a shared opengl context, but the original glut doesn't support it.
Last week, I tried a few examples I found while reading GLUT tutorials and everything was working great.
Now, when I retry those same examples, I get a weird behaviour: my GLUT window displays the part of the desktop that is at the place the window is (so if the GLUT window starts at (100,100) and is sized 500px by 500px, it displays the part of the desktop that starts at (100,100) and ends at (600,600)).
The following example that I tested last week should show a wire teapot on a black background. I'd like to know if there's something wrong with my code (since I don't recall changing anything in it) or if the problems come from the library not being linked correctly or if it's something else that I'm not aware of.
#include <GL/glut.h>
void displayFunc(void);
int glutWindow;
int main(int argc, char** argv){
glutInit(&argc, argv);
glutInitDisplayMode(GLUT_RGBA | GLUT_SINGLE);
// define the size
glutInitWindowSize(500,500);
// the position where the window will appear
glutInitWindowPosition(100,100);
glutWindow = glutCreateWindow("UITest");
glClearColor(0.0, 0.0, 0.0, 0.0);
glutDisplayFunc(displayFunc);
glutMainLoop();
return EXIT_SUCCESS;
}
void displayFunc(void){
glClear(GL_COLOR_BUFFER_BIT);
glutWireTeapot(0.5);
}
You need to add a call to glFlush() in the end of displayFunc().
I have the following program
void allocVars(){
m_window = new GLWindow(); //glGenTexture() is called //CRASH!
m_window->Init(m_cam.w, m_cam.h, "Window Name");
}
void glInit()
{
glutReshapeFunc(reshape);
glutIdleFunc(idle);
glutKeyboardFunc(keyboard);
glutMouseFunc(mouse);
glutDisplayFunc(display);
glBlendFunc(GL_SRC_ALPHA, GL_ONE); //CRASH!
}
int main(int argc, char **argv)
{
glutInit(&argc, argv);
glInit(); // CRASH HERE
camInit(); //ok
allocVars(); // CRASH HERE
trackingInit();
glutMainLoop();
return 0;
}
Acccording to other posts, in order to make gl calls I have to have a valid openGL context first. (For strange reasons in Windows it works even if the context is not valid yet.)
That is why I move everything after glutInit and glInit functions but this application always crashes in gl functions like
glGenTextures() inside of allocVars(); in GLWindow(); or in
glBlendFunc() inside of glInit()
I wonder what am I missing here and/or how can I check I have a valid opengl context?
Thanks in advance
answer is in the comments ↑