SDL_GL_SwapBuffers() is intermittently slow - c++

I have a sdl/opengl game I am working on for fun. I get a decent fps on average, but movement is really choppy because SDL_GL_SwapBuffers() will randomly take a crazy long amount of time to process. With textures loaded and written to the buffer sometimes it will take over 100ms! I cut out a lot of my code to try and figure out if it was something I did wrong but I haven't had much luck. When I run this bare bones program it will still block for up to 70ms at times.
Main:
// Don't forget to link to opengl32, glu32, SDL_image.lib
// includes
#include <stdio.h>
// SDL
#include <cstdlib>
#include <SDL/SDL.h>
// Video
#include "videoengine.h"
int main(int argc, char *argv[])
{
// begin SDL
if ( SDL_Init(SDL_INIT_VIDEO) != 0 )
{
printf("Unable to initialize SDL: %s\n", SDL_GetError());
}
// begin video class
VideoEngine videoEngine;
// BEGIN MAIN LOOP
bool done = false;
while (!done)
{
int loopStart = SDL_GetTicks();
printf("STARTING SWAP BUFFER : %d\n", SDL_GetTicks() - loopStart);
SDL_GL_SwapBuffers();
int total = SDL_GetTicks() - loopStart;
if (total > 6)
printf("END LOOP : %d ------------------------------------------------------------>\n", total);
else
printf("END LOOP : %d\n", total);
}
// END MAIN LOOP
return 0;
}
My "VideoEngine" constructor:
VideoEngine::VideoEngine()
{
UNIT = 16;
SCREEN_X = 320;
SCREEN_Y = 240;
SCALE = 1;
// Begin Initalization
SDL_Surface *screen;
SDL_GL_SetAttribute( SDL_GL_DOUBLEBUFFER, 1 ); // [!] SDL_GL_SetAttributes must be done BEFORE SDL_SetVideoMode
screen = SDL_SetVideoMode( SCALE*SCREEN_X, SCALE*SCREEN_Y, 16, SDL_OPENGL ); // Set screen to the window with opengl
if ( !screen ) // make sure the window was created
{
printf("Unable to set video mode: %s\n", SDL_GetError());
}
// set opengl state
opengl_init();
// End Initalization
}
void VideoEngine::opengl_init()
{
// Set the OpenGL state after creating the context with SDL_SetVideoMode
//glClearColor( 0, 0, 0, 0 ); // sets screen buffer to black
//glClearDepth(1.0f); // Tells OpenGL what value to reset the depth buffer when it is cleared
glViewport( 0, 0, SCALE*SCREEN_X, SCALE*SCREEN_Y ); // sets the viewport to the default resolution (SCREEN_X x SCREEN_Y) multiplied by SCALE. (x,y,w,h)
glMatrixMode( GL_PROJECTION ); // Applies subsequent matrix operations to the projection matrix stack.
glLoadIdentity(); // Replaces the current matrix with the identity matrix
glOrtho( 0, SCALE*SCREEN_X, SCALE*SCREEN_Y, 0, -1, 1 ); //describes a transformation that produces a parallel projection
glMatrixMode( GL_MODELVIEW ); // Applies subsequent matrix operations to the projection matrix stack.
glEnable(GL_TEXTURE_2D); // Need this to display a texture
glLoadIdentity(); // Replaces the current matrix with the identity matrix
glEnable(GL_BLEND); // Enable blending for transparency
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA); // Specifies pixel arithmetic
//glDisable( GL_LIGHTING ); // Disable lighting
//glDisable( GL_DITHER ); // Disable dithering
//glDisable( GL_DEPTH_TEST ); // Disable depth testing
//Check for error
GLenum error = glGetError();
if( error != GL_NO_ERROR )
{
printf( "Error initializing OpenGL! %s\n", gluErrorString( error ) );
}
return;
}
I'm starting to think possibly I have a hardware issue? I have never had this problem with a game though.

SDL does use the SwapIntervalEXT extension so you can make sure that the buffer swaps are as fast as possible (VSYNC disabled). Also, buffer swap is not a simple operation, OpenGL needs to copy contents of back buffers to front buffers for case that you want to glReadPixels(). This behavior can be controlled using WGL_ARB_pixel_format, using WGL_SWAP_EXCHANGE_ARB (you can read about all this stuff in the specs; now I'm not sure if there is an alternative to that for Linux).
And then on top of all that, there is the windowing system. That can actually cause a lot of trouble. Also, if some errors are generated ...
This behavior is probably ok if you're running on a small mobile GPU.
SDL_GL_SwapBuffers() only contains a call to glxSwapBuffers() / wglSwapBuffers() so there is no time spent in there.

Related

ImGui with the glad openGL loader throws segmentation fault (core dumped)

I am new to the ImGui library and recently i've been trying out the examples included. Everything worked like a charm until I changed the include (and functions) of gl3w to glad (the loader i would like to use). The moment I swapped between the two loaders I got a segmentation fault exception inside the imgui_impl_glfw_gl3.cpp file. I found a post which suggested that this may happen because of some functions failing to "bind" and producing nullpointers.
I have located the error in line 216 of imgui_impl_glfw_gl3.cpp
this is the code in line 216:
glGetIntegerv(GL_TEXTURE_BINDING_2D, &last_texture);
I have also changed the include file in imgui_impl_glfw_gl3.cpp from gl3w to glad with no results.
This is the main function i am executing (it's the basic opengl3 example of imgui using glad):
#include "gui/imgui.h"
#include "gui/imgui_impl_glfw_gl3.h"
#include <stdio.h>
#include <glad/glad.h> // This example is using gl3w to access OpenGL functions (because it is small). You may use glew/glad/glLoadGen/etc. whatever already works for you.
#include <GLFW/glfw3.h>
static void error_callback(int error, const char* description)
{
fprintf(stderr, "Error %d: %s\n", error, description);
}
int main(int, char**)
{
// Setup window
glfwSetErrorCallback(error_callback);
if (!glfwInit())
return 1;
glfwWindowHint(GLFW_CONTEXT_VERSION_MAJOR, 3);
glfwWindowHint(GLFW_CONTEXT_VERSION_MINOR, 2);
glfwWindowHint(GLFW_OPENGL_PROFILE, GLFW_OPENGL_CORE_PROFILE);
GLFWwindow* window = glfwCreateWindow(1280, 720, "ImGui OpenGL3 example", NULL, NULL);
glfwMakeContextCurrent(window);
glfwSwapInterval(1); // Enable vsync
glfwInit();
// Setup ImGui binding
ImGui_ImplGlfwGL3_Init(window, true);
// Setup style
//ImGui::StyleColorsDark();
ImGui::StyleColorsClassic();
bool show_demo_window = true;
bool show_another_window = false;
bool algo = true;
ImVec4 clear_color = ImVec4(0.45f, 0.55f, 0.60f, 1.00f);
// Main loop
while (!glfwWindowShouldClose(window))
{
glfwPollEvents();
ImGui_ImplGlfwGL3_NewFrame();
// 1. Show a simple window.
// Tip: if we don't call ImGui::Begin()/ImGui::End() the widgets automatically appears in a window called "Debug".
{
static float f = 0.0f;
static int counter = 0;
ImGui::Text("Hello, world!"); // Display some text (you can use a format string too)
ImGui::SliderFloat("float", &f, 0.0f, 1.0f); // Edit 1 float using a slider from 0.0f to 1.0f
ImGui::ColorEdit3("COLORINES", (float*)&clear_color); // Edit 3 floats representing a color
ImGui::Checkbox("Demo Window", &show_demo_window); // Edit bools storing our windows open/close state
ImGui::Checkbox("Booleanooooo", &algo);
ImGui::Checkbox("Another Window", &show_another_window);
if (ImGui::Button("Button")) // Buttons return true when clicked (NB: most widgets return true when edited/activated)
counter++;
ImGui::SameLine();
ImGui::Text("counter = %d", counter);
ImGui::Text("pues se ve que hay texto: %d", algo);
ImGui::Text("Application average %.3f ms/frame (%.1f FPS)", 1000.0f / ImGui::GetIO().Framerate, ImGui::GetIO().Framerate);
}
{
ImGui::Begin("VENTANA WAPA");
ImGui::Text("POS SA QUEDAO BUENA VENTANA");
static float yee = 0.0f;
ImGui::SliderFloat("lel", &yee,1.0f,0.5f);
ImGui::End();
}
// 2. Show another simple window. In most cases you will use an explicit Begin/End pair to name your windows.
if (show_another_window)
{
ImGui::Begin("Another Window", &show_another_window);
ImGui::Text("Hello from another window!");
if (ImGui::Button("Close Me"))
show_another_window = false;
ImGui::End();
}
// 3. Show the ImGui demo window. Most of the sample code is in ImGui::ShowDemoWindow(). Read its code to learn more about Dear ImGui!
if (show_demo_window)
{
ImGui::SetNextWindowPos(ImVec2(650, 20), ImGuiCond_FirstUseEver); // Normally user code doesn't need/want to call this because positions are saved in .ini file anyway. Here we just want to make the demo initial state a bit more friendly!
ImGui::ShowDemoWindow(&show_demo_window);
}
// Rendering
int display_w, display_h;
glfwGetFramebufferSize(window, &display_w, &display_h);
glViewport(0, 0, display_w, display_h);
glClearColor(clear_color.x, clear_color.y, clear_color.z, clear_color.w);
glClear(GL_COLOR_BUFFER_BIT);
ImGui::Render();
glfwSwapBuffers(window);
}
// Cleanup
//ImGui_ImplGlfwGL3_Shutdown();
glfwTerminate();
return 0;
}
I have no clue why this is happenning and I'm pretty new to openGL an ImGui so, any ideas? :(
Glad & gl3w are both extension loader libraries. They generally need to be initialized on a current GL context before use.
The original code called gl3wInit(). Yours is missing any sort of glad init.
Make sure you initialize glad (gladLoadGLLoader((GLADloadproc) glfwGetProcAddress)) after glfwMakeContextCurrent() and before you call any OpenGL functions.
Otherwise all the OpenGL function pointers glad declares will remain NULL. Trying to call NULL function pointers generally doesn't go well for a process.

glfw openGL c++ window background and title

This is my source code from a series of tutorials I'm taking a look regarding opengl 3+.
//#include <stdio.h>
//#include <stdlib.h>
#include <GL/glew.h>
#include <GL/glfw.h>
#include <glm/glm.hpp>
using namespace glm;
#include <iostream>
using namespace std;
int main()
{
if( !glfwInit() )
{
fprintf( stderr, "Failed to initialize GLFW\n" );
return -1;
}
glfwOpenWindowHint(GLFW_FSAA_SAMPLES, 4); // 4x antialiasing
glfwOpenWindowHint(GLFW_OPENGL_VERSION_MAJOR, 3); // We want OpenGL 3.3
glfwOpenWindowHint(GLFW_OPENGL_VERSION_MINOR, 2);
glfwOpenWindowHint(GLFW_OPENGL_PROFILE, GLFW_OPENGL_CORE_PROFILE); //We don't want the old OpenGL
// Open a window and create its OpenGL context
if( !glfwOpenWindow( 1024, 768, 0,0,0,0, 32,0, GLFW_WINDOW ) )
{
fprintf( stderr, "Failed to open GLFW window\n" );
glfwTerminate();
return -1;
}
else
{
glfwSetWindowTitle( "Tutorial 01" );
}
// Initialize GLEW
glewExperimental=true; // Needed in core profile
if (glewInit() != GLEW_OK) {
fprintf(stderr, "Failed to initialize GLEW\n");
return -1;
}
glfwEnable( GLFW_STICKY_KEYS );
do{
// Draw nothing, see you in tutorial 2 !
// Swap buffers
glfwSwapBuffers();
} // Check if the ESC key was pressed or the window was closed
while( glfwGetKey( GLFW_KEY_ESC ) != GLFW_PRESS &&
glfwGetWindowParam( GLFW_OPENED ) );
return 0;
}
Everything works great except that when the window initializes it has a background color of white and the title 'GLFW WINDOW' and but after 1-2 secs the title changes to Tutorial 01 as it should have been in the first place and the background becomes black, as it should have been also.
In previous studies of opengl (2.x) with glut that I did a couple of years back I didnt have issues like that could someone explain what is wrong here?
I get the same behaviour (even running a release exe outside of the IDE) on an ATI FirePro V5700. If you're really bothered by it, download the GLFW source and change line 764 of carbon_window.c, line 1210 of win32_window.c, and line 962 of x11_window.c.
.\lib\carbon\carbon_window.c
(void)SetWindowTitleWithCFString( _glfwWin.window, CFSTR( "GLFW Window" ) );
.\lib\win32\win32_window.c
_glfwWin.window = CreateWindowEx( _glfwWin.dwExStyle, // Extended style
_GLFW_WNDCLASSNAME, // Class name
"GLFW Window", // Window title
_glfwWin.dwStyle, // Defined window style
wa.left, wa.top, // Window position
fullWidth, // Decorated window width
fullHeight, // Decorated window height
NULL, // No parent window
NULL, // No menu
_glfwLibrary.instance, // Instance
NULL ); // Nothing to WM_CREATE
.\lib\x11\x11_window.c
_glfwPlatformSetWindowTitle( "GLFW Window" );
I'm not experiencing any problems that you describe. I copied and pasted, compiled, then ran your code as you posted it (commenting out references to GLM as I don't have that library installed). The title changes instantaneously for me -- I never even see the window having the title "GLFW WINDOW", and the color of the graphics area is immediately black. Could it be that your computer simply isn't very fast?
What happens if you do the following?
do{
// Draw nothing, see you in tutorial 2 !
glClear(GL_COLOR_BUFFER_BIT);
// Swap buffers
glfwSwapBuffers();
glfwSleep(0.016);
} // Check if the ESC key was pressed or the window was closed
while( glfwGetKey( GLFW_KEY_ESC ) != GLFW_PRESS && glfwGetWindowParam( GLFW_OPENED ) );
Edit: My GPU is an Nvidia GTX 580 (capable of at least OpenGL 4.3).
The background color of the window is configured by the call to the openGL function glClearColor() and refreshed with the glClear() function.
The delay in the changing of the title of the window may perhaps have something to do with the fact that to create a openGL 3.x it is required to create a standard OpenGL context (version 1.x or 2.x) and then get your application to opt-in to using an OpenGL 3.x context. A 1-2 sec delay though seems a lot.
Well as it seems it has to do with the IDE, if you run the actual .exe it works as intended and with no delay at all.

Use OpenGL Bitmap Fonts to put text onto the screen

I am now learning OpenGL NeHe production.When I come to read Lesson 13 Bitmap Fonts,I encounter a problem.I write my code using glut.And my PC system is Windows7.I run my code on Microsoft Visual Studio 2008 and there is not any error.But nothing appears in the window.I don't know what is wrong.What may cause this problem generally?Did I miss some settings?
Here is my code:
#pragma comment(lib,"GLAUX.LIB")
#include <GL/glut.h>
#include <windows.h>
#include <GL/glaux.h>
#include <stdio.h>
#include <stdarg.h>
#include <math.h>
HDC hDC = NULL;
GLuint base;//the first display list we create
GLfloat cnt1,cnt2;//move on the screen or set color
GLvoid buildFont() // Build Our Bitmap Font
{
HFONT font; // Windows Font ID
HFONT oldfont; // Used For Good House Keeping
base = glGenLists(96); // Storage For 96 Characters
font = CreateFont(
-24, // Height Of Font
0, // Width Of Font
0, // Angle Of Escapement
0, // Orientation Angle
FW_BOLD, // Font Weight
FALSE, // Italic
FALSE, // Underline
FALSE, // Strikeout
ANSI_CHARSET, // Character Set Identifier
OUT_TT_PRECIS, // Output Precision
CLIP_DEFAULT_PRECIS, // Clipping Precision
ANTIALIASED_QUALITY, // Output Quality
FF_DONTCARE|DEFAULT_PITCH, // Family And Pitch
"Times New Roman"); // Font Name
oldfont = (HFONT)SelectObject(hDC, font); // Selects The Font We Want
wglUseFontBitmaps(hDC, 32, 96, base); // Builds 96 Characters Starting At Character 32
SelectObject(hDC, oldfont); // Selects The Font We Want
DeleteObject(font); // Delete The Font
}
void killFont()
{
glDeleteLists(base,96);
}
void glPrint(const char *fmt, ...) // Custom GL "Print" Routine
{
char text[256]; // Holds Our String
va_list ap; // Pointer To List Of Arguments
if (fmt == NULL) // If There's No Text
{
printf("the string to print is NULL!\n");
return; // Do Nothing
}
va_start(ap, fmt); // Parses The String For Variables
vsprintf(text, fmt, ap); // And Converts Symbols To Actual Numbers
va_end(ap); // Results Are Stored In Text
glPushAttrib(GL_LIST_BIT); // Pushes The Display List Bits
glListBase(base - 32); // Sets The Base Character to 32
glCallLists(strlen(text), GL_UNSIGNED_BYTE, text); // Draws The Display List Text
glPopAttrib(); // Pops The Display List Bits
}
int init(GLvoid) // All Setup For OpenGL Goes Here
{
glShadeModel(GL_SMOOTH); // Enable Smooth Shading
glClearColor(0.0f, 0.0f, 0.0f, 0.5f); // Black Background
glClearDepth(1.0f); // Depth Buffer Setup
glEnable(GL_DEPTH_TEST); // Enables Depth Testing
glDepthFunc(GL_LEQUAL); // The Type Of Depth Testing To Do
glHint(GL_PERSPECTIVE_CORRECTION_HINT, GL_NICEST); // Really Nice Perspective Calculations
buildFont(); // Build The Font
return TRUE; // Initialization Went OK
}
void display() // Here's Where We Do All The Drawing
{
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT); // Clear Screen And Depth Buffer
glLoadIdentity(); // Reset The Current Modelview Matrix
glTranslatef(0.0f,0.0f,-1.0f); // Move One Unit Into The Screen
// Pulsing Colors Based On Text Position
glColor3f(1.0f*float(cos(cnt1)),1.0f*float(sin(cnt2)),1.0f-0.5f*float(cos(cnt1+cnt2)));
// Position The Text On The Screen
glRasterPos2f(-0.45f+0.05f*float(cos(cnt1)), 0.32f*float(sin(cnt2)));
glPrint("Active OpenGL Text With NeHe - %7.2f", cnt1); // Print GL Text To The Screen
glutSwapBuffers();// Everything Went OK
}
void spinDisplay()
{
cnt1 += 0.051f;
cnt2 += 0.005f;
printf("cnt1: %f\n",cnt1);
printf("cnt2: %f\n",cnt2);
}
void reshape(int w,int h)
{
if (0 == h)
h = 1;
glViewport(0,0,(GLsizei)w,(GLsizei)h);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
gluPerspective(60.0f,(GLfloat)w / (GLfloat)h,1,100);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
}
int main(int argc,char** argv)
{
glutInit(&argc,argv);
glutInitDisplayMode(GLUT_DOUBLE | GLUT_RGB);
glutInitWindowSize(600,600);
glutInitWindowPosition(100,100);
glutCreateWindow("Bitmap Fonts");
init();
glutDisplayFunc(display);
glutReshapeFunc(reshape);
glutIdleFunc(spinDisplay);
//glutKeyboardFunc(keyboard);
glutMainLoop();
killFont();
return 0;
}
This is the result on Visual Studio 2008:
you can use glRasterPos and glutBitmapCharacter this is present in glut
glRasterPos3f( 30.0f , 25.0f ,0.0f );
glutBitmapCharacter( GLUT_BITMAP_HELVETICA_18 , 'A');
or use glutBitmapString (supported in freeglut current).
glRasterPos3f(30.0f , 20.0f ,0.0f);
glutBitmapString( GLUT_BITMAP_HELVETICA_18 , "Hello World!" );
if you can't use glutBitmapString to print a string you can use a loop
char *a="Hello World!";
glRasterPos3f( 30.0f , 25.0f ,0.0f );
for(i = 0; a[i] != '\0'; i++)
glutBitmapCharacter( GLUT_BITMAP_HELVETICA_18 , a[i]);
There isn't anything wrong with your code.
I'm taking a class in OpenGL using glut. We have encountered a problem in class where the labs computers correctly display the characters in the correct colors, but a few of the students' laptops will only display the characters in black. All the machines are running windows 7, so we suspect that it has to do with what version of OpenGL is on the machine.
Anyway change your background color to white ( or something that will easily show black text). You should see your text if your positioning is correct.
GLUT is outdated, and no longer maintained. Maybe this is the reason of problem on Win7. The last GLUT version (3.7) dating back to August 1998.
You can try freeglut, a full compatible alternative to GLUT to get a 100% replacement without changing anything in source.
I've just tried NeHe Lesson 13 project (based on GLUT) on
Vista x64 SP2 with MS Visual Studio 2005 SP2
Windows 7 (64bit) with MS Visual Studio 2010 SP1 (32bit debug app)
Both of them works fine! But under Win7 with MS VS2010 the 64bit debug version cannot be built because of some unresolved external.
Did you build a 32bit or a 64bit version?
Have you already tried other NeHe downloads without GLUT? (http://nehe.gamedev.net/data/lessons/vc/lesson13.zip)
You can try to update your graphic driver and try to switch on/off Windows Aero Theme, it helps often, because of different Pixel Format Descriptor.
I hope that helps.

C++ Semi-Transparent Window SDL

I wish to have a semi-transparent SDL background (nothing to do with sub-surfaces or images), such that instead of having a black background it is actually transparent, but the other things I draw are not. My current code is a slightly modified copy of Code::Blocks' SDL project, similar to how various applications have rounded borders or odd shapes besides rectangles.
#ifdef __cplusplus
#include <cstdlib>
#else
#include <stdlib.h>
#endif
#ifdef __APPLE__
#include <SDL/SDL.h>
#else
#include <SDL.h>
#endif
int main ( int argc, char** argv )
{
putenv("SDL_VIDEO_WINDOW_POS");
putenv("SDL_VIDEO_CENTERED=1");
// initialize SDL video
if ( SDL_Init( SDL_INIT_VIDEO ) < 0 )
{
printf( "Unable to init SDL: %s\n", SDL_GetError() );
return 1;
}
// make sure SDL cleans up before exit
atexit(SDL_Quit);
// create a new window
SDL_Surface* screen = SDL_SetVideoMode(640, 480, 16,
SDL_HWSURFACE|SDL_DOUBLEBUF|SDL_NOFRAME);
if ( !screen )
{
printf("Unable to set 640x480 video: %s\n", SDL_GetError());
return 1;
}
// load an image
SDL_Surface* bmp = SDL_LoadBMP("cb.bmp");
if (!bmp)
{
printf("Unable to load bitmap: %s\n", SDL_GetError());
return 1;
}
// centre the bitmap on screen
SDL_Rect dstrect;
dstrect.x = (screen->w - bmp->w) / 2;
dstrect.y = (screen->h - bmp->h) / 2;
// program main loop
bool done = false;
while (!done)
{
// message processing loop
SDL_Event event;
while (SDL_PollEvent(&event))
{
// check for messages
switch (event.type)
{
// exit if the window is closed
case SDL_QUIT:
done = true;
break;
// check for keypresses
case SDL_KEYDOWN:
{
// exit if ESCAPE is pressed
if (event.key.keysym.sym == SDLK_ESCAPE)
done = true;
break;
}
} // end switch
} // end of message processing
// DRAWING STARTS HERE
// clear screen
SDL_FillRect(screen, 0, SDL_MapRGB(screen->format, 0, 0, 0));
// draw bitmap
SDL_BlitSurface(bmp, 0, screen, &dstrect);
// DRAWING ENDS HERE
// finally, update the screen :)
SDL_Flip(screen);
} // end main loop
// free loaded bitmap
SDL_FreeSurface(bmp);
// all is well ;)
printf("Exited cleanly\n");
return 0;
}
I think what you're trying to do is in fact a shaped window (parts of the window are transparent depending on a mask that you provide). It seems there's no way to do that with SDL 1.2, however there is a SDL_SetWindowShape function just for this in SDL 1.3 for which you can find a pre-release snapshot here but it's not even in beta yet so I suggest waiting until it's officialy released :)
this is a link to a pretty neat article about development of an older application for Mac OS 9, which did not have support for shaped windows, either.
It's actually a neat article in general about software development.
But the idea seems pretty smart, and I wonder if you might be able to get it working here, too. Instead of trying to make a transparent background, they actually take a screen-shot of the computer right where their window is going to go, and then use that screen shot for their background. When the user drags the window around on the screen, they continue to update the background with new screen-shots. I think this might be more complicated than you were hoping for, but it's certainly an interesting idea.

SDL_GL_SwapBuffers() in a loop = Freeze?

I read a few tutorials about OpenGL and now I'm trying to use it with SDL. The thing is that when I use SDL_GL_SwapBuffers() in a while loop the window just freezes. Here's some code:
#include "SDL.h"
#include "system.h"
#include "SDL_opengl.h"
System Sys(800, 600, 32);
SDL_Event kpress;
int main( int argc, char* args[] )
{
Sys.init();
bool quit = false;
while (!quit)
{
while (SDL_PollEvent(&kpress)) if(kpress.type == SDL_QUIT) quit = true;
glClear(GL_COLOR_BUFFER_BIT);
SDL_GL_SwapBuffers();
}
SDL_Quit();
return 0;
}
------------------------------These are in system.h, class System----------------
bool System::init()
{
if (SDL_Init(SDL_INIT_EVERYTHING) < 0)
{
errorCode = 1;
return false;
}
if (SDL_SetVideoMode(screen_h, screen_w, bpp, SDL_OPENGL) == 0)
{
errorCode = 2;
return false;
}
SDL_GL_SetAttribute( SDL_GL_DOUBLEBUFFER, 1 );
if (!init_GL())
{
errorCode = 3;
return false;
}
SDL_WM_SetCaption("Engine", 0);
return true;
}
bool System::init_GL()
{
glClearColor(1, 0, 0, 0);
glClear(GL_COLOR_BUFFER_BIT);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(0, screen_h, screen_w, 0, -1, 1);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
if (glGetError() != GL_NO_ERROR) return false;
return true;
}
If I draw some shapes or use a timer for limiting FPS - nothing changes.
Do you have any ideas?
My first advice: Get rid of that bogus System class: So far all the tasks it does are purely sequential/procedural and that should be reflected in the programs outline. People tend to put everything into classes just becase they're taught see everything in terms of object models. But this System class would have to follow the singleton pattern, which, in my opinion, is an anti-pattern.
All the stuff you placed in init_GL belong into the rendering loop. OpenGL initialization ends after creating a render context. OpenGL state is not initialized it is set on demand. OpenGL objects are initialized, but also on demand.
Also you're using glGetError not correctly. It needs to be called in a loop until no more errors are reported. It thus also makes little sense to bail out if a GL error is reported. OpenGL errors should be considered diagnostic.
SDL_GL_SetAttribute must be set before calling SDL_SetVideoMode so you're probably not double buffering.
Hey, you aren't calling the function which initializes OpenGL: init_GL()
youre not checking the result from system::init - it might be failing somewhere in that function and not setting up your initial state correctly
SDL #defines main() to be SDL_main() to allow it to do some extra initialization before program start. Which you seem to be bypassing via the statically initialized class.
Try constructing your System object in main().