I initialize SDL with this code:
SDL_Init(SDL_INIT_VIDEO);
SDL_Window* win = SDL_CreateWindow(
"SDL Window",
SDL_WINDOWPOS_UNDEFINED,
SDL_WINDOWPOS_UNDEFINED,
WIDTH,
HEIGHT,
SDL_WINDOW_SHOWN
);
SDL_Renderer* ren = SDL_CreateRenderer(win, -1, SDL_RENDERER_SOFTWARE);
SDL_GL_SetSwapInterval(1); // probably has no effect since it's not using GL
I then render with a non-SDL software renderer and present it to the window with this code:
SDL_UpdateTexture(screenBuffer, NULL, screenData, WIDTH*4);
SDL_RenderClear(ren);
SDL_RenderCopy(ren, screenBuffer, NULL, NULL);
SDL_RenderPresent(ren);
The framerate is completely uncapped, and I have no idea why.
Adding | SDL_RENDERER_PRESENTVSYNC to the SDL_CreateRenderer flags simply makes the window a blank white screen. Though I need to use the SDL_RENDERER_SOFTWARE flag (even though I'm not using SDL's software renderer other than to present the screen) or else will SDL_RenderPresent() will stall for a very, very long time resulting in about 1 frame per second.
How can I make SDL_RenderPresent() wait for vsync, or wait for it (accurately) myself?
VSync is a hardware feature. When VSync is enabled, the video card stops the renderer from presenting a frame until a signal from the monitor indicating vertical syncronism arrives (which means it finished displaying the last frame).
If you are using a software renderer there's no way to detect this signal since you're not using the video card to render. It's up to you to set a framerate and wait for the next frame.
An example for 60 frames per second:
#define TICKS_FOR_NEXT_FRAME (1000 / 60)
int lastTime = 0;
void update() {
while (lastTime - SDL_GetTicks() < TICKS_FOR_NEXT_FRAME) {
SDL_Delay(1);
}
... // SDL_RenderCopy...
SDL_RenderPresent(renderer);
lastTime = SDL_GetTicks();
}
A call to update will only present the frame if at least 15 ms have passed. 15 ms is the time needed for a framerate of 60 FPS.
I know this question is old, but i decided to post the answer for other people who might need it.
EDIT
As stated in the comments, this will not prevent tearing since you are not syncing to the vblank signal. This code only helps on capping the framerate. As far as I know there is no way to prevent tearing when using a software renderer (because you can't detect the vblank signal).
Related
I've got an OpenGL application with win32 api without glut etc...and I run into a problem with screen tearing in fullscreen.
Basicly I have set WS_POPUP as a window style and resolution of my monitor as window size.
I'm running on AMD radeon HD 7770 and I see terrible tearing!
When I put WS_POPUPWINDOW style instead of WS_POPUP, the tearing is gone, however I have unwanted border around my scene.
Another thing I noticed is fact, that the tearing disappears when the resolution is NOT native.
So when I pass my_screen_resolution + 1 as size parameter, the tearing is gone.
RESx = 1920;
RESy = 1080;
hwnd = CreateWindowEx(NULL, NAME, NAME, WS_POPUP, 0, 0, RESx, RESy, NULL, NULL, hInstance, NULL);
SetWindowPos(hwnd, 0, -1, -1, RESx + 1, RESy + 1, 0); // With this function call, the tearing disappears!
What can I do to get rid of the tearing without having to run on not native resolution?
EDIT: (Hint: It's not V-sync)
What can I do to get rid of the tearing without having to run on not native resolution?
EDIT: (Hint: It's not V-sync)
Yes it is V-Sync.
When you make a fullscreen window, it will bypass the DWM compositor.
If the window is not covering the full screen its contents are going through the DWM compositor. The DWM compositor itself makes itself a copy of the window's contents whenever something indicates, that it is done drawing (return from WM_PAINT handler, EndPaint or SwapBuffers called). The composition itself happens V-synced.
Thanks for your advice, but I want to aviod the tearing without vsync. With vsync I have terrible input lag.
Then you're doing something wrong in your input processing. Most likely your event loop only processes one input event at a time then does a redraw. If that's the case and your scene complexity goes up, then you're getting lag, that's proportional to your scene's drawing complexity. You don't want this to happen.
What you should do is accumulate all the input events that piled up between redraws and coalesce them into a single new drawing state. Ideally input events are collected until only just before the scene is set up for drawing to reflect the most recent state. If you want to get fancy you my add a Kalman filter to predict the input state at the moment the frame gets shown to the user and use that for drawing the scene.
To remove OpenGL tearing, you should have "enable" vsync. Follow this link for details: how to enable vertical sync in opengl?
I made a short program to test out SDL2, though there are some things I don't understand how they work.
So I have created a window and a surface:
SDL_Window *window = nullptr;
SDL_Surface *windowSurface = nullptr;
Now I have this (the part I don't get):
window = SDL_CreateWindow("Window name", SDL_WINDOWPOS_CENTERED, SDL_WINDOWPOS_CENTERED, 640, 480, SDL_WINDOW_SHOWN);
windowSurface = SDL_GetWindowSurface(window);
So the first line: I use the SDL_createWindow() function to create a window called window I assume. The second line, I got no idea whats going on - explanation?
Finally I have this:
SDL_BlitSurface(currentImage, NULL, windowSurface, NULL);
SDL_UpdateWindowSurface(window);
followed by some clean up code to set the pointers back to nullptr and exit the program/destroy windows etc.
The code you have pasted does the following things: Creates a SDL window called "Window name", sets its horizontal and vertical positions to center, sets the window size to 640 x 480 and marks it as shown.
The second line acquires the SDL surface bind to this window.
What this means is: Create Window , actually sets up and openGL window and a GPU texture (the Surface, althou SDL2 has seperate class for Textures), to which it is going to draw. Modifying the surface acquired with GetWindowSurface will modify the pixel on the window you have just created.
Bliting is applying a array of pixel to a target texture, in the meaning : hey i got this image/prerendered frame etc.. and I want to apply it to this surface so i can show it. Blit it.
I hope this is helpful : >
You can find more information for SDL here
Official SDL wiki
LazyFoo
LazyFoo provides a full tutorial and explanations of everything for the old SDL, but a lot of the things are the same in SDL2
My program starts with a loading window while it is compiling shaders, loading textures etc. I then want to be able to launch a fullscreen application and use these resources. My understanding is that the openGL context must be the same before and after. I tried two methods for this: first of all I tried making a second window which was fullscreen, and used the SDL_GL_makecurrent command on this window to 'transfer' the context across (couldn't find where I read about this method), and secondly tried just fullscreening the loading window. Both of these methods resulted in the loading screen being moved to the top left corner of the screen. However opengl commands no longer ran properly in fullscreen, including clearing buffers which meant that the window contained the contents of my desktop/background applications.
Is there a proper way of doing this? Or is this a strange bug in sdl/opengl drivers?
Code to fullscreen original window:
//opengl commands work fine up to here
//now to fullscreen
SDL_SetWindowFullscreen(window, SDL_WINDOW_FULLSCREEN_DESKTOP);
SDL_SetWindowSize(window, 1366, 768); //tried this on either side of line above and without either line
glViewport(0, 0, 1366, 768); //update viewport
glClearColor(1, 1, 1, 1);
glClear(GL_COLOR_BUFFER_BIT);
//window should be whited, other draw commands tried and all fail or distort
SDL_GL_SwapWindow(window);
Creating a new window and using previous context:
//Fine up to here
window2 = SDL_CreateWindow("Window", SDL_WINDOWPOS_CENTERED, SDL_WINDOWPOS_CENTERED, 1366, 768, SDL_WINDOW_OPENGL | SDL_WINDOW_FULLSCREEN_DESKTOP | SDL_WINDOW_SHOWN);
SDL_GL_MakeCurrent(window2, glContext); //created with SDL_GL_CreateContext(oldwindow);
//draw commands dont work
PS: running ubuntu
Update: In the second code, reusing the context in a new window, it returns an error saying 'invalid window' when it fails, which is most of the time but not always. When it fails, the screen ends up completely corrupted(black with weird white squares and patterns), ending the program will not clear the screen of this (although screenshots are perfectly fine?), but it can be restored by ctrl+f1 to terminal then ctrl+f7 back
I dont really know if its a bug. I experienced the same issue with sdl2 and opengl.
Create an regular window
attach to opengl context.
fullscreen
BOOM. black screen and crashed window.
I only noticed that issue in ubuntu.
Trought some tests i found a quick way to fix it:
Uint32 flags = 0;
flags |= SDL_WINDOW_RESIZABLE;
//bla bla bla your tags
flags |= SDL_WINDOW_OPENGL;
m_window = SDL_CreateWindow( "hello gl", SDL_WINDOWPOS_CENTERED, SDL_WINDOWPOS_CENTERED, m_screen.x, m_screen.y,flags);
m_glContext = SDL_GL_CreateContext(m_window);
//Set right the way the screen to fullscrene false
SDL_SetWindowFullscreen(m_window, SDL_FALSE);
Now the fullscreen seems to work without problem.
I'm experiencing some weird graphical corruption with my SDL2 + OpenGL GUI application. Now I apologize if I'm not using the appropriate term to describe the situation. It seems that the front buffer is being invalidated and other applications are being drawn in the buffer?
Important note: this corruption only occurs in Windows XP or when using the Windows Basic theme in Windows 7.
The corruption can be reproduced in a few ways:
by moving the window around. It seems that the front buffer gets corrupted with the image of other applications in front or behind it.
if there's an application behind, it'll occasionally flash into the front buffer.
I can load up a google webpage in Chrome above the opengl application, close the chrome process, and then move the opengl application around and I'll see the google homepage covering the whole inside of the window.
by moving another application over the opengl window.
It seems that I also see the corruption when I'm moving other windows around, even if they don't overlap the opengl window, so long as I've already tried one of the ones posted just above.
inline int UI::RenderingThread()
{
UI::Window::Instance().RenderingThreadEnter();
while( UI::Window::Instance().RenderingThreadActive() )
{
unsigned int ticks = SDL_GetTicks();
UI::Window::Instance().RenderingThreadUpdate();
UI::Window::Instance().RenderingThreadRender();
ticks = SDL_GetTicks() - ticks;
if( ticks < 33 )
{
SDL_Delay( 33 - ticks );
}
}
UI::Window::Instance().RenderingThreadExit();
return 0;
}
As you probably guessed, I'm using multithreading and I have my rendering in a separate thread. Within the RenderingThreadRender() function, I only redraw if there's a change in content, or I've requested a redraw.
case SDL_WINDOWEVENT:
switch( sdl_event.window.event )
{
default:
UI::Window::Instance().Redraw();
break;
}
This is done to allow all SDL_WINDOWEVENT events to redraw in hopes that one of them would fix the problem. Unfortunately that has not solved the issue.
I'm hesitant to simply constantly redraw my application at 30 or 60 fps since I've noticed that by doing so, other application would be sluggish when moving them around.
Initialization:
SDL_GL_SetAttribute( SDL_GL_DOUBLEBUFFER, 1 );
this->window = SDL_CreateWindow( this->caption.c_str(), SDL_WINDOWPOS_UNDEFINED, SDL_WINDOWPOS_UNDEFINED,
UI::WindowSize::Instance().GetWindowWidth(), UI::WindowSize::Instance().GetWindowHeight(),
SDL_WINDOW_OPENGL | SDL_WINDOW_RESIZABLE );
glClearColor(0.0f, 0.0f, 0.0f, 1.0f);
glViewport(0, 0, UI::WindowSize::Instance().GetWindowWidth(), UI::WindowSize::Instance().GetWindowHeight());
// Set the OpenGL view
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho( 0, UI::WindowSize::Instance().GetWindowWidth(), UI::WindowSize::Instance().GetWindowHeight(), 0, -1, 1 );
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
glHint(GL_PERSPECTIVE_CORRECTION_HINT, GL_FASTEST);
glDisable( GL_DITHER );
Rendering:
if( this->redraw )
{
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glLoadIdentity();
// a few DrawArrays
SDL_GL_SwapWindow( this->window );
}
Does anyone have any idea what could be causing this issue, and how to fix it?
EDIT: Some more research has pointed me to the following two questions:
https://superuser.com/questions/316738/what-happens-with-the-off-screen-front-buffers-in-windows-7-when-dwm-is-disabled
C++ OpenGL window only tracks background
However, this doesn't solve my problem since I'm already using Double Buffering.
I've found the solution to my problem. It's unlikely that anyone will experience the same issue that I was having, but you never know. Essentially a C# Window of size equal or larger than my application, was being created before. This window was hidden, however because XP has a global framebuffer instead of a composition, there was a conflict causing the corruption.
All in all, if you're having a similar issue, make sure there isn't another context.
I am trying to change the size of the window of my app with:
mysurface = SDL_SetVideoMode(width, height, 32, SDL_OPENGL);
Although I am using vsync swapbuffers (in driver xorg-video-ati), I can see flickering when the window size changes (I guess one or more black frames):
void Video::draw()
{
if (videoChanged){
mysurface = SDL_SetVideoMode(width, height, 32, SDL_OPENGL);
scene->init(); //Update glFrustum & glViewPort
}
scene->draw();
SDL_GL_SwapBuffers();
}
So please, someone knows, if...
The SDL_SetVideoMode is not vsync'ed as is SDL_GL_SwapBuffers()?
Or is it destroying the window and creating another and the buffer is black in meantime?
Someone knows a working code to do this? Maybe in freeglut?
In SDL-1 when you're using a windowed video mode the window is completely torn down and a new one created when changing the video mode. Of course there's some undefined data inbetween, which is perceived as flicker. This issue has been addressed in SDL-2. Either use that or use a different OpenGL framework, that resizes windows without gong a full window recreation.
If you're using a FULLSCREEN video mode then something different happens additionally:
A change of the video mode actually changes the video signal timings going from the graphics card to the display. After such a change the display has to find synchronization with the new settings and that takes some time. This of course comes with some flickering as the display may try to display a frame of different timings with the old settings until it detects that those no longer match. It's a physical effect and there's nothing you can do in software to fix this, other than not changing the video mode at all.