When I write my SDL2 OpenGL program somewhat like this (using VSync):
SDL_GL_SetSwapInterval(1);
while(isRunning)
{
while(SDL_PollEvent(&e))
{
if(e.type == SDL_Quit)
{
isRunning = false;
}
}
SDL_GL_SwapWindow(window);
}
my CPU usage goes as high as 39%-50% for this single program which actually does nothing
whereas when I pass sleep time after calculating time difference to SDL_Delay() makes my programming to freeze completely with a 'not responding'.
I do not want to use SDL_WaitEvent() because my program will show animation which will run regardless of input events.
is there any equivalent to pygame's which neither blocks input nor the video thread
fpsClock = pygame.time.Clock()
while(True):
pygame.display.update()
fpsClock.tick(60)
Have you checked that call to SDL_GL_SetSwapInterval(1) was successful? Your FPS should be capped at your refresh rate if your call was successful.
Did you properly initialized your OpenGL in SDL for double buffering? Like this for exmaple:
SDL_Window *window;
SDL_GLContext context;
SDL_GL_SetAttribute(SDL_GL_RED_SIZE, 5);
SDL_GL_SetAttribute(SDL_GL_GREEN_SIZE, 5);
SDL_GL_SetAttribute(SDL_GL_BLUE_SIZE, 5);
SDL_GL_SetAttribute(SDL_GL_DEPTH_SIZE, 16);
SDL_GL_SetAttribute(SDL_GL_DOUBLEBUFFER, 1);
window = SDL_CreateWindow("OpenGL Window", SDL_WINDOWPOS_CENTERED, SDL_WINDOWPOS_CENTERED, 640, 480, SDL_WINDOW_OPENGL);
if (!window) {
fprintf(stderr, "Couldn't create window: %s\n", SDL_GetError());
return;
}
context = SDL_GL_CreateContext(window);
if (!context) {
fprintf(stderr, "Couldn't create context: %s\n", SDL_GetError());
return;
}
int r, g, b;
SDL_GL_GetAttribute(SDL_GL_RED_SIZE, &r);
SDL_GL_GetAttribute(SDL_GL_GREEN_SIZE, &g);
SDL_GL_GetAttribute(SDL_GL_BLUE_SIZE, &b);
printf("Red size: %d, Green size: %d, Blue size: %d\n", r, g, b);
SDL is not a framework, there is no such high level functions as framerate control. However you should have framerate cap with VSync. If there is no capping then it's probably means VSync is not working.
Please consider to check more in depth tutorials on how to setup OpenGL apps in SDL, this for example, see also.
Related
I've been trying to learn how to use SDL2, and am trying to follow the tutorials here for that. From example 7 (texture loading / hardware rendering), I've put together this stripped down example. As far as I can tell, the code is only calling SDL_RenderClear() and SDL_RenderPresent() in the render loop, VSYNC'd to my monitor (100 Hz in this case). However, I'm seeing Windows report 40% GPU utilization when doing this, which seems high for doing nothing. My GPU sits at 1-2% utilization when the SDL2 program is not running.
Removing just the SDL_RenderPresent() call brings the usage all the way down to 1-2% again, but then nothing gets drawn, which makes me think that the issue is either in SDL2's rendering, or (much more likely), how I've configured it.
Is this high GPU utilization (when doing nothing) expected? Is there anything I can do to minimize the GPU utilization (while still using hardware rendering)?
===================================
Machine details:
Windows 10
Threadripper 3970x CPU (3.7 GHz 32 core)
256 GB RAM DDR4 3200 MHz
1x RTX 3090 GPU
3x 1440p 100Hz monitors
GPU utilization (shown via Task Manager and Xbox Game Bar):
Code:
#define SDL_MAIN_HANDLED
/*This source code copyrighted by Lazy Foo' Productions (2004-2020)
and may not be redistributed without written permission.*/
// Using SDL, SDL_image, standard IO, and strings
#include <SDL.h>
#include <stdio.h>
#include <string>
// Screen dimension constants
const int SCREEN_WIDTH = 1920;
const int SCREEN_HEIGHT = 1080;
// Starts up SDL and creates window
bool init();
// Frees media and shuts down SDL
void close();
// The window we'll be rendering to
SDL_Window* gWindow = NULL;
// The window renderer
SDL_Renderer* gRenderer = NULL;
bool init() {
// Initialization flag
bool success = true;
// Initialize SDL
if (SDL_Init(SDL_INIT_VIDEO) < 0) {
printf("SDL could not initialize! SDL Error: %s\n", SDL_GetError());
success = false;
} else {
// Create window
gWindow = SDL_CreateWindow(
"SDL Tutorial",
SDL_WINDOWPOS_UNDEFINED,
SDL_WINDOWPOS_UNDEFINED,
SCREEN_WIDTH,
SCREEN_HEIGHT,
SDL_WINDOW_SHOWN);
if (gWindow == NULL) {
printf("Window could not be created! SDL Error: %s\n", SDL_GetError());
success = false;
} else {
// Create renderer for window
gRenderer =
SDL_CreateRenderer(gWindow, -1, SDL_RENDERER_ACCELERATED | SDL_RENDERER_PRESENTVSYNC);
if (gRenderer == NULL) {
printf("Renderer could not be created! SDL Error: %s\n", SDL_GetError());
success = false;
} else {
// Initialize renderer color
SDL_SetRenderDrawColor(gRenderer, 0xFF, 0x00, 0x00, 0x00);
}
}
}
return success;
}
void close() {
// Destroy window
SDL_DestroyRenderer(gRenderer);
SDL_DestroyWindow(gWindow);
gWindow = NULL;
gRenderer = NULL;
// Quit SDL subsystems
SDL_Quit();
}
int main(int argc, char* args[]) {
// Start up SDL and create window
if (!init()) {
printf("Failed to initialize!\n");
} else {
// Main loop flag
bool quit = false;
// Event handler
SDL_Event e;
// While application is running
while (!quit) {
// Handle events on queue
while (SDL_PollEvent(&e) != 0) {
// User requests quit
if (e.type == SDL_QUIT) {
quit = true;
}
}
// Clear screen
SDL_RenderClear(gRenderer);
//Update screen
SDL_RenderPresent( gRenderer );
}
}
// Free resources and close SDL
close();
return 0;
}
I'm doing Lazy Foo's tutorial on SDL (I'm using SDL2-2.0.9), and at the texture rendering part I encountered the following problem: the program compiles and runs as expected, no issue here, but when I close the window, the console doesn't close and the process continues running, so I have to close the console separately.
When I tried to debug it, I found out that the program indeed leaves the main cycle and reaches the "return 0" line in the main function successfully, but then it just hangs like that until I close the console.
The issue is only present when I use the SDL renderer with any option other than SDL_RENDERER_SOFTWARE. If I use SDL_RENDERER_SOFTWARE - the program closes as expected. With other options it stays at "return 0" running other threads (crypt32.dll, ntdll.dll and nvd3dum, in this order in the thread view, meaning that the process is stuck in crypt32).
I'm aware that my main function is not the "real main" as it has been hijacked by SDL, so exit(0) works fine as an ad-hoc solution. But I want to know, why exactly does that happen and is there any other way to fix this, so that I don't have to use exit(0) ?
Here is an example (simplified) code, which demonstrates this issue for me:
#include "SDL.h"
#include <stdio.h>
int main(int argc, char *argv[]) {
SDL_Window *win = NULL;
SDL_Renderer *renderer = NULL;
SDL_Texture *bitmapTex = NULL;
SDL_Surface *bitmapSurface = NULL;
int width = 640, height = 480;
if (SDL_Init(SDL_INIT_VIDEO) < 0)
{
printf("Could not initialize SDL");
return 1;
}
win = SDL_CreateWindow("Hello World", SDL_WINDOWPOS_UNDEFINED, SDL_WINDOWPOS_UNDEFINED, width, height, 0);
renderer = SDL_CreateRenderer(win, -1, SDL_RENDERER_ACCELERATED);
bitmapSurface = SDL_LoadBMP("res/x.bmp");
bitmapTex = SDL_CreateTextureFromSurface(renderer, bitmapSurface);
SDL_FreeSurface(bitmapSurface);
bool quit = false;
while (!quit) {
SDL_Event e;
while (SDL_PollEvent(&e) != 0) {
if (e.type == SDL_QUIT) {
quit = true;
}
}
SDL_RenderClear(renderer);
SDL_RenderCopy(renderer, bitmapTex, NULL, NULL);
SDL_RenderPresent(renderer);
}
SDL_DestroyTexture(bitmapTex);
SDL_DestroyRenderer(renderer);
SDL_DestroyWindow(win);
SDL_Quit();
printf("REACHED RETURN 0");
return 0;
}
Works as intended, but after closing the window I see "REACHED RETURN 0" printed in console and that's it, the console stays there. The code can be simplified further, the issue will be present as long as there is an instance of SDL_Renderer created.
UPD: The callstack during the hanging:
> ntdll.dll!_NtWaitForMultipleObjects#20()
KernelBase.dll!_WaitForMultipleObjectsEx#20()
crypt32.dll!ILS_WaitForThreadProc()
kernel32.dll!#BaseThreadInitThunk#12()
ntdll.dll!__RtlUserThreadStart()
ntdll.dll!__RtlUserThreadStart#8()
UPD2: The problem is not with the loop at all, I created the simplest application where I just create a window and a renderer and then return 0, it still gives me a hanging console. Like this:
#include <SDL.h>
int main(int argc, char* args[])
{
if (SDL_Init(SDL_INIT_VIDEO) < 0) return 1;
SDL_Window* window = SDL_CreateWindow("Test", SDL_WINDOWPOS_UNDEFINED, SDL_WINDOWPOS_UNDEFINED, 640, 480, 0);
SDL_Renderer* renderer = SDL_CreateRenderer(window, -1, SDL_RENDERER_ACCELERATED);
return 0;
}
Same thing when I destroy them properly. The problem is in the renderer.
UPD3: Here is the Parallel Stack window during the "hanging". There is no "main" thread since I close it successfully, these are the threads which stop the program from closing properly. Other than that, it doesn't give me any understanding of the problem.
I am just now learning SDL and have downloaded the libraries and added them to my linker, etc with MinGW and I am trying to run a simple demo program to display a window and it will not show up at all. I get no errors at all, the window just doesn't show up.
#include "SDL.h"
#include <stdio.h>
int main(int argc, char* argv[]) {
SDL_Window *window; // Declare a pointer
SDL_Init(SDL_INIT_VIDEO); // Initialize SDL2
// Create an application window with the following settings:
window = SDL_CreateWindow(
"An SDL2 window", // window title
SDL_WINDOWPOS_UNDEFINED, // initial x position
SDL_WINDOWPOS_UNDEFINED, // initial y position
640, // width, in pixels
480, // height, in pixels
SDL_WINDOW_OPENGL // flags - see below
);
// Check that the window was successfully created
if (window == NULL) {
// In the case that the window could not be made...
printf("Could not create window: %s\n", SDL_GetError());
return 1;
}
// The window is open: could enter program loop here (see SDL_PollEvent())
SDL_Delay(3000); // Pause execution for 3000 milliseconds, for example
// Close and destroy the window
SDL_DestroyWindow(window);
// Clean up
SDL_Quit();
return 0;
}
I just tested this on Linux and MinGW. It may be a problem with SDL_Delay blocking before the window gets a chance to show. Try adding a basic main loop to see if it works. This will create an empty window.
#include "SDL.h"
#include <stdio.h>
int main(int argc, char* argv[]) {
SDL_Window *window; // Declare a pointer
SDL_Init(SDL_INIT_VIDEO); // Initialize SDL2
// Create an application window with the following settings:
window = SDL_CreateWindow(
"An SDL2 window", // window title
SDL_WINDOWPOS_UNDEFINED, // initial x position
SDL_WINDOWPOS_UNDEFINED, // initial y position
640, // width, in pixels
480, // height, in pixels
SDL_WINDOW_OPENGL // flags - see below
);
// Check that the window was successfully created
if (window == NULL) {
// In the case that the window could not be made...
printf("Could not create window: %s\n", SDL_GetError());
return 1;
}
// A basic main loop to prevent blocking
bool is_running = true;
SDL_Event event;
while (is_running) {
while (SDL_PollEvent(&event)) {
if (event.type == SDL_QUIT) {
is_running = false;
}
}
SDL_Delay(16);
}
// Close and destroy the window
SDL_DestroyWindow(window);
// Clean up
SDL_Quit();
return 0;
}
I've been trying to start a new SDL + GLEW + OpenGL project, and the setup has been difficult (MinGW-w64-32 on Windows 8 64-bit with an Intel i7-4700MQ CPU and NVidia GTX 745M GPU).
If I set the GL attributes to be used for context creation to use OpenGL version 4.2, the color and depth bit sizes get set to 0. However, if I request a 2.1 context (which is also the default), I can get the requested bit depths (8 bits for each color, 24 bits for depth). In either case, however, glClearColor has no effect (just a black background).
In both cases, the result of a few glGetString calls is the same- a 4.2 context, suggesting SDL's output is far from correct.
The entirety of the code can be found here, it's mostly boilerplate for a larger project for now. The relevant sections would most likely be
if(SDL_Init(SDL_INIT_EVERYTHING) != 0) {
std::cerr << "Error initializing SDL.\n";
exit(1);
}
SDL_GL_SetAttribute(SDL_GL_ACCELERATED_VISUAL, 1);
SDL_GL_SetAttribute(SDL_GL_CONTEXT_MAJOR_VERSION, 2);
SDL_GL_SetAttribute(SDL_GL_CONTEXT_MINOR_VERSION, 1);
//SDL_GL_SetAttribute(SDL_GL_CONTEXT_FLAGS, SDL_GL_CONTEXT_FORWARD_COMPATIBLE_FLAG);
SDL_GL_SetAttribute(SDL_GL_RED_SIZE, 8);
SDL_GL_SetAttribute(SDL_GL_GREEN_SIZE, 8);
SDL_GL_SetAttribute(SDL_GL_BLUE_SIZE, 8);
SDL_GL_SetAttribute(SDL_GL_ALPHA_SIZE, 8);
SDL_GL_SetAttribute(SDL_GL_DEPTH_SIZE, 24);
SDL_GL_SetAttribute(SDL_GL_DOUBLEBUFFER, 1);
SDL_Window* window = SDL_CreateWindow("RenderSystem", SDL_WINDOWPOS_CENTERED, SDL_WINDOWPOS_CENTERED, 640, 480, SDL_WINDOW_OPENGL|SDL_WINDOW_RESIZABLE);
if(!window) {
std::cerr << "Window creation failed.\n";
SDL_Quit();
exit(2);
}
SDL_GLContext context = SDL_GL_CreateContext(window);
if(!context) {
std::cerr << "OpenGL Context creation failed.\n";
SDL_DestroyWindow(window);
SDL_Quit();
exit(3);
}
SDL_GL_MakeCurrent(window, context);
and
glClearColor(1.0f, 1.0f, 1.0f, 1.0f);
SDL_Event evt;
bool run = true;
while(run) {
SDL_PollEvent(&evt);
switch(evt.type) {
case SDL_KEYDOWN:
if(evt.key.keysym.sym == SDLK_ESCAPE) {
run = false;
}
break;
case SDL_QUIT:
run = false;
break;
}
SDL_GL_SwapWindow(window);
}
I'm using SDL's built-in function SDL_GL_GetAttribute which returns the values that SDL uses to create the context, not the actual context attributes (as I understand it).
This is incorrect, I took a look at the SDL implementation of SDL_GL_GetAttribute (...) (see src/video/SDL_video.c) and it does what I described. You cannot query the values on a core profile context because they are not defined for the default framebuffer.
Here is where the problem comes from:
int
SDL_GL_GetAttribute(SDL_GLattr attr, int *value)
{
// ...
switch (attr) {
case SDL_GL_RED_SIZE:
attrib = GL_RED_BITS;
break;
case SDL_GL_BLUE_SIZE:
attrib = GL_BLUE_BITS;
break;
case SDL_GL_GREEN_SIZE:
attrib = GL_GREEN_BITS;
break;
case SDL_GL_ALPHA_SIZE:
attrib = GL_ALPHA_BITS;
break;
}
// ...
glGetIntegervFunc(attrib, (GLint *) value);
error = glGetErrorFunc();
}
That code actually generates a GL_INVALID_ENUM error on a core profile, and the return value of SDL_GL_GetAttribute (...) should be non-zero as a result.
If you must get meaningful values from SDL_GL_GetAttribute (...) for bit depths, then that means you must use a compatibility profile. SDL2 does not extract this information from the pixel format it selected (smarter frameworks like GLFW do this), but it naively tries to query it from GL.
If not fail me memory, SDL_GL_DEPTH_SIZE has to be the sum of all color channels:
using four color channels:
SDL_GL_SetAttribute (SDL_GL_DEPTH_SIZE, 32);
If you were using 3 color channels would then:
SDL_GL_SetAttribute (SDL_GL_DEPTH_SIZE, 24);
Already had some problems with it, this might be the problem.
Sorry for my english.
I wish to have a semi-transparent SDL background (nothing to do with sub-surfaces or images), such that instead of having a black background it is actually transparent, but the other things I draw are not. My current code is a slightly modified copy of Code::Blocks' SDL project, similar to how various applications have rounded borders or odd shapes besides rectangles.
#ifdef __cplusplus
#include <cstdlib>
#else
#include <stdlib.h>
#endif
#ifdef __APPLE__
#include <SDL/SDL.h>
#else
#include <SDL.h>
#endif
int main ( int argc, char** argv )
{
putenv("SDL_VIDEO_WINDOW_POS");
putenv("SDL_VIDEO_CENTERED=1");
// initialize SDL video
if ( SDL_Init( SDL_INIT_VIDEO ) < 0 )
{
printf( "Unable to init SDL: %s\n", SDL_GetError() );
return 1;
}
// make sure SDL cleans up before exit
atexit(SDL_Quit);
// create a new window
SDL_Surface* screen = SDL_SetVideoMode(640, 480, 16,
SDL_HWSURFACE|SDL_DOUBLEBUF|SDL_NOFRAME);
if ( !screen )
{
printf("Unable to set 640x480 video: %s\n", SDL_GetError());
return 1;
}
// load an image
SDL_Surface* bmp = SDL_LoadBMP("cb.bmp");
if (!bmp)
{
printf("Unable to load bitmap: %s\n", SDL_GetError());
return 1;
}
// centre the bitmap on screen
SDL_Rect dstrect;
dstrect.x = (screen->w - bmp->w) / 2;
dstrect.y = (screen->h - bmp->h) / 2;
// program main loop
bool done = false;
while (!done)
{
// message processing loop
SDL_Event event;
while (SDL_PollEvent(&event))
{
// check for messages
switch (event.type)
{
// exit if the window is closed
case SDL_QUIT:
done = true;
break;
// check for keypresses
case SDL_KEYDOWN:
{
// exit if ESCAPE is pressed
if (event.key.keysym.sym == SDLK_ESCAPE)
done = true;
break;
}
} // end switch
} // end of message processing
// DRAWING STARTS HERE
// clear screen
SDL_FillRect(screen, 0, SDL_MapRGB(screen->format, 0, 0, 0));
// draw bitmap
SDL_BlitSurface(bmp, 0, screen, &dstrect);
// DRAWING ENDS HERE
// finally, update the screen :)
SDL_Flip(screen);
} // end main loop
// free loaded bitmap
SDL_FreeSurface(bmp);
// all is well ;)
printf("Exited cleanly\n");
return 0;
}
I think what you're trying to do is in fact a shaped window (parts of the window are transparent depending on a mask that you provide). It seems there's no way to do that with SDL 1.2, however there is a SDL_SetWindowShape function just for this in SDL 1.3 for which you can find a pre-release snapshot here but it's not even in beta yet so I suggest waiting until it's officialy released :)
this is a link to a pretty neat article about development of an older application for Mac OS 9, which did not have support for shaped windows, either.
It's actually a neat article in general about software development.
But the idea seems pretty smart, and I wonder if you might be able to get it working here, too. Instead of trying to make a transparent background, they actually take a screen-shot of the computer right where their window is going to go, and then use that screen shot for their background. When the user drags the window around on the screen, they continue to update the background with new screen-shots. I think this might be more complicated than you were hoping for, but it's certainly an interesting idea.