I found out that my program - which only draw two pictures on the screen - runs at 200FPS. 200FPS is really low number at this point of development! It does not matter if I remove all textures and other data.
The slow part of code are SDL_RenderPresent and SDL_RenderClear. Without them, the program runs at about 300K FPS. Could you tell me where could be the problem?
Window and Renderer
_window = SDL_CreateWindow( "Window", SDL_WINDOWPOS_CENTERED, SDL_WINDOWPOS_CENTERED, 800, 600, 0 );
if ( _window == NULL ) {
std::cout << "CreateWindow problem: " << SDL_GetError() << std::endl;
}
_renderer = SDL_CreateRenderer( _window, -1, SDL_RENDERER_ACCELERATED );
if ( _renderer == NULL ) {
std::cout << "CreateRenderer problem: " << SDL_GetError() << std::endl;
}
Acording to SDL_RendererInfo only two flags are used - SDL_RENDERER_ACCELERATED and SDL_RENDERER_TARGETTEXTURE.
Related
I have just set up the SDL2 framework on my mac but however compilation and running the program succeeds, the window is not responding (I copied code that creates some rectangle).
I use xcode and I followed tutorial from here http://lazyfoo.net/tutorials/SDL/01_hello_SDL/mac/xcode/index.php
step by step.
SDL_Window* window = NULL;
//The surface contained by the window
SDL_Surface* screenSurface = NULL;
//Initialize SDL
if( SDL_Init( SDL_INIT_VIDEO ) < 0 )
{
printf( "SDL could not initialize! SDL_Error: %s\n", SDL_GetError() );
}
else
{
//Create window
window = SDL_CreateWindow( "SDL Tutorial", SDL_WINDOWPOS_UNDEFINED, SDL_WINDOWPOS_UNDEFINED, SCREEN_WIDTH, SCREEN_HEIGHT, SDL_WINDOW_SHOWN );
if( window == NULL )
{
printf( "Window could not be created! SDL_Error: %s\n", SDL_GetError() );
}
else
{
//Get window surface
screenSurface = SDL_GetWindowSurface( window );
//Fill the surface white
SDL_FillRect( screenSurface, NULL, SDL_MapRGB( screenSurface->format, 0xFF, 0xFF, 0xFF ) );
//Update the surface
SDL_UpdateWindowSurface( window );
cout << " Ok" << endl;
//Wait two seconds
SDL_Delay( 20000 );
}
}
//Destroy window
SDL_DestroyWindow( window );
//Quit SDL subsystems
SDL_Quit();
return 0;
Why could that problem happen?
Thank you in advance
In order for a program written SDL to "respond" to operating system, you should give control back to SDL for it to process system messages and give them back to you as SDL events (mouse events, keyboard events and so on).
To do that, you have to add a loop that uses SDL_PollEvent, that should look something
while(true)
{
SDL_Event e;
while (SDL_PollEvent(&e))
{
// Decide what to do with events here
}
// Put the code that is executed every "frame".
// Under "frame" I mean any logic that is run every time there is no app events to process
}
There are some special events such as SDL_QuiEvent that you would need to handle to have a way to close your application. If you want to handle it, you should modify your code to look something like this:
while(true)
{
SDL_Event e;
while (SDL_PollEvent(&e))
{
if(e.type == SDL_QUIT)
{
break;
}
// Handle events
}
// "Frame" logic
}
I have recently been breaking up a large source file into headers and smaller source files. I have never done this before, and largely because of scope, I had to rewrite some of the functions to accept pointers and then call them from main().
This is still a work in progress, however, I've ran into a problem that I have been trying for hours to fix. The code is developed from Lazy Foos excellent SDL tutorial series.
I have moved this function into a header file. Ingeniously titled misc.cpp/misc.h.
bool init(SDL_Window *gWindow, SDL_Renderer *gRenderer)
{
//Initialization flag
bool success = true;
//Initialize SDL
if( SDL_Init( SDL_INIT_VIDEO ) < 0 )
{
printf( "SDL could not initialize! SDL Error: %s\n", SDL_GetError() );
success = false;
}
else
{
//Set texture filtering to linear
if( !SDL_SetHint( SDL_HINT_RENDER_SCALE_QUALITY, "1" ) )
{
printf( "Warning: Linear texture filtering not enabled!" );
}
//Create window
gWindow = SDL_CreateWindow( "Roulette Testing", SDL_WINDOWPOS_UNDEFINED, SDL_WINDOWPOS_UNDEFINED, SCREEN_WIDTH, SCREEN_HEIGHT, SDL_WINDOW_SHOWN );
if( gWindow == NULL )
{
printf( "Window could not be created! SDL Error: %s\n", SDL_GetError() );
success = false;
}
else
{
//Create renderer for window
gRenderer = SDL_CreateRenderer( gWindow, -1, SDL_RENDERER_ACCELERATED | SDL_RENDERER_PRESENTVSYNC);
if( gRenderer == NULL )
{
printf( "Renderer could not be created! SDL Error: %s\n", SDL_GetError() );
success = false;
}
else
{
//Initialize renderer color
SDL_SetRenderDrawColor( gRenderer, 0xFF, 0xFF, 0xFF, 0xFF );
//Initialize PNG loading
int imgFlags = IMG_INIT_PNG;
if( !( IMG_Init( imgFlags ) & imgFlags ) )
{
printf( "SDL_image could not initialize! SDL_image Error: %s\n", IMG_GetError() );
success = false;
}
std::cout << gRenderer << "renderer not NULL" << std::endl; //This is where I am checking the value of gRenderer within the function.
}
}
}
std::cout << "initialization success" << std::endl;
return success;
}
Now I am trying to call this function from my main() after having included the header and compiled the cpp into object code. This worked fine for the texture header file.
Unfortunately, now when I'm passing a SDL_Renderer pointer to this init function, it's becoming local, and I'm left with a NULL pointer when I return. I have tried the "&" address of operator, but it won't accept it!
init(gWindow, gRenderer);
I'm unsure why, but I'm guessing it's to do with the type.
Any ideas guy??
In response to comments... In my main I fire off a
cout << gRenderer << endl;
and it comes back NULL both before and after the function call.
I have highlighted in the function where I fire off another cout.
Make the function signature a reference to a pointer and it will return the value returned from SDL_CreateRenderer to the calling function (assuming gRenderer is a pointer when passed in):
bool init(SDL_Window *gWindow, SDL_Renderer *&gRenderer)
Looks like you should also pass gWindow in the same way.
I have the following piece of code where among with lot of other stuff (which i didn't include in this topic), i'm trying to start up sdl, create a render and load some sprites.
Everything compiles just fine but when i run my application a break is caused saying: Unhandled exception at 0x681252D5 (SDL.dll) in Carribean World SDL.exe: 0xC0000005: Access violation reading location 0x16161804
The break occurs and the point where i use the SDL_ConvertSurface() function
Can anyone help me out, i can't see what's wrong
Declerations:
SDL_Texture* background = NULL;
SDL_Surface* tmp = NULL;
SDL_Surface* surface = NULL;
SDL_Window *window = SDL_CreateWindow("Carribean World",
SDL_WINDOWPOS_UNDEFINED,
SDL_WINDOWPOS_UNDEFINED,
1360, 768,
SDL_WINDOW_RESIZABLE);
SDL_Surface* screen = SDL_GetWindowSurface(window);
SDL_Renderer* renderer = SDL_CreateRenderer(window, -1, 0);
SDL_PixelFormat* fmt = screen->format;
IN MAIN:
Initialize all SDL subsystems
if (SDL_Init(SDL_INIT_EVERYTHING) == -1)
{
return 0;
}
Load images to surfaces
if ((tmp = IMG_Load("images/water.jpg")) == NULL)
{
cout << "SDL_SetVideoMode() Failed: " << SDL_GetError() << endl;
return 0;
}
Right here a break is caused
if ((surface = SDL_ConvertSurface(tmp, fmt, 0)) == NULL)
{
cout << "SDL_ConvertSurface() Failed: " << SDL_GetError() << endl;
}
background = SDL_CreateTextureFromSurface(renderer, tmp);
You haven't checked return value of SDL_GetWindowSurface. But anyway, SDL documentation for this function says 'You may not combine this with 3D or the rendering API on this window.'. So either you exclusively use SDL_Renderer API, or using SDL_BlitSurface and alike and after that calling SDL_UpdateWindowSurface, but you can't use both.
I ran into a strange problem recently. I wrote an application class which uses a very simple renderer to draw some models on screen. The camera is movable.
I ran the program on my laptop. Initially I noticed that nothing was being drawn on screen (the screen was being cleared by the correct color, however). Then I noticed that the screen would update itself IF I clicked on the decoration frame and moved the window: this way, the models became visible, but would not move unless I clicked and moved the decoration frame again.
I tested my program on a desktop computer, and everything worked fine; the camera moved smoothly.
Eventually, I got the program to work on my laptop, but I have to set SDL_GL_SetAttribute( SDL_GL_DOUBLEBUFFER, 0 ); and disable buffer swapping.
Below is the main application class. In the execution loop, I call the application state stack to loop & render (the application state actually owns the renderer).
In case it is of any consequence, my laptop has intel HD 4000 graphics, and the desktop has a GTX 670.
App::App() : _running( false ),
_deltaTime( 0u ),
_elapsedTime( 0u ),
_mainWindow( nullptr ),
_glContext(),
_stack() {
//ctor
}
App::~App() {
SDL_GL_DeleteContext( _glContext );
SDL_DestroyWindow( _mainWindow );
SDL_Quit();
}
void App::execute() {
_initialize();
static const float millisecondsPerFrame = 17;
while ( _running ) {
//get the delta time & update elapsed time
uint32_t oldTime = _elapsedTime;
_elapsedTime = SDL_GetTicks();
_deltaTime = _elapsedTime - oldTime;
_processEvents();
_loop( _deltaTime / 1000.0f );
_render();
//apply possible state changes made to the stack
_stack.applyPendingChanges();
int usedTime = SDL_GetTicks() - int ( _elapsedTime );
//sleep the remainder of the cycle if we didn't use the entire update cycle
if ( millisecondsPerFrame - usedTime > 0 ) {
SDL_Delay( uint32_t ( millisecondsPerFrame - usedTime ) );
}
}
}
void App::_initialize() {
//initialize random number generator
nge::srand();
_running = true;
_initializeSDL();
_initializeOpenGL();
SDL_GL_MakeCurrent( _mainWindow, _glContext );
//attempt to set late swap tearing
int res = SDL_GL_SetSwapInterval( -1 );
//returns 0 on success
//returns -1 if swap interval is not supported
if ( res == -1 ) {
std::cout << "App::_initializeSDL> " << SDL_GetError() << "\n\n";
SDL_GL_SetSwapInterval( 1 );
}
_stack.registerState<GameState>( AppStateID::Game );
_stack.pushState( AppStateID::Game );
_stack.applyPendingChanges();
}
void App::_initializeSDL() {
SDL_Init( SDL_INIT_VIDEO );
SDL_Init( SDL_INIT_TIMER );
SDL_GL_SetAttribute( SDL_GL_CONTEXT_MAJOR_VERSION, 3 );
SDL_GL_SetAttribute( SDL_GL_CONTEXT_MINOR_VERSION, 3 );
SDL_GL_SetAttribute( SDL_GL_CONTEXT_PROFILE_MASK,
SDL_GL_CONTEXT_PROFILE_CORE );
SDL_GL_SetAttribute( SDL_GL_ACCELERATED_VISUAL, 1 );
/**
For some reason, on my Samsung Series 9, double buffering does not
work.
*/
SDL_GL_SetAttribute( SDL_GL_DOUBLEBUFFER, 0 );
SDL_GL_SetAttribute( SDL_GL_DEPTH_SIZE, 16 );
SDL_GL_SetAttribute( SDL_GL_STENCIL_SIZE, 8 );
//anti-aliasing
SDL_GL_SetAttribute( SDL_GL_MULTISAMPLEBUFFERS, 1 );
SDL_GL_SetAttribute( SDL_GL_MULTISAMPLESAMPLES, 4 );
_mainWindow = SDL_CreateWindow( "window",
SDL_WINDOWPOS_UNDEFINED,
SDL_WINDOWPOS_UNDEFINED,
800,
600,
SDL_WINDOW_OPENGL |
SDL_WINDOW_RESIZABLE |
SDL_WINDOW_MAXIMIZED |
SDL_WINDOW_SHOWN );
_glContext = SDL_GL_CreateContext( _mainWindow );
}
void App::_initializeOpenGL() {
//initialize GLEW
glewExperimental = GL_TRUE;
if ( glewInit() != GLEW_OK ) {
std::cerr << "glewInit failed." << std::endl;
std::exit( EXIT_FAILURE );
}
glEnable( GL_DEPTH_TEST );
//enable culling
glEnable( GL_CULL_FACE );
glCullFace( GL_BACK );
glDepthFunc( GL_LEQUAL );
glEnable( GL_TEXTURE_CUBE_MAP_SEAMLESS );
std::cout << "OpenGL version: " << glGetString( GL_VERSION ) << std::endl;
std::cout << "GLSL version: " << glGetString( GL_SHADING_LANGUAGE_VERSION ) << std::endl;
std::cout << "Vendor: " << glGetString( GL_VENDOR ) << std::endl;
std::cout << "Renderer: " << glGetString( GL_RENDERER ) << std::endl << std::endl;
//make sure OpenGL 3.3 is available
ASSERT( GLEW_VERSION_3_3, "OpenGL 3.3 API is not available" );
}
void App::_processEvents() {
SDL_Event event;
while ( SDL_PollEvent( &event ) ) {
if ( event.type == SDL_QUIT ) {
_running = false;
}
}
}
void App::_loop( float delta ) {
_stack.loop( delta );
}
void App::_render() {
_stack.render();
//SDL_GL_SwapWindow( _mainWindow );
}
The first thing that I would check are GPU drivers on the laptop. Make sure that the drivers version matches the drivers version on the desktop.
Second thing is to add error printing. From here :
window = SDL_CreateWindow("OpenGL Window", SDL_WINDOWPOS_CENTERED, SDL_WINDOWPOS_CENTERED, 640, 480, SDL_WINDOW_OPENGL);
if (!window) {
fprintf(stderr, "Couldn't create window: %s\n", SDL_GetError());
return;
}
context = SDL_GL_CreateContext(window);
if (!context) {
fprintf(stderr, "Couldn't create context: %s\n", SDL_GetError());
return;
}
Third thing to check are requested buffers. Maybe the GPU or drivers do not support double buffering, or depth size of 16 bits, or some other parameter that you requested. So, play with parameters in the initializeSDL() function, and find the one that works on your laptop.
This tutorial on SDL 2.0 uses code that returns from main without first destroying any of the resource pointers:
int main(int argc, char** argv){
if (SDL_Init(SDL_INIT_EVERYTHING) == -1){
std::cout << SDL_GetError() << std::endl;
return 1;
}
window = SDL_CreateWindow("Lesson 2", SDL_WINDOWPOS_CENTERED,
SDL_WINDOWPOS_CENTERED, SCREEN_WIDTH, SCREEN_HEIGHT, SDL_WINDOW_SHOWN);
if (window == nullptr){
std::cout << SDL_GetError() << std::endl;
return 2; //this
}
renderer = SDL_CreateRenderer(window, -1, SDL_RENDERER_ACCELERATED
| SDL_RENDERER_PRESENTVSYNC);
if (renderer == nullptr){
std::cout << SDL_GetError() << std::endl;
return 3; //and this too
}
Should I tell my terminate function to DestroyRenderer, DestroyWindow, DestroyTexture, etc. before exiting?
Same question as 'should i free memory that i've allocated before quitting a program'. Yes, if there is no bugs in SDL/X11/GL/etc. finalize code - all will be freed anyway. But i see no reasons why you shouldn't want to do that yourself.
Of course if you crash rather then exit - there is a good chance some things wouldn't be done and e.g. you wouldn't return display to native desktop resolution.
Ive personally had problems with SDL_TEXTURE that caused a memory leak while the program was running and the display of pictures just stopped after the program leaked about 2gb of ram when normally it uses 37 mb of ram.
SDL_DestroyTexture(texture);
Just called this afer every time i used to display a different picture with renderer and the memory leak was gone