I ran into a strange problem recently. I wrote an application class which uses a very simple renderer to draw some models on screen. The camera is movable.
I ran the program on my laptop. Initially I noticed that nothing was being drawn on screen (the screen was being cleared by the correct color, however). Then I noticed that the screen would update itself IF I clicked on the decoration frame and moved the window: this way, the models became visible, but would not move unless I clicked and moved the decoration frame again.
I tested my program on a desktop computer, and everything worked fine; the camera moved smoothly.
Eventually, I got the program to work on my laptop, but I have to set SDL_GL_SetAttribute( SDL_GL_DOUBLEBUFFER, 0 ); and disable buffer swapping.
Below is the main application class. In the execution loop, I call the application state stack to loop & render (the application state actually owns the renderer).
In case it is of any consequence, my laptop has intel HD 4000 graphics, and the desktop has a GTX 670.
App::App() : _running( false ),
_deltaTime( 0u ),
_elapsedTime( 0u ),
_mainWindow( nullptr ),
_glContext(),
_stack() {
//ctor
}
App::~App() {
SDL_GL_DeleteContext( _glContext );
SDL_DestroyWindow( _mainWindow );
SDL_Quit();
}
void App::execute() {
_initialize();
static const float millisecondsPerFrame = 17;
while ( _running ) {
//get the delta time & update elapsed time
uint32_t oldTime = _elapsedTime;
_elapsedTime = SDL_GetTicks();
_deltaTime = _elapsedTime - oldTime;
_processEvents();
_loop( _deltaTime / 1000.0f );
_render();
//apply possible state changes made to the stack
_stack.applyPendingChanges();
int usedTime = SDL_GetTicks() - int ( _elapsedTime );
//sleep the remainder of the cycle if we didn't use the entire update cycle
if ( millisecondsPerFrame - usedTime > 0 ) {
SDL_Delay( uint32_t ( millisecondsPerFrame - usedTime ) );
}
}
}
void App::_initialize() {
//initialize random number generator
nge::srand();
_running = true;
_initializeSDL();
_initializeOpenGL();
SDL_GL_MakeCurrent( _mainWindow, _glContext );
//attempt to set late swap tearing
int res = SDL_GL_SetSwapInterval( -1 );
//returns 0 on success
//returns -1 if swap interval is not supported
if ( res == -1 ) {
std::cout << "App::_initializeSDL> " << SDL_GetError() << "\n\n";
SDL_GL_SetSwapInterval( 1 );
}
_stack.registerState<GameState>( AppStateID::Game );
_stack.pushState( AppStateID::Game );
_stack.applyPendingChanges();
}
void App::_initializeSDL() {
SDL_Init( SDL_INIT_VIDEO );
SDL_Init( SDL_INIT_TIMER );
SDL_GL_SetAttribute( SDL_GL_CONTEXT_MAJOR_VERSION, 3 );
SDL_GL_SetAttribute( SDL_GL_CONTEXT_MINOR_VERSION, 3 );
SDL_GL_SetAttribute( SDL_GL_CONTEXT_PROFILE_MASK,
SDL_GL_CONTEXT_PROFILE_CORE );
SDL_GL_SetAttribute( SDL_GL_ACCELERATED_VISUAL, 1 );
/**
For some reason, on my Samsung Series 9, double buffering does not
work.
*/
SDL_GL_SetAttribute( SDL_GL_DOUBLEBUFFER, 0 );
SDL_GL_SetAttribute( SDL_GL_DEPTH_SIZE, 16 );
SDL_GL_SetAttribute( SDL_GL_STENCIL_SIZE, 8 );
//anti-aliasing
SDL_GL_SetAttribute( SDL_GL_MULTISAMPLEBUFFERS, 1 );
SDL_GL_SetAttribute( SDL_GL_MULTISAMPLESAMPLES, 4 );
_mainWindow = SDL_CreateWindow( "window",
SDL_WINDOWPOS_UNDEFINED,
SDL_WINDOWPOS_UNDEFINED,
800,
600,
SDL_WINDOW_OPENGL |
SDL_WINDOW_RESIZABLE |
SDL_WINDOW_MAXIMIZED |
SDL_WINDOW_SHOWN );
_glContext = SDL_GL_CreateContext( _mainWindow );
}
void App::_initializeOpenGL() {
//initialize GLEW
glewExperimental = GL_TRUE;
if ( glewInit() != GLEW_OK ) {
std::cerr << "glewInit failed." << std::endl;
std::exit( EXIT_FAILURE );
}
glEnable( GL_DEPTH_TEST );
//enable culling
glEnable( GL_CULL_FACE );
glCullFace( GL_BACK );
glDepthFunc( GL_LEQUAL );
glEnable( GL_TEXTURE_CUBE_MAP_SEAMLESS );
std::cout << "OpenGL version: " << glGetString( GL_VERSION ) << std::endl;
std::cout << "GLSL version: " << glGetString( GL_SHADING_LANGUAGE_VERSION ) << std::endl;
std::cout << "Vendor: " << glGetString( GL_VENDOR ) << std::endl;
std::cout << "Renderer: " << glGetString( GL_RENDERER ) << std::endl << std::endl;
//make sure OpenGL 3.3 is available
ASSERT( GLEW_VERSION_3_3, "OpenGL 3.3 API is not available" );
}
void App::_processEvents() {
SDL_Event event;
while ( SDL_PollEvent( &event ) ) {
if ( event.type == SDL_QUIT ) {
_running = false;
}
}
}
void App::_loop( float delta ) {
_stack.loop( delta );
}
void App::_render() {
_stack.render();
//SDL_GL_SwapWindow( _mainWindow );
}
The first thing that I would check are GPU drivers on the laptop. Make sure that the drivers version matches the drivers version on the desktop.
Second thing is to add error printing. From here :
window = SDL_CreateWindow("OpenGL Window", SDL_WINDOWPOS_CENTERED, SDL_WINDOWPOS_CENTERED, 640, 480, SDL_WINDOW_OPENGL);
if (!window) {
fprintf(stderr, "Couldn't create window: %s\n", SDL_GetError());
return;
}
context = SDL_GL_CreateContext(window);
if (!context) {
fprintf(stderr, "Couldn't create context: %s\n", SDL_GetError());
return;
}
Third thing to check are requested buffers. Maybe the GPU or drivers do not support double buffering, or depth size of 16 bits, or some other parameter that you requested. So, play with parameters in the initializeSDL() function, and find the one that works on your laptop.
Related
I have just set up the SDL2 framework on my mac but however compilation and running the program succeeds, the window is not responding (I copied code that creates some rectangle).
I use xcode and I followed tutorial from here http://lazyfoo.net/tutorials/SDL/01_hello_SDL/mac/xcode/index.php
step by step.
SDL_Window* window = NULL;
//The surface contained by the window
SDL_Surface* screenSurface = NULL;
//Initialize SDL
if( SDL_Init( SDL_INIT_VIDEO ) < 0 )
{
printf( "SDL could not initialize! SDL_Error: %s\n", SDL_GetError() );
}
else
{
//Create window
window = SDL_CreateWindow( "SDL Tutorial", SDL_WINDOWPOS_UNDEFINED, SDL_WINDOWPOS_UNDEFINED, SCREEN_WIDTH, SCREEN_HEIGHT, SDL_WINDOW_SHOWN );
if( window == NULL )
{
printf( "Window could not be created! SDL_Error: %s\n", SDL_GetError() );
}
else
{
//Get window surface
screenSurface = SDL_GetWindowSurface( window );
//Fill the surface white
SDL_FillRect( screenSurface, NULL, SDL_MapRGB( screenSurface->format, 0xFF, 0xFF, 0xFF ) );
//Update the surface
SDL_UpdateWindowSurface( window );
cout << " Ok" << endl;
//Wait two seconds
SDL_Delay( 20000 );
}
}
//Destroy window
SDL_DestroyWindow( window );
//Quit SDL subsystems
SDL_Quit();
return 0;
Why could that problem happen?
Thank you in advance
In order for a program written SDL to "respond" to operating system, you should give control back to SDL for it to process system messages and give them back to you as SDL events (mouse events, keyboard events and so on).
To do that, you have to add a loop that uses SDL_PollEvent, that should look something
while(true)
{
SDL_Event e;
while (SDL_PollEvent(&e))
{
// Decide what to do with events here
}
// Put the code that is executed every "frame".
// Under "frame" I mean any logic that is run every time there is no app events to process
}
There are some special events such as SDL_QuiEvent that you would need to handle to have a way to close your application. If you want to handle it, you should modify your code to look something like this:
while(true)
{
SDL_Event e;
while (SDL_PollEvent(&e))
{
if(e.type == SDL_QUIT)
{
break;
}
// Handle events
}
// "Frame" logic
}
I have recently been breaking up a large source file into headers and smaller source files. I have never done this before, and largely because of scope, I had to rewrite some of the functions to accept pointers and then call them from main().
This is still a work in progress, however, I've ran into a problem that I have been trying for hours to fix. The code is developed from Lazy Foos excellent SDL tutorial series.
I have moved this function into a header file. Ingeniously titled misc.cpp/misc.h.
bool init(SDL_Window *gWindow, SDL_Renderer *gRenderer)
{
//Initialization flag
bool success = true;
//Initialize SDL
if( SDL_Init( SDL_INIT_VIDEO ) < 0 )
{
printf( "SDL could not initialize! SDL Error: %s\n", SDL_GetError() );
success = false;
}
else
{
//Set texture filtering to linear
if( !SDL_SetHint( SDL_HINT_RENDER_SCALE_QUALITY, "1" ) )
{
printf( "Warning: Linear texture filtering not enabled!" );
}
//Create window
gWindow = SDL_CreateWindow( "Roulette Testing", SDL_WINDOWPOS_UNDEFINED, SDL_WINDOWPOS_UNDEFINED, SCREEN_WIDTH, SCREEN_HEIGHT, SDL_WINDOW_SHOWN );
if( gWindow == NULL )
{
printf( "Window could not be created! SDL Error: %s\n", SDL_GetError() );
success = false;
}
else
{
//Create renderer for window
gRenderer = SDL_CreateRenderer( gWindow, -1, SDL_RENDERER_ACCELERATED | SDL_RENDERER_PRESENTVSYNC);
if( gRenderer == NULL )
{
printf( "Renderer could not be created! SDL Error: %s\n", SDL_GetError() );
success = false;
}
else
{
//Initialize renderer color
SDL_SetRenderDrawColor( gRenderer, 0xFF, 0xFF, 0xFF, 0xFF );
//Initialize PNG loading
int imgFlags = IMG_INIT_PNG;
if( !( IMG_Init( imgFlags ) & imgFlags ) )
{
printf( "SDL_image could not initialize! SDL_image Error: %s\n", IMG_GetError() );
success = false;
}
std::cout << gRenderer << "renderer not NULL" << std::endl; //This is where I am checking the value of gRenderer within the function.
}
}
}
std::cout << "initialization success" << std::endl;
return success;
}
Now I am trying to call this function from my main() after having included the header and compiled the cpp into object code. This worked fine for the texture header file.
Unfortunately, now when I'm passing a SDL_Renderer pointer to this init function, it's becoming local, and I'm left with a NULL pointer when I return. I have tried the "&" address of operator, but it won't accept it!
init(gWindow, gRenderer);
I'm unsure why, but I'm guessing it's to do with the type.
Any ideas guy??
In response to comments... In my main I fire off a
cout << gRenderer << endl;
and it comes back NULL both before and after the function call.
I have highlighted in the function where I fire off another cout.
Make the function signature a reference to a pointer and it will return the value returned from SDL_CreateRenderer to the calling function (assuming gRenderer is a pointer when passed in):
bool init(SDL_Window *gWindow, SDL_Renderer *&gRenderer)
Looks like you should also pass gWindow in the same way.
Maybe it's just me being dumb, but I'm sure this should work.
#include <SDL2/SDL.h>
#include <GL/glew.h>
struct Display
{
SDL_Window* window;
SDL_GLContext context;
};
Display* init()
{
SDL_Init( SDL_INIT_EVERYTHING );
SDL_GL_SetAttribute( SDL_GL_CONTEXT_MAJOR_VERSION, 3 );
SDL_GL_SetAttribute( SDL_GL_CONTEXT_MINOR_VERSION, 3 );
SDL_GL_SetAttribute( SDL_GL_CONTEXT_PROFILE_MASK , SDL_GL_CONTEXT_PROFILE_CORE );
SDL_Window* window = SDL_CreateWindow( "Ice Engine",
800, 600,
SDL_WINDOWPOS_CENTERED,
SDL_WINDOWPOS_CENTERED,
SDL_WINDOW_OPENGL );
if ( !window )
{
printf( "%s\n", SDL_GetError() );
return nullptr;
}
SDL_GLContext context = SDL_GL_CreateContext( window );
if ( !context )
{
printf( "%s\n", SDL_GetError() );
return nullptr;
}
glewExperimental = GL_TRUE;
if ( glewInit() != GLEW_OK )
return nullptr;
return new Display{ window, context };
}
int main( int argc, char** argv )
{
Display* display = init();
bool running = true;
SDL_Event e;
while( running )
{
while( SDL_PollEvent( &e ) )
if ( e.type == SDL_QUIT )
running = false;
SDL_GL_SwapWindow( display->window );
}
delete display;
SDL_Quit();
}
I probably shouldn't be using new and delete and such, but this was just a quick setup to get my project going. The problem is it compiles just fine, but when I run it I get this error:
X Error of failed request: BadValue (integer parameter out of range for operation)
Major opcode of failed request: 1 (X_CreateWindow)
Value in failed request: 0x0
Serial number of failed request: 155
Current serial number in output stream: 168
I've tried without setting the OpenGL context versions, but I just get the same error.
I tried switching to GLFW3 and it all works just fine. It creates me a window and a OpenGL 3.3 core profile context. So it seems to be a problem with SDL2. I'm running ubuntu 15.10 and I installed it via the command line with: sudo apt install libsdl2-dev.
You are calling SDL_CreateWindow incorrectly. You have mixed up the x, y and width, height settings. The right way would be:
SDL_Window* window = SDL_CreateWindow("Ice Engine",
SDL_WINDOWPOS_CENTERED,
SDL_WINDOWPOS_CENTERED,
800,
600,
SDL_WINDOW_OPENGL );
See SDL_CreateWindow reference. Other than that, your code looks fine.
I found out that my program - which only draw two pictures on the screen - runs at 200FPS. 200FPS is really low number at this point of development! It does not matter if I remove all textures and other data.
The slow part of code are SDL_RenderPresent and SDL_RenderClear. Without them, the program runs at about 300K FPS. Could you tell me where could be the problem?
Window and Renderer
_window = SDL_CreateWindow( "Window", SDL_WINDOWPOS_CENTERED, SDL_WINDOWPOS_CENTERED, 800, 600, 0 );
if ( _window == NULL ) {
std::cout << "CreateWindow problem: " << SDL_GetError() << std::endl;
}
_renderer = SDL_CreateRenderer( _window, -1, SDL_RENDERER_ACCELERATED );
if ( _renderer == NULL ) {
std::cout << "CreateRenderer problem: " << SDL_GetError() << std::endl;
}
Acording to SDL_RendererInfo only two flags are used - SDL_RENDERER_ACCELERATED and SDL_RENDERER_TARGETTEXTURE.
I am writing a program that displays an animation that is dependent on the size of the display. In order to get this to work with multiple displays, I have an array of display_data objects:
struct window_data
{
SDL_Rect bounds;
SDL_Window *window;
};
and initialize these for each display:
int numdisplays = SDL_GetNumVideoDisplays();
std::vector< window_data > screens( numdisplays );
for( int i = 0 ; i < numdisplays ; ++i )
{
SDL_GetDisplayBounds( i, &( screens[ i ].bounds ) );
screens[ i ].window
= SDL_CreateWindow( "Display", screens[ i ].bounds.x,
screens[ i ].bounds.y, screens[ i ].bounds.w,
screens[ i ].bounds.h, SDL_WINDOW_FULLSCREEN );
}
This works fine as long as my mouse cursor is in the primary display, but if I start the program with the cursor in the secondary display, it will draw both windows in the secondary display, resulting in only the second window being visible. This behavior seems to depend only on the location of the cursor and not the terminal window from which I run the program.
I have verified that the same display numbers and bounds are found regardless of the cursor location, so I am perplexed by the variation in the program behavior. Is this the intended behavior of SDL2, or a bug? In either case, could anyone suggest a workaround?
EDIT: The mouse dependency shows up on Debian with XFCE. I have tried this on Windows as well and it outputs both windows on the second monitor, regardless of the mouse position.
You can use the SDL_WINDOWPOS_UNDEFINED_DISPLAY macro in the position arguments to SDL_CreateWindow, in combination with the SDL_WINDOW_FULLSCREEN parameter.
Something like:
SDL_CreateWindow(
"Window Name",
SDL_WINDOWPOS_UNDEFINED_DISPLAY(display),
SDL_WINDOWPOS_UNDEFINED_DISPLAY(display),
0,
0,
SDL_WINDOW_FULLSCREEN),
The macro is not well documented, but you can see how is works clearly by reading the source code.
Try SDL_WINDOW_BORDERLESS instead of SDL_WINDOW_FULLSCREEN.
I'm not sure if multiple fullscreen windows can meaningfully coexist, especially once you factor in input grabs.
Try something like this:
#include <SDL2/SDL.h>
#include <vector>
struct window_data
{
SDL_Rect bounds;
SDL_Window *window;
SDL_Renderer* renderer;
};
int main( int argc, char **argv )
{
if( SDL_Init( SDL_INIT_EVERYTHING ) < 0 )
return EXIT_FAILURE;
std::vector< window_data > screens( SDL_GetNumVideoDisplays() );
for( size_t i = 0; i < screens.size(); ++i )
{
window_data& screen = screens[ i ];
SDL_GetDisplayBounds( i, &screen.bounds );
screen.window = SDL_CreateWindow
(
"Display",
screen.bounds.x, screen.bounds.y,
screen.bounds.w, screen.bounds.h,
SDL_WINDOW_BORDERLESS
);
screen.renderer = SDL_CreateRenderer( screen.window, 0, SDL_RENDERER_ACCELERATED );
SDL_ShowWindow( screen.window );
}
bool running = true;
while( running )
{
SDL_Event ev;
while( SDL_PollEvent( &ev ) )
{
if( ev.type == SDL_QUIT ) running = false;
if( ev.type == SDL_KEYUP &&
ev.key.keysym.sym == SDLK_ESCAPE ) running = false;
}
for( size_t i = 0; i < screens.size(); ++i )
{
window_data& screen = screens[ i ];
SDL_SetRenderDrawColor( screen.renderer, 255, 0, 0, 255 );
SDL_RenderFillRect( screen.renderer, NULL );
SDL_RenderPresent( screen.renderer );
}
SDL_Delay( 33 );
}
for( size_t i = 0; i < screens.size(); ++i )
{
window_data& screen = screens[ i ];
SDL_DestroyRenderer( screen.renderer );
SDL_DestroyWindow( screen.window );
}
SDL_Quit();
return EXIT_SUCCESS;
}