SDL_GetWindowFlags() returns seemingly random values [closed] - c++

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
I need my SDL2 program to know whether a window is fullscreen, and I thought I could get that info using SDL_GetWindowFlags(). By default I initialize my window with two flags: SDL_WINDOW_SHOWN and SDL_WINDOW_BORDERLESS, which are equal to 16 and 4 respectively. So I expected the function to return 20, but instead I get 532. And also sometimes 1556, which even changes to 532 during runtime after reinitializing the window a few times. 532 never changes to 1556 during runtime however.
How do these flags work?
bool init( int windowflags )
{
bool success = true;
if( SDL_Init( SDL_INIT_VIDEO ) < 0 )
{
printf( "Video initialization failed: %s\n", SDL_GetError() );
success = false;
}
else
{
gWindow = SDL_CreateWindow( "VIRGULE", SDL_WINDOWPOS_UNDEFINED, SDL_WINDOWPOS_UNDEFINED, WIN_W, WIN_H, SDL_WINDOW_SHOWN + windowflags );
if( gWindow == NULL )
{
printf( "Window could not be created: %s\n", SDL_GetError() );
success = false;
}
else
{
gRenderer = SDL_CreateRenderer( gWindow, -1, SDL_RENDERER_ACCELERATED + SDL_RENDERER_TARGETTEXTURE );
if( gRenderer == NULL )
{
printf( "Renderer could not be created: %s\n", SDL_GetError() );
success = false;
}
else
{
gTexture = SDL_CreateTexture( gRenderer, SDL_PIXELFORMAT_UNKNOWN, SDL_TEXTUREACCESS_TARGET, SCR_W, SCR_H );
if( gTexture == NULL )
{
printf( "Texture creation failed: %s\n", SDL_GetError() );
success = false;
}
}
}
}
printf( "%i\n", SDL_GetWindowFlags( gWindow ) );
//this is either prints 1556 or 532
return success;
}

Looks like your flag value is changing based on the states of SDL_WINDOW_INPUT_FOCUS and SDL_WINDOW_MOUSE_FOCUS. But it doesn't matter. Flag values change all the time. You shouldn't worry about the total value of the flags. You only need to know the value of the flag bit you are watching. The SDL_WINDOW_SHOWN and SDL_WINDOW_BORDERLESS flags are still set when the values are 532 and 1556 (if you look in binary).
Just grab the value of the bit flag:
int flags = SDL_GetWindowFlags( gWindow );
int window_shown = ( flags & SDL_WINDOW_SHOWN ) ? true : false;
int window_borderless = ( flags & SDL_WINDOW_BORDERLESS ) ? true : false;
int window_fullscreen = ( flags & SDL_WINDOW_FULLSCREEN ) ? true : false;
Here's a function you can use to see what flags are set based on the value:
void show_flags(int flags);
int main()
{
show_flags(20);
show_flags(532);
show_flags(1556);
return 0;
}
void show_flags(int flags) {
printf("\nFLAGS ENABLED: ( %d )\n", flags);
printf("=======================\n");
if(flags & SDL_WINDOW_FULLSCREEN) printf("SDL_WINDOW_FULLSCREEN\n");
if(flags & SDL_WINDOW_OPENGL) printf("SDL_WINDOW_OPENGL\n");
if(flags & SDL_WINDOW_SHOWN) printf("SDL_WINDOW_SHOWN\n");
if(flags & SDL_WINDOW_HIDDEN) printf("SDL_WINDOW_HIDDEN\n");
if(flags & SDL_WINDOW_BORDERLESS) printf("SDL_WINDOW_BORDERLESS\n");
if(flags & SDL_WINDOW_RESIZABLE) printf("SDL_WINDOW_RESIZABLE\n");
if(flags & SDL_WINDOW_MINIMIZED) printf("SDL_WINDOW_MINIMIZED\n");
if(flags & SDL_WINDOW_MAXIMIZED) printf("SDL_WINDOW_MAXIMIZED\n");
if(flags & SDL_WINDOW_INPUT_GRABBED) printf("SDL_WINDOW_INPUT_GRABBED\n");
if(flags & SDL_WINDOW_INPUT_FOCUS) printf("SDL_WINDOW_INPUT_FOCUS\n");
if(flags & SDL_WINDOW_MOUSE_FOCUS) printf("SDL_WINDOW_MOUSE_FOCUS\n");
if(flags & SDL_WINDOW_FULLSCREEN_DESKTOP) printf("SDL_WINDOW_FULLSCREEN_DESKTOP\n");
if(flags & SDL_WINDOW_FOREIGN) printf("SDL_WINDOW_FOREIGN\n");
}
More flags can be found here: https://wiki.libsdl.org/SDL_WindowFlags.
Output:
FLAGS ENABLED: ( 20 )
=======================
SDL_WINDOW_SHOWN
SDL_WINDOW_BORDERLESS
FLAGS ENABLED: ( 532 )
=======================
SDL_WINDOW_SHOWN
SDL_WINDOW_BORDERLESS
SDL_WINDOW_INPUT_FOCUS
FLAGS ENABLED: ( 1556 )
=======================
SDL_WINDOW_SHOWN
SDL_WINDOW_BORDERLESS
SDL_WINDOW_INPUT_FOCUS
SDL_WINDOW_MOUSE_FOCUS

Related

SDL terminates when creating renderer

When I run this code:
#include <iostream>
#include <SDL.h>
#include <stdexcept>
#include <GL/gl3w.h>
int main() try {
if ( SDL_Init( SDL_INIT_VIDEO | SDL_INIT_TIMER | SDL_INIT_GAMECONTROLLER ) != 0 )
throw std::runtime_error{ "Could not initialize sdl" };
SDL_GL_SetAttribute( SDL_GL_CONTEXT_FLAGS, 0 );
SDL_GL_SetAttribute( SDL_GL_CONTEXT_PROFILE_MASK, SDL_GL_CONTEXT_PROFILE_CORE );
SDL_GL_SetAttribute( SDL_GL_CONTEXT_MAJOR_VERSION, 3 );
SDL_GL_SetAttribute( SDL_GL_CONTEXT_MINOR_VERSION, 0 );
// Create window with graphics context
SDL_GL_SetAttribute( SDL_GL_DOUBLEBUFFER, 1 );
SDL_GL_SetAttribute( SDL_GL_DEPTH_SIZE, 24 );
SDL_GL_SetAttribute( SDL_GL_STENCIL_SIZE, 8 );
auto window_flags = (SDL_WindowFlags) ( SDL_WINDOW_OPENGL | SDL_WINDOW_RESIZABLE | SDL_WINDOW_ALLOW_HIGHDPI );
auto window = SDL_CreateWindow( "window", SDL_WINDOWPOS_CENTERED,
SDL_WINDOWPOS_CENTERED, 1280, 720, window_flags );
auto gl_context = SDL_GL_CreateContext( window );
SDL_GL_MakeCurrent( window, gl_context );
SDL_GL_SetSwapInterval( 1 ); // Enable vsync
if ( gl3wInit() != 0 )
throw std::runtime_error{ "Unable to initialize OpenGL loader" };
auto renderer = SDL_CreateRenderer( window, -1, 0 );
return 0;
}
catch ( std::exception& e ) {
std::cerr << e.what() << "\n";
return -1;
}
It produces the following output:
X Error of failed request: GLXBadDrawable
Major opcode of failed request: 151 (GLX)
Minor opcode of failed request: 5 (X_GLXMakeCurrent)
Serial number of failed request: 259
Current serial number in output stream: 259
Process finished with exit code 1
When I comment out the line that sets the opengl version, it runs fine.
What is causing this error and why does SDL_CreateRenderer terminate instead of returning a null pointer?

SDL TTL_OpenFont returns NULL without Error

When I try to open a font, I always get NULL in return and TTL_GetError doesn't provide any error message.
I use SDL2 (v2.0.9.0) and the SDL_TTF (v2.0.15) under Windows.
Code Snippets:
SDL/TTF Init:
bool Graphics::Init()
{
if (SDL_Init(SDL_INIT_VIDEO) < 0)
{
printf("[SDL]\tInitialization error: %s\n",SDL_GetError());
return false;
}
mWindowFlags |= SDL_WINDOW_SHOWN;
if (SCREEN_FULLSCREEN) mWindowFlags |= SDL_WINDOW_FULLSCREEN;
if (SCREEN_BORDERLESS) mWindowFlags |= SDL_WINDOW_BORDERLESS;
mWindow = SDL_CreateWindow("Titel", SDL_WINDOWPOS_CENTERED, SDL_WINDOWPOS_CENTERED, SCREEN_WIDTH, SCREEN_HEIGHT, mWindowFlags);
if (mWindow == NULL)
{
printf("[SDL]\tWindow creation error: %s\n", SDL_GetError());
return false;
}
mRenderer = SDL_CreateRenderer(mWindow, -1, SDL_RENDERER_ACCELERATED);
if (mRenderer == NULL)
{
printf("[SDL]\tRenderer creation error: %s\n", SDL_GetError());
return false;
}
SDL_SetRenderDrawColor(mRenderer, 0xFF, 0xFF, 0xFF, 0xFF);
// Flags for image format handle
int flags = IMG_INIT_PNG;
if (!(IMG_Init(flags) & flags))
{
printf("[SDL IMG]\tInitialization error: %s\n", IMG_GetError());
return false;
}
if (TTF_Init() != 0)
{
printf("[SDL TTF]\tInitialization error: %s\n",TTF_GetError());
return false;
}
mBackBuffer = SDL_GetWindowSurface(mWindow);
return true;
}
TTF Font:
TTF_Font * AssetManager::GetFont(std::string filename, int size)
{
std::string fullpath = SDL_GetBasePath();
fullpath.append("Assets\\Fonts\\" + filename);
std::string key = fullpath + (char)size;
if (mFonts[key] == nullptr)
{
mFonts[key] == TTF_OpenFont(fullpath.c_str(), size);
if (mFonts[key] == nullptr)
{
printf("[SDL TTF]\tFont loading error: Font:(%s) | FPath:(%s) | Error:(%s)\n",filename.c_str(), fullpath.c_str(), TTF_GetError());
}
}
return mFonts[key];
}
Output:
I tried already different TTF fonts but with no success.
If I give him a not existing font, I get a normal error that it couldnt get loaded.
Have somebody any idea or had a similar problems ?

SDL automatically closes window in MinGW when using IMG_Load

So I have been following this set of tutorials to get myself some general knowledge with SDL before going to make a game engine with it. However, once I reached this part of the tutorial, I started having issues.
Whenever I would try to use IMG_Load or any related commands, it would cause the program to instantly close. I was doing just fine when I was using SDL_LoadBMP, but now the problem only started to arise now. Even when I copy the code in the tutorial exactly, it still wants to be disagreeable.
bool init()
{
//Initialization flags
bool success = true;
//Initiralize SDL
if(SDL_Init(SDL_INIT_VIDEO) < 0)
{
printf("SDL failed to initialize! Details: %s\n", SDL_GetError());
success = false;
}
else
{
//Set texture filtering to linear
if( !SDL_SetHint( SDL_HINT_RENDER_SCALE_QUALITY, "1" ) )
{
printf( "Warning: Linear texture filtering not enabled!" );
}
//Create a window
gWindow = SDL_CreateWindow("densipoint", SDL_WINDOWPOS_UNDEFINED, SDL_WINDOWPOS_UNDEFINED, SCREEN_WIDTH, SCREEN_HEIGHT, SDL_WINDOW_SHOWN);
if(gWindow == 0)
{
printf("Window failed to display! Details: %s\n", SDL_GetError());
success = false;
}
else
{
//Create renderer for window
gRenderer = SDL_CreateRenderer(gWindow, -1, SDL_RENDERER_ACCELERATED);
if(gRenderer == 0)
{
printf("Renderer failed to be created! Details: %s\n", SDL_GetError());
success = false;
}
else
{
//Initialize renderer color
SDL_SetRenderDrawColor(gRenderer, 0xff, 0xff, 0xff, 0xff);
//Initialize PNG loading
int imgFlags = IMG_INIT_PNG;
if( !( IMG_Init( imgFlags ) & imgFlags ))
{
printf("SDL_image failed to initialize! Details: %s\n", IMG_GetError());
success = false;
}
}
}
}
return success;
}
Any ideas/advice? I'm using command-line MinGW with the g++ compiler.
EDIT: I have narrowed the problem down to issues with initializing SDL_image.
int imgFlags = IMG_INIT_PNG;
if( !( IMG_Init( imgFlags ) & imgFlags ))
{
printf("SDL_image failed to initialize! Details: %s\n", IMG_GetError());
success = false;
}

Creating Seperate Context for Each GPU while having one display monitor

I want to create one GL Context for each GPU on Linux using the GLX. As nVIDIA Slides show, it is pretty simple and I just have to use ":0.0" for the first gpu and ":0.1" for the second one in XOpenDisplay function. I have tried it but it only works with ":0.0" but not with ":0.1". I have two gpus: GTX 980 and GTX 970. Also, as the xorg.conf shows the Xinerama is disabled. Furthermore, I only have one display monitor and it is connected to the GTX 980.
Do you have any idea about how to fix that? or what is missing?
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#include <unistd.h>
#include <X11/Xlib.h>
#include <X11/Xutil.h>
#include <GL/gl.h>
#include <GL/glx.h>
#define GLX_CONTEXT_MAJOR_VERSION_ARB 0x2091
#define GLX_CONTEXT_MINOR_VERSION_ARB 0x2092
typedef GLXContext (*glXCreateContextAttribsARBProc)(Display*, GLXFBConfig, GLXContext, Bool, const int*);
// Helper to check for extension string presence. Adapted from:
// http://www.opengl.org/resources/features/OGLextensions/
static bool isExtensionSupported(const char *extList, const char *extension)
{
const char *start;
const char *where, *terminator;
/* Extension names should not have spaces. */
where = strchr(extension, ' ');
if (where || *extension == '\0')
return false;
/* It takes a bit of care to be fool-proof about parsing the
OpenGL extensions string. Don't be fooled by sub-strings,
etc. */
for (start=extList;;) {
where = strstr(start, extension);
if (!where)
break;
terminator = where + strlen(extension);
if ( where == start || *(where - 1) == ' ' )
if ( *terminator == ' ' || *terminator == '\0' )
return true;
start = terminator;
}
return false;
}
static bool ctxErrorOccurred = false;
static int ctxErrorHandler( Display *dpy, XErrorEvent *ev )
{
ctxErrorOccurred = true;
return 0;
}
int main(int argc, char* argv[])
{
Display *display = XOpenDisplay(":0.1");
if (!display)
{
printf("Failed to open X display\n");
exit(1);
}
// Get a matching FB config
static int visual_attribs[] =
{
GLX_X_RENDERABLE , True,
GLX_DRAWABLE_TYPE , GLX_WINDOW_BIT,
GLX_RENDER_TYPE , GLX_RGBA_BIT,
GLX_X_VISUAL_TYPE , GLX_TRUE_COLOR,
GLX_RED_SIZE , 8,
GLX_GREEN_SIZE , 8,
GLX_BLUE_SIZE , 8,
GLX_ALPHA_SIZE , 8,
GLX_DEPTH_SIZE , 24,
GLX_STENCIL_SIZE , 8,
GLX_DOUBLEBUFFER , True,
//GLX_SAMPLE_BUFFERS , 1,
//GLX_SAMPLES , 4,
None
};
int glx_major, glx_minor;
// FBConfigs were added in GLX version 1.3.
if ( !glXQueryVersion( display, &glx_major, &glx_minor ) ||
( ( glx_major == 1 ) && ( glx_minor < 3 ) ) || ( glx_major < 1 ) )
{
printf("Invalid GLX version");
exit(1);
}
printf( "Getting matching framebuffer configs\n" );
int fbcount;
GLXFBConfig* fbc = glXChooseFBConfig(display, DefaultScreen(display), visual_attribs, &fbcount);
if (!fbc)
{
printf( "Failed to retrieve a framebuffer config\n" );
exit(1);
}
printf( "Found %d matching FB configs.\n", fbcount );
// Pick the FB config/visual with the most samples per pixel
printf( "Getting XVisualInfos\n" );
int best_fbc = -1, worst_fbc = -1, best_num_samp = -1, worst_num_samp = 999;
int i;
for (i=0; i<fbcount; ++i)
{
XVisualInfo *vi = glXGetVisualFromFBConfig( display, fbc[i] );
if ( vi )
{
int samp_buf, samples;
glXGetFBConfigAttrib( display, fbc[i], GLX_SAMPLE_BUFFERS, &samp_buf );
glXGetFBConfigAttrib( display, fbc[i], GLX_SAMPLES , &samples );
printf( " Matching fbconfig %d, visual ID 0x%2x: SAMPLE_BUFFERS = %d,"
" SAMPLES = %d\n",
i, vi -> visualid, samp_buf, samples );
if ( best_fbc < 0 || samp_buf && samples > best_num_samp )
best_fbc = i, best_num_samp = samples;
if ( worst_fbc < 0 || !samp_buf || samples < worst_num_samp )
worst_fbc = i, worst_num_samp = samples;
}
XFree( vi );
}
GLXFBConfig bestFbc = fbc[ best_fbc ];
// Be sure to free the FBConfig list allocated by glXChooseFBConfig()
XFree( fbc );
// Get a visual
XVisualInfo *vi = glXGetVisualFromFBConfig( display, bestFbc );
printf( "Chosen visual ID = 0x%x\n", vi->visualid );
printf( "Creating colormap\n" );
XSetWindowAttributes swa;
Colormap cmap;
swa.colormap = cmap = XCreateColormap( display,
RootWindow( display, vi->screen ),
vi->visual, AllocNone );
swa.background_pixmap = None ;
swa.border_pixel = 0;
swa.event_mask = StructureNotifyMask;
printf( "Creating window\n" );
Window win = XCreateWindow( display, RootWindow( display, vi->screen ),
0, 0, 100, 100, 0, vi->depth, InputOutput,
vi->visual,
CWBorderPixel|CWColormap|CWEventMask, &swa );
if ( !win )
{
printf( "Failed to create window.\n" );
exit(1);
}
// Done with the visual info data
XFree( vi );
XStoreName( display, win, "GL 3.0 Window" );
printf( "Mapping window\n" );
XMapWindow( display, win );
// Get the default screen's GLX extension list
const char *glxExts = glXQueryExtensionsString( display,
DefaultScreen( display ) );
// NOTE: It is not necessary to create or make current to a context before
// calling glXGetProcAddressARB
glXCreateContextAttribsARBProc glXCreateContextAttribsARB = 0;
glXCreateContextAttribsARB = (glXCreateContextAttribsARBProc)
glXGetProcAddressARB( (const GLubyte *) "glXCreateContextAttribsARB" );
GLXContext ctx = 0;
// Install an X error handler so the application won't exit if GL 3.0
// context allocation fails.
//
// Note this error handler is global. All display connections in all threads
// of a process use the same error handler, so be sure to guard against other
// threads issuing X commands while this code is running.
ctxErrorOccurred = false;
int (*oldHandler)(Display*, XErrorEvent*) =
XSetErrorHandler(&ctxErrorHandler);
// Check for the GLX_ARB_create_context extension string and the function.
// If either is not present, use GLX 1.3 context creation method.
if ( !isExtensionSupported( glxExts, "GLX_ARB_create_context" ) ||
!glXCreateContextAttribsARB )
{
printf( "glXCreateContextAttribsARB() not found"
" ... using old-style GLX context\n" );
ctx = glXCreateNewContext( display, bestFbc, GLX_RGBA_TYPE, 0, True );
}
// If it does, try to get a GL 3.0 context!
else
{
int context_attribs[] =
{
GLX_CONTEXT_MAJOR_VERSION_ARB, 3,
GLX_CONTEXT_MINOR_VERSION_ARB, 0,
//GLX_CONTEXT_FLAGS_ARB , GLX_CONTEXT_FORWARD_COMPATIBLE_BIT_ARB,
None
};
printf( "Creating context\n" );
ctx = glXCreateContextAttribsARB( display, bestFbc, 0,
True, context_attribs );
// Sync to ensure any errors generated are processed.
XSync( display, False );
if ( !ctxErrorOccurred && ctx )
printf( "Created GL 3.0 context\n" );
else
{
// Couldn't create GL 3.0 context. Fall back to old-style 2.x context.
// When a context version below 3.0 is requested, implementations will
// return the newest context version compatible with OpenGL versions less
// than version 3.0.
// GLX_CONTEXT_MAJOR_VERSION_ARB = 1
context_attribs[1] = 1;
// GLX_CONTEXT_MINOR_VERSION_ARB = 0
context_attribs[3] = 0;
ctxErrorOccurred = false;
printf( "Failed to create GL 3.0 context"
" ... using old-style GLX context\n" );
ctx = glXCreateContextAttribsARB( display, bestFbc, 0,
True, context_attribs );
}
}
// Sync to ensure any errors generated are processed.
XSync( display, False );
// Restore the original error handler
XSetErrorHandler( oldHandler );
if ( ctxErrorOccurred || !ctx )
{
printf( "Failed to create an OpenGL context\n" );
exit(1);
}
// Verifying that context is a direct context
if ( ! glXIsDirect ( display, ctx ) )
{
printf( "Indirect GLX rendering context obtained\n" );
}
else
{
printf( "Direct GLX rendering context obtained\n" );
}
printf( "Making context current\n" );
glXMakeCurrent( display, win, ctx );
glClearColor( 0, 0.5, 1, 1 );
glClear( GL_COLOR_BUFFER_BIT );
glXSwapBuffers ( display, win );
sleep( 1 );
glClearColor ( 1, 0.5, 0, 1 );
glClear ( GL_COLOR_BUFFER_BIT );
glXSwapBuffers ( display, win );
sleep( 1 );
glXMakeCurrent( display, 0, 0 );
glXDestroyContext( display, ctx );
XDestroyWindow( display, win );
XFreeColormap( display, cmap );
XCloseDisplay( display );
return 0;
}
The reason it works with ":0.0" but not with ":0.1" is because they are the X display and screen numbers. ":0.0" means the first screen on the first display and ":0.1" means the second screen on the first display.
These numbers are for selecting which monitor you wish to display the window to and not which GPU you wish to use. As you have only one monitor attached you only have one screen so ":0.1" fails.
I believe the slides expect you to have two or more monitors attached, each driven by a different GPU.

Images and text not showing in SDL under Mac OSX

I got to compile, bundle and load resources under XCode 4.3 and SDL 1.2.15
I know resources are loading correctly because file handles are not null and no error is thrown.
I successfully load png's and ttf's, obtain and crop surfaces, and blit them.
But when I flip, the only thing I get to see are the lines I drew using SDL_Draw
I will put some bits of code, as I'm trying to keep an engine-ish structure so the code is everything but together.
Initialization:
void CEngine::Init() {
// Register SDL_Quit to be called at exit; makes sure things are cleaned up when we quit.
atexit( SDL_Quit );
// Initialize SDL's subsystems - in this case, only video.
if ( SDL_Init( SDL_INIT_EVERYTHING ) < 0 ) {
fprintf( stderr, "Unable to init SDL: %s\n", SDL_GetError() );
exit( 1 );
}
// Attempt to create a window with the specified height and width.
SetSize( m_iWidth, m_iHeight );
// If we fail, return error.
if ( m_pScreen == NULL ) {
fprintf( stderr, "Unable to set up video: %s\n", SDL_GetError() );
exit( 1 );
}
AdditionalInit();
}
and
void CTileEngine::AdditionalInit() {
SetTitle( "TileEngine - Loading..." );
PrintDebug("Initializing SDL_Image");
int flags = IMG_INIT_PNG;
int initted = IMG_Init( flags );
if( ( initted & flags ) != flags ) {
PrintDebug("IMG_Init: Failed to init required image support!");
PrintDebug(IMG_GetError());
// handle error
}
PrintDebug("Initializing SDL_TTF");
if( TTF_Init() == -1 ) {
PrintDebug("TTF_Init: Failed to init required ttf support!");
PrintDebug(TTF_GetError());
}
PrintDebug("Loading fonts");
font = TTF_OpenFont( OSXFileManager::GetResourcePath("Roboto-Regular.ttf"), 28 );
if( !font ) {
PrintDebug("Error loading fonts");
PrintDebug(TTF_GetError());
}
g_pGame = new CGame;
LoadGame( OSXFileManager::GetResourcePath( "test", "tmx") );
SetTitle( "TileEngine" );
PrintDebug("Finished AditionalInit()");
}
Main draw method
void CEngine::DoRender(){
++m_iFPSCounter;
if ( m_iFPSTickCounter >= 1000 ) {
m_iCurrentFPS = m_iFPSCounter;
m_iFPSCounter = 0;
m_iFPSTickCounter = 0;
}
SDL_FillRect( m_pScreen, 0, SDL_MapRGB( m_pScreen->format, 0, 0, 0 ) );
// Lock surface if needed
if ( SDL_MUSTLOCK( m_pScreen ) ){
if ( SDL_LockSurface( m_pScreen ) < 0 ){
return;
}
}
Render( GetSurface() );
// Render FPS
SDL_Color fpsColor = { 255, 255, 255 };
string fpsMessage = "FPS: ";
fpsMessage.append( SSTR(m_iCurrentFPS) );
SDL_Surface* fps = TTF_RenderText_Solid(font, fpsMessage.c_str(), fpsColor);
if( fps ) {
SDL_Rect destRect;
destRect.x = pDestSurface->w - fps->w;
destRect.y = pDestSurface->h - fps->h;
destRect.w = fps->w;
destRect.h = fps->h;
SDL_BlitSurface(fps, &fps->clip_rect, pDestSurface, &destRect);
SDL_FreeSurface(fps);
}
// Unlock if needed
if ( SDL_MUSTLOCK( m_pScreen ) )
SDL_UnlockSurface( m_pScreen );
// Tell SDL to update the whole gScreen
SDL_Flip( m_pScreen );
}
Image file loading
bool CEntity::VLoadImageFromFile( const string& sFile) {
if ( m_pSurface != 0 ){
SDL_FreeSurface( m_pSurface );
}
string nFile = string(OSXFileManager::APPNAME) + OSXFileManager::RESOURCEDIR + sFile;
SDL_Surface *pTempSurface;
pTempSurface = IMG_Load( nFile.c_str() );
m_sImage = sFile;
if ( pTempSurface == 0 ){
char czError[256];
sprintf( czError, "Image '%s' could not be opened. Reason: %s", nFile.c_str(), IMG_GetError() );
fprintf( stderr, "\nERROR: %s", czError );
return false;
} else {
pTempSurface = SDL_DisplayFormatAlpha(pTempSurface);
}
m_pSurface = pTempSurface;
return true;
}
Entity draw method
void CEntity::VRender( SDL_Surface *pDestSurface ) {
if ( ( m_pSurface == 0 ) || ( m_bVisible == false) || ( m_iAlpha == 0 ) ){
// If the surface is invalid or it's 100% transparent.
return;
}
SDL_Rect SDestRect;
SDestRect.x = m_iPosX;
SDestRect.y = m_iPosY;
SDestRect.w = m_pSurface->w;
SDestRect.h = m_pSurface->h;
if ( m_iAlpha != 255 )
SDL_SetAlpha( m_pSurface, SDL_SRCALPHA, m_iAlpha );
SDL_BlitSurface( m_pSurface, &m_pSurface->clip_rect, pDestSurface, &SDestRect );
}
I have checked and debugged million times and I don't get what's wrong here. As I told before, file loading seems to be OK.
But this part
void CTile::RenderGrid( SDL_Surface* pDestSurface ) {
Uint32 m_GridColor = SDL_MapRGB( pDestSurface->format, 0xFF, 0xFF, 0xFF );
Draw_Rect(pDestSurface, GetPosX(), GetPosY(), GetWidth(), GetHeight(), m_GridColor);
}
works like a charm.
I found out what was happening. Turns out that, from SDL version 1.1.18 SDL_Lock calls are recursive, so each lock must pair an unlock. That was not happening last time I used SDL, so I was not aware of it. Simply matching locks and unlocks did the job.