SDL + SDL_ttf: Transparent blended text? - c++

I want to render an anti-aliased string on an SDL_Surface with a given alpha channel.
I figured out it is possible to render:
an anti-aliased string with the Blended variant of the string render method (ie: TTR_RenderText_Blended). But then I can't make it transparent.
An anti-aliased string with the Shaded method. But then there is a solid background. The background and the drawn string can be made transparent, but then the solid background is still there. Passing it a transparent background color is also not possible.
an non-anti-aliased string, which I can make transparent like I want with the Solid variant. But it is not anti-aliased.
Thanks

I know I'm a bit late on this one :/
According to SDL documentation on SDL_SetAlpha:
Note that per-pixel and per-surface alpha cannot be combined; the per-pixel alpha is always used if available.
So regular SDL_BlitSurface/SDL_SetAlpha won't work here. But it can be done:
The only ways I can think of alpha-blending TTF_RenderText_Blended output are to use OpenGL, or adjust the alpha values of each pixel in the surface.
By adjusting per-pixel alpha values
You can do this by scaling the per-pixel alpha values from [0, 255] to a new range [0, alpha]:
// Changes a surface's alpha value, by altering per-pixel alpha if necessary.
void SetSurfaceAlpha (SDL_Surface *surface, Uint8 alpha)
{
SDL_PixelFormat* fmt = surface->format;
// If surface has no alpha channel, just set the surface alpha.
if( fmt->Amask == 0 ) {
SDL_SetAlpha( surface, SDL_SRCALPHA, alpha );
}
// Else change the alpha of each pixel.
else {
unsigned bpp = fmt->BytesPerPixel;
// Scaling factor to clamp alpha to [0, alpha].
float scale = alpha / 255.0f;
SDL_LockSurface(surface);
for (int y = 0; y < surface->h; ++y)
for (int x = 0; x < surface->w; ++x) {
// Get a pointer to the current pixel.
Uint32* pixel_ptr = (Uint32 *)(
(Uint8 *)surface->pixels
+ y * surface->pitch
+ x * bpp
);
// Get the old pixel components.
Uint8 r, g, b, a;
SDL_GetRGBA( *pixel_ptr, fmt, &r, &g, &b, &a );
// Set the pixel with the new alpha.
*pixel_ptr = SDL_MapRGBA( fmt, r, g, b, scale * a );
}
SDL_UnlockSurface(surface);
}
}
I know it looks scary, but it's pretty straight-forward. The key line is here:
*pixel_ptr = SDL_MapRGBA( fmt, r, g, b, scale * a );
You can use it like this:
text_surface = TTF_RenderText_Blended( font, "Hello World!", color );
SetSurfaceAlpha( text_surface, 128 );
With OpenGL
If using OpenGL, things are lot easier. Assuming you convert the SDL_Surface from TTF_RenderText_Blended to a GL texture, you can just use:
glColor4f( 1.0, 1.0, 1.0, Alpha );
before you render it on a textured quad.
But don't forget to enable alpha blending first!
glEnable( GL_BLEND );
glBlendFunc( GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA );

All you have to do is call SDL_SetAlpha on your SDL_Surface after creating the sprite on which the text is rendered, but before calling SDL_DisplayFormatAlpha on that sprite.
// Make the sprite with the text on it
SDL_Surface *swSprite = TTF_RenderText_Solid( font, text, textColor ) ;
// CALL SET ALPHA NOW
SDL_SetAlpha( swSprite, SDL_SRCALPHA, 128 ) ; // 50% opacity
// OK, NOW you can convert it to display format. I'm presuming
// you called `SDL_SetVideoMode` with the `SDL_HWSURFACE` flag set previously
SDL_Surface* hwSprite = SDL_DisplayFormatAlpha( swSprite ) ;
// If you invert the above 2 steps, it won't work.
// We don't need the software sprite anymore
SDL_FreeSurface( swSprite ) ;
// Now draw the hwSprite as normal
SDL_BlitSurface( hwSprite, NULL, screen, &spriteLocation );

The way to do this with a TTF is to use SDL_SetTextureAlphaMod() Something like this
SDL_Surface *surface;
SDL_Texture *texture;
SDL_Color color = {255, 255, 255, 128}
surface = TTF_RenderText_Blended_Wrapped(myFont, "HI!", color, 100);
// myFont and the _renderer is pre-defined somewhere in the game
texture = SDL_CreateTextureFromSurface(_renderer, surface);
SDL_SetTextureAlphaMod(texture, color.a);
SDL_RenderCopy(_renderer, texture, NULL, &dest);
SDL_DestroyTexture(texture);
SDL_FreeSurface(surface);
/*...*/
SDL_RenderPresent(_renderer);
https://wiki.libsdl.org/SDL_SetTextureAlphaMod

Why not use bitmapped fonts? You could build a png image with alpha channel. I think that SDL_ttf works with the same system, it builds an image an internally uses bitmapped fonts.

Related

How I can add a shadow to a texture on SDL2?

So, I'm making a 2D game and I have some textures. I will like some of them to drop a shadow, there is something like drop-shadow in css for SDL2?
Render the texture first, then render a slightly larger semi-transparent gray square slightly behind it. If you want rounded corners, use a shader that increases alpha as you get further from the corners.
Since noone mentionned it yet, here it is:
int SDL_SetTextureColorMod(SDL_Texture* texture,
Uint8 r,
Uint8 g,
Uint8 b)
https://wiki.libsdl.org/SDL_SetTextureColorMod
This function multiplies a texture color channels when copying it to the SDL_Renderer*. Using it with 0 as r, g and b arguments would make your texture pitch black but not affect the alpha, so the texture would keep its shape (like in the case of a transparent PNG). You just have to copy that shadow before (= behind) your texture, with a slight offset. You can also change the overall transparency of the shadow, with SDL_SetTextureAlphaMod(SDL_Texture* texture, Uint8 a)
Just don't forget to set the values back to 255 when you're done.
Code example:
SDL_SetRenderDrawColor(renderer, 0, 0, 0, 255);
SDL_RenderClear(renderer);
// [...] draw background, etc... here
SDL_Rect characterRect;
// [...] fill the rect however you want
SDL_Rect shadowRect(characterRect);
shadowRect.x += 30; // offsets the shadow
shadowRect.y += 30;
SDL_SetTextureColorMod(characterTexture, 0, 0, 0); // makes the texture black
SDL_SetTextureAlphaMod(characterTexture, 191); // makes the texture 3/4 of it's original opacity
SDL_RenderCopy(renderer, characterTexture, NULL, &shadowRect); // draws the shadow
SDL_SetTextureColorMod(characterTexture, 255, 255, 255); // sets the values back to normal
SDL_SetTextureAlphaMod(characterTexture, 255);
SDL_RenderCopy(renderer, characterTexture, NULL, &characterRect); // draws the character
// [...] draw UI, additionnal elements, etc... here
SDL_RenderPresent(renderer);

SDL2 (C++) How to render an image smaller

I have finally started getting used to SDL 2's basic functions for rendering and I have stumbled across a problem that I believe the public might be able to answer. In my code I generate some text and using some code from a tutorial, am able to load the text as a texture (namely Lazy foo's tutorial). This texture now has a width and a height based on font size and how much text was entered. Another function I use loads in a square made of fancy boardering that I wish to use as a menu. This square is 200x200. As an example, if the text texture is 100x160, I want the square to now render as perhaps a 120x180 image (essentially compressing it to be a similar size as the text texture.
tl;dr:
I have 200x200 square.
I have 100x160 text texture
I want to render 200x200 square as a 120x160 square and render 100x160 text inside square.
***loadFromRenderedText takes a ttf font, a string, and a color (RGBA) to create an image texture based on the string -> generates own width/height
menuTextTexture.loadFromRenderedText(menuFont, "Info Item Skill Back",menuTextColor);
menuSize.x = 0;
menuSize.y = 0;
menuSize.w = menuTextTexture.getWidth() + boarderW;
menuSize.h = menuTextTexture.getHeight() + boarderW;
***menuSize is an SDL_Rect
menuBoxTexture.TextRender(XmenuRenderLocX, XmenuRenderLocY, &menuSize, 0, NULL, SDL_FLIP_NONE);
menuTextTexture.render(XmenuRenderLocX+boarderW, XmenuRenderLocY+boarderW);
TextRender and render do the same thing except render uses a scaling factor to multiply the clip size to be bigger (which I leave blank -> clip is then NULL and the basic height/width are used). For TextRender, I specify the render dimensions by passing the menuSize SDL rect. this takes the 200x200 square and renders only the 120x160 of the square at the (XmenuRenderLocX, XmenuRenderLocY)... thus essentially cropping the square, which is not what I want... I want to resize the square.
Any help will be greatly appreciated
Originally, I was using the provided LTexture::render function that was created for Lazy Foo's tutorial. See below code
void LTexture::render( int x, int y, SDL_Rect* clip, double angle, SDL_Point* center, SDL_RendererFlip flip )
{
//Set rendering space and render to screen
SDL_Rect renderQuad = { x, y, mWidth, mHeight };
//Set clip rendering dimensions
if( clip != NULL )
{
renderQuad.w = SCALE_SIZE*(clip->w);
renderQuad.h = SCALE_SIZE*(clip->h);
}
else if(mTexture ==NULL)
{
printf("Warning: Texture to Render is NULL!\n");
}
//Render to screen
SDL_RenderCopyEx( gRenderer, mTexture, clip, &renderQuad, angle, center, flip );
}
But because I didn't fully understand until now how exactly the function renders, I wasn't actually telling it to render at a new dimension (except in the case where I magnify everything with a SCALE_SIZE)
I made a new function for more control
void LTexture::DefinedRender(SDL_Rect* Textureclip, SDL_Rect* renderLocSize, double angle, SDL_Point* center, SDL_RendererFlip flip)
{
//Set clip rendering dimensions
if(mTexture ==NULL)
{
printf("Warning: Texture to Render is NULL!\n");
}
//Render to screen
SDL_RenderCopyEx( gRenderer, mTexture, Textureclip, renderLocSize, angle, center, flip );
}
And now everything works like I want it to.

How to get correct SourceOver alpha compositing in SDL with OpenGL

I am using an FBO (or "Render Texture") which has an alpha channel (32bpp ARGB) and clear that with a color that is not fully opaque, for example (R=1, G=0, B=0, A=0) (i.e. completely transparent). Then I am rendering a translucent object, for example a rectangle with color (R=1, G=1, B=1, A=0.5), on top of that. (All values normalized from 0 to 1)
According to common sense, as well as imaging software such as GIMP and Photoshop, as well as several articles on Porter-Duff compositing, I would expect to get a texture that is
fully transparent outside of the rectangle
white (1.0, 1.0, 1.0) with 50 % opacity inside the rectangle.
Like so (you won't see this on the SO website):
Instead, the background color RGB values, which are (1.0, 0.0, 0.0) are weighted overall with (1 - SourceAlpha) instead of (DestAlpha * (1 - SourceAlpha)). The actual result is this:
I have verified this behavior using OpenGL directly, using SDL's wrapper API, and using SFML's wrapper API. With SDL and SFML I have also saved the results as an image (with alpha channel) instead of merely rendering to the screen to be sure that it's not a problem with the final rendering step.
What do I need to do to produce the expected SourceOver result, either with SDL, SFML, or using OpenGL directly?
Some sources:
W3 article on compositing, specifies co = αs x Cs + αb x Cb x (1 – αs), weight of Cb should be 0 if αb is 0, no matter what.
English Wiki shows destination ("B") being weighted according to αb (as well as αs, indirectly).
German Wiki shows 50% transparency examples, clearly the transparent background's original RGB values do not interfere with either the green or the magenta source, also shows that the intersection is clearly asymmetric in favor of the element that is "on top".
There are also several questions on SO that seemingly deal with this at first glance, but I could not find anything that talks abut this specific issue. People suggest different OpenGL blending functions, but the general consensus seems to be glBlendFuncSeparate(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA, GL_ONE, GL_ONE_MINUS_SRC_ALPHA), which is what both SDL and SFML use by default. I have also tried different combinations with no success.
Another suggested thing is premultiplying the color with the destination alpha, since OpenGL can only have 1 factor, but it needs 2 factors for correct SourceOver. However, I cannot make sense of that at all. If I'm premultiplying (1, 0, 0) with the destination alpha value of, say, (0.1), I get (0.1, 0, 0) (as suggested here for example). Now I can tell OpenGL the factor GL_ONE_MINUS_SRC_ALPHA for this (and source with just GL_SRC_ALPHA), but then I'm effectively blending with black, which is incorrect. Though I am not a specialist on the topic, I put a fair amount of effort into trying to understand (and at least got to the point where I managed to program a working pure software implementation of every compositing mode). My understanding is that applying an alpha value of 0.1 "via premultiplication" to (1.0, 0.0, 0.0) is not at all the same as treating the alpha value correctly as the fourth color component.
Here is a minimal and complete example using SDL. Requires SDL2 itself to compile, optionally SDL2_image if you want to save as PNG.
// Define to save the result image as PNG (requires SDL2_image), undefine to instead display it in a window
#define SAVE_IMAGE_AS_PNG
#include <SDL.h>
#include <stdio.h>
#ifdef SAVE_IMAGE_AS_PNG
#include <SDL_image.h>
#endif
int main(int argc, char **argv)
{
if (SDL_Init(SDL_INIT_VIDEO) != 0)
{
printf("init failed %s\n", SDL_GetError());
return 1;
}
#ifdef SAVE_IMAGE_AS_PNG
if (IMG_Init(IMG_INIT_PNG) == 0)
{
printf("IMG init failed %s\n", IMG_GetError());
return 1;
}
#endif
SDL_Window *window = SDL_CreateWindow("test", SDL_WINDOWPOS_CENTERED, SDL_WINDOWPOS_CENTERED, 800, 600, SDL_WINDOW_OPENGL | SDL_WINDOW_SHOWN);
if (window == NULL)
{
printf("window failed %s\n", SDL_GetError());
return 1;
}
SDL_Renderer *renderer = SDL_CreateRenderer(window, 1, SDL_RENDERER_ACCELERATED | SDL_RENDERER_TARGETTEXTURE);
if (renderer == NULL)
{
printf("renderer failed %s\n", SDL_GetError());
return 1;
}
// This is the texture that we render on
SDL_Texture *render_texture = SDL_CreateTexture(renderer, SDL_PIXELFORMAT_RGBA8888, SDL_TEXTUREACCESS_TARGET, 300, 200);
if (render_texture == NULL)
{
printf("rendertexture failed %s\n", SDL_GetError());
return 1;
}
SDL_SetTextureBlendMode(render_texture, SDL_BLENDMODE_BLEND);
SDL_SetRenderDrawBlendMode(renderer, SDL_BLENDMODE_BLEND);
printf("init ok\n");
#ifdef SAVE_IMAGE_AS_PNG
uint8_t *pixels = new uint8_t[300 * 200 * 4];
#endif
while (1)
{
SDL_Event event;
while (SDL_PollEvent(&event))
{
if (event.type == SDL_QUIT)
{
return 0;
}
}
SDL_Rect rect;
rect.x = 1;
rect.y = 0;
rect.w = 150;
rect.h = 120;
SDL_SetRenderTarget(renderer, render_texture);
SDL_SetRenderDrawColor(renderer, 255, 0, 0, 0);
SDL_RenderClear(renderer);
SDL_SetRenderDrawColor(renderer, 255, 255, 255, 127);
SDL_RenderFillRect(renderer, &rect);
#ifdef SAVE_IMAGE_AS_PNG
SDL_RenderReadPixels(renderer, NULL, SDL_PIXELFORMAT_ARGB8888, pixels, 4 * 300);
// Hopefully the masks are fine for your system. Might need to randomly change those ff parts around.
SDL_Surface *tmp_surface = SDL_CreateRGBSurfaceFrom(pixels, 300, 200, 32, 4 * 300, 0xff0000, 0xff00, 0xff, 0xff000000);
if (tmp_surface == NULL)
{
printf("surface error %s\n", SDL_GetError());
return 1;
}
if (IMG_SavePNG(tmp_surface, "t:\\sdltest.png") != 0)
{
printf("save image error %s\n", IMG_GetError());
return 1;
}
printf("image saved successfully\n");
return 0;
#endif
SDL_SetRenderTarget(renderer, NULL);
SDL_SetRenderDrawColor(renderer, 255, 255, 255, 255);
SDL_RenderClear(renderer);
SDL_RenderCopy(renderer, render_texture, NULL, NULL);
SDL_RenderPresent(renderer);
SDL_Delay(10);
}
}
Thanks to #HolyBlackCat and #Rabbid76 I was able to shed some light on this entire thing. I hope this can help out other people who want to know how about correct alpha blending and the details behind premultiplied alpha.
The basic problem is that correct "Source Over" alpha blending in actually not possible with OpenGL's built-in blend functionality (that is glEnable(GL_BLEND), glBlendFunc[Separate](...), glBlendEquation[Separate](...)) (this is the same for D3D by the way). The reason is the following:
When calculating the result color and alpha values of the blending operation (according to correct Source Over), one would have to use these functions:
Each RGB color values (normalized from 0 to 1):
RGB_f = ( alpha_s x RGB_s + alpha_d x RGB_d x (1 - alpha_s) ) / alpha_f
The alpha value (normalized from 0 to 1):
alpha_f = alpha_s + alpha_d x (1 - alpha_s)
Where
sub f is the result color/alpha,
sub s is the source (what is on top) color/alpha,
d is the destionation (what is on the bottom) color/alpha,
alpha is the processed pixel's alpha value
and RGB represents one of the pixel's red, green, or blue color values
However, OpenGL can only handle a limited variety of additional factors to go with the source or destination values (RGB_s and RGB_d in the color equation) (see here), the relevant ones in this case being GL_ONE, GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA. We can specify the alpha formula correctly using those options, but the best we can do for RGB is:
RGB_f = alpha_s x RGB_s + RGB_d x (1 - alpha_s)
Which completely lacks the destination's alpha component (alpha_d). Note that this formula is equivalent to the correct one if \alpha_d = 1. In other words, when rendering onto a framebuffer which has no alpha channel (such as the window backbuffer), this is fine, otherwise it will produce incorrect results.
To solve that problem and achieve correct alpha blending if alpha_d is NOT equal to 1, we need some gnarly workarounds. The original (first) formula above can be rewritten to
alpha_f x RGB_f = alpha_s x RGB_s + alpha_d x RGB_d x (1 - alpha_s)
if we accept the fact that the result color values will be too dark (they will be multiplied by the result alpha color). This gets rid of the division already. To get the correct RGB value, one would have to divide the result RGB value by the result alpha value, however, as it turns out that conversion usually never needed. We introduce a new symbol (pmaRGB) which denotes RGB values which are generally too dark because they have been multiplied by their corresponding pixel's alpha value.
pmaRGB_f = alpha_s x RGB_s + alpha_d x RGB_d x (1 - alpha_s)
We can also get rid of the problematic alpha_d factor by ensuring that ALL of the destination image's RGB values have been multiplied with their respective alpha values at some point. For example, if we wanted the background color (1.0, 0.5, 0, 0.3), we do not clear the framebuffer with that color, but with (0.3, 0.15, 0, 0.3) instead. In other words, we are doing one of the steps that the GPU would have to do already in advance, because the GPU can only handle one factor. If we are rendering to an existing texture, we have to ensure that it was created with premultiplied alpha. The result of our blending operations will always be textures that also have premultiplied alpha, so we can keep rendering things onto there and always be sure that the destination does have premultiplied alpha. If we are rendering to a semi-transparent texture, the semi-transparent pixels will always be too dark, depending on their alpha value (0 alpha meaning black, 1 alpha meaning the correct color). If we are rendering to a buffer which has no alpha channel (like the back buffer we use for actually displaying things), alpha_f is implicitly 1, so the premultiplied RGB values are equal to the correctly blended RGB values. This is the current formula:
pmaRGB_f = alpha_s x RGB_s + pmaRGB_d x (1 - alpha_s)
This function can be used when the source does not yet have premultiplied alpha (for example, if the source is a regular image that came out of an image processing program, with an alpha channel that is correctly blended with no premultiplied alpha).
There is a reason we might want to get rid of \alpha_s as well, and use premultiplied alpha for the source as well:
pmaRGB_f = pmaRGB_s + pmaRGB_d x (1 - alpha_s)
This formula needs to be taken if the source happens to have premultiplied alpha - because then the source pixel values are all pmaRGB instead of RGB. This is always going to be the case if we are rending to an offscreen buffer with an alpha channel using the above method. It may also be reasonable to have all texture assets stored with premultiplied alpha by default so that this formula can always be taken.
To recap, to calculate the alpha value, we always use this formula:
alpha_f = alpha_s + alpha_d x (1 - alpha_s)
, which corresponds to (GL_ONE, GL_ONE_MINUS_SRC_ALPHA). To calculate the RGB color values, if the source does not have premultiplied alpha applied to its RGB values, we use
pmaRGB_f = alpha_s x RGB_s + pmaRGB_d x (1 - alpha_s)
, which corresponds to (GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA). If it does have premultiplied alpha applied to it, we use
pmaRGB_f = pmaRGB_s + pmaRGB_d x (1 - alpha_s)
, which corresponds to (GL_ONE, GL_ONE_MINUS_SRC_ALPHA).
What that practically means in OpenGL: When rendering to a framebuffer with alpha channel, switch to the correct blending function accordingly and make sure that the FBO's texture always has premultiplied alpha applied to its RGB values. Note that the correct blending function may potentially be different for each rendered object, according to whether or not the source has premultiplied alpha. Example: We want a background [1, 0, 0, 0.1], and render an object with color [1, 1, 1, 0.5] onto it.
// Clear with the premultiplied version of the real background color - the texture (which is always the destination in all blending operations) now complies with the "destination must always have premultiplied alpha" convention.
glClearColor(0.1f, 0.0f, 0.0f, 0.1f);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
//
// Option 1 - source either already has premultiplied alpha for whatever reason, or we can easily ensure that it has
//
{
// Set the drawing color to the premultiplied version of the real drawing color.
glColor4f(0.5f, 0.5f, 0.5f, 0.5f);
// Set the blending equation according to "blending source with premultiplied alpha".
glEnable(GL_BLEND);
glBlendFuncSeparate(GL_ONE, GL_ONE_MINUS_SRC_ALPHA, GL_ONE, GL_ONE_MINUS_SRC_ALPHA);
glBlendEquationSeparate(GL_ADD, GL_ADD);
}
//
// Option 2 - source does not have premultiplied alpha
//
{
// Set the drawing color to the original version of the real drawing color.
glColor4f(1.0f, 1.0f, 1.0f, 0.5f);
// Set the blending equation according to "blending source with premultiplied alpha".
glEnable(GL_BLEND);
glBlendFuncSeparate(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA, GL_ONE, GL_ONE_MINUS_SRC_ALPHA);
glBlendEquationSeparate(GL_ADD, GL_ADD);
}
// --- draw the thing ---
glDisable(GL_BLEND);
In either case, the resulting texture has premultiplied alpha. Here are 2 possibilities what we might want to do with this texture:
If we want to export it as an image that is correctly alpha blended (as per the SourceOver definition), we need to get its RGBA data and explicitly divide each RGB value by the corresponding pixel's alpha value.
If we want to render it onto the backbuffer (whose background color shall be (0, 0, 0.5)), we proceed as we would normally (for this example, we additionally want to modulate the texture with (0, 0, 1, 0.8)):
// The back buffer has 100 % alpha.
glClearColor(0.0f, 0.0f, 0.5f, 1.0f);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
// The color with which the texture is drawn - the modulating color's RGB values also need premultiplied alpha
glColor4f(0.0f, 0.0f, 0.8f, 0.8f);
// Set the blending equation according to "blending source with premultiplied alpha".
glEnable(GL_BLEND);
glBlendFuncSeparate(GL_ONE, GL_ONE_MINUS_SRC_ALPHA, GL_ONE, GL_ONE_MINUS_SRC_ALPHA);
glBlendEquationSeparate(GL_ADD, GL_ADD);
// --- draw the texture ---
glDisable(GL_BLEND);
Technically, the result will have premultiplied alpha applied to it. However, because the result alpha will always be 1 for each pixel, the premultiplied RGB values are always equal to the correctly blended RGB values.
To achieve the same in SFML:
renderTexture.clear(sf::Color(25, 0, 0, 25));
sf::RectangleShape rect;
sf::RenderStates rs;
// Assuming the object has premultiplied alpha - or we can easily make sure that it has
{
rs.blendMode = sf::BlendMode(sf::BlendMode::One, sf::BlendMode::OneMinusSrcAlpha);
rect.setFillColor(sf::Color(127, 127, 127, 127));
}
// Assuming the object does not have premultiplied alpha
{
rs.blendMode = sf::BlendAlpha; // This is a shortcut for the constructor with the correct blending parameters for this type
rect.setFillColor(sf::Color(255, 255, 255, 127));
}
// --- align the rect ---
renderTexture.draw(rect, rs);
And the likewise to draw the renderTexture onto the backbuffer
// premultiplied modulation color
renderTexture_sprite.setColor(sf::Color(0, 0, 204, 204));
window.clear(sf::Color(0, 0, 127, 255));
sf::RenderStates rs;
rs.blendMode = sf::BlendMode(sf::BlendMode::One, sf::BlendMode::OneMinusSrcAlpha);
window.draw(renderTexture_sprite, rs);
Unfortunately, this is not possible with SDL afaik (at least not on the GPU as part of the rendering process). Unlike SFML, which exposes fine-grained control over the blending mode to the user, SDL does not allow setting the individual blending function components - it only has SDL_BLENDMODE_BLEND hardcoded with glBlendFuncSeparate(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA, GL_ONE, GL_ONE_MINUS_SRC_ALPHA).

DirectX alpha masking

I'm working on a game using DirectX 9. Here's what I'm trying to do:
After the scene is rendered, on top of it I want to render few sprites: a black cover on entire scene and a few sprites, which are masks showing where the cover should have holes. So far I tried messing with blend mode but with no luck. My code setting it up looks like this:
D3DD->SetRenderState(D3DRS_ALPHABLENDENABLE, TRUE);
D3DD->SetRenderState(D3DRS_SRCBLEND, D3DBLEND_SRCALPHA);
D3DD->SetRenderState(D3DRS_DESTBLEND, D3DBLEND_INVSRCALPHA);
I guess the best way would be to multiply each sprites alpha, but according to http://msdn.microsoft.com/en-us/library/windows/desktop/bb172508%28v=vs.85%29.aspx no such mode is supported. Is There another way to do this?
edit
Following Nico Schertler's answer, here's the code I came up with:
LPDIRECT3DTEXTURE9 pRenderTexture;
LPDIRECT3DSURFACE9 pRenderSurface,
pBackBuffer;
// create texture
D3DD->CreateTexture(1024,
1024,
1,
D3DUSAGE_RENDERTARGET,
D3DFMT_R5G6B5,
D3DPOOL_DEFAULT,
&pRenderTexture,
NULL);
pRenderTexture->GetSurfaceLevel(0,&pRenderSurface);
// store old render target - back buffer
D3DD->GetRenderTarget(0,&pBackBuffer);
// set new render target - texture
D3DD->SetRenderTarget(0,pRenderSurface);
//clear texture to opaque black
D3DD->Clear(0,
NULL,
D3DCLEAR_TARGET | D3DCLEAR_ZBUFFER,
D3DCOLOR_XRGB(0,0,0),
32.0f,
0);
// set blending
D3DD->SetRenderState(D3DRS_SRCBLEND, D3DBLEND_ZERO);
D3DD->SetRenderState(D3DRS_DESTBLEND, D3DBLEND_ONE);
D3DD->SetRenderState(D3DRS_SRCBLENDALPHA, D3DBLEND_ZERO);
D3DD->SetRenderState(D3DRS_DESTBLENDALPHA, D3DBLEND_SRCALPHA);
D3DD->SetRenderState(D3DRS_SEPARATEALPHABLENDENABLE, TRUE);
//// now I render hole sprites the usual way
// restore back buffe as render target
D3DD->SetRenderTarget(0,pBackBuffer);
// restore blending
D3DD->SetRenderState(D3DRS_SRCBLEND, D3DBLEND_SRCALPHA);
D3DD->SetRenderState(D3DRS_DESTBLEND, D3DBLEND_INVSRCALPHA);
D3DD->SetRenderState(D3DRS_SRCBLENDALPHA, D3DBLEND_SRCALPHA);
D3DD->SetRenderState(D3DRS_DESTBLENDALPHA, D3DBLEND_INVSRCALPHA);
D3DD->SetRenderState(D3DRS_SEPARATEALPHABLENDENABLE, FALSE);
ulong color = ulong(-1);
Vertex2D v[4];
v[0] = Vertex2D(0, 0, 0);
v[1] = Vertex2D(1023, 0, 0);
v[3] = Vertex2D(1023, 1023, 0);
v[2] = Vertex2D(0, 1023, 0);
D3DD->SetTexture(0, pRenderTexture);
D3DD->SetFVF(D3DFVF_XYZRHW | D3DFVF_DIFFUSE | D3DFVF_TEX1);
D3DD->DrawPrimitiveUP(D3DPT_TRIANGLESTRIP, 2, v, sizeof(Vertex2D));
D3DD->SetTexture(0, NULL);
// release used resources
pRenderTexture->Release();
pRenderSurface->Release();
pBackBuffer->Release();
Unfortunatelly, the app crashes when restoring the old render target. Any advice?
Firstly, you should create the mask in a separate texture first. Then you can add the holes as needed. Finally, draw the mask on the screen:
Initialize the texture
Clear it to opaque black
Using the following blend states:
D3DRS_SRCBLEND -> D3DBLEND_ZERO (hole's color does not matter)
D3DRS_DESTBLEND -> D3DBLEND_ONE (preserve the black color)
D3DRS_SRCBLENDALPHA -> D3DBLEND_ZERO
D3DRS_DESTBLENDALPHA -> D3DBLEND_SRCALPHA
D3DRS_SEPARATEALPHABLENDENABLE -> TRUE
Draw each hole sprite
Restore default blending (src_alpha / inv_src_alpha)
Render the texture as a sprite to the back buffer
The above blend state assumes that the holes are opaque where there should be a hole. Then, the color is calculated by:
blended color = 0 * hole sprite color + 1 * background color
which should always be black.
And the alpha channel is calculated by:
blended alpha = 0 * hole sprite alpha + (1 - hole sprite alpha) * background alpha
So where the hole sprite is opaque, the blended alpha becomes 0. Where it is transparent, the blended alpha is the previous value. Values in between are blended.

How to load an icon with transparent background and display it correctly with C++ / OpenGL?

I am currently trying to load an icon which has a transparent background.
Then I create a bitmap from it and try to display the bits via glTexImage2D().
But the background of the icon never gets transparent :(
Here is some of my code:
DWORD dwBmpSize = 32*32*4;
byte* bmBits = new byte[dwBmpSize];
for(unsigned int i = 0; i <dwBmpSize; i+=4)
{
bmBits[i] = 255; // R
bmBits[i+1] = 0; // G
bmBits[i+2] = 0; // B
bmBits[i+3] = 255;// A
// I always get a red square, no matter what value i fill into alpha
}
//create texture from bitmap
glTexImage2D(target, 0,
GL_RGBA, 32, 32,
0, GL_RGBA, GL_UNSIGNED_BYTE, bmBits);
delete bmBits;
Edit: I changed the code, to be sure, that my bits have an alpha channel.
Now I am filling a 32x32 pxl area with custom values to see, what happens, instead of loading an icon. It still does not work!
What am I missing? Or is it just not possible?
You have to enable blending and set the correct blend mode.
glEnable (GL_BLEND);
glBlendFunc (GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
Also if you fill the entire alpha channel with 255 it will still be opaque. Try 128 or something instead.