glBlendFunc() with 32-bit RGBA textures - opengl

I have a texture that is semi-transparent with varying opacity at different locations. I have the main texture bitmap, and a mask bitmap. When the program executes, the alpha values from the mask bitmap are loaded into the alpha values of the main texture bitmap. The areas that I want to be transparent have a value of 255 alpha, and the areas that I want to remain totally opaque have values of 0 alpha. There are in-between values also for mid-transparency.
I have tried all manner of glBlendFunc() settings, but it is either completely invisible or it acts on the RGB colors of the source texture.

Typically in OpenGL, 0 means transparent and 255 opaque, which is the opposite that you have.
so something like:
glBlendFunc(GL_ONE_MINUS_SRC_ALPHA, GL_SRC_ALPHA);
Should work.

I had the same issue. I was using SDL_Image to load my images. I realized when I was converting the image I was using SDL_DisplayFormat instead of SDL_DisplayFormatAlpha. This solved my issue.

Related

Why RGBA is making png image blackish?

I am trying to capture PNG image with transparent background. I have set GL_RGBA as format in glReadPixels. But the output PNG image looks a little blackish or with saturated color. If backgrouund is not transparent that is if I use GL_RGB format in glReadPixels expected image is captured.
Note: In both cases, I am capturing translucent(partially transparent) shaded cube. If cube is completely opaque, RGBA format works fine.
Any ideas as to why this is happening for transparent background?
Blackish image with RGBA format
Image with RGB format
The cube looks darker because it's semitransparent, and whatever you use to view the image blends the semitransparent parts of it with black background.
You might argue that the cube in the picture shouldn't be semitransparent since it's drawn on top of a completely opaque background, but the problem is that the widely-used
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
blending mode (which you seem to use as well) is known to produce incorrect alpha values when used to draw semitransparent objects.
Usually you can't see it because alpha is discarded when drawing to the default framebuffer, but it becomes prominent when inspecting outputs of glReadPixels.
As you noticed, to solve it you can simply discard the alpha component.
But if you for some reason need to have a proper blending without those problems, you need
glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA);
Note that for this funciton to work, both source and destination images have to be in premultiplied alpha format. That is, you need to do color.rgb *= color.a
as a last step in your shaders.
The inverse operation (color.rgb /= color.a) can be used to convert an image back to the regular format, but if your images are completely opaque, this step is unnecessary.

Is there a way to make bmps transparent?

(Using WinApi) Is there a way to:
Make transparent pixels?
Somehow instead of using transparency just have the image dynamically get the background colors and textures, and fill certain Colors with those textures, For Example: If I had a video game sprite and the background color of it was white, could I somehow get those white pixels and fill them with the background colors/textures?
If you create a 32-bit bitmap, 24 bits of each pixel are used for RGB values and the extra 8 bits are used for an alpha channel. Just set the alpha to 0 for full transparency.
When creating a bitmap that uses 24-bit or smaller pixels, the transparent color is usually indicated by the pixel in the lower-left corner of the bitmap.
Either way, creating a transparent bitmap is only half the equation. Creating a transparent bitmap itself is easy, but you then have to render the bitmap in a transparent manner. The Win32 API has TransparentBlt() and AlphaBlend() functions for that purpose, and there are plenty of online turorials and blogs that explain how to use them.

Copying SDL_Surfaces with Alpha Channels

I've run into issues preserving the alpha channels of surfaces being copied/clip blitted (blitting sections of a surface onto a smaller surface, they're spritesheets). I've tried various solutions, but the end result is that any surface that's supposed to have transparency ends up becoming fully opaque (alpha mask becomes white).
So my question is, how does one copy one RGBA SDL_Surface to another new surface (also RGBA), including the alpha channel? And if it's any different, how does one copy a section of an RGBA surface, to a new RGBA surface (the same size of the clipped portion of the source surface), ala tilesheet blitting.
It seems that SDL_BlitSurface blends the alpha channels, so when for example, I want to copy a tile from my tilesheet surface to a new surface (which is of course, blank, I'm assuming SDL fills surfaces with black or white by default), it ends up losing it's alpha mask, so that when that tile is finally blitted to the screen, it doesn't blend with whatever is on the screen.
SDL_DisplayFormatAlpha works great to copy surfaces with an alpha mask, but it doesn't take clip parameters, it's only intended to copy the entire surface, not a portion of it, hence my problem.
If anyone is still wondering after all these years:
Before bliting the surface, you need to make sure that the blending mode of the source (which is an SDL_Surface) is set to SDL_BLENDMODE_NONE as described in the documentation: SDL_SetSurfaceBlendMode(). Is should look something simple like this:
SDL_SetSurfaceBlendMode(source, SDL_BLENDMODE_BLEND);
SDL_BlitSurface(source, sourceRect, destination, destinationRect);
I had this problem before and have not come to an official answer yet.
However, I think the only way to do it will be to write your own copy function.
http://www.libsdl.org/docs/html/sdlpixelformat.html
This page will help you understand how SDL_Surface stores color information. Note that there is a huge difference between colors above and below 8 bits.

OpenGL: Weird transparency blending result

I'm working on creating a transparent GUI in OpenGL, and am trying to get text rendered over some semi-transparent quads, but the results are odd.
If I render the text by itself, with nothing behind it, it looks fine:
However, if I render a semi-transparent quad behind it (rendering the quad before rendering the text), I get this:
I have blending set to (GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA). The font texture is an all-white texture with the character shapes in the alpha channel.
Do I need to be doing something special when performing alpha-transparency over an existing layer of transparency? Or is there something else I need to check?
The alpha value of your font texture seems to be off. It should be 0 for texels that you want to be invisible and 1 (or 255 in bytes) for visible texels. You should check the texture and make sure alpha values are correct.
Instead of alpha blending, you can use alpha testing. This will completely get rid of fragments, that have a alpha value below a certain threshold and is often much faster than blending.
glDisbale(GL_BLEND);
glEnable(GL_ALPHA_TEST);
glAlphaFunc(GL_GREATER, 0.96f); // Or some fitting threshold for your texture
This might work even if your texture's alpha is off in some places, but doesn't look like it is the case here, as the 's' and 't' seem to have a low alpha in places where it should be 1.
Thanks for the responses. There was nothing wrong with my font texture, but your suggestions led me to try a few other things. Turns out the problem wasn't the transparency at all. There was a problem with rendering the background quad, which caused it to also render the text quads, but using the background texture. Bah...

Good way to deal with alpha channels in 8-bit bitmap? - OpenGL - C++

I am loading bitmaps with OpenGL to texture a 3d mesh. Some of these bitmaps have alpha channels (transparency) for some of the pixels and I need to figure out the best way to
obtain the values of transparency for each pixel
and
render them with the transparency applied
Does anyone have a good example of this? Does OpenGL support this?
First of all, it's generally best to convert your bitmap data to 32-bit so that each channel (R,G,B,A) gets 8 bits. When you upload your texture, specify a 32bit format.
Then when rendering, you'll need to glEnable(GL_BLEND); and set the blend function, eg: glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);. This tells OpenGL to mix the RGB of the texture with that of the background, using the alpha of your texture.
If you're doing this to 3D objects, you might also want to turn off back-face culling (so that you see the back of the object through the front) and sort your triangles back-to-front (so that the blends happen in the correct order).
If your source bitmap is 8-bit (ie: using a palette with one colour specified as the transparency mask), then it's probably easiest to convert that to RGBA, setting the alpha value to 0 when the colour matches your transparency mask.
Some hints to make things (maybe) look better:
Your alpha channel is going to be an all-or-nothing affair (either 0x00 or 0xff), so apply some blur algorithm to get softer edges, if that's what you're after.
For texels (texture-pixels) with an alpha of zero (fully transparent), replace the RGB colour with the closest non-transparent texel. When texture coordinates are being interpolated, they wont be blended towards the original transparency colour from your BMP.
If your pixmap are 8-bit single channel they are either grayscale or use a palette. What you first need to do is converting the pixmap data into RGBA format. For this you allocate a buffer large enough to hold a 4-channel pixmap of the dimensions of the original file. Then for each pixel of the pixmap use that pixel's value as index into the palette (look up table) and put that color value into the corresponding pixel in the RGBA buffer. Once finished, upload to OpenGL using glTexImage2D.
If your GPU supports fragment shaders (very likely) you can do that LUT transformation in the shader: Upload the 8-bit pixmal as a GL_RED or GL_LUMINANCE 2D texture. And upload the palette as a 1D GL_RGBA texture. Then in the fragment shader:
uniform sampler2D texture;
uniform sampler1D palette_lut;
void main()
{
float palette_index = texture2D(texture,gl_TexCoord[0].st).r;
vec4 color = texture1D(palette_lut, palette_index);
gl_FragColor = color;
}
Blended rendering conflicts with the Z buffer algorithm, so you must sort your geometry back-to-front for things to look properly. As long as this affects objects at a whole this is rather simple, but it becomes tedious if you need to sort the faces of a mesh rendering each and every frame. A method to avoid this is breaking down meshes into convex submeshes (of course a mesh that's convex already can not be broken down further). Then use the following method:
Enable face culling
for convex_submesh in sorted(meshes, far to near):
set face culling to front faces (i.e. the backside gets rendered)
render convex_submesh
set face culling to back faces (i.e. the fronside gets rendered)
render convex_submesh again