OpenGL API glClearColor alpha channel does nothing - opengl

I use kivy in my application and try to create a transparent background window. I do this with:
Window.clearcolor = (1,1,1,0)
Window.clear()
That produces a white window - opaque.
Kivy directly calls glClearColor from the OpenGL 4 API (https://www.khronos.org/opengl/).
The docs say, that the last parameter is the alpha channel so I expect my window to be transparent.
Do I have a mistake in my thinking or is this a bug?

Default pixel formats are often RGB, so the alpha value is only used for blending operations. You need the correct pixel format to make your surfaces transparent, see this answer.

Related

Why RGBA is making png image blackish?

I am trying to capture PNG image with transparent background. I have set GL_RGBA as format in glReadPixels. But the output PNG image looks a little blackish or with saturated color. If backgrouund is not transparent that is if I use GL_RGB format in glReadPixels expected image is captured.
Note: In both cases, I am capturing translucent(partially transparent) shaded cube. If cube is completely opaque, RGBA format works fine.
Any ideas as to why this is happening for transparent background?
Blackish image with RGBA format
Image with RGB format
The cube looks darker because it's semitransparent, and whatever you use to view the image blends the semitransparent parts of it with black background.
You might argue that the cube in the picture shouldn't be semitransparent since it's drawn on top of a completely opaque background, but the problem is that the widely-used
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
blending mode (which you seem to use as well) is known to produce incorrect alpha values when used to draw semitransparent objects.
Usually you can't see it because alpha is discarded when drawing to the default framebuffer, but it becomes prominent when inspecting outputs of glReadPixels.
As you noticed, to solve it you can simply discard the alpha component.
But if you for some reason need to have a proper blending without those problems, you need
glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA);
Note that for this funciton to work, both source and destination images have to be in premultiplied alpha format. That is, you need to do color.rgb *= color.a
as a last step in your shaders.
The inverse operation (color.rgb /= color.a) can be used to convert an image back to the regular format, but if your images are completely opaque, this step is unnecessary.

Qt OpenGL - Translucent objects show content from behind window

I'm using Qt5 and its OpenGL integration, and am running into a problem when I try to draw translucent objects. When an object is translucent, whatever is visible behind my OpenGL window is shown within the screen area of that object, instead of the object being blended with whatever is already in the colour buffer. I have started watching YouTube videos through my translucent objects, as whatever shows through is live.
Interestingly, the most see-through an object gets seems to occur at half opacity - full opacity renders it solid, while zero opacity renders nothing at all (and whatever was previously in the background of the 3D scene remains there). Rendering translucent objects last does not fix the issue.
I have noticed that this also happens when I enable mipmaps on my textures - as the distance to a point on an object increases, the pixel concerned becomes more translucent and displays whatever is behind the OpenGL window. The issue occurs both on my Windows and OSX machines.
Is this a known issue? Is there a workaround? Google hasn't proven too helpful.
Hah, that's a funny one. I can't tell you what is going on, because it normally takes some extra effort to make windows actually transparent; in Windows you have to select a framebuffer format with an alpha channel and call DwnEnableBlurBehindWindow to actually achieve this effect. And as far as I know Qt doesn't do this.
But if it does here are a few hints:
Make sure you clear your framebuffer to alpha=1
When rendering translucent objects keep the destination alpha value 1, i.e. don't use blending modes and functions that modify the destination alpha value, or force it to 1.
There's actually little use for the alpha channel on the main window framebuffer, except for implementing window translucency effects. Unless you need those you should choose an either pixel format without an alpha channel for your window framebuffer, or keep all its pixels alpha values at full opacity.

Displaying images without the black portion

I am trying to display a bitmap using opengl but I don't want the black portion of the image to be displayed. I can do this in DirectX but not in opengl. In other words - I have images of plants with a black background and I want the plants to be drawn so they look realistic (without a black border).
You can do this using alpha-testing:
Add an alpha channel to your image before uploading it to a texture, 0.0 on black pixels and 1.0 everywhere else.
Enable alpha-testing with glEnable( GL_ALPHA_TEST )
glAlphaFunc( GL_LESS, 0.1f )
Render textured quad as usual. OpenGL will skip the zero-alpha texels thanks to the alpha-test.
There are a couple of ways you can do this.
One is to use an image editing program like Photoshop or GIMP to add an alpha channel to your image and then set the black portions of the image to a max alpha value. The upside to this is it allows you to decide which portions of the image you want to be transparent, since a fully programmatic approach can sometimes hide things you want to be seen.
Another method is to loop through every pixel in your bitmap and set the alpha based on some defined threshold (i.e. if you want true black, check to see if each color channel is at 255). The downside to this is it will occasionally cause some of your lines to disappear.
Also, you will need to make sure that you have actually enabled the alpha channel and test, as stated in the answer above. Make sure to double check the order of your calls as well, as this can cause a lot of issues when you're trying to use transparency.
That's about as much as I can suggest since you haven't posted the code itself, but hopefully it should be enough to at least get you on the way to a solution.

LWJGL set transparency color?

I've been playing with LWJGL a little, as a bit of a step up from Pygame. I'm trying to render a sprite and I was wondering if LWJGL has a function similar to Pygame's colorkey that lets you define a color in an image that will be rendered as transparent. Do you have to use an alpha channel in OpenGL?
OpenGL doesn't have any built in color keying support. You'll need to either manually swap your key for alpha on the CPU, or use a custom shader that replaces it on the fly.

Problem with blending in openGL (Color bar example)

Can anyone give some clues as to why when I try to render the color bar quad below
It appears like this:
Here is my rendering code:
gl.glEnable(GL.GL_BLEND);
gl.glBlendFunc(GL.GL_ONE, GL.GL_ZERO);
gl.glBlendEquation(GL.GL_FUNC_ADD);
gl.glEnable(GL.GL_ALPHA_TEST);
gl.glAlphaFunc(GL.GL_GREATER, 0.01f);
// do the drawing...
gl.glDisable(GL.GL_TEXTURE_2D);
gl.glDisable(GL.GL_ALPHA_TEST);
I'm sure the solution is simple and I'm just having a brainfart but it's just one of those days!
Thanks for the help!
What kind of blending are you trying to perform? To simply draw something without any color mixing or alpha channels you don't even have to play around with GL_BLEND or GL_ALPHA_TEST (leave both disabled). GL_BLEND is used to define how to add different "layers" of color (usually on how to apply alpha channels) while GL_ALPHA_TEST decides what alpha values to respect/ignore. Also check your vertex colors when rendering the quad (try to render a unicolor quad without texture, e.g. using magenta).
However looking at your images I'd guess you somehow disabled drawing to your red color channel (glColorMask()) - although there's yellow, which confuses me.
There was a problem with RGBA getting swapped around when I imported the PNG file.