If I have a Renderbuffer that uses a color format without alpha, for example GL_RG8, how can I tell the alpha blender to use the green channel for alpha? This can be done in textures using a swizzle mask, but as renderbuffers don't support those, what can I do?
My current blendFunc is GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA.
Each user-defined output of the fragment shader contains 4 channels: RGBA. This is true regardless of the image format of the destination that the output will write to. These outputs are the source colors for the blend operation.
So just write to the alpha of the output as normal. It doesn't matter that the alpha won't be written to the framebuffer image. It's still a part of the source color, so it can still be used for blending purposes.
Related
I am trying to capture PNG image with transparent background. I have set GL_RGBA as format in glReadPixels. But the output PNG image looks a little blackish or with saturated color. If backgrouund is not transparent that is if I use GL_RGB format in glReadPixels expected image is captured.
Note: In both cases, I am capturing translucent(partially transparent) shaded cube. If cube is completely opaque, RGBA format works fine.
Any ideas as to why this is happening for transparent background?
Blackish image with RGBA format
Image with RGB format
The cube looks darker because it's semitransparent, and whatever you use to view the image blends the semitransparent parts of it with black background.
You might argue that the cube in the picture shouldn't be semitransparent since it's drawn on top of a completely opaque background, but the problem is that the widely-used
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
blending mode (which you seem to use as well) is known to produce incorrect alpha values when used to draw semitransparent objects.
Usually you can't see it because alpha is discarded when drawing to the default framebuffer, but it becomes prominent when inspecting outputs of glReadPixels.
As you noticed, to solve it you can simply discard the alpha component.
But if you for some reason need to have a proper blending without those problems, you need
glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA);
Note that for this funciton to work, both source and destination images have to be in premultiplied alpha format. That is, you need to do color.rgb *= color.a
as a last step in your shaders.
The inverse operation (color.rgb /= color.a) can be used to convert an image back to the regular format, but if your images are completely opaque, this step is unnecessary.
I'm porting some OpenGL code from a technical paper to use with Metal. In it, they use a render target with only one channel - a 16-bit float buffer. But then they set blending operations on it like this:
glBlendFunci(1, GL_ZERO, GL_ONE_MINUS_SRC_ALPHA);
With only one channel, does that mean that with OpenGL, the target defaults to being an alpha channel?
Does anyone know if it is the same with Metal? I am not seeing the results I expect and I am wondering if Metal differs, or if there is a setting that controls how single-channel targets are treated with regards to blending.
In OpenGL, image format channels are labeled explicitly with their channels. There is only one one-channel color format: GL_R* (obviously with different bitdepths and other info). That is, red-only. And while texture swizzling can make the red channel appear in other channels from a texture fetch, that doesn't work for framebuffer writes.
Furthermore, that blend function doesn't actually use the destination alpha. It only uses the source alpha, which has the value the FS gave it. So the fact that the framebuffer doesn't store an alpha is essentially irrelevant.
I am trying to blend a white texture with varying alpha values with a colored background. I am expecting the result to retain colors from the background, and have alpha values replaced by ones from the blended texture.
So, for background I use:
Gdx.gl.glEnable(GL20.GL_BLEND);
Gdx.gl.glBlendEquationSeparate(GL20.GL_FUNC_ADD, GL20.GL_FUNC_ADD);
Gdx.gl.glBlendFuncSeparate(GL20.GL_ONE, GL20.GL_ZERO, GL20.GL_ONE, GL20.GL_ZERO);
I expect the background triangles mesh to override the destination, both color and alpha.
Question 1 - why with those blendFunc parameters, alpha value is being ignored?
If I set blendfunc to GL_ONE, GL_ONE, GL_ZERO, GL_ZERO then the filled mesh is rendered with proper alpha level - but both the source and dest alpha are supposed to be multiplied by zero - why does this work?
====
Now to blend the alpha map I use:
Gdx.gl.glBlendEquationSeparate(GL20.GL_FUNC_ADD, GL20.GL_FUNC_ADD);
Gdx.gl.glBlendFuncSeparate(GL20.GL_ZERO, GL20.GL_ONE, GL20.GL_ONE, GL20.GL_ZERO);
Question 2 - This supposed to keep the destination color and replace alpha. However, when I render the texture with those blendfunc params, I get no change to the output at all...
I've been reading the opengl blending chapter over and over again to understand what I fail to understand, please, share your insight on how those parameters actually work
I use this:
Gdx.gl.glEnable(GL20.GL_BLEND);
Gdx.gl.glBlendFunc(GL20.GL_SRC_ALPHA, GL20.GL_ONE_MINUS_SRC_ALPHA);
It works but only in the order of rendering. In order to account for depth you will need alpha testing which libGDX does not include.
Is it possible to draw a RGB texture as grayscale without using fragment shaders, using only fixed pipeline openGL?
Otherwise I'd have to create two versions of texture, one in color and one in black and white.
I don't know how to do this with an RGB texture and the fixed function pipeline.
If you create the texture from RGB source data but specify the internal format as GL_LUMINANCE, OpenGL will convert the color data into greyscale for you. Use the standard white material and MODULATE mode.
Hope this helps.
No. Texture environment combiners are not capable of performing a dot product without doing the scale/bias operation. That is, it always pretends that [0, 1] values are encoded as [-1, 1] values. Since you can't turn that off, you can't do a proper dot product.
Do i need Alpha channels for transparency to work in OpenGL? can i use glBlendFunc or anything else to make somehow the Black or White color transparent/not visible? if yes, how to do it?
No, you don't need an alpha channel in your textures. Call discard in your fragment shader for all fragments that match your transparency rule.
Yes, you need alpha channels to use transparency. You can emulate the behaviour of color keying using shaders, or processing the image and replacing the color key with pixels with alpha = 0.0.
Notice that GPUs always allocate RGBA textures, even if you want a RGB texture. The alpha channel is still present in hardware.