Premultiplied alpha and multisampling in OpenGL - opengl

I'm rendering some triangles into multisampled texture. Then blit this texture into normal texture and draw a textured quad onto screen.
I need the final texture to have premultiplied alpha. Is there a way to tell OpenGL to automatically premultiply semi-transparent pixels in multisampled texture (or when blitting)?
Or the only way is using an extra shader pass to perform multiplication manually?

A multisample resolve blit can't perform pre-multiplication. You will have to pre-multiply the texture's pixels in the process that reads from the texture.

Related

OpenGL offscreen rendering

I was trying to render a 2D scene into a off-screen FrameBuffer Object and use glFrameBufferTexture2D function to use the frame as texture to texture a cube.
The 2D scene is rendered in one context and the texture is used in another one in the same thread.
The problem is when I textured the cube, alpha channel seemed to be incorrect. I used apitrace to check the texture, and the texture has correct alpha value, and the shader was merely out_color = texture(in_texture, uv_coords)
The problem was solved if I blit the off-screen framebuffer color attachment to anything, whether it be itself or framebuffer 0 (output window).
I was wondering why this is happening and how to solve it without needing to blit the framebuffer.
Found out that I used single buffering for the static 2D scene and requires a glFlush to flush the pipeline.

OpenGL - Multi Sampled FBO methodology

I am implementing Multisampled Antialiasing into my deferred rendering engine and I'm curious, can I simply use a multisampled texture for the albedo output? Or will the depth texture, normals texture and even lighting texture have to be multisampled also?

Rendering transparent texture onto glTexture

I've been working in opengl for a while relatively smoothly, but recently I've noticed that when I render a primitive with a transparent texture onto my fbo texture (custom frame buffer) it makes the fbo texture transparent at the pixels the primitive's texture is transparent. The problem is that there are things behind this primitive (with solid color) already rendered before the transparent one. So the fbo texture should not be transparent at those pixels - blending a solid & transparent color should result in a solid color, shouldn't it?
So basically, opengl is adding transparency to my fbo-texture just because the last primitive drawn has transparent pixels even though there are solid colors behind it already drawn into the fbo texture. Shouldn't opengl blend the transparent texture with the fbo's existing pixels, and result in a solid color if the fbo texture is already filled with solid colors before rendering the transparent primitive?
What happens when I render my fbo texture to the default frame buffer, is that the clear color bleeds through parts of it - where the last drawn texture is transparent. But when I render the same scene straight to the default opengl frame buffer, the scene looks fine and the clear color is not bleeding into the transparent texture.
What's even more interesting is that the glClearColor - color is only visible where the primitive's texture's alpha has a gradient - the clear color has no influence on where the texture alpha is 1.0 or 0.0... is this a bug? It seems to affect the primitive's texture the most at pixels with 0.5 alpha. Then going further above or below decreases the glClearColor influence.
I know it's a bit of a complex question/situation, I honestly tried my best to explain
I'm using:
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
to draw both the partly transparent primitive into the fbo-texture, and then the fbo texture to the default framebuffer
This is what the fbo-texture drawn into the default opengl fbo looks like:
glClearColor is set to red.
blending a solid & transparent color should result in a solid color, shouldn't it?
Only if your blend mode tells it to. And it doesn't:
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
This applies the same blending operation to all four components of the colors. This causes the final alpha to be the result of multiplying the source alpha by itself and adding that to the destination alpha times 1-src alpha.
Now, you could use separate blend functions for the color and the alpha:
glBlendFuncSeparate(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA, GL_ZERO, GL_ONE);
The last two parameters specify the blending for just the alpha, while the first two specify the blending for the RGB colors. So this preserves the alpha. Separate blend functionality is available on GL 3.x and above, but is also available as an extension on older hardware.
But it seems to me that you probably don't want to change the alpha at all. So simply mask alpha writes when rendering to the texture:
glColorMask(GL_TRUE, GL_TRUE, GL_TRUE, GL_FALSE);
Don't forget to undo the alpha mask when you want to write to it again.

Rendering layers of bitmaps with alpha to FBO

I need to draw several "layers" of bitmaps that are semi-transparent to a FBO (for later readback).
My current approach is to create a FBO, attach a texture to it and use glTexSubImage2D to "draw" the bitmaps to the FBO, this however, doesn't work as glTexSubImage2D doesn't draw/blend the pixels, but just overwrite the pixels currently in the texture.
What's the best way to do this?
You create a FBO with a clean texture R attached to hold the final result.
For each of your bitmaps you:
Upload the bitmap to a texture T (T and and R are different textures).
Render a quad textured with T into the FBO with GL_BLEND enabled and properly set up.
The final result is that R holds your blended bitmaps. You can now read it back or use in other texturing operations.

Normal back buffer + render to depth texture in OpenGL (FBOs)

Here's the situation: I have a texture containing some depth values. I want to render some geometry using that texture as the depth buffer, but I want the color to be written to the normal framebuffer (i.e., the window's framebuffer).
How do I do this? I tried creating an FBO and only binding a depth texture to it (GL_DEPTH_COMPONENT) but that didn't work; none of the colors showed up.
No you can't. The FBO you are rendering to may be either a main framebuffer or an off-screen one. You can't mix them in any way.
Instead, I would suggest you to render to a color renderbuffer and then do a simple blitting operation into the main framebuffer.
Edit-1.
Alternatively, if you already have depth in the main FB, you can first blit your depth and then render to a main FB, thus saving video memory on the additional color renderbuffer.
P.S. Blitting is done via glBlitFramebuffer. In order to make it work you should setup GL_READ_FRAMEBUFFER, GL_DRAW_FRAMEBUFFER and glDrawBuffer() for each of them.