How to post-process image with shaders in OpenGL? - opengl

Shaders can not read data from framebuffer, they can pass data only forward by rendering pipeline. But for post-processing it is needed to read rendered image.
I'm going to resolve this as following: 1) create a texture with size of viewport; 2) render image to the texture normally; 3) render the texture to framebuffer passing it through post-processing shader.
Am I doing right? Are there more effective ways to do post-processing?

That is indeed the usual way to do post-processing ! Render to texture by binding an FBO for your first pass, then use that texture as the input of your post-process shader after unbinding your FBO (i.e. returning to the default frame buffer).

Related

Use default framebuffer depth buffer in a fbo

I want to achieve a post process outline effect on some actors in my game and I want the outline be occluded by other actors closer to the camera. To do so I planed to render the elements that should be outline in an FBO and an outline shader. To discard the pixel that are occluded I want to use the depth buffer of the default framebuffer.
I read and searched but I didn't find how to properly use the default framebuffer depth buffer in another framebuffer or how to copy or anyway to use the default depth buffer informations on an fbo.
How can I achieve it?
I read and searched but I didn't find how to properly use the default framebuffer depth buffer in another framebuffer
Unfortunately, there is no way in OpenGL to do that.
or how to copy or anyway to use the default depth buffer informations on an fbo.
That is quite easy. Just create an FBO with a appropriate depth attachment (no matter if texture or renderbuffer), and blit the default depth buffer to it:
glBindFramebuffer(GL_READ_FRAMEBUFFER, 0);
glBindFramebuffer(GL_DRAW_FRAMEBUFFER, your_target_fbo);
glBlitFramebuffer(0,0,width,height,0,0,width,height,GL_DEPTH_BUFFER_BIT, GL_NEAREST);
However, since you do some post-processing anyway, it might be a good option to render the intial scene into an FBO and re-use the same depth buffer in different FBOs.
Unfortunately, it is not possible to attach the default depth-buffer to a FBO or to read it from a shader.
You can copy the default framebuffer into a renderbuffer (or a framebuffer attached texturem, see glBlitFramebuffer), but this could be relatively slow.
Another option is to render the complete scene first to a FBO (color + depth), then use the outline shader to read from this FBO, combine the result with the outline calculation and write the final result to the default framebuffer.

Accessing bytes in an FBO

I have rendered a texture to an FBO in the hope that I can now apply effects to that texture before displaying it. I cant work out how to access the FBO array to change it. Any idea? Thanks for your time
Could you specify what you mean by adding effects? The basic idea of a fbo is to render everything to that fbo and then apply that fbo to a texture. If you wish to know how to bind that framebuffer to a texture, simple create and setup a texture with an empty data set for glTexImage2D. Then, call glFramebufferTexture2D with your framebuffer object bound and pass in your texture as one of the arguments. To render to your framebuffer, call glBindFramebuffer. To bind your framebuffer texture, simply call glBindTexture, like any other texture.

Separate Frame Buffer and Depth Buffer in OpenGL

In DirectX you are able to have separate render targets and depth buffers, so you can bind a render target and a depth buffer, do some rendering, remove the depth buffer and then do more rendering using the old depth buffer as a texture.
How would you go about this in opengl? From my understanding, you have a framebuffer object that contains both the color buffer(s) and an optional depth buffer. I don't think I can bind several framebuffer objects at the same time, would I have to recreate the framebuffer object every time it changes(probably several times a frame)? How do normal opengl programs do this?
A Framebuffer Object is nothing more than a series of references to images. These can be images in Textures (such as a mipmap layer of a 2D texture) or Renderbuffers (which can't be used as textures).
There is nothing stopping you from assembling an FBO that uses a texture's image for its color buffer and a texture's image for its depth buffer. Nor is there anything stopping you from later (so long as you're not rendering to that FBO while doing this) sampling from the texture as a depth texture. The FBO does not suddenly own these images exclusively or something.
In all likelihood, what has happened is that you've misunderstood the difference between an FBO and OpenGL's Default Framebuffer. The default framebuffer (ie: the window) is unchangeable. You can't take it's depth buffer and use it as a texture or something. What you do with an FBO is your own business, but OpenGL won't let you play with its default framebuffer in the same way.
You can bind multiple render targets to a single FBO, which should to the trick. Also since OpenGL is a state machine you can change the binding and number of targets anytime it is required.

Renderbuffer or texture object after fragment shader?

I am working on openGL ES2.0 and glsl and I have a question about FBO.
I pass two textures on my openGL ES2.0 code and through glsl shader, particularly fragment shader, I subtract two textures and make a binary image, just like opencv treshold function. My question is that I am not sure if I should use Renderbuffer or texture object for my FBO. I have to choose one since I can only use 1 color attachment (due to restriction of openGL ES2.0). Since the output image after my fragment shader will be a binary image (black or white), shouldn't it be Renderbuffer object?
A texture is a series of images which can be read from (via normal texturing means) and rendered into via FBOs. A renderbuffer is an image that can only be rendered into.
You should use a renderbuffer for images that you will only use as a render target. If you need to sample from it later, you should use a texture.

Normal back buffer + render to depth texture in OpenGL (FBOs)

Here's the situation: I have a texture containing some depth values. I want to render some geometry using that texture as the depth buffer, but I want the color to be written to the normal framebuffer (i.e., the window's framebuffer).
How do I do this? I tried creating an FBO and only binding a depth texture to it (GL_DEPTH_COMPONENT) but that didn't work; none of the colors showed up.
No you can't. The FBO you are rendering to may be either a main framebuffer or an off-screen one. You can't mix them in any way.
Instead, I would suggest you to render to a color renderbuffer and then do a simple blitting operation into the main framebuffer.
Edit-1.
Alternatively, if you already have depth in the main FB, you can first blit your depth and then render to a main FB, thus saving video memory on the additional color renderbuffer.
P.S. Blitting is done via glBlitFramebuffer. In order to make it work you should setup GL_READ_FRAMEBUFFER, GL_DRAW_FRAMEBUFFER and glDrawBuffer() for each of them.