Opengl shaders: Postprocessing a few object on the screen - opengl

I try to understand shaders and framebuffers by making random stuff.
I have a cube floating in a scene in 2 colours: black and white (texture). I add additional colours to the cube and scene with postprocessing.
This works fine, but I want that only the cube gets these colours, and not the scene.
I do that with this:
Bind the texture
Bind the frame buffer object
Bind the shader
Draw the background
Draw the cube
Unbind the fram buffer
Binding the shader to post process the image
Passing the colourparameters to the shader
Draw everything with: glutSwapBuffers();
I can add the code if you need it.

Related

Writing to Framebuffer using multiple shaders

I'm currently implementing skeletal animation in my deferred rendering pipeline. Since each vertex in a rigged mesh will take at least an extra 32 bytes (due to the bone's vertex IDs & weights), I thought it would be a good idea to make a different shader that will be in charge of drawing animated meshes.
That being said, I have a simple geometry buffer (framebuffer) that has 4 color attachments. These color attachments will be written to using my static geometry shader. C++ code looks like:
glDisable(GL_BLEND);
glEnable(GL_DEPTH_TEST);
glEnable(GL_CULL_FACE);
glCullFace(GL_BACK);
glBindFramebuffer(GL_FRAMEBUFFER, gID); // Bind FBO
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glUseProgram(staticGeometryShaderID); // Bind static geometry shader
// Pass uniforms & draw static meshes here
glBindFramebuffer(GL_FRAMEBUFFER, 0); // Unbind FBO
The code above functions correctly. The issue is when I try to add my animation shader into the mix. The following code is what I am currently using:
glDisable(GL_BLEND);
glEnable(GL_DEPTH_TEST);
glEnable(GL_CULL_FACE);
glCullFace(GL_BACK);
glBindFramebuffer(GL_FRAMEBUFFER, gID); // Bind FBO
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glUseProgram(staticGeometryShaderID); // Bind static geometry shader
// Pass uniforms & draw static meshes here
glUseProgram(animationShaderID); // Bind animated geometry shader
// Pass uniforms & draw animated meshes here
glBindFramebuffer(GL_FRAMEBUFFER, 0); // Unbind FBO
The above example is the same as the last, except another shader is bound after the static geometry is drawn, which attempts to draw the animated geometry. Keep in mind that the static geometry shader and animated geometry shader are the EXACT SAME (besides the animated vertex shader which transforms vertices based on bone transforms).
The result of this code is my animated meshes are being drawn, not only for the current frame, but for all previous frames as well. The result looks something like this: https://gyazo.com/fef2faccbfd03377c0ffab3f9a8cb8ec
My initial thought when writing this code was that the things drawn using the animated shader will simply just overwrite the previous data (assuming that the depth is lower) since I'm not clearing the depth or color buffers. This obviously isn't the case.
If anyone has an idea as to how to fix this, that would be great!
Turns out that I wasn't clearing my vector of render submissions, so I was adding a new mesh to draw every frame.

OpenGL offscreen rendering

I was trying to render a 2D scene into a off-screen FrameBuffer Object and use glFrameBufferTexture2D function to use the frame as texture to texture a cube.
The 2D scene is rendered in one context and the texture is used in another one in the same thread.
The problem is when I textured the cube, alpha channel seemed to be incorrect. I used apitrace to check the texture, and the texture has correct alpha value, and the shader was merely out_color = texture(in_texture, uv_coords)
The problem was solved if I blit the off-screen framebuffer color attachment to anything, whether it be itself or framebuffer 0 (output window).
I was wondering why this is happening and how to solve it without needing to blit the framebuffer.
Found out that I used single buffering for the static 2D scene and requires a glFlush to flush the pipeline.

How to draw from the inside of the light geometry in deferred shading

I'm trying to implement a deferred shader with OpenGL and GLSL and I'm having trouble with the light geometry. These are the steps I'm taking:
Bind multitarget framebuffer
Render color, position, normal and depth
Unbind framebuffer
Enable blend
Disable depth testing
Render every light
Enable depth testing
Disable blend
Render to screen
But since I'm only rendering the front face, when I'm inside a light it disappears completely, rendering the back face does not work, since I would get double the light power (And when inside, half [or the normal amount]).
How can I render the same light value from inside and outside the light geometry?
well in my case, i do it like that:
Bind gbuffer framebuffer
Render color, position, normal
Unbind framebuffer
Enable blend
Enable depth testing
glDepthMask(0);
glCullFace(GL_FRONT); //to render only backfaces
glDepthFunc(GL_GEQUAL); //to test if light fragment is "behind geometry", or it shouldn't affect it
Bind light framebuffer
Blit depth from gbuffer to light framebuffer //so you can depth-test light volumes against geometry
Render every light
If i remember correctly, in my deferred renderer i just render only the backfaces of the light volume. The drawback is you cannot depth test, you will only know if a light is behind a geometry after the light calculation is done and discard the pixel.
As another answer explained, you can do depth testing. Test for greater or equal to see if the backface is behind or on a geometry, therefore intersects with the surface of the geometry.
Alternatively you could check if you are inside the light volume when rendering and switch front faces accordingly.

Set background color to a texture

So in openGL when I call glClearColor() is there a way to set the background color to a texture instead of setting it just to a static color? Or is there another method which can do that besides glClearColor()?
You cannot clear the screen to a texture. You can:
Draw a textured quad the size of the screen.
Blit a texture from an FBO onto the screen.
Either one will work.

output of Fragment shader as a texture

I have a problem with the fragment shader,
this is my situation:
I have a 3d scene with a simple 2d square representing a wall (with "GL.GL_QUADS") in the middle.
I move the virtual camera using the function "glu.gluLookAt".
I implemented a simple fragment shader for the wall that basically changes the color of the wall respect to the distance from the wall to the virtual camera (using dFdx and dFdy).
The problem is that instead of visualize the output of the shader on the wall I would like to store the output in a buffer or in a texture.
I tried with "gl.glBindFramebufferEXT" but in this case the output was the entire rendering of the virtual scene, not just the output of the shader referred to the wall.
So how can I "extract" only the output of a fragment shader referred to a GL_QUADS without "extract" all the rendered scene?
You will need to set up an ortho projection and render only the quad needed to the FBO (or just a screen aligned quad). Then, render the scene with the contents of the FBO bound as a texture.