I'm trying to render a semi-transparent object inside of a skybox. However with my current implementation the textures are blended with the background color instead of the skybox (if there is no other object in the way).
Here are some milestones in my implementation I thought would be useful to share:
init:
[...]
glEnable(GL_DEPTH_TEST);
glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
[...]
render:
[...]
[render scene]
[...]
glUseProgram(program_skybox);
glDepthFunc(GL_LEQUAL);
glDepthMask(GL_FALSE);
[bind view & projection matrices]
[draw skybox]
glDepthMask(GL_TRUE);
glDepthFunc(GL_LESS);
What should I change to make my skybox blend?
I suppose the problem is with the depth buffer?
Related
I'm currently implementing skeletal animation in my deferred rendering pipeline. Since each vertex in a rigged mesh will take at least an extra 32 bytes (due to the bone's vertex IDs & weights), I thought it would be a good idea to make a different shader that will be in charge of drawing animated meshes.
That being said, I have a simple geometry buffer (framebuffer) that has 4 color attachments. These color attachments will be written to using my static geometry shader. C++ code looks like:
glDisable(GL_BLEND);
glEnable(GL_DEPTH_TEST);
glEnable(GL_CULL_FACE);
glCullFace(GL_BACK);
glBindFramebuffer(GL_FRAMEBUFFER, gID); // Bind FBO
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glUseProgram(staticGeometryShaderID); // Bind static geometry shader
// Pass uniforms & draw static meshes here
glBindFramebuffer(GL_FRAMEBUFFER, 0); // Unbind FBO
The code above functions correctly. The issue is when I try to add my animation shader into the mix. The following code is what I am currently using:
glDisable(GL_BLEND);
glEnable(GL_DEPTH_TEST);
glEnable(GL_CULL_FACE);
glCullFace(GL_BACK);
glBindFramebuffer(GL_FRAMEBUFFER, gID); // Bind FBO
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glUseProgram(staticGeometryShaderID); // Bind static geometry shader
// Pass uniforms & draw static meshes here
glUseProgram(animationShaderID); // Bind animated geometry shader
// Pass uniforms & draw animated meshes here
glBindFramebuffer(GL_FRAMEBUFFER, 0); // Unbind FBO
The above example is the same as the last, except another shader is bound after the static geometry is drawn, which attempts to draw the animated geometry. Keep in mind that the static geometry shader and animated geometry shader are the EXACT SAME (besides the animated vertex shader which transforms vertices based on bone transforms).
The result of this code is my animated meshes are being drawn, not only for the current frame, but for all previous frames as well. The result looks something like this: https://gyazo.com/fef2faccbfd03377c0ffab3f9a8cb8ec
My initial thought when writing this code was that the things drawn using the animated shader will simply just overwrite the previous data (assuming that the depth is lower) since I'm not clearing the depth or color buffers. This obviously isn't the case.
If anyone has an idea as to how to fix this, that would be great!
Turns out that I wasn't clearing my vector of render submissions, so I was adding a new mesh to draw every frame.
I have issue with texture with alpha channel. I'm rendering a palm tree with leaves:
but as you can see, sky is over leaves on the left side of the picture.
In my code, sky is rendered, then i render the trees.
Here is my code which renders one palm tree:
RenderFrame(0);//trunk
//glColor3f(0.0, 0.6, 0.0);
glEnable(GL_BLEND);
glDisable(GL_CULL_FACE);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_REPLACE);
leaves.RenderFrame(0);
glEnable(GL_CULL_FACE);
glDisable(GL_BLEND);
Like others stated, it seems that the order of rendering is wrong. I've had this issue in the past and it isn't a simple solution, especially since you are using deprecated immediate mode. Take a look at these solutions in this question: OpenGL ES2 Alpha test problems
rendering artifact http://byte-werx.com/rendering-artifact.png
When I create two sprite batches and attempt to draw twice on the same frame half of my screen (or thereabouts) gets "lost"; this happens regardless of the position of the little campfire sprite.
When rendering in wireframe mode the same results occur so it does not appear that a giant black polygon is getting drawn and overriding the under laying tilemap.
This is the code used to initialize OpenGL:
glPolygonMode(GL_FRONT_AND_BACK, GL_FILL);
glEnableClientState(GL_VERTEX_ARRAY);
glActiveTexture(GL_TEXTURE1);
glActiveTexture(GL_TEXTURE0);
glDepthRange(0.0f, 1.0f);
glEnable(GL_DEPTH_TEST);
glDisable(GL_CULL_FACE);
glEnable(GL_TEXTURE_2D);
glDisable(GL_LIGHTING);
glDepthFunc(GL_LEQUAL);
glDisable(GL_DITHER);
glClearDepth(1.0f);
glEnable(GL_CW);
glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
I have uploaded the relevant code here: download
SDL2 is used for window creation and context management, however I do not use anything else from SDL.
Resolved the issue, I was not unbinding the array/element buffers after calling glDrawElements.
Had to put this after glDrawElements:
glBindVertexArray(0);
glBindBuffer(GL_ARRAY_BUFFER, 0);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, 0);
I am trying to render to an fbo and then use that texture as an input to my second render pass for post processing, but it seems that glClear and glClearColor affect the texture that has been rendered to. How can I make them only affect the display buffer?
My code looks something like this:
UseRenderingShaderProgram();
glBindFramebuffer(GL_FRAMEBUFFER, fb);
glClearColor(0.0, 0.0, 0.0, 0.0);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
renderWorld();
// render to screen
UsePostProcessingShaderProgram();
glBindFramebuffer(GL_FRAMEBUFFER, 0);
glClearColor(0.0, 0.0, 0.0, 1.0); // <== texture appears to get cleared in this two lines.
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
renderWorld();
glfwSwapBuffers();
If I had to make an educated guess you're defined your texture to use mipmap minification filtering. After rendering to a texture using a FBO only one mipmap level is defined. But without all the mipmap levels selected being created the texture is incomplete and will not deliver data. The easiest solution would be disabling mipmapping for this texture by setting its minification filter parameter
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
Also you must make sure that you're correctly unbinding and binding the texture being attached to the FBO.
Unbind the texture before binding the FBO (you can have it attached to it the whole time safely).
Unbind the FBO before binding the texture as image source for rendering.
Adding those two changes (binding order and mipmap levels) should make your texture appear.
I have one texture that has some portions which are transparent transparent I want to apply over an object whose faces are some opaque material (or colour if it's simpler) but the final object gets transparent. I want the final object to be totally opaque.
Here is my code:
First I set the material:
glDisable(GL_COLOR_MATERIAL);
glColorMaterial(GL_FRONT_AND_BACK, GL_AMBIENT);
glColor4f(0.00, 0.00, 0.00, 1.00);
glColorMaterial(GL_FRONT_AND_BACK, GL_DIFFUSE);
glColor4f(0.80, 0.80, 0.80, 1.00);
glColorMaterial(GL_FRONT_AND_BACK, GL_SPECULAR);
glColor4f(0.01, 0.01, 0.01, 1.00);
glEnable(GL_COLOR_MATERIAL);
Then I setup the VBOs
glBindTexture(GL_TEXTURE_2D, object->texture);
glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_MODULATE);
glBindBuffer(GL_ARRAY_BUFFER, object->object);
glVertexPointer(3, GL_FLOAT, sizeof(Vertex), ver_offset);
glTexCoordPointer(2, GL_FLOAT, sizeof(Vertex), tex_offset);
glNormalPointer(GL_FLOAT, sizeof(Vertex), nor_offset);
And finally I draw the object
glEnable(GL_BLEND);
glDisable(GL_DEPTH_TEST);
glDisable(GL_TEXTURE_2D);
glBlendFunc(GL_ONE, GL_ZERO);
glDrawArrays(GL_TRIANGLES, 0, object->num_faces);
glEnable(GL_TEXTURE_2D);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
glDrawArrays(GL_TRIANGLES, 0, object->num_faces);
glDisableClientState(GL_VERTEX_ARRAY);
glDisableClientState(GL_TEXTURE_COORD_ARRAY);
glDisable(GL_BLEND);
glEnable(GL_DEPTH_TEST);
I tried passing different arguments to glBlendFunc() with no prevail. I've uploaded the source here: http://dpaste.com/83559/
UPDATE
I get this, but I want this (or without texture this).
The 2nd and the 3rd picture are produces with glm. I studied the sources, but since my knowledge of OpenGL is limited I didn't understand much.
If you're trying to apply two textures to your object you really want to set two textures and use multitexturing to achieve this look. Your method is drawing the geometry twice which is a huge waste of performance.
Multitexturing will just sample from two texture units while only drawing the geometry once. You can do this with shaders (the way things really should be done) or you can still used the fixed function pipeline (see: http://bluevoid.com/opengl/sig00/advanced00/notes/node62.html)
AFAIK the blend function takes fragment colors (opposed to texture colors). So if you draw the object a second time with blending, the triangles become transparent.
What you want to accomplish could be done using multitexturing.
This is just a wild guess, as you have failed to provide any screenshots of what the actual problem is, but why do you disable the depth test? Surely you want to enable depth testing on the first pass with a standard GL_LESS and then do the second pass with GL_EQUAL?
Edit:
ie
glEnable(GL_BLEND);
glEnable(GL_DEPTH_TEST); // ie do not disable
glDepthFunc( GL_LESS ); // only pass polys have a z value less than ones already in the z-buffer (ie are in front of any previous pixels)
glDisable(GL_TEXTURE_2D);
glBlendFunc(GL_ONE, GL_ZERO);
glDrawArrays(GL_TRIANGLES, 0, object->num_faces);
// for the second pass we only want to blend pixels where they occupy the same position
// as in the previous pass. Therefore set to equal and only pixels that match the
// previous pass will be blended together.
glDepthFunc( GL_EQUAL );
glEnable(GL_TEXTURE_2D);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
glDrawArrays(GL_TRIANGLES, 0, object->num_faces);
glDisableClientState(GL_VERTEX_ARRAY);
glDisableClientState(GL_TEXTURE_COORD_ARRAY);
glDisable(GL_BLEND);
Try disabling blending and drawing a single pass with your texture function set to GL_DECAL instead of GL_MODULATE. This will blend between the vertex color and the texture color based on the texture’s alpha channel, but leave the alpha channel set to the vertex color.
Note that this will ignore any lighting applied to the vertex color anywhere the texture was opaque, but this sounds like the intended effect, based on your description.
It will be much simpler with pixel shaders. otherwise I think you need multi-passes rendering or more than one texture.
You can find comprehensive details here :http://www.opengl.org/resources/faq/technical/transparency.htm