I'm trying to make simple shading for a 2D OpenGL scene by placing semi-transparent black quads over the scene, but in certain places instead of darkening the scene it completely blacks it out. I'm using glBlendFunc(GL_DST_COLOR, GL_SRC_ALPHA) for the shader quads, and glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA) for all the other drawing underneath.
This normally works, dimming the area under the semi-transparent quad, but in certain situations it behaves oddly. When the area under the semi-transparent quad has only had one quad drawn behind it, it behaves normally. When another quad (or multiple quads) are drawn on top of the back quad, it sometimes behaves normally, sometimes blacks out the quad entirely, or sometimes blacks out everything beneath the last-drawn quad (e.g. if I had a red texture with a blue dot on top of it, the red beneath would be blacked out, but the blue dot would remain unchanged).
I thought this looked like the quads drawing in the wrong order, but I have GL_DEPTH_TEST disabled, so I'm not sure if this is odd behavior from the blending, misorder, or something else entirely. Any ideas as to what could be causing this?
Related
I learn opengl here: https://learnopengl.com/#!Advanced-OpenGL/Cubemaps
Did skybox. If you draw it first, then everything is fine. However, to reduce the number of pixels for its output, I try to draw it last. But when you look at the skybox through transparent objects, it is not displayed. If you draw skybox before transparent objects, then they are not displayed. How to fix it?enter image description here
Transparency is not order independent. You cannot draw something "behind" a already drawn surface. You will have to draw the skybox (at least) before you draw your transparent objects.
Note, that you also have to order your transparent objects back to front if it should be possible to correctly see through multiple of them.
I need to draw a load of cubes and would like them to be white with black stroke. At the moment I am storing all of these cubes in a VBO and I can draw them in wireframe and filled with no outline.
I would like to draw them like the image on the left in this image, stroked only on the sides facing the camera, not like the right.
I am using OpenGL.
What you want is to remove hidden lines.
If you want to draw a wireframe object with hidden lines removed, one approach is to draw the outlines using lines and then fill the interiors of the polygons making up the surface with polygons having the background color.
You need to glEnable(GL_CULL_FACE); in order to get back-face culling of triangles not visible automatically applied. Provided the "winding order" of your triangles is consistent of course (clockwise or anti-clockwise). If they're wound in the opposite direction, you can tell OpenGL which direction to use with glFrontFrace(GL_CW | GL_CCW) and whether to cull front or back facing triangles with glCullFace(GL_BACK | GL_FRONT).
I'm learning about how to use JOGL and OpenGL to render texture-mapped quads. I have a test program and a test quad, and I figured out how to enable GL_BLEND so that I can specify the alpha value of a vertex to make a quad with a sort of gradient... but now I want this to show through to another textured quad at the same position.
Drawing two quads with the same vertex locations didn't work, it only renders the first quad. Is this possible then, or will I need to basically construct a custom texture on-the-fly based on what I want and then draw one quad with this texture? I was really hoping to take advantage of blending in this case...
Have a look at which glDepthFunc you're using, perhaps you're using GL_LESS/GL_GREATER and it could work if you're using GL_LEQUAL/GL_GEQUAL.
Its difficult to make out of the question what exactly you're trying to achieve but here's a try
For transparency to work correctly in OpenGL you need to draw the polygons from the furthest to the nearest to the camera. If you're scene is static this is definitely something you can do. But if it's rotating and moving then this is usually not feasible since you'll have to sort the polygons for each and every frame.
More on this can be found in this FAQ page:
http://www.opengl.org/resources/faq/technical/transparency.htm
For alpha blending, the renderer blends all colors behind the current transparent object (from the camera's point of view) at the time the transparent object is rendered. If the transparent object is rendered first, there is nothing behind it to blend with. If it's rendered second, it will have something to blend it with.
Try rendering your opaque quad first, then rendering your transparent quad second. Plus, make sure your opaque quad is slightly behind your transparent quad (relative to the camera) so you don't get z-buffer striping.
I am trying to create a simple ray tracer. I have a perspective view which shows the rays visibly for debugging purposes.
In my example screenshot below I have a single white sphere to be raytraced and a green sphere representing the eye.
Rays are drawn as lines with
glLineWidth(10.0f)
If a ray misses the sphere it is given color glColor4ub(100,100,100,100);
in my initialization code I have the following:
glEnable(GL_ALPHA_TEST);
glAlphaFunc(GL_GREATER, 0.0f);
glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA,GL_SRC_ALPHA);
You can see in the screen shot that for some reason, the rays passing between the perspective view point and the sphere are being color blended with the axis line behind the sphere, rather than with the sphere itself.
Here is a screenshot:
Can anyone explain what I am doing wrong here?
Thanks!!
Is it a possibility you cast those rays before you draw the sphere?
Then if Z-buffer is enabled, the sphere's fragments simply won't be rendered, as those parts of rays are closer. When you are drawing something semi-transparent (using blending), you should watch the order you draw things carefully.
In fact I think you cannot use Z-buffer in any sensible way together with ray-tracing process. You'll have to track Z-order manually. While we are at it OpenGL might not be the best API to visualize ray-tracing process. (It will do so possibly much slower than pure software ray-tracer)
You dont need the glAlphaFunc, disable it.
Light rays should be blended by adding to the buffer: glBlendFunc(GL_ONE, GL_ONE) (for premultiplied alpha, which you chose.
Turn off depth buffer writing (not testing) when rendering the rays: glDepthMask(GL_FALSE)
Render the rays last.
AlphaTest is only for discarding fragments - not for blending them. Check the spec
By using it, you are telling OpenGL that you want it to throw away the pixels instead of drawing them, so you won't can any transparent blending. The most common blending function is
glBlendFunc (GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA); You can also check out the OpenGL Transparency FAQ.
I am rendering 4 vertices (a square) in front of a colored cube. The vertices are colored white, but are blended at 0.5f.
Related: Why does my colored cube not work with GL_BLEND?
Please could someone tell me why the colored cube appears brighter when obscured by the semi-opaque square?
Cube rendered without square in front:
Normal cube http://img408.imageshack.us/img408/2853/normalcube.png
And, rendered with the square:
Cube with square http://img142.imageshack.us/img142/6255/brightsquare.png
Please see the code used to create the colored cube, the code used to actually draw the cube, and the code where the cube and square are rendered.
This is the code in my init function:
glEnable(GL_CULL_FACE);
glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE);
I'd say it's because your semi-transparent square gets added to the existing pixels, thus incrementing their intensity.
The documentation for glBlendFunc() recommends setting the second parameter to GL_ONE_MINUS_SRC_ALPHA, that is the boilerplate for implementing transparency. Try it.