Does anyone know how I can achieve the following effect in OpenGL:
Change the brightness of the rendered scene
Or implementing a Gamma setting in OpenGL
I have tried by changing the ambient parameter of the light and the type of light (directional and omnidirectional) but the result was not uniform. TIA.
Thanks for your help, some additional information:
* I can't use any windows specifics API.
* The gamma setting should not affect the whole window as I must have different gamma for different views.
On win32 you can use SetDeviceGammaRamp to adjust the overall brightness / gamma. However, this affects the entire display so it's not a good idea unless your app is fullscreen.
The portable alternative is to either draw the entire scene brighter or dimmer (which is a hassle), or to slap a fullscreen alpha-blended quad over the whole scene to brighten or darken it as desired. Neither of these approaches can affect the gamma-curve, only the overall brightness; to adjust the gamma you need grab the entire scene into a texture and then render it back to the screen via a pixel-shader that runs each texel through a gamma function.
Ok, having read the updated question, what you need is a quad with blending set up to darken or brighten everything underneath it. Eg.
if( brightness > 1 )
{
glBlendFunc( GL_DEST_COLOR, GL_ONE );
glColor3f( brightness-1, brightness-1, brightness-1 );
}
else
{
glBlendFunc( GL_ZERO, GL_SRC_COLOR );
glColor3f( brightness, brightness, brightness );
}
glEnable( GL_BLEND );
draw_quad();
http://www.gamedev.net/community/forums/topic.asp?topic_id=435400 might be an answer to your question otherwise you could probably implement a gamma correction as a pixel shader
Related
I am trying to draw brush strokes made of quads with a rough texture into framebuffers that are then composited together. The problem is that the framebuffer texture initial color is 0,0,0,0 and it blends in creating a dark glow around the edges. here is an example image
im using
gl.blendEquationSeparate( gl.FUNC_ADD, gl.FUNC_ADD );
gl.blendFuncSeparate( gl.SRC_ALPHA, gl.ONE_MINUS_SRC_ALPHA, gl.SRC_ALPHA, gl.ONE );
I think Iv tried every possible combination of blending settings and none work the way I want.
here is a demo of the problem
This looks like a pre-multiplication issue to me. Either your brush stroke textures aren't pre-multiplied and they should be, or they are pre-multiplied and they shouldn't be - I forget which.
For compositing you most likely want to use pre-multiplied alpha to get the effect you want. Either pre-multiply the texture when you upload the data
gl.pixelStorei(gl.UNPACK_PREMULTIPLY_ALPHA_WEBGL, true);
or pre-multiply in the shader
gl_FragColor = vec4(color.rgb * color.a, color.a);
And blend with
gl.blendFunc(gl.ONE, gl.ONE_MINUS_SRC_ALPHA);
A quick google of "pre-multiplied alpha blending" brought up this great explanation of the issue
im rendering png's on simple squares in opengl es 2.0, but when i try and draw something behind an square i have already drawn the transparent area in my top square are rendered the same color as the background.
I am calling these at the start of every render call.
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glEnable(GL_DEPTH_TEST);
glEnable (GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
Your title is essentially the answer to your question!
Generally transparency is done by first rendering all opaque objects in the scene (letting the z-buffer figure out what's visible), then rendering all transparent objects from back to front.
Drew Hall gave you a good answer but another option is to set glEnable(GL_ALPHA_TEST) with glAlphaFunc(GL_GREATER, 0.1f). This will prevent transparent pixels (in this case, ones with alpha < 0.1f) from being rendered at all. That way they do not write into the Z buffer and other things can "show through". However, this only works on fully transparent objects. It also has rough edges wherever the 0.1 alpha edge is and this can look bad for distant features where the pixels are large compared to the object.
Figured it out. You can discard in the fragment shader
mediump vec4 basecolor = texture2D(sTexture, TexCoord);
if (basecolor.a == 0.0){
discard;
}
gl_FragColor = basecolor;
I'm trying to get a fairly simple effect; I'd like my sprites to be able to have their alpha channels used by GL (that is, translucent across parts of the image, but not all of it) as well as the entire sprite to have an "opacity" level that affects the entire sprite.
I've got the latter part, that was a simple enough matter of using GL_MODULATE and passing a color4d(opacity, opacity, opacity, opacity). Works like a dream.
But the problem is in the first part; partially translucent images. I'd thought that i could just fling out a glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA); and enable blending, but unfortunately it doesn't do it. What it seems to do is "whiten" the color of the image in question, rather than making it translucent. Any other sprites passing under it behave as if it were a solid block of color, and get directly cut off.
For reference, i've disabled lighting, z-buffer, color material, and alpha test. Did shade model flat, just in case. But other than that, i'm using default ortho settings. I'm using teximage2d for the texture in question, and i've been sure the formats and GL_RGBA are all set correctly.
How can i get GL to consider the texture's alpha channel during blending?
The simplest and fastest solution is to have a fragment shader.
uniform float alpha;
uniform sampler2D texture;
main(){
gl_FragColor = texture2d(texture, gl_TexCoords);
gl_FragColor.a *= alpha;
}
GL_MODULATE is the way to tell GL to use the texture alpha for the final alpha of the fragment (it's also the default).
Your blending is also correct as to how to use that generated alpha in the blending stage.
So the problem lies elsewhere... Your description sounds like you did not in fact disable Z-test, and you do not render your sprites back to front. Alpha blending in GL will only do what you want if you draw your sprites back to front. Otherwise, the sprites get blended in the wrong order, and this does not produce the correct output.
It would be easier to verify this with a picture of what you observe though.
I'm trying to draw grayscale image in color as texture in OpenGL using two colors. Black goes to color1 and white goes to color2. Texture is loaded using GL_RGBA.
I have tried two solutions:
1)
Load image as texture
Draw image on screen
Enable blending
glBlendFunc(GL_ONE_MINUS_DST_COLOR, GL_ZERO); and draw rectangle with color1
glBlendFunc(GL_ONE_MINUS_DST_COLOR, GL_ONE); and draw rectangle with color2
But... When I apply first color, there is no black color on screen and when second color is applied it is combined with first color too.
2)
Save image as texture but don't use grayscale image, use white image with alpha channel that is same as grayscale
Draw rectangle with color1
Draw image
But... When image is drawn it doesn't use color1 where image is transparent, instead it uses current color set with glColor.
Any help will come in handy :)
In general, when dealing with OpenGL texturing pipeline, I recommend writing down the end result you want. here, you want your grayscale color to be used as an interpolant between your 2 colors.
out_color = mix(color1, color2, texValue)
The math for this actually is something like:
out_color = color1 + (color2 - color1) * texValue
So... is there a texture environment value that helps do that ? Yes, and it's also called GL_BLEND (not to be confused with the blending to frame buffer that glEnable(GL_BLEND) enables).
So... something like
// pass color1 as the incoming color to the texture unit
glColor4fv(color1);
GLfloat color2[4] = { ... };
// ask for the texture to be blended/mixed/lerped with incoming color
glTexEnvi(GL_TEXTURE_2D, GL_TEXTURE_ENV_MODE, GL_BLEND);
// specify the color2 as per the TexEnv documentation
glTexEnvfv(GL_TEXTURE_2D, GL_TEXTURE_ENV_COLOR, color2)
There is no need to draw multiple times or anything more complicated than this, like you tried to do. The texturing pipeline has plenty of ways to get controlled. Learn them to your advantage!
Your #2 idea would work, but it seems like you didn't set blending correctly.
It should be:
glEnable( GL_BLEND );
glBlendFunc( GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA );
This is a rather old problem I've had with an OpenGL application.
I have a rather complex model, some polygons in it are untextured and colored using a plain color with glColor() and others are textured. Some of the texture is the same color as the untextured polygons and there should be no visible seam between the two.
The problem is that when I turn up the ambient component of the light source, a seam between the two kinds of polygons emerge.
see this image: http://www.shiny.co.il/shooshx/colorBug2.png
The left image is without any ambient light and the right image is with ambient light of (0.2,0.2,0.2).
the RGB value of the color on the texture is identical to the RGB value of the colored faces. The textures alpha is set to 1.0 everywhere.
To shade the texture I use GL_MODULATE.
Can anyone think of a reason why that would happen and of a possible solution?
You mention that you set the color with glColor(), so I assume that GL_COLOR_MATERIAL is on? What setting do you use for glColorMaterial()? In this case it should be GL_AMBIENT_AND_DIFFUSE, so that the glColor() call affects the ambient color as well as the diffuse color. (This is the default.)
You could also try to set all material colours to white (with glMaterial()) before rendering the texture mapped faces. With some settings (don't remember which), the texture itself gets modulated by the current color.
Hope this helps or at least points you into a useful direction.