I want the alpha value to be read for each pixel from the texture, so that some pixels completely disappear. The texture file(targa format) does contain the proper alpha channel.
Screenshot: http://i43.tinypic.com/2i79s1x.png
Here are the options I am using:
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_BGR, //changing GL_BGR to anything else doesn't do a thing :? also tried GL_BGRA.
GL_UNSIGNED_BYTE, targaImage);
I have also tried most of the combinations of parameters for the glBlendFunc but none achieves the effect, alhtough I might have skipped it. This is the one that gets the regular blending done right(based on the alpha from glColor):
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
Rectangle drawing:
glColor4f(1,1,1,0.5);
mRect->Render();
If I set alpha to 1 it is fully opaque, but there is still white in the bottom right, meaning that the alpha is read from the texture but the white polygon beneath it is visible. So I need to make the polygon disappear somehow, but the texture to remain visible.
So that's how I achieve this in the picture. I have also experimented with this:
glAlphaFunc(GL_GREATER, 0.49);
glEnable(GL_ALPHA_TEST);
It only proves that the alpha of each ,,fragment'' of my rectangle is 0.5.
This texture file has a gradient that has full red around the blue circle in the middle, but the alpha goess from 0 in the top-left to full in the bottom-right(it's not the red color fading to white).
I would supply the whole code but it has more than 2k lines and I have split everything into classes, so I am just pulling out the parts I think are important.
Do I need my own shader to do this? I have only made my first contact with OpenGL and C++ a couple of weeks ago and I'm not into them yet, so if that's the solution I would appreciate a link to a tutorial that deals with alpha and GLSL.
Thank you :)
It looks like you're using the old, fixed function pipeline. With that you must properly configure the texture environment. Specifically you want the texture to modulate or replace the base color. Either is fine, but I presume replace mode is better suited for you.
After binding the texture set the mode using glTexEnvi, in your case specifically
glTexEnvi(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_REPLACE);
Also I want to remind you, that you actually must enable blending (glEnable(GL_BLEND);)
If you look at the reference page for glTexEnv I think you easily grasp how ridiculously complex the state space for the fixed function texture environment pipeline became. I strongly suggest you don't bother with it and go directly for using the programmable pipeline, i.e. fragment shaders. Yes, their learning curve is significantly steeper, but with shaders you can actually write something legible like
#version 330
uniform sampler3D sampler_albedo;
uniform sampler2D sampler_diffuse;
out vec4 outcolor;
main()
{
float albedo = texture(sampler_albedo, texcood);
vec4 diffuse = texture(sampler_diffuse, texcoord);
outcolor = mix(basecolor, diffuse.rgb, diffuse.a) * albedo;
}
instead of spending over 15 lines setting up the texture environment and register combiners.
Related
If I use the fixed pipeline, I can use
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
to make an image 'pixelated' as opposed to fragments in between pixels in the image being interpolated. How would I do the same thing in GLSL program? I'm using the texture2D function. I ask because I am using a shader program for my skybox, and you can see the edges because the edge pixels get blurred with grey. This problem gets fixed if I were to use the fixed pipeline and the above function calls.
You can use the same texture minification and magnification filters with the programmable pipeline. It sounds like the issue is not the min/mag filter, but with how you're handling texture clamping/wrapping. Either that or your textures have gray in them, which you probably don't want.
To set up texture clamping, you can do the following:
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
This will cause any pixels sampled from outside the texture to return the same color as the nearest pixel within the texture to that sample location.
As the other answers and comments alread pointed out, the texture sampling states will effect both the fixed function pipeline and the programmable pipeline in the same ways. I'd just like to add that in shaders, you can also completely bypass the sampling and use the GLSL texelFetch() functions where you can directly access the unfiltered texels - which will basically look like GL_NEAREST filtering. You will also lose the wrapping functionality and hve to use unnormalized integer texture coords, so this is probably not what you want in that scenario, though.
I'm trying to blend two partially overlapping textures in GLSL and am wondering if I'm misunderstanding the concept of multi-texturing. Is it required that the textures fully overlap or can you have two offset textures that blend only where they overlap?
I have two images similar to the following (minus grid lines and text):
Example image
Ideally, the overlapping sections of the image would blend together nicely so that the final result would look like one smooth image that combines the two together. Overlapping orange pixels, for example, would blend together or take the higher intensity.
I'm new to GLSL and have been using this article GLSL Shader Article which uses a fragment shader to blend the textures (fairly standard).
Following the article, I#m setting up each texture like so:
glUseProgramObjectARB( m_hProgramObject );
GLint nParamObj = glGetUniformLocationARB( m_hProgramObject, pParamName_i );
...
glActiveTexture(GL_TEXTURE0 + nTextureID_i );
glBindTexture(GL_TEXTURE_2D, nTextureID_i);
glUniform1iARB( nParamObj, nTextureID_i );
I then bind each texture and draw triangle strips. My textures are created as:
glBindTexture( GL_TEXTURE_2D, m_nTextureID );
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glPixelStorei(GL_UNPACK_ALIGNMENT, 1);
glPixelStorei(GL_UNPACK_ROW_LENGTH, 0);
glPixelStorei(GL_UNPACK_SKIP_ROWS, 0);
glPixelStorei(GL_UNPACK_SKIP_PIXELS, 0);
glTexImage2D(GL_TEXTURE_2D, 0, 4, nWidth, nHeight, 0, GL_RGBA,
GL_UNSIGNED_BYTE, pbyData);
Does that process seem reasonable or am I misunderstanding the concept? Any tips or advice on how to achieve this?
That process certainly seems adequate. The advantage of using a fragment shader is you get complete control over how the textures are combined. For the offset, you may want two sets of texture coordinates - one for each image - or you could generate them implicitly. Figuring out what you want and writing the fragment shader will probably be the difficult bit. Unfortunately if you want to blend many different textures, the fragment shader used in this way can get quite expensive or just wont work with too many textures bound.
Your example image doesn't look like any blending has occurred at all - the images are just positioned over each other. In this case, it's easier just to draw separate bits of geometry with mapped textures.
Blending is typically done by the fixed pipeline blending stage. For example using the following calls...
glEnable(GL_BLEND)
glBlendFunc(src_scale, dest_scale)
One of the most common configuration is alpha blending with the over operator: glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA) in which the amount blended is given by the alpha value of the colour your drawing - possibly influenced by the A component in your GL_RGBA texture. You can further manipulate the blend equations if needed. See Blending.
I am having issues alpha blending glPoints. The alpha blending works correctly when the glPoint is over the background but when a glPoint overlaps another glPoint the background color is visible rather then the underlying glPoint.
// create texture
GLuint tex;
glGenTextures(1, &tex);
glBindTexture(GL_TEXTURE_2D, tex);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, spriteData);
// Draw
glEnable(GL_DEPTH_TEST);
glEnable(GL_BLEND);
glBlendFunc (GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
glBindTexture(GL_TEXTURE_2D, tex);
....
glDrawArrays(GL_POINTS, 0, numberOfPoints);
// Frag Shader
uniform sampler2D Texture;
void main(void)
{
gl_FragColor = texture2D(Texture, gl_PointCoord);
}
What am i doing wrong?
It looks like the depth test is causing your issues. What happens is, as you describe, a point in front is drawn first and the depth value is replaced. Then the point behind is rasterized but all the fragments fail the depth test so nothing more is rendered/blended.
As Andon M. Coleman pointed out, you really need to sort the fragments in order of depth for correct alpha blending (exact order independent transparency is currently impractical for particles although you could try some of the approximate techniques. averaging all colour using alpha values as a weight can give decent results too).
Especially for the particle density you have and the lack of variation among particles there probably won't be much difference between sorting and not sorting. In this case, make sure you draw all the opaque stuff first, with depth testing enabled and then draw the particles. You want to keep depth testing enabled so your particles aren't drawn when they're behind opaque things however you don't want them to obscure each other - you want to use the depth buffer for testing but not write depth values. For this, use glDepthMask
I'm working with sprite animation (OpenGL + C++).
I have some trouble working with blending.
I'm trying to load an image with a black background and draw it over another texture without a block of black appearing around the image. The image has Alpha channel and blending is enabled.
I tried playing with different blending functions. I either end up with a blocky image or a translucent image.
I know I can do it if I replace the black background with a transparent color instead using an image editing software, but I would like to get this working without that and without using an image mask.
An example to better understand my situation.
The image & Image over texture done incorrectly:
The way I want it to be:
Here is a bit of code that I'm using. I picked out what seemed to be most relevant since a lot of code is spread out over a few classes.
glEnable(GL_DEPTH_TEST);
....
....
glEnable(GL_TEXTURE_2D);
/*Drawing the image with black background first*/
glBindTexture(GL_TEXTURE_2D, blockImage);
glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_REPLACE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glEnable(GL_BLEND);
glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA);
//drawing code
...
...
...
/*background texture is drawn last*/
glBindTexture(GL_TEXTURE_2D, bgImage);
glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_MODULATE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glBlendFunc(GL_ONE, GL_DST_ALPHA);
//drawing code
...
glDisable(GL_BLEND);
glDisable(GL_TEXTURE_2D);
I don't really need the actual code to do this. A small explanation of the logic would suffice (order of drawing and blending).
OpenGL fixed-function pipeline has no builtin support for color keying. If you want to do it, you can write a shader to test the fragment's color and use the discard operation.
If you want to do this in fixed-function you'll have to properly use the alpha channel (make all the black area alpha = 0 in pre-processing).
I'm loading raw texture (with alpha channel) and displaying it in openGL everything is fine and texture displayed, but color is little bit darker than original. I already tried to turn of lighting, blending and dithering, but this doesn't helps.
I'm using mac osx.
Sample image
http://postimage.org/image/2wi1x5jic/
Here's openGL texture loading source code:
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA , width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, &bytes[0]);
EDIT:
Thats very weird, I used example from http://forums.tigsource.com/index.php?topic=9560.0
and received same glitch ... So the problem not im my code, maybe driver options? Hm ...
SOLUTION:
Thanks #datenwolf, images were saved with sRrgb color profile. Problem is solved once I removed it and converted to RGB.
Maybe you have GL_MODULATE set as texturing environment and the vertex colours are not white. Try setting the texture environment to GL_REPLACE.
glBindTexture(GL_TEXTURE_2D, your_texture);
glTexEnvi(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_REPLACE);
EDIT
Another problem may be a color profile embedded into the image. A image viewer may use this colour profile to implement color management, adjusting the colours for your monitor's colour profile. OpenGL as-it-is doesn't do color management; there is a extension, that framebuffers and textures are sRGB, this is kind of the smallest common denominator of colour management. But then you'd still have to transfer your input images to sRGB colour space.
I've a lengthy article in preparation the explains in depth how to do colour management with OpenGL. But it's far from complete.