OpenGL multitexture tessellation - opengl

I have to tessellate some surface in OpenGL with rectangular textures. Let it be a single triangle for simplicity. The textures touch each other by sides, and do not overlap. That is done by setting GL_TEXTURE_WRAP_S and GL_TEXTURE_WRAP_T to GL_CLAMP_TO_BORDER and adjusting texture coords properly. Everything goes fine while GL_TEXTURE_MIN_FILTER and GL_TEXTURE_MAG_FILTER is set to GL_NEAREST, but when I want to apply GL_LINEAR filering and/or anisotropic filtering following arifact apperas: textures border pixel's alpha gradually fall to transparent, so that line of background color is visible between neighbouring textures.
How can I avoid this artifact without merging multiple textures to one while linear filtering is preserved?

You probably want GL_CLAMP_TO_EDGE instead of GL_CLAMP_TO_BORDER. Clamp to border mixes the edge pixel with the border color, which is initialized to (0,0,0,0). This is where your transparency is coming from.
Either clamp the texture to the actual edge, or set a border color that is nontransparent.

Related

Rendering transparent texture onto glTexture

I've been working in opengl for a while relatively smoothly, but recently I've noticed that when I render a primitive with a transparent texture onto my fbo texture (custom frame buffer) it makes the fbo texture transparent at the pixels the primitive's texture is transparent. The problem is that there are things behind this primitive (with solid color) already rendered before the transparent one. So the fbo texture should not be transparent at those pixels - blending a solid & transparent color should result in a solid color, shouldn't it?
So basically, opengl is adding transparency to my fbo-texture just because the last primitive drawn has transparent pixels even though there are solid colors behind it already drawn into the fbo texture. Shouldn't opengl blend the transparent texture with the fbo's existing pixels, and result in a solid color if the fbo texture is already filled with solid colors before rendering the transparent primitive?
What happens when I render my fbo texture to the default frame buffer, is that the clear color bleeds through parts of it - where the last drawn texture is transparent. But when I render the same scene straight to the default opengl frame buffer, the scene looks fine and the clear color is not bleeding into the transparent texture.
What's even more interesting is that the glClearColor - color is only visible where the primitive's texture's alpha has a gradient - the clear color has no influence on where the texture alpha is 1.0 or 0.0... is this a bug? It seems to affect the primitive's texture the most at pixels with 0.5 alpha. Then going further above or below decreases the glClearColor influence.
I know it's a bit of a complex question/situation, I honestly tried my best to explain
I'm using:
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
to draw both the partly transparent primitive into the fbo-texture, and then the fbo texture to the default framebuffer
This is what the fbo-texture drawn into the default opengl fbo looks like:
glClearColor is set to red.
blending a solid & transparent color should result in a solid color, shouldn't it?
Only if your blend mode tells it to. And it doesn't:
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
This applies the same blending operation to all four components of the colors. This causes the final alpha to be the result of multiplying the source alpha by itself and adding that to the destination alpha times 1-src alpha.
Now, you could use separate blend functions for the color and the alpha:
glBlendFuncSeparate(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA, GL_ZERO, GL_ONE);
The last two parameters specify the blending for just the alpha, while the first two specify the blending for the RGB colors. So this preserves the alpha. Separate blend functionality is available on GL 3.x and above, but is also available as an extension on older hardware.
But it seems to me that you probably don't want to change the alpha at all. So simply mask alpha writes when rendering to the texture:
glColorMask(GL_TRUE, GL_TRUE, GL_TRUE, GL_FALSE);
Don't forget to undo the alpha mask when you want to write to it again.

OpenGL blending: texture on top overlaps its pixels which should be transparent (alpha = 0)

I am drawing a map texture and, on top of it, a colorbar texture. Both have alpha channel and I am using blending, set as
// Turn on blending
glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
However, the following happens:
The texture on top (colorbar) with alpha channel imposes its black pixels, which I don't want to happen. The map texture should appear behind where the colorbar alpha = 0.
Is this related to the blending definitions? How should I change it?
Assuming the texture has an alpha channel and it's transparent in the right places, I suspect the issue is with the rendering order and depth testing.
Lets say you render the scale texture first. It blends correctly with a black background. Then you render the orange texture behind it, but the pixels from the scale texture have a higher depth value and cause the orange texture there to be discarded.
So, make sure you render all your transparent stuff in back to front order, or farthest to nearest.
Without getting into order independent transparency, a common approach to alpha transparency is as follows:
Enable the depth buffer
Render all your opaque geometry
Disable depth writes (glDepthMask)
Enable alpha blending (as in the OP's code)
Render your transparent geometry in farthest to nearest order
For particles you can sometimes get away without sorting and it'll still look OK. Another approach is using the alpha test or using alpha to coverage with a multisample framebuffer.

UV Texture oddity in textureUnits, openscenegraph

I am having issues with textures. I have the model open as a .osg so I will refer to it here as such here. I have one texture in textureUnit 0 which acts as a base texture. Then I have a second texture in textureUnit 1 which acts as a label of sorts. I apply a rgba texture in there which then should be transparent on the model in openscenegraph. However I get this:
The gray areas are the base texture. The darker areas are where the uv coordinates move off the edge of the texture itself. I cant seem to be able to remove the dark areas. Any ideas?
You probably need to set the edge clamping mode -- the dark is probably some of the Texture Border color creeping in. Try setting GL_CLAMP_TO_EDGE​ as your texture wrap mode.

Manual GL_REPEAT in GLSL

Currently I have a texture atlas that is 2048 x 2048 pixels set up with three 512 x 512 textures stored, and I am only applying one texture to the object. So I used the following code to position the texture coordinates (from zero to 1) to the correct position on the texture atlas for that texture:
color = texture2D(tex_0, vec2(0.0, 1024.0/2048.0) + mod(texture_coordinate*vec2(40.0), vec2(1.0))*vec2(512.0/2048.0));
The problem is that when I apply this, there is a black border around the texture. I presume that this is because OpenGL can't blend the two pixels at the place of that border.
So how do I get rid of the border?
Edit*
I have already tried to move the starting and ending boundaries in toward the center of the texture and that didn't work.
Edit*
I found the source of the problem, the automatic mipmap generation is blending the textures in the texture atlas together. This means I have to write my own mipmapping function. (As far as I can tell)
If anyone has any better ideas, please do post.
Instead of using a normal 2D texture as the texture atlas with a grid of textures, I used the GL_2D_TEXTURE_ARRAY functionality to create a 3D texture that mipmapped correctly and repeated correctly. That way the textures did not blend together at higher mipmap levels.

How to visualize a depth texture in OpenGL?

I'm working on a shadow mapping algorithm, and I'd like to debug the depth map that it's generating on its first pass. However, depth textures don't seem to render properly to the viewport. Is there any easy way to display a depth texture as a greyscale image, preferably without using a shader?
You may need to change the depth texture parameters to display it as greyscale levels :
glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_COMPARE_MODE, GL_NONE )
glTexParameteri( GL_TEXTURE_2D, GL_DEPTH_TEXTURE_MODE, GL_LUMINANCE )
You can then normally use the texture as a 'normal' greyscale 2d texture, either via fixed function, or a 'sampler2d' shader uniform.
Depth textures (2D) can be used just like any regular grayscale texture. The only problem might be that the values inside it are all too high and you only see a white texture. If that's the case play around with the z-near and -far planes that are used when creating the depth texture (or scale the values with a shader or maybe glTexEnv).
Sure, just bind your depth texture to your favourite texture unit, enable texturing, and draw a 2D quad! You could also size the quad to only fill part of the screen so that you can view the shadowmap in realtime.
OpenGL also has functions which can copy the texture into an array for you. You could save this as an image and use an image viewer to view it.