OpenGL shader ignore texture - c++

I have recently implemented Awesomium into a OpenGL application.
When I load Awesomium in to a texture OpenGL includes it in its shading process regardless of whether I draw the texture onto a surface or not.
I am trying to trace down the line of code that is processing the texture into the shaders, is there a specific function OpenGL uses to access all textures or a way to tell OpenGL to ignore the texture?
Update texture block
glBindTexture(GL_TEXTURE_2D, SkypeHUD);
glTexImage2D(GL_TEXTURE_2D, 0, 4, AwesomiumW, AwesomiumH, 0, GL_BGRA, GL_UNSIGNED_BYTE, surface->buffer());
Create texture block
glBindTexture(GL_TEXTURE_2D, SkypeHUD);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT );
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT );
glBindTexture(GL_TEXTURE_2D, 0);
Drawing the scene without the texture being loaded: http://puu.sh/2bVTV
Drawing the scene after I have loaded the texture: http://puu.sh/2bVUb
You can see it blending the google texture in over the others.

Texture enable/disable should be controlled by the shader code, not some client binding state. Anyway, you most likely use several texture units (glActiveTexture); the texture binding is individual to each unit, so you'll have to do some leg work and unbind textures from each unit if you want to go this route.

Related

GLSL Partially Overlapping Textures

I'm trying to blend two partially overlapping textures in GLSL and am wondering if I'm misunderstanding the concept of multi-texturing. Is it required that the textures fully overlap or can you have two offset textures that blend only where they overlap?
I have two images similar to the following (minus grid lines and text):
Example image
Ideally, the overlapping sections of the image would blend together nicely so that the final result would look like one smooth image that combines the two together. Overlapping orange pixels, for example, would blend together or take the higher intensity.
I'm new to GLSL and have been using this article GLSL Shader Article which uses a fragment shader to blend the textures (fairly standard).
Following the article, I#m setting up each texture like so:
glUseProgramObjectARB( m_hProgramObject );
GLint nParamObj = glGetUniformLocationARB( m_hProgramObject, pParamName_i );
...
glActiveTexture(GL_TEXTURE0 + nTextureID_i );
glBindTexture(GL_TEXTURE_2D, nTextureID_i);
glUniform1iARB( nParamObj, nTextureID_i );
I then bind each texture and draw triangle strips. My textures are created as:
glBindTexture( GL_TEXTURE_2D, m_nTextureID );
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glPixelStorei(GL_UNPACK_ALIGNMENT, 1);
glPixelStorei(GL_UNPACK_ROW_LENGTH, 0);
glPixelStorei(GL_UNPACK_SKIP_ROWS, 0);
glPixelStorei(GL_UNPACK_SKIP_PIXELS, 0);
glTexImage2D(GL_TEXTURE_2D, 0, 4, nWidth, nHeight, 0, GL_RGBA,
GL_UNSIGNED_BYTE, pbyData);
Does that process seem reasonable or am I misunderstanding the concept? Any tips or advice on how to achieve this?
That process certainly seems adequate. The advantage of using a fragment shader is you get complete control over how the textures are combined. For the offset, you may want two sets of texture coordinates - one for each image - or you could generate them implicitly. Figuring out what you want and writing the fragment shader will probably be the difficult bit. Unfortunately if you want to blend many different textures, the fragment shader used in this way can get quite expensive or just wont work with too many textures bound.
Your example image doesn't look like any blending has occurred at all - the images are just positioned over each other. In this case, it's easier just to draw separate bits of geometry with mapped textures.
Blending is typically done by the fixed pipeline blending stage. For example using the following calls...
glEnable(GL_BLEND)
glBlendFunc(src_scale, dest_scale)
One of the most common configuration is alpha blending with the over operator: glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA) in which the amount blended is given by the alpha value of the colour your drawing - possibly influenced by the A component in your GL_RGBA texture. You can further manipulate the blend equations if needed. See Blending.

texturing a glutSolidSphere

I need to add an earth texture to a glutSolidSphere. The problem is that I cannot figure out how to make the texture stretch over the entire sphere and still be able to rotate.
We're enabling the textures with.
glTexGeni(GL_S, GL_TEXTURE_GEN_MODE,GL_OBJECT_LINEAR);
glTexGeni(GL_T, GL_TEXTURE_GEN_MODE,GL_OBJECT_LINEAR);
glEnable(GL_TEXTURE_2D);
glEnable(GL_TEXTURE_GEN_S);
glEnable(GL_TEXTURE_GEN_T);
//drawcode...
using GL_SPHERE_MAP in param instead of GL_OBJECT_LINEAR makes the textures look proper, but they cannot rotate.
The parameters I use for the texture are
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
I understand that GL_REPEAT tiles the texture, while using GL_CLAMP instead gives me the texture once on the object, but I cannot get it to stretch over the whole sphere.
Does anyone know how to properly texture a glutSolidSphere?
glutSolidSphere doesn't provide proper texture coordinates, and OpenGL built in texture generation allows only for linear mappings from vertex position to vertex texture coordinate, which essentially means that you can not use them to texture a 3-sphere with a 2-flat, bounded texture (for a mathematical explanation look up the topics of topology of manifolds and map theory).
So what can you do? There are a number of possible solutions:
Don't use glutSolidSphere, but some other geometry generator that does provide proper texture coordinates (though texturing a sphere with just a single bounded 2D texture is a difficult topic, there are several mappings, each with their problems)
Use a texture with the same topology as a sphere, a cube map, then you can use the GL_NORMAL_MAP for the texture gen mode, i.e.
glTexGeni(GL_S, GL_TEXTURE_GEN_MODE, GL_NORMAL_MAP);
glTexGeni(GL_T, GL_TEXTURE_GEN_MODE, GL_NORMAL_MAP);
glTexGeni(GL_R, GL_TEXTURE_GEN_MODE, GL_NORMAL_MAP);
Look up tutorials about cube mapping. But in a essence a cube map consists of 6 texture faces, arranged in a cube about the origin and texture coordinates are not points on the cube itself, but a direction from the origin and the addressed texel is the one, where the direction ray intersects with the cube.
Use a vertex shader, generating texture coordinates from vertex positions. Since a vertex shader is freely programmable, the mapping isn't required to be linear. Of course will run into the peculiarities of mapping a 3-sphere with a bounded 2-flat again.

OpenGL texture color issue

I'm loading raw texture (with alpha channel) and displaying it in openGL everything is fine and texture displayed, but color is little bit darker than original. I already tried to turn of lighting, blending and dithering, but this doesn't helps.
I'm using mac osx.
Sample image
http://postimage.org/image/2wi1x5jic/
Here's openGL texture loading source code:
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA , width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, &bytes[0]);
EDIT:
Thats very weird, I used example from http://forums.tigsource.com/index.php?topic=9560.0
and received same glitch ... So the problem not im my code, maybe driver options? Hm ...
SOLUTION:
Thanks #datenwolf, images were saved with sRrgb color profile. Problem is solved once I removed it and converted to RGB.
Maybe you have GL_MODULATE set as texturing environment and the vertex colours are not white. Try setting the texture environment to GL_REPLACE.
glBindTexture(GL_TEXTURE_2D, your_texture);
glTexEnvi(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_REPLACE);
EDIT
Another problem may be a color profile embedded into the image. A image viewer may use this colour profile to implement color management, adjusting the colours for your monitor's colour profile. OpenGL as-it-is doesn't do color management; there is a extension, that framebuffers and textures are sRGB, this is kind of the smallest common denominator of colour management. But then you'd still have to transfer your input images to sRGB colour space.
I've a lengthy article in preparation the explains in depth how to do colour management with OpenGL. But it's far from complete.

How to copy depth buffer to a texture on the GPU?

I want to get the current depth buffer to a texture, to access it in a shader. For various reasons I can't do a separate depth pass, but would need to copy the already-rendered depth.
glReadPixels would involve the CPU and potentially kill performance, and as far as I know glBlitFramebuffer can't blit depth-to-color, only depth-to-depth.
How to do this on the GPU?
The modern way of doing this would be to use a FBO. Attach a color and depth texture to it, render, then disable the FBO and use the textures as inputs to a shader that will render to the default framebuffer.
All the details you need about FBO can be found here.
Copying the depth buffer to a texture is pretty simple. If you have created a new texture that you haven't called glTexImage* on, you can use glCopyTexImage2D. This will copy pixels from the framebuffer to the texture. To copy depth pixels, you use a GL_DEPTH_COMPONENT format. I'd suggest GL_DEPTH_COMPONENT24.
If you have previously created a texture with a depth component format (ie: anytime after the first frame), then you can copy directly into this image data with glCopyTexSubImage2D.
It also seems as though you're having trouble accessing depth component textures in your shader, since you want to copy depth-to-color (which is not allowed). If you are, then that is a problem you should get fixed.
In any case, copying should be the method of last resort. You should use framebuffer objects whenever possible. Just render directly to your texture.
Best way would be using FBOs, for better performance and some coding style issues whatsoever.
If you are not interested take a look at this code. It is from the days when I was much younger!(and didn't know FBOs exist)
int shadowMapWidth = 512;
int shadowMapHeight = 512;
glGenTextures(1, &m_depthTexture);
glBindTexture(GL_TEXTURE_2D, m_depthTexture);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP);
glTexImage2D( GL_TEXTURE_2D, 0, GL_DEPTH_COMPONENT, shadowMapWidth, shadowMapHeight, 0, GL_DEPTH_COMPONENT, GL_UNSIGNED_BYTE, 0);
glCopyTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, 0, 0, 512,512);

OpenGL: Loading a texture changes the current color

I noticed that when I'm loading a texture, it might change the current drawing color, depending on the texture's color. For example after executing
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, info.biWidth,
info.biHeight, 0, GL_RGB, GL_UNSIGNED_BYTE,bitmap);
glTexParameterf(GL_TEXTURE_2D,
GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameterf(GL_TEXTURE_2D,
GL_TEXTURE_MAG_FILTER, GL_LINEAR);
all consecutive polygons drawn to the screen will have a color depending on the texture image loaded.
Is that standard? I didn't find this behavior documented.
Yes, it's how it works, remember that GL is a state machine, so you left the texture bound (and probably enabled), so when drawing it used the first pixel (assuming you didn't provide any texture coordinates) to color the primitive.
To solve it, disable texturing when you don't want texturing, you can do it with:
glDisable(GL_TEXTURE_2D);