Draw Completely White Texture in OpenGL - c++

What I want to do is draw a texture like normal in OpenGL, but completely white. I tried doing glColor3f(2.f, 2.f, 2.f) but that doesn't work. I just want to draw the shape of a certain texture but without color and just white, so I'm trying to draw a texture but white...
To clarify the desired result: I want the RGB part of all colors sampled from the texture to be white, while the alpha value is the one sampled from the texture. So if the value in the texture is (R, G, B, A), I want the sampled color to be (1.0f, 1.0f, 1.0f, A).

To turn the color from the texture white, but still use the alpha value sampled from the texture, you can use:
glTexEnvi(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_ADD);
glColor4f(1.0f, 1.0f, 1.0f, 1.0f);
With the GL_ADD texture env mode, the incoming fragment color (which the second call above sets to white) will be added to the color sampled from the texture to obtain the output color. Since output colors are clamped to the range [0.0, 1.0], the color components from the texture do not matter, since they will be added to 1.0 before the result is clamped to 1.0. So the RGB color part of the output will always be white.
The less obvious part is what happens to the alpha value. According to the spec, the alpha value from the incoming fragment and the alpha value sampled from the texture are multiplied for the GL_ADD texture env mode. With the fragment alpha set to 1.0, this means that the resulting value is the alpha value from the texture, which is what you wanted.

Use a pixel shader and set each fragment's color to white.

Create a 1x1 white RGBA bitmap in code.
GLubyte texData[] = { 255, 255, 255, 255 };
Copy data to texture.
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, 1, 1, 0, GL_RGBA, GL_UNSIGNED_BYTE, texData);

Related

openGL Transparent pixels unexpectedly White

I noticed a big problem in my openGL texture rendering:
Assumedly transparent pixels are rendered as solid white. According to most solutions to similar issues discussed on StackOverflow, I need to set glBlend / the proper functions, but I have already set the necessary gl state and am positive that textures are loaded correctly as far as I can tell. My texture load function is below:
GLboolean GL_texture_load(Texture* texture_id, const char* const path, const GLboolean alpha, const GLint param_edge_x, const GLint param_edge_y)
{
// load image
SDL_Surface* img = nullptr;
if (!(img = IMG_Load(path))) {
fprintf(stderr, "SDL_image could not be loaded %s, SDL_image Error: %s\n",
path, IMG_GetError());
return GL_FALSE;
}
glBindTexture(GL_TEXTURE_2D, *texture_id);
// image assignment
GLuint format = (alpha) ? GL_RGBA : GL_RGB;
glTexImage2D(GL_TEXTURE_2D, 0, format, img->w, img->h, 0, format, GL_UNSIGNED_BYTE, img->pixels);
// wrapping behavior
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, param_edge_x);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, param_edge_y);
// texture filtering
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
glBindTexture(GL_TEXTURE_2D, 0);
// free the surface
SDL_FreeSurface(img);
return GL_TRUE;
}
I use Adobe Photoshop to export "for the web" 24-bit + transparency .png files -- 72 pixels/inch, 6400 x 720. I am not sure how to set the color mode (8, 16, 32), but this might have something to do with the issue. I also use the default sRGB color profile, but I thought to remove the color profile at one point. This didn't do anything.
No matter what, a png exported from Photoshop displays as solid white over transparent pixels.
If I create an image in e.g. Gimp, I have correct transparency. Importing the Adobe .psd or .png does not seem to work, and in any case I prefer to use Photoshop for editing purposes.
Has anyone experienced this issue? I imagine that Photoshop must add some strange metadata or I am not using the correct color modes--or both.
(I am concerned that this goes beyond the scope of Stack Overflow, but my issue intersects image editing and programming. Regardless, please let me know if this is not the right place.)
EDIT:
In both Photoshop and Gimp I created a test case-- 8 pixels (red, green, transparent, blue) clockwise.
In Photoshop, the transparent square is read as 1, 1, 1, 0 and displays as white.
In Gimp, the transparent square is 0, 0, 0, 0.
I also checked my fragment shader to see whether transparency works at all. Varying the alpha over time does increase transparency, so the alpha isn't outright ignored. For some reason 1, 1, 1, 0 counts as solid.
In addition, setting the background color to black with glClearColor seems to prevent the alpha from increasing transparency.
I don't know how to explain some of these behaviors, but something seems off. 0 alpha should be the same regardless of color, shouldn't it?
(Note that I render a few shapes on top of each other, but I've tried just rendering one for testing purposes.)
The best I can do is post more of my setup code (with bits omitted):
// vertex array and buffers setup
glClearColor(0.0f, 0.0f, 0.0f, 1.0f);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glViewport(0, 0, SCREEN_WIDTH, SCREEN_HEIGHT);
glEnable(GL_DEPTH_TEST);
glEnable(GL_BLEND);
// I think that the blend function may be wrong (GL_ONE that is).
glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA);
glDepthRange(0, 1);
glDepthFunc(GL_LEQUAL);
Texture tex0;
// same function as above, but generates one texture id for me
if (GL_texture_gen_and_load_1(&tex0, "./textures/sq2.png", GL_TRUE, GL_CLAMP_TO_EDGE, GL_CLAMP_TO_EDGE) == GL_FALSE) {
return EXIT_FAILURE;
}
glUseProgram(shader_2d);
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_2D, tex0);
glUniform1i(glGetUniformLocation(shader_2d, "tex0"), 0);
bool active = true;
while (active) {
glClearColor(0.0f, 0.0f, 0.0f, 1.0f);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
// uniforms, game logic, etc.
glDrawElements(GL_TRIANGLES, tri_data.i_count, GL_UNSIGNED_INT, (void*)0);
}
I don't know how to explain some of these behaviors, but something seems off. 0 alpha should be the same regardless of color, shouldn't it?
If you want to get an identical result for an alpha channel of 0.0, independent on the red, green and blue channels, the you have to change the blend function. See glBlendFunc.
Use:
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
This cause tha the the red, green and blue channel are multiplied by the alpha channel.
If the alpha channel is 0.0, the resulting RGB color is (0, 0, 0).
If the alpha channel is 1.0, the RGB color channels keep unchanged.
See further Alpha Compositing, OpenGL Blending and Premultiplied Alpha

Read back texture coordinates from rendered image in OpenGL?

If I render a scene in openGL, is it possible to get back the texture coordinates that were used to paint that pixel?
For example, if I render a triangle that has 3 vertices (x,y,z) and 3 tex coords (u,v), and then I select a pixel on the triangle, I can get the color of the triangle and the depth using opengl calls, but is it possible to also get the interpolated texture coordinate?
Basically, I want to get the image point on the texture that was used to paint the triangle at a particular pixel.
I am guessing the only real way to do this is by reconstructing the ray that goes from the camera center through the pixel on the image plane, and then do a ray-triangle intersection to figure out which triangle it was, and then I can do a lookup into my texture array to get the texture coordinates of the triangle, and then do my own barycentric interpolation, but I would like to avoid having to do all that if possible.
Edit: The code I currently have didn't appear properly formatted in the bounty request below, so I've put it here. This is what I have right now, I would like to add reading texture coordinates u,v to it, ideally without a shader program if possible.
// First initialize the FBO, I am interested in depth and color
// create a framebuffer object
glGenFramebuffers(1, &id);
glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, id);
// Create texture to store color info
glGenTextures(1, &color);
glBindTexture(GL_TEXTURE_2D, color);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, NULL);
glFramebufferTexture2DEXT(GL_FRAMEBUFFER_EXT, GL_COLOR_ATTACHMENT0_EXT, GL_TEXTURE_2D, color, 0);
glBindRenderbufferEXT(GL_RENDERBUFFER_EXT, 0);
// Create render buffer to store depth info
glGenRenderbuffers(1, &depth);
glBindRenderbufferEXT(GL_RENDERBUFFER_EXT, depth);
glRenderbufferStorageEXT(GL_RENDERBUFFER_EXT, GL_DEPTH_COMPONENT, width, height);
glBindRenderbufferEXT(GL_RENDERBUFFER_EXT, 0);
// Attach the renderbuffer to depth attachment point
glFramebufferRenderbufferEXT(GL_FRAMEBUFFER_EXT, GL_DEPTH_ATTACHMENT_EXT, GL_RENDERBUFFER_EXT, depth);
glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, 0);
// Then later in the code, I use the actual buffer:
glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, fboId);
...
//draw model
...
//read color and depth values (want to also read texture coordinate values u and v here too)
...
glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, 0);
If you are determined to do this without using shaders, you could render your scene without lighting and using a single texture for every object. This texture would be filled with two gradients. The red channel would go from 0 to 255 horizontally and the green channel would go from 0 to 255 vertically. Now you have effectively painted the scene using the texture coordinates (assuming they are in the range 0-1). You can use glReadPixels to read back the buffer (or part of the buffer) you have just rendered to and use the red channel to retrieve u and the green channel to retrieve v.
Render your scene to a FBO with a 2-channel floating-point color attachment (GL_RG32F or similar) and output the u/v coordinates to that attachment in the fragment shader.

Creating and blending a dynamic texture in OpenGL

I need to render a sphere to a texture (done using a Framebuffer Object (FBO)), and then alpha blend that texture with the back buffer. So far I'm not doing any processing with the texture except clearing it at the beginning of every frame.
I should say that my scene consists of nothing but a planet in empty space, the sphere should appear next to or around the planet (kind of like a moon for now). When I render the sphere directly to the back buffer, it displays correctly; but when I do the intermediary step of rendering it to a texture and then blending that texture with the back buffer, the sphere only shows up when it is in front of the planet, the part that isn't in front is just "cut off":
I render the sphere using glutSolidSphere to a RGBA8 fullscreen texture that's bound to an FBO, making sure that every sphere pixel receives an alpha value of 1.0. I then pass the texture to a fragment shader program, and use this code to render a fullscreen quad - with the texture mapped onto it - to the backbuffer while alpha blending:
glBindFramebuffer(GL_FRAMEBUFFER, 0);
glBlendFunc(GL_SRC_ALPHA,GL_ONE_MINUS_SRC_ALPHA);
glEnable(GL_BLEND);
glDisable(GL_DEPTH_TEST);
glMatrixMode(GL_MODELVIEW);
glPushMatrix();
glLoadIdentity();
glMatrixMode(GL_PROJECTION);
glPushMatrix();
glLoadIdentity();
glBegin(GL_QUADS);
glTexCoord2i(0, 1);
glVertex3i(-1, 1, -1); // TOP LEFT
glTexCoord2i(0, 0);
glVertex3i(-1, -1, -1); // BOTTOM LEFT
glTexCoord2i(1, 0);
glVertex3i( 1, -1, -1); // BOTTOM RIGHT
glTexCoord2i(1, 1);
glVertex3i( 1, 1, -1); // TOP RIGHT
glEnd();
glPopMatrix();
glMatrixMode(GL_MODELVIEW);
glPopMatrix();
glEnable(GL_DEPTH_TEST);
glDisable(GL_BLEND);
This is the shader code (taken from an FX file written in Cg):
sampler2D BlitSamp = sampler_state
{
MinFilter = LINEAR;
MagFilter = LINEAR;
MipFilter = LINEAR;
AddressU = Clamp;
AddressV = Clamp;
};
float4 blendPS(float2 texcoords : TEXCOORD0) : COLOR
{
float4 outColor = tex2D(BlitSamp, texcoords);
return outColor;
}
I don't even know whether this is a problem with the depth buffer or with alpha blending, I've tried a lot of combinations of enabling and disabling depth testing (with a depth buffer attached to the FBO) and alpha blending.
EDIT: I tried just rendering a blank fullscreen quad straight to the back buffer and even that was cropped around the planet's edges. For some reason, enabling depth testing for rendering the quad (that is, removing the lines glDisable(GL_DEPTH_TEST) and glEnable(GL_DEPTH_TEST) in the code above) got rid of the problem, but now everything but the planet and the sphere appears white:
I made sure (and could confirm) that the alpha channel of the texture is 0 at every pixel but the sphere's, so I don't understand where the whiteness could be introduced. (Would also still be interested in an explanation why enabling depth testing has this effect.)
I see two possible sources of error here:
1. Rendering to the FBO
If the missing pixels are not even present in the FBO after rendering, there must be some mechanism which discarded the corresponding fragments. The OpenGL pipeline includes four different types of fragment tests which can lead to fragments being discarded:
Scissor Test: Unlikely to be the cause, as the scissor test only affects a rectangular portion of the screen.
Alpha Test: Equally unlikely, as your fragments should all have the same alpha value.
Stencil Test: Also unlikely, unless you use stencil operations when drawing the background planet and copy over the stencil buffer from the back buffer to the FBO.
Depth Test: Same as for stencil test.
So there's a good chance that rendering into FBO is not the issue here. But just to be absolutely sure, you should read back your color attachment texture and dump it into a file for inspection. You can use the following function for that:
void TextureToFile(GLuint texture, const char* filename) {
glBindTexture(GL_TEXTURE_2D, texture);
GLint width, height;
glGetTexLevelParameteriv(GL_TEXTURE_2D, 0, GL_TEXTURE_WIDTH, &width);
glGetTexLevelParameteriv(GL_TEXTURE_2D, 0, GL_TEXTURE_HEIGHT, &height);
std::vector<GLubyte> pixels(3 * width * height);
glGetTexImage(GL_TEXTURE_2D, 0, GL_RGB, GL_UNSIGNED_BYTE, &pixels[0]);
std::ofstream out(filename, std::ios::out | std::ios::binary);
out << "P6\n"
<< width << '\n'
<< height << '\n'
<< 255 << '\n';
out.write(reinterpret_cast<const char*>(&pixels[0]), pixels.size());
}
The resulting file is a portable pixmap (.ppm). Be sure to unbind the FBO before reading back the texture.
2. Texture mapping
Assuming rendering into the FBO works as expected, the only other source of error is blending the texture over the previously rendered scene. There are two scenarios:
a) Fragments get discarded
The possible reasons for fragments to get discarded are the same as in 1.:
Scissor Test: Nope, affects rectangular areas only.
Alpha Test: Probably not, the texels covered sphere should all have the same alpha value.
Stencil Test: Might be the cause if you use stencil operations/stencil testing when drawing the background planet and the old stencil state is still active.
Depth Test: Might be the cause, but as you already disable it, it really shouldn't have any effect.
So you should make sure that all of these tests are disabled, especially the stencil test.
b) Wrong results from blending
Assuming all fragments reach the back buffer, blending is the only thing which could still cause the wrong result. With your blending function (GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA) the values in the back buffer are irrelevant for blending, and we assume that the alpha values in the texture are correct. So I see no reason for why blending should be the root cause here.
Conclusion
In conclusion, the only sensible cause for the observed result seems to be stencil testing. If it's not, I'm out of options :)
I solved it or at least came up with a work around.
First off, the whiteness stems from the fact that glClearColor had been set to glClearColor(1.0f, 1.0f, 1.0f, 1000.0f), so everything but the planet wasn't even written to in the end. I now copy the contents of the back buffer (which is the planet, the atmosphere, and the space around it) to the texture before rendering the sphere, and I render the atmosphere and space before that copy/blit operation, so they are included in it. Previously, everything but the planet itself was rendered after my quad, which - when using depth testing - apparently placed everything behind the quad, making it invisible.
The reference implementation of the effect I'm trying to achieve has always used this kind of blit operation in its code but I didn't think it was necessary for the effect. Now I feel like there might be no other way...

Previous calls to GL.Color3 are making my texture use the wrong colors

Making a 2D OpenGL game. When rendering a frame I need to first draw some computed quads geometry and then draw some textured sprites. When the body of my render method only draws the sprites, everything works fine. However, when I try to draw my geometric quads prior to the sprites the texture of the sprite changes to be the color of the last GL.Color3 used previously. How do I tell OpenGL (well, OpenTK) "Ok, we are done drawing geometry and its time to move on to sprites?"
Here is what the render code looks like:
// Let's do some geometry
GL.Begin(BeginMode.Quads);
GL.Color3(_dashboardBGColor); // commenting this out makes my sprites look right
int shakeBuffer = 100;
GL.Vertex2(0 - shakeBuffer, _view.DashboardHeightPixels);
GL.Vertex2(_view.WidthPixelsCount + shakeBuffer, _view.DashboardHeightPixels);
GL.Vertex2(_view.WidthPixelsCount + shakeBuffer, 0 - shakeBuffer);
GL.Vertex2(0 - shakeBuffer, 0 - shakeBuffer);
GL.End();
// lets do some sprites
GL.Begin(BeginMode.Quads);
GL.BindTexture(TextureTarget.Texture2D, _rockTextureId);
float baseX = 200;
float baseY = 200;
GL.TexCoord2(0, 0); GL.Vertex2(baseX, baseY);
GL.TexCoord2(1, 0); GL.Vertex2(baseX + _rockTextureWidth, baseY);
GL.TexCoord2(1, 1); GL.Vertex2(baseX + _rockTextureWidth, baseY - _rockTextureHeight);
GL.TexCoord2(0, 1); GL.Vertex2(baseX, baseY - _rockTextureHeight);
GL.End();
GL.Flush();
SwapBuffers();
The default texture environment mode is GL_MODULATE, which does that, it multiplies the texture color with the vertex color.
A easy solution is to set the vertex color before drawing a textured primitive to 1,1,1,1 with:
glColor4f(1.0f, 1.0f, 1.0f, 1.0f);
Another solution is to change the texture environment mode to GL_REPLACE, which makes the texture color replace the vertex color and doesn't have the issue:
glTexEnvi(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_REPLACE);

OpenGL Texture transparency doesn't work

I'm having an OpenGL texture that is binded to a simple quad.
My problem is: My texture is 128x128 pixels image. I'm only filling up about 100x60 pixels on that image, the other pixels are transparent. I saved it in a .png file. When I'm drawing, the transparent part of the binded texture is white.
Let's say I have a background. When I draw this new quad on this background I can't see the through the transparent part of my texture.
Any suggestions?
Code:
// Init code...
gl.glEnable(gl.GL_TEXTURE_2D);
gl.glDisable(gl.GL_DITHER);
gl.glDisable(gl.GL_LIGHTING);
gl.glDisable(gl.GL_DEPTH_TEST);
gl.glTexEnvi(gl.GL_TEXTURE_ENV, gl.GL_TEXTURE_ENV_MODE, gl.GL_MODULATE);
// Drawing code...
gl.glBegin(gl.GL_QUADS);
gl.glTexCoord2d(0.0, 0.0);
gl.glVertex3f(0.0f, 0.0f, 0.0f);
gl.glTexCoord2d(1.0, 0.0);
gl.glVertex3f(1.0f, 0.0f, 0.0f);
gl.glTexCoord2d(1.0, 1.0);
gl.glVertex3f(1.0f, 1.0f, 0.0f);
gl.glTexCoord2d(0.0, 1.0);
gl.glVertex3f(0.0f, 1.0f, 0.0f);
gl.glEnd();
I've tried almost everything, from enabling blending to change to GL_REPLACE, however I can't get it to work.
Edit:
// Texture. Have tested both gl.GL_RGBA and gl.GL_RGB8.
gl.glTexImage2D(gl.GL_TEXTURE_2D, 0, (int)gl.GL_RGBA, imgWidth, imgHeight,
0, gl.GL_BGR_EXT, gl.GL_UNSIGNED_BYTE, bitmapdata.Scan0);
Check that your texture is of RGBA format, and enable blending and set the blending func:
glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
And draw the texture. If your texture is not RGBA, then there is no alpha and blending won't do anything.
EDIT: Since you posted your code, i can a spot a serious error:
glTexImage2D(gl.GL_TEXTURE_2D, 0, (int)gl.GL_RGBA, imgWidth, imgHeight, 0, gl.GL_BGR_EXT, gl.GL_UNSIGNED_BYTE, bitmapdata.Scan0);
You're telling GL that the texture has internalFormat RGBA, but the bitmap data has BGR format, so, no alpha from your texture data. This assumes alpha = 1.0.
To correct it, load your PNG with RGBA format and use GL_RGBA as internalFormat and format parameters for glTexImage2D.
When I'm drawing, the transparent part of the binded texture is white.
That means your PNG-parser converted transparent regions to the value of white. If you want to render transparent layers with OpenGL you dont typically depend on texture-files to hold the transparency but instead use the GLBlendFunc(). More information here:
http://www.opengl.org/resources/faq/technical/transparency.htm
Also, should you render to a frame buffer and copy the result into a texture, check that the frame buffer has alpha turned on. For example when using osgViewer this can be achived by (do this before calling setUpViewInWindow):
osg::DisplaySettings *pSet = myviewer.getDisplaySettings();
if(pSet == NULL)
pSet = new osg::DisplaySettings();
pSet->setMinimumNumAlphaBits(8);
myviewer.setDisplaySettings(pSet);
and under qt it should work with (from http://forum.openscenegraph.org/viewtopic.php?t=6411):
QGLFormat f;
f.setAlpha( true ); //enables alpha channel for this format
QGLFormat::setDefaultFormat( f ); //set it as default before instantiations
setupUi(this); //instantiates QGLWidget (ViewerQT)
Normally it is better to render directly into a frame buffer but I came alonge this while preparing some legacy code and in the beginning it was very hard to find this.