Texture Transparency in OpenGL - opengl

I am making a texture in my environment that excludes all white pixels. I read in a ppm file and the fourth value is always set to 0 if it is a white pixel. Everything seems to be in order, I have set up my view correctly and so forth. The texture image is visible with my current code, however the image as a whole is not fully opaque. It is highly see through. Is this a problem with how I am setting up my GL_Blend? Why is the entire texture not opaque as it should be only excluding the white pixels?
first three values are read in as rgb values and fourth value is not in file, it is selected depending on the total value of the three previous numbers from rgb. This texture is not loaded every time I render it is in a display list so only done once.
glPushMatrix();
FILE *inFile3;
char dump3[3];
int max3, k3 = 0;
inFile3 = fopen("tree.ppm", "r");
int x3;
int y3;
fscanf(inFile3, "%s", dump3);
fscanf(inFile3, "%d %d", &x3, &y3);
fscanf(inFile3, "%d", &max3);
int arraySize3 = y3*(4*x3);
int pixel3, rgb = 0;
GLubyte data3[arraySize3];
for (int i = 0; i < x3; i++) {
for (int j = 0; j < y3; j++) {
fscanf(inFile3, "%d", &pixel3);
data3[k3++] = pixel3;
rgb += pixel3;
fscanf(inFile3, "%d", &pixel3);
data3[k3++] = pixel3;
rgb += pixel3;
fscanf(inFile3, "%d", &pixel3);
data3[k3++] = pixel3;
rgb += pixel3;
data3[k3++] = ((rgb) > 760) ? 0 : 255;
rgb = 0;
}
}
fclose(inFile3);
glGenTextures(1,&texture3);
glBindTexture(GL_TEXTURE_2D,texture3);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER,GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER,GL_NEAREST_MIPMAP_NEAREST);
gluBuild2DMipmaps(GL_TEXTURE_2D,4,x3,y3,GL_RGBA,GL_UNSIGNED_BYTE,data3);
glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
glEnable( GL_TEXTURE_2D );
glBindTexture( GL_TEXTURE_2D, texture3 );
glTexEnvf( GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_MODULATE );
glRotatef(180, 0, 0, 0);
glTranslatef(0, -19, 0);
glBegin(GL_QUADS);
glTexCoord2d(0,0); glVertex3f(30,0,10);
glTexCoord2d(0,1); glVertex3f(30,20,10);
glTexCoord2d(1,1); glVertex3f(30,20,-10);
glTexCoord2d(1,0); glVertex3f(30,0,-10);
glEnd();
glDisable(GL_TEXTURE_2D);
glDisable(GL_BLEND);
glPopMatrix();
Screen shots:

If you move the viewpoint really close to the tree, does it become opaque? Or does it if you disable mipmapping?
edit
By moving the eye closer to a tree, the level 0 mipmap (your original image) is used. To select a mipmap, OpenGL computes which level provides the best match between the size of its texels and the size of a pixel.
To disable mipmapping generation, you must use glTexImage2D to upload your texture instead of gluBuild2DMipmaps.
To disable usage of mipmaps, change the following line
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST_MIPMAP_NEAREST);
to
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
You could also do as Jim Buck suggests and use
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAX_LEVEL, 0);
to force the highest mipmap level used to 0. The lowest being by default 0, this effectively disables mipmapping. By setting GL_TEXTURE_MAX_LEVEL and GL_TEXTURE_BASE_LEVEL to the same value, you will be able to see the content of that specific mipmap level.
All this is to confirm if it's a problem with the mipmaps.

In addition to what #bernie pointed out about GluByte arrays and including glBegin/end, the problem was in..
glTexEnvf( GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_MODULATE );
GL_MODULATE needed to be replaced with GL_REPLACE. This fixed the issue. Thanks for your guys' help.

Related

OpenGL texture not rendered properly on model

In my opengl application, texture is not rendered correctly on the model.
Here is a screenshot of the result:
Here is what the bunny should look like:
expected result
Here is the code to load the texture.
stbi_set_flip_vertically_on_load(1);
m_LocalBuffer = stbi_load(path.c_str(), &m_Width, &m_Height, &m_BPP, 0);
GLCall(glGenTextures(1, &m_RendererID));
GLCall(glBindTexture(GL_TEXTURE_2D, m_RendererID));
GLCall(glGenerateMipmap(GL_TEXTURE_2D));
GLenum format = GL_RGBA;
//..switching on m_BPP to set format, omitted here
GLCall(glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR));
GLCall(glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR));
GLCall(glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE));
GLCall(glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE));
GLCall(glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA8, m_Width, m_Height, 0, format, GL_UNSIGNED_BYTE, m_LocalBuffer));
GLCall(glBindTexture(GL_TEXTURE_2D, 0));
if (m_LocalBuffer) {
stbi_image_free(m_LocalBuffer);
}
Here is the texture file I'm using
Texture File
I downloaded the asset from https://blenderartists.org/t/uv-unwrapped-stanford-bunny-happy-spring-equinox/1101297 (the 3.3Mb link)
Here is the code where I read in the texCoords
for (size_t i = 0; i < mesh->mNumVertices; i++) {
//..read in positions and normals
if (mesh->mTextureCoords[0]) {
vertex.TexCoords.x = mesh->mTextureCoords[0][i].x;
vertex.TexCoords.y = mesh->mTextureCoords[0][i].y;
}
}
I'm loading the model as an obj file using assimp. I just read the texture coord from the result and pass it to the shader. (GLCall is just a debug macro I have in the renderer)
What could potentially be the cause for this? Let me know if more info is needed. Thanks a lot!
The image seems to be flipped vertically (around the x-axis). To compensated that, you've to flip the image manually, after loading it. Or if you've flipped the image then you've to omit that. Whether the image has to be flipped or not, depends on the image format.

C++ OpenGL Texture not loading

void OGLRectangle::LoadTexture(const char* filename)
{
unsigned int texture;
int width, height;
BYTE * data;
FILE * file;
file = fopen(filename, "rb");
width = 1920;
height = 1080;
data = new BYTE[height * width * 3];
fread(data, width * height * 3, 1, file);
fclose(file);
glGenTextures(1.0, &texture);
glBindTexture(GL_TEXTURE_2D, texture);
tex = texture;
glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_MODULATE);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR_MIPMAP_NEAREST);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP);
glTexImage2D(GL_TEXTURE_2D, 0, 2, width, height,0, GL_RGB, GL_UNSIGNED_BYTE, data);
delete [] data;
}
I have this code to render in an image, the method is called with:
LoadTexture("C:\\Users\Rhys\Documents\Hills.bmp");
The file exists.
Then I'm trying to render it to the openGL window using;
glEnable(GL_TEXTURE_2D);
glBindTexture(GL_TEXTURE_2D, tex);
glBegin(GL_QUADS);
glTexCoord2d(0.0, 0.0); glVertex2d(0.0, 0.0);
glTexCoord2d(1.0, 0.0); glVertex2d(100.0, 0.0);
glTexCoord2d(1.0, 1.0); glVertex2d(100.0, 100.0);
glTexCoord2d(0.0, 1.0); glVertex2d(0.0, 100.0);
glEnd();
glDisable(GL_TEXTURE_2D);
However, all I'm getting on screen is a darkish blue box, with no texture rendered in it.
I have searched for tutorials on how to do this, even asked my lecturer and I still cannot seem to find out why its not working.
Any help will be greatly appreciated.
The .bmp files loading must be little different
This code simply loads bmp file to memory m_pcbData without compression and indexed color support.
bool CBMPImage::LoadFromFile(const CString& FileName)
{
BITMAPINFOHEADER BitmapInfo;
ZeroMemory(&BitmapInfo, sizeof(BITMAPINFOHEADER));
BITMAPFILEHEADER BitmapFile;
ZeroMemory(&BitmapFile, sizeof(BITMAPFILEHEADER));
std::ifstream FileStream(FileName, std::ios::binary | std::ios::in);
if (!FileStream.good())
return false;
// Read bitmap file info
FileStream.read(reinterpret_cast<char*>(&BitmapFile), sizeof(BITMAPFILEHEADER));
// Read bitmap info
FileStream.read(reinterpret_cast<char*>(&BitmapInfo), sizeof(BITMAPINFOHEADER));
// Proper bitmap file supports only 1 plane
if (BitmapInfo.biPlanes != 1)
return false;
m_cbAlphaBits = 0;
m_cbRedBits = 0;
m_cbGreenBits = 0;
m_cbBlueBits = 0;
// Retrives bits per pixel info
m_cbBitsPerPel = (BMPbyte)BitmapInfo.biBitCount;
// Width and height of image
m_nWidth = BitmapInfo.biWidth;
m_nHeight = BitmapInfo.biHeight;
// Compute bitmap file size
m_nSize = 4 * ((m_nWidth * m_cbBitsPerPel + 31) / 32) * m_nHeight;
// Less important info
m_nPixelWidthPerMeter = BitmapInfo.biXPelsPerMeter;
m_nPixelHeightPerMeter = BitmapInfo.biYPelsPerMeter;
// Indexes info not important in our case
m_nClrCount = BitmapInfo.biClrUsed;
m_nClrImportant = BitmapInfo.biClrImportant;
// COMPRESSION MUST BE BI_RGB
m_Compression = (BMPCompression)BitmapInfo.biCompression;
delete [] m_pcbData;
m_pcbData = NULL;
// Allocate proper data size
m_pcbData = new BMPbyte[m_nSize];
// Read actual image data, considering offset of file header
FileStream.seekg(BitmapFile.bfOffBits);
FileStream.read(reinterpret_cast<char*>(m_pcbData), m_nSize);
FileStream.close();
return true;
}
than load bmp texture data to OpenGL
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, Image.GetWidth(), Image.GetHeight(), 0, GL_BGR_EXT, GL_UNSIGNED_BYTE, (GLvoid*)Image.GetImageData());
GL_BGR_EXT is important because bmp stores image data in reverse byte order.
Secondly you must specify your material color as white because of usage that texture environment GL_TEXTURE_ENV_MODE, GL_MODULATE
And as mentioned #Reto Koradi, you must specify to generate mipmaps before texture image loading using one of these function calls.
glGenerateMipmap(GL_TEXTURE_2D);
or
glTexParameteri(GL_TEXTURE_2D, GL_GENERATE_MIPMAP, GL_TRUE);
Plus as you used not power of two textures (width = 1920;
height = 1080;) it may not work.
You're setting the attribute to sample with mipmaps:
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR_MIPMAP_NEAREST);
You should only set that if your textures actually has mipmaps. To generate mipmaps, you can call:
glGenerateMipmap(GL_TEXTURE_2D);
after the glTexImage2D() call. Or you can simply set the sampler attribute to not use mipmaps:
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
As has already been pointed out: If your image file is indeed a BMP, and not just a raw image file, your image loading code will also need work.

DevIL image not rendering correctly

I am using OpenGL, I can load tga files properly, but for some reason when i render jpg files, i do not see them correctly.
This is what the image is supposed to look like--
And this is what it looks like.. why is it stretched? is it because of the coordinates?
Here is the code i am using for drawing.
void Renderer::DrawJpg(GLuint tex, int xi, int yq, int width, int height) const
{
glBindTexture(GL_TEXTURE_2D, tex);
glBegin(GL_QUADS);
glTexCoord2i(0, 0); glVertex2i(0+xi, 0+xi);
glTexCoord2i(0, 1); glVertex2i(0+xi, height+xi);
glTexCoord2i(1, 1); glVertex2i(width+xi, height+xi);
glTexCoord2i(1, 0); glVertex2i(width+xi, 0+xi);
glEnd();
}
This is how i am loading the image...
imagename=s;
ILboolean success;
ilInit();
ilGenImages(1, &id);
ilBindImage(id);
success = ilLoadImage((const ILstring)imagename.c_str());
if (success)
{
success = ilConvertImage(IL_RGB, IL_UNSIGNED_BYTE); /* Convert every colour component into
unsigned byte. If your image contains alpha channel you can replace IL_RGB with IL_RGBA */
if (!success)
{
printf("image conversion failed.");
}
glGenTextures(1, &id);
glBindTexture(GL_TEXTURE_2D, id);
width = ilGetInteger(IL_IMAGE_WIDTH);
height = ilGetInteger(IL_IMAGE_HEIGHT);
glTexImage2D(GL_TEXTURE_2D, 0, ilGetInteger(IL_IMAGE_BPP), ilGetInteger(IL_IMAGE_WIDTH),
ilGetInteger(IL_IMAGE_HEIGHT), 0, ilGetInteger(IL_IMAGE_FORMAT), GL_UNSIGNED_BYTE,
ilGetData());
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT); // Linear Filtered
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT); // Linear Filtered
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
I probably should mention this, but some images did get rendered properly, I thought it was because width != height. But that is not the case, images with width != height also get loaded fine.
But for other images i still get this problem.
You probably need to call
glPixelStorei(GL_UNPACK_ALIGNMENT, 1);
before uploading the texture data with glTexImage2D.
From the reference pages:
GL_UNPACK_ALIGNMENT
Specifies the alignment requirements for the start of each pixel row
in memory. The allowable values are 1 (byte-alignment), 2 (rows
aligned to even-numbered bytes), 4 (word-alignment), and 8 (rows start
on double-word boundaries).
The default value for the alignment is 4 and your image loading library probably returns pixel data with byte-aligned rows, which explains why some of your images look OK (when the width is a multiple of four).
Always try to have the images width and height of the power of two because some GPU support textures only in NPOT resolution. (for example 128x128, 512x512 but not 123x533, 128x532)
And i think that here instead of GL_REPEAT you should use GL_CLAMP_TO_EDGE :)
GL_REPEAT is used when your texture coordinates are > 1.0f, CLAMP_TO_EDGE too but guarantees the image will fill the polygon without unwanted lines on edges. (it's blocking your linear filtering on edges)
Remember to try out code where floats are used (sample from comment) :)
Here is good explanation http://open.gl/textures :)

texture formats in glTeximage3D

I am using teximage3D with gl_texture_3D and gl_texture_2D_array as a targets.
I am creating 4 layers of colors and applying that on a sphere. So i am expecting that it will apply 4 layers on sphere equally.
But for GL_TEXTURE_3D, it repeats all the layers 2 times. whereas for gl_texture_2D_array it applies those layers only once as per expected.
int w = 4, h = 4, d = 4;
size_t size = w * h * d;
*format=GL_RGBA;
GLubyte *dataRGBA=new GLubyte[4*size];
for (int i=0; i<size/4; i++)
{
dataRGBA[4*i]=200;
dataRGBA[4*i+1]=0;
dataRGBA[4*i+2]=0;
dataRGBA[4*i+3]=255;
}
for (int i=size/4; i<size/2; i++)
{
dataRGBA[4*i]=0;
dataRGBA[4*i+1]=255;
dataRGBA[4*i+2]=0;
dataRGBA[4*i+3]=255;
}
for ( int i=size/2; i<(3*size)/4; i++)
{
dataRGBA[4*i]=0;
dataRGBA[4*i+1]=0;
dataRGBA[4*i+2]=255;
dataRGBA[4*i+3]=255;
}
for ( int i=(3*size)/4; i<size; i++)
{
dataRGBA[4*i]=255;
dataRGBA[4*i+1]=0;
dataRGBA[4*i+2]=255;
dataRGBA[4*i+3]=255;
}
glGenTextures(1,&id);
glBindTexture(*target11, id);
glTexParameteri(*target11, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
// when this texture needs to be magnified to fit on a big polygon, use linear interpolation of the texels to determine the color
glTexParameteri(*target11, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
// we want the texture to repeat over the S axis, so if we specify coordinates out of range we still get textured.
glTexParameteri(*target11, GL_TEXTURE_WRAP_S, GL_REPEAT);
// same as above for T axis
glTexParameteri(*target11, GL_TEXTURE_WRAP_T, GL_REPEAT);
// same as above for R axis
glTexParameteri(*target11, GL_TEXTURE_WRAP_R, GL_REPEAT);
glTexImage3D(*target11, 0, *format, w, h, d, 0, GL_RGBA, GL_UNSIGNED_BYTE, dataRGBA);
I believe that the issue is that a 2d texture array uses integers as texture indices, while a 3d texture scales the R axis 0-1 like the S and T axes.

Draw a texture in OpenGL while ignoring its alpha channel

I have a texture loaded into memory that is of RGBA format with various alpha values.
The image is loaded as so:
GLuint texture = 0;
glGenTextures(1, &texture);
glBindTexture(GL_TEXTURE_2D, texture);
self.texNum = texture;
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER,GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER,GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, self.imageWidth, self.imageHeight, 0, GL_RGBA, GL_UNSIGNED_BYTE, [self.imageData bytes]);
I want to know how I can draw this texture so that the alpha channel in the image is treated as all 1's and the texture is drawn like an RGB image.
Consider the base image:
This image is a progression from 0 to 255 alpha and has the RGB value of 255,0,0 throughout
However if I draw it with blending disabled I get an image that looks like:
www.ldeo.columbia.edu/~jcoplan/alpha/no_alpha.png
When what I really want is an image that looks like this:
www.ldeo.columbia.edu/~jcoplan/alpha/correct.png
I'd really appreciate some pointers to have it ignore the alpha channel completely. Note that I can't just load the image in as an RGB initially because I do need the alpha channel at other points.
Edit: I tried to use GL_COMBINE to solve my problem as such:
glColorf(1,1,1,1);
glTexEnvi(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_COMBINE);
glTexEnvi(GL_TEXTURE_ENV, GL_COMBINE_RGB, GL_REPLACE);
glTexEnvi(GL_TEXTURE_ENV, GL_SRC0_RGB, GL_TEXTURE);
glTexEnvi(GL_TEXTURE_ENV, GL_OPERAND0_RGB, GL_SRC_COLOR);
glTexEnvi(GL_TEXTURE_ENV, GL_COMBINE_ALPHA, GL_REPLACE);
glTexEnvi(GL_TEXTURE_ENV, GL_SRC0_ALPHA, GL_PRIMARY_COLOR);
glTexEnvi(GL_TEXTURE_ENV, GL_OPERAND0_ALPHA, GL_SRC_ALPHA);
[self drawTexture];
But still no luck, it draws black to red still.
I have a texture loaded into memory that is of RGBA format with various alpha values
glDisable(GL_BLEND)
However if I draw it with blending disabled I get an image that looks like: www.ldeo.columbia.edu/~jcoplan/alpha/no_alpha.png
This happens because in your source image all transparent pixels are black. It's a problem with your texture/image, or maybe with loader function, but it is not an OpenGL problem.
You could probably try to fix it using glTexEnv(GL_COMBINE... ) (i.e. mix texture color with underlying color based on alpha channel), but since I haven't done something like that, i'm not completely sure, and can't give you exact operands. It was possible in Direct3D9 (using D3DTOP_MODULATEALPHA_ADDCOLOR), so most likely there is a way to do it in opengl.
You should not disable blending but use the glBlendFunc with proper parameters:
glBlendFunc(GL_ONE, GL_ZERO);
Or you could tell OpenGL to upload only the RGB channels of your image using
glPixelStorei(GL_UNPACK_ALIGNMENT, 4)
before you call glTexImage2D with format set to GL_RGB. It will cause it to skip the fourth byte of every pixel, i.e. the alpha channel.
I had a similar problem, and found out that it was because iOS image loading was doing a premultiply on the RBG values (as discussed in some of the other answers and comments here). I'd love to know whether there's a way of disabling pre-multiplication, but in the meantime I'm "un-pre-multiplying" using code derived from this thread and this thread.
// un pre-multiply
uint8_t *imageBytes = (uint8_t *)imageData ;
int byteCount = width*height*4 ;
for (int i=0; i < byteCount; i+= 4) {
uint8_t a = imageBytes[i+3] ;
if (a!=255 && a!=0 ){
float alphaFactor = 255.0/a ;
imageBytes[i] *= alphaFactor ;
imageBytes[i+1] *= alphaFactor ;
imageBytes[i+2] *= alphaFactor ;
}
}