How do I make textures transparent in OpenGL? - c++

I've tried to research this on Google but there doesn't appear to me to be any coherent simple answers. Is this because it's not simple, or because I'm not using the correct keywords?
Nevertheless, this is the progress I've made so far.
Created 8 vertices to form 2 squares.
Created a texture with a 200 bit alpha value (so, about 80% transparent).
Assigned the same texture to each square, which shows correctly.
Noticed that when I use a texture with 255 alpha, it appears brighter.
The init is something like the following:
glClearColor(0.0, 0.0, 0.0, 0.0);
glShadeModel(GL_FLAT);
glEnable(GL_DEPTH_TEST);
glEnable(GL_CULL_FACE);
glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
glPixelStorei(GL_UNPACK_ALIGNMENT, 1);
glGenTextures(1, textureIds);
glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_REPLACE);
int i, j;
GLubyte pixel;
for (i = 0; i < TEXTURE_HEIGHT; i++)
{
for (j = 0; j < TEXTURE_WIDTH; j++)
{
pixel = ((((i & 0x8) == 0) ^ ((j & 0x8) == 0)) * 255);
texture[i][j][0] = pixel;
texture[i][j][1] = pixel;
texture[i][j][2] = pixel;
texture[i][j][3] = 200;
}
}
glBindTexture(GL_TEXTURE_2D, textureIds[0]);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexImage2D(
GL_TEXTURE_2D, 0, GL_RGBA,
TEXTURE_WIDTH, TEXTURE_HEIGHT,
0, GL_RGBA, GL_UNSIGNED_BYTE, texture);
This is somewhat similar to the code snippet from page 417 in the book, OpenGL Programming Guide, and creates a check pattern.
And then, the display function contains...
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glEnable(GL_TEXTURE_2D);
// Use model view so that rotation value is literal, not added.
glMatrixMode(GL_MODELVIEW);
glPushMatrix();
// ... translation, etc ...
glBindTexture(GL_TEXTURE_2D, textureIds[0]);
glBegin(GL_QUADS);
glTexCoord2f(0.0, 0.0); glVertex3f(-1.0, +1.0, 0.0); // top left
glTexCoord2f(0.0, 1.0); glVertex3f(-1.0, -1.0, 0.0); // bottom left
glTexCoord2f(1.0, 1.0); glVertex3f(+1.0, -1.0, 0.0); // bottom right
glTexCoord2f(1.0, 0.0); glVertex3f(+1.0, +1.0, 0.0); // top right
glEnd();
// not neccecary to repeat, just good practice
glBindTexture(GL_TEXTURE_2D, textureIds[0]);
glBegin(GL_QUADS);
glTexCoord2f(0.0, 0.0); glVertex3f(-0.5, +1.0, -1.0); // top left
glTexCoord2f(0.0, 1.0); glVertex3f(-0.5, -1.0, -1.0); // bottom left
glTexCoord2f(1.0, 1.0); glVertex3f(+1.5, -1.0, -1.0); // bottom right
glTexCoord2f(1.0, 0.0); glVertex3f(+1.5, +1.0, -1.0); // top right
glEnd();
glFlush();
glDisable(GL_TEXTURE_2D);
glPopMatrix();
SwapBuffers();
So, this renders a 2nd square in the background; I can see this but it looks like they're being blended with the background (I assume this because they are darker with 200 bit alpha than 255 bit) instead of the texture behind...
As you can see, no transparency... How can I fix this?

So the other answer which was here but was deleted mentioned this - Generally, for alpha blending to work correctly you need to sort the objects from far to near in the coordinate system of the camera.
This is why your polygons are blended with the background. You can confirm that this is indeed the problem by disabling the depth test. Without depth test all the fragments are displayed and you'll be able to see the alpha blending.
More on this in this page.

Related

OpenGL - texture not visible with blending enabled [closed]

It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 10 years ago.
When I try to display texture on object it works but only with GL_BLEND disabled. When I enable blending:
glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
The texture is just not visible anymore. Black screen.
I have really no idea whats going on. Its same for JPG and for PNG with alpha channel.
EDIT (more details):
Well, its hard to paste the code (objects, objects eveywhere and huge) but it goes something like this:
//initialization - i commented everything else
glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
//preparing texture
glEnable(GL_TEXTURE_2D);
glGenTextures(1, &texture_id);
glBindTexture(GL_TEXTURE_2D, texture_id);
int Mode = GL_BGR;
int nOfColors = image->format->BytesPerPixel;
if (nOfColors == 4) {
if (image->format->Rmask == 0x000000ff)
Mode = GL_RGBA;
else
Mode = GL_BGRA;
} else if (nOfColors == 3) {
if (image->format->Rmask == 0x000000ff)
Mode = GL_RGB;
else
Mode = GL_BGR;
}
// glTexEnvf( GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_MODULATE );
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
//glTexEnvi(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_DECAL);
glTexImage2D(GL_TEXTURE_2D, 0, nOfColors, image->w, image->h, 0, Mode, GL_UNSIGNED_BYTE, image->pixels);
glDisable(GL_TEXTURE_2D);
//drawing
glColor4f(1.0f,1.0f,1.0f,1.0f);
glEnable(GL_TEXTURE_2D);
glBindTexture( GL_TEXTURE_2D, _i );
glBegin(GL_QUADS);
glTexCoord2f(0, 0);
glVertex2f(x, y);
glTexCoord2f(1, 0);
glVertex2f(x + width, y);
glTexCoord2f(1, 1);
glVertex2f(x + width, y + height);
glTexCoord2f(0, 1);
glVertex2f(x, y + height);
glEnd();
glBindTexture( GL_TEXTURE_2D, NULL );
glDisable(GL_TEXTURE_2D);
EDIT2
"black screen" may be little confusing - i meant that nothing is displayed (my background is black but it doesnt matter) - if i turn blending off i get nice texture on screen - with blending on nothing but background color
Did you try to call glTexEnvf?
glTexEnvf( GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_REPLACE);
Here is example: http://unick-soft.ru/art/files/basicBlend.zip
For your case you need to look this code:
//draw texture
glPushMatrix();
glColor4f(1.0, 1.0, 0.0, 1.0);
texture.switchOffTexture();
Sphere.drawObject();
glTranslatef(0.0, 0.0, 7.0);
glTexEnvf( GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_REPLACE );
texture.bindTexture();
glBegin(GL_POLYGON);
glTexCoord2f(0.0, 0.0);
glVertex3f(-1.0, -1.0, 0.0);
glTexCoord2f(1.0, 0.0);
glVertex3f(1.0, -1.0, 0.0);
glTexCoord2f(1.0, 1.0);
glVertex3f(1.0, 1.0, 0.0);
glTexCoord2f(0.0, 1.0);
glVertex3f(-1.0, 1.0, 0.0);
glEnd();
glPopMatrix();
If you have problem with compilation, you can comment include: #include < gl\glaux.h >
Press 1, 2, 3 to select blending type. In case 3 you will see this result http://unick-soft.ru/art/img/blend/text_blend.png
It seems as if your alpha channel was all 0 - are you positive that you are reading your texture correctly (with alpha channel)? Try filling it by hand or check if it is really filled by whatever software you are using to read your files into pixels.
So it finally started to work after i changed my image loading library to DevIL. Previous had problems with alpha channel it seems.
Thanks all for your help

Partly Transparent HUD-style overlay in OpenGL

I'm trying to make a program showing a red rotating cube in the background, overlayed with a textured quad.
The texture is a simple 24-bit bitmap of the words "Hello World" in black over a white background. I want the white background to be transparent so that the cube can be seen behind the overlay. The image loader checks the value of each pixel and adds the relevant alpha value to convert the image into a 32-bit bitmap.
At the moment, my program displays the overlay with black text but a red background, same colour as the cube. Below is the code used for the initial texture set up:
if (bitmap->Load("test.bmp")) {
glGenTextures(1, &texture);
glBindTexture(GL_TEXTURE_2D, texture);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexImage2D(GL_TEXTURE_2D, 0, 3, bitmap->GetWidth(), bitmap->GetHeight(),
0, GL_RGBA, GL_UNSIGNED_BYTE, bitmap->GetPixelData());
}
And this is the whole of my display function, in case anything is interfering with anything else.
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
gluPerspective(40, 1, 0.1, 27.0);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glColor3f(1.0, 0.0, 0.0);
glTranslatef(0.0, 0.0, -1.1);
glRotatef(angle, 1.0, 1.0, 0.0);
glutSolidCube(0.1);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(0.0, 640, 480, 0.0, -1.0, 10.0);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
glDisable(GL_CULL_FACE);
glClear(GL_DEPTH_BUFFER_BIT);
glEnable(GL_TEXTURE_2D);
glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
glBindTexture(GL_TEXTURE_2D, texture);
glBegin(GL_QUADS);
glTexCoord2d(0.0, 0.0); glVertex2f(0.0, 0.0);
glTexCoord2d(1.0, 0.0); glVertex2f(320.0, 0.0);
glTexCoord2d(1.0, 1.0); glVertex2f(320.0, 240.0);
glTexCoord2d(0.0, 1.0); glVertex2f(0.0, 240.0);
glEnd();
glDisable(GL_BLEND);
glDisable(GL_TEXTURE_2D);
glFlush();
glutSwapBuffers();
The default texture environment is GL_MODULATE which mixes in the current color (red from your cube) with the incoming texel value.
Switch to GL_DECAL or do a glColor3ub(255,255,255) before you render your text.

Why GL_CLAMP matters here?

I'm running this example,
but changing:
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT);
to:
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP);
And I got a slightly different image at the top and right.
But as can seen from the code:
glBegin(GL_QUADS);
//lower left
glTexCoord2f(0, 0);
glVertex2f(-1.0, -1.0);
//upper left
glTexCoord2f(0, 1.0);
glVertex2f(-1.0, 1.0);
//upper right
glTexCoord2f(1.0, 1.0);
glVertex2f(1.0, 1.0);
//lower right_
glTexCoord2f(1.0, 0);
glVertex2f(1.0, -1.0);
glEnd();
The texture will not go out of range, why GL_CLAMP matters?
In the fragment shader the colors are mixed with colors top right from the current texel. There is a vec2(0.0625, 0.0625) added to the texture coordinate. So you always get the same value using GL_CLAMP when looking up texture coordinates higher or equal (1.0 - 0.0625) = 0.9375.
void main() {
vec4 s1 = texture2D(tex0, gl_TexCoord[0].st);
vec4 s2 = texture2D(tex1, gl_TexCoord[0].st + vec2(0.0625, 0.0625));
gl_FragColor = mix(vec4(1, s1.g, s1.b, 0.5), vec4(s2.r, s2.g, 1, 0.5), 0.5);
}

wglShareLists fails with error 6 : ERROR_INVALID_HANDLE The handle is invalid

I try to share a HPBUFFERARB between two classes : TGLForm and TGLForm2.
(I tried FBO but having an old Borland Builder 6 version I can't manage using FBO)
My goal is to display the same buffer in two openGL windows.
So I declared outside of the first Form this object :
struct GLRenderToTexture
{
struct
{
HDC hdc;
HGLRC hGlRc;
HPBUFFERARB hBuffer;
PFNWGLGETEXTENSIONSSTRINGARBPROC wglGetExtensionsStringARB;
PFNWGLCHOOSEPIXELFORMATARBPROC wglChoosePixelFormatARB;
PFNWGLCREATEPBUFFERARBPROC wglCreatePbufferARB;
PFNWGLGETPBUFFERDCARBPROC wglGetPbufferDCARB;
PFNWGLQUERYPBUFFERARBPROC wglQueryPbufferARB;
PFNWGLDESTROYPBUFFERARBPROC wglDestroyPbufferARB;
PFNWGLRELEASEPBUFFERDCARBPROC wglReleasePbufferDCARB;
PFNWGLBINDTEXIMAGEARBPROC wglBindTexImageARB;
PFNWGLRELEASETEXIMAGEARBPROC wglReleaseTexImageARB;
} wgl;
unsigned int texture; // the texture we're going to render to
};
GLRenderToTexture RTT;
I intialize it so as to have the same pixel format as the first GLForm :
void __fastcall TGLForm::FormCreate(TObject *Sender)
{
ghDC = GetDC(Handle);
if (!bSetupPixelFormat(ghDC)) Close();
ghRC = wglCreateContext(ghDC);
wglMakeCurrent(ghDC, ghRC);
InitializeGL();
int pixelFormats;
int intAttrs[32] ={WGL_RED_BITS_ARB,8,WGL_GREEN_BITS_ARB,8,WGL_BLUE_BITS_ARB,8,WGL_ALPHA_BITS_ARB,8,WGL_DRAW_TO_PBUFFER_ARB, GL_TRUE,WGL_BIND_TO_TEXTURE_RGBA_ARB, GL_TRUE,WGL_SUPPORT_OPENGL_ARB,GL_TRUE,WGL_ACCELERATION_ARB,WGL_FULL_ACCELERATION_ARB,WGL_DOUBLE_BUFFER_ARB,GL_FALSE,0}; // 0 terminate the list
unsigned int numFormats = 0;
// get an acceptable pixel format to create the PBuffer with
if (RTT.wgl.wglChoosePixelFormatARB(ghDC, intAttrs, NULL, 1, &pixelFormats, &numFormats)==FALSE)
AnsiString error = AnsiString().sprintf("wglChoosePixelFormatARB returned %i", GetLastError()); // GetLastError will tell us why it failed
//Set some p-buffer attributes so that we can use this p-buffer as a 2d texture target
const int attributes[]= {WGL_TEXTURE_FORMAT_ARB, WGL_TEXTURE_RGBA_ARB, // p-buffer will have RBA texture format
WGL_TEXTURE_TARGET_ARB, WGL_TEXTURE_2D_ARB, 0}; // Of texture target will be GL_TEXTURE_2D
// the size of the PBuffer must be the same size as the texture
RTT.wgl.hBuffer= RTT.wgl.wglCreatePbufferARB(ghDC, pixelFormats, ClientWidth, ClientHeight, attributes);
RTT.wgl.hdc= RTT.wgl.wglGetPbufferDCARB(RTT.wgl.hBuffer);
RTT.wgl.hGlRc= wglCreateContext(RTT.wgl.hdc);
wglMakeCurrent(NULL,NULL);
}
Here is my first DrawScene : the "PaintGL()" drawing is perfectly drawn on this form :
void TGLForm::DrawSceneForm1()
{
wglMakeCurrent(ghDC, ghRC);
ClientWidth = 1920;
ClientHeight = 1080;
// create a texture to use as the backbuffer
glGenTextures(1, &RTT.texture);
glBindTexture(GL_TEXTURE_2D, RTT.texture);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
// make sure this is the same color format as the screen
glTexImage2D(GL_TEXTURE_2D, 0, 4, ClientWidth, ClientHeight, 0, GL_RGBA, GL_UNSIGNED_BYTE, NULL);
// switch to the texture context
wglMakeCurrent(RTT.wgl.hdc, RTT.wgl.hGlRc);
glEnable(GL_TEXTURE_2D); // Enable Texture Mapping
glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE,GL_MODULATE);
glClear(GL_DEPTH_BUFFER_BIT);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(-1.0, 1.0, -1.0, 1.0, -1.0, 1.0);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
glClearColor(0,0,0,1);
glClear(GL_COLOR_BUFFER_BIT);
glDisable(GL_TEXTURE_2D);
// switch back to the screen context
wglMakeCurrent(ghDC, ghRC);
wglShareLists(ghRC, RTT.wgl.hGlRc);
glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE,GL_MODULATE);
glClear(GL_DEPTH_BUFFER_BIT);
glViewport(0, 0, ClientWidth, ClientHeight);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(-1.0, 1.0, -1.0, 1.0, -1.0, 1.0);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
wglMakeCurrent(RTT.wgl.hdc, RTT.wgl.hGlRc);
glEnable(GL_TEXTURE_2D);
glBindTexture(GL_TEXTURE_2D, RTT.texture);
PaintGL();
glDisable(GL_TEXTURE_2D);
wglMakeCurrent(ghDC, ghRC);
glEnable(GL_TEXTURE_2D);
glBindTexture(GL_TEXTURE_2D, RTT.texture);
RTT.wgl.wglBindTexImageARB(RTT.wgl.hBuffer, WGL_FRONT_LEFT_ARB);
glBegin(GL_QUADS);
glColor4ub(255,255,255,255);
glTexCoord2f (0.0, 0.0); glVertex2f (-1.0, -1.0);
glTexCoord2f (1.0, 0.0); glVertex2f (1.0, -1.0);
glTexCoord2f (1.0, 1.0); glVertex2f (1.0, 1.0);
glTexCoord2f (0.0, 1.0); glVertex2f (-1.0, 1.0);
glEnd();
RTT.wgl.wglReleaseTexImageARB(RTT.wgl.hBuffer, WGL_FRONT_LEFT_ARB);
glDisable(GL_TEXTURE_2D);
glFlush();
SwapBuffers(ghDC);
wglMakeCurrent(NULL,NULL);
}
And here is my second GLForm's DrawScene : the problem is that I only see the colored quad but this QUAD is not textured, or the texture is empty :
void TGLForm2::DrawSceneForm2()
{
wglMakeCurrent(ghDC2, ghRC2);
ClientWidth = 1920;
ClientHeight = 1080;
wglShareLists(RTT.wgl.hGlRc, ghRC2);
if (wglShareLists(RTT.wgl.hGlRc,ghRC2) == FALSE)
SCmsgError(AnsiString().sprintf("wglShareLists returned %i", GetLastError()));
glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE,GL_MODULATE); //ARC
glClear(GL_DEPTH_BUFFER_BIT);
glViewport(0, 0, ClientWidth, ClientHeight);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(-1.0, 1.0, -1.0, 1.0, -1.0, 1.0);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
glEnable(GL_TEXTURE_2D);
glBindTexture(GL_TEXTURE_2D, RTT.texture);
RTT.wgl.wglBindTexImageARB(RTT.wgl.hBuffer, WGL_FRONT_LEFT_ARB);
glBegin(GL_QUADS);
glColor4ub(200,200,200,200);
glTexCoord2f (0.0, 0.0); glVertex2f (-1.0, -1.0);
glTexCoord2f (1.0, 0.0); glVertex2f (1.0, -1.0);
glTexCoord2f (1.0, 1.0); glVertex2f (1.0, 1.0);
glTexCoord2f (0.0, 1.0); glVertex2f (-1.0, 1.0);
glEnd();
RTT.wgl.wglReleaseTexImageARB(RTT.wgl.hBuffer, WGL_FRONT_LEFT_ARB);
glDisable(GL_TEXTURE_2D);
glFlush();
SwapBuffers(ghDC);
}
=> How may I check if this texture is empty or not ?
export it to a bitmap and check it ?
=> the wglShareLists in the DrawSceneForm2 returns an error with GetLastError :
Error 6 : ERROR_INVALID_HANDLE The handle is invalid.
=> Does somebody see what is wrong in this wglShareList or in my code ?
When calling wglShareLists, the context must not be current. Preferrably share before you do anything else. Sharing contexts will share anything created thereafter just fine. The best thing is to create all contexts that need to be shared at startup. If you use WGL_ARB_create_context, then you can even do this atomically within the creation call.
If you can't for some reason (though, why?) then wglMakeCurrent(0,0); first (you do the opposite in your code, you make the context current just before sharing).
I had a similar problem where :
wglShareLists returns 0
GetLastError() returns 3221684311 (0xc0070057)
It turns out you cant do much with the hglrc2 (2nd parameter passed into wglShareLists) before you call wglShareLists. In my case I created, and glUseProgram a shader, and then tried wglShareLists resulting in the errors shown above. Moving wglShareLists to immediately after wglCreateContext(hDC) of 2nd RC worked. I was able to share textures across the 2 contexts.

rectangular texture in OpenGL with both width and height POT

I'm trying to use rectangular texture with OpenGL. When the height is equal to the width of the texture, everything looks fine, however, when the height is different than the width the texture looks distorted.
My display function is (h and w are globals storing the height and the width of the image):
Please note that the size of the drawn image doesn't matter. It is distorted regardless of the actual polygon size.
void display(void)
{
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glEnable(GL_TEXTURE_2D);
glEnable(GL_BLEND);
glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_DECAL);
glBindTexture(GL_TEXTURE_2D, texName);
glTranslatef(-2.0f,-2.0f,0.0f);
glScalef(1.0f/128.0f,1.0f/128.0f,1.0f);
glBegin(GL_QUADS);
glTexCoord2f(0.0, 0.0); glVertex3f(0.0, 0.0, 0.0);
glTexCoord2f(0.0, 1.0); glVertex3f(0.0, w, 0.0);
glTexCoord2f(1.0, 1.0); glVertex3f(h, w, 0.0);
glTexCoord2f(1.0, 0.0); glVertex3f(h, 0.0, 0.0);
// Will be distorted also with the following:
/*glScalef(1.0f/128.0f,1.0f/128.0f,1.0f);
glTexCoord2f(0.0, 0.0); glVertex3f(0.0, 0.0, 0.0);
glTexCoord2f(0.0, 1.0); glVertex3f(0.0, h, 0.0);
glTexCoord2f(1.0, 1.0); glVertex3f(w, h, 0.0);
glTexCoord2f(1.0, 0.0); glVertex3f(w, 0.0, 0.0);*/
glEnd();
glFlush();
glDisable(GL_TEXTURE_2D);
}
I'm loading the texture with:
void *data = LoadBMP("c:\\dev\\64x128_face.bmp");
glPixelStorei(GL_UNPACK_ALIGNMENT, 1);
glGenTextures(1, &texName);
glBindTexture(GL_TEXTURE_2D, texName);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER,
GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER,
GL_NEAREST);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, w,
h, 0, GL_RGBA, GL_UNSIGNED_BYTE,
data);
When I'm loading a 64x64 square texture image, it looks fine. However when I'm loading a rectangular texture image, the image looks distorted.
How does OpenGL support rectangular POT texture? What's wrong with my code?
Your rect image is 64x128 but you render it with these commands:
glTexCoord2f(0.0, 0.0); glVertex3f(0.0, 0.0, 0.0);
glTexCoord2f(0.0, 1.0); glVertex3f(0.0, w, 0.0);
glTexCoord2f(1.0, 1.0); glVertex3f(h, w, 0.0);
glTexCoord2f(1.0, 0.0); glVertex3f(h, 0.0, 0.0);
where h is height (=128) and w (=64) is width.
But you placed height on x (which is width) and width on y (which is height).
Maybe try this instead:
glTexCoord2f(0.0, 0.0); glVertex3f(0.0, 0.0, 0.0);
glTexCoord2f(0.0, 1.0); glVertex3f(0.0, h, 0.0);
glTexCoord2f(1.0, 1.0); glVertex3f(w, h, 0.0);
glTexCoord2f(1.0, 0.0); glVertex3f(w, 0.0, 0.0);
You should probably check the support of the GL_ARB_texture_non_power_of_two extension before you use non-power-of-two values for h and w in glTexImage2D(), because specifying arbitrary heights and widths is only valid with this extension.