OpenGL Texture Transparency - c++

I'm using C++ and OpenGL to make a basic 2D game, I have a png image with transparent areas for my player. It works perfectly on my laptop and lab computers, but on my desktop the entire image is mostly see through, not just the areas that are meant to be. What could cause/fix this?
Here is the code I've used and is the same on all machines
glPushMatrix();
glEnable(GL_TEXTURE_2D);
glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
glBindTexture(GL_TEXTURE_2D, playerTex);
glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_REPLACE);
glTranslatef(XPos, YPos, 0.0);
glRotatef(heading, 0,0,1);
glBegin(GL_POLYGON);
glTexCoord2f(0.0, 1.0); glVertex2f(-40,40);
glTexCoord2f(0.0, 0.0); glVertex2f(-40,-40);
glTexCoord2f(1.0, 0.0); glVertex2f(40,-40);
glTexCoord2f(1.0, 1.0); glVertex2f(40,40);
glEnd();
glDisable(GL_BLEND);
glDisable(GL_TEXTURE_2D);
glPopMatrix();

I found the problem, I changed
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR_MIPMAP_LINEAR);
to
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
and it works correctly, not sure why though.

Does setting glColor4f(1,1,1,1) help? (I can't remember if GL_REPLACE is affected by vertex color)
Check glGetError() at appropriate places to see if you're not doing anything really wrong.
Other generic tips:
try to lock down all loose ends of the render state.
make sure your PNG-read lib works correctly everywhere. (create texture data in code otherwise)
It might be hardware related, and then it helps if you list the OS:es and CPU types/drivers.
I'm assuming you're running the same executable on all computers?

Related

How to use OpenGL and DevIL get user drawing pixels

I need to load an image, display the image, and let user draw some strokes on the image and get those drawing pixels.
I know OpenGL can load a texture image read by DevIL, and display it. But I am not sure how to use OpenGL to get user drawing pixels from loaded texture.
First off, note that a lot of this code is deprecated. But it is easier to understand from just code snippets. I'm not doing everything for you, but I hope to get you started by providing the basic workflow.
There are a few things you need to do to get the result you are looking for.
Firstly you have to load your texture in video memory. This is done with:
glGenTextures(1, texture_id); //generate a texture object
glBindTexture(GL_TEXTURE_2D, texture_id); //bind the texture
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST); //set filters
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST); //set filters
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, texture_width, texture_height, 0, GL_RGB, GL_UNSIGNED_BYTE, original_image_data); //create the actual texture in video ram
When this succeeds you can draw your texture with:
glMatrixMode(GL_PROJECTION);
glPushMatrix();
glLoadIdentity();
//set to ortographic projection
glOrtho(0.0, window_width, 0.0, window_height, -1.0, 1.0);
glMatrixMode(GL_MODELVIEW);
glPushMatrix();
glLoadIdentity();
glBindTexture(GL_TEXTURE_2D,texture_id);
glBegin(GL_QUADS);
glTexCoord2f(0, 1); glVertex2f(-1.0f, 1.0f);
glTexCoord2f(1, 1); glVertex2f( 1.0f, 1.0f);
glTexCoord2f(1, 0); glVertex2f( 1.0f, -1.0f);
glTexCoord2f(0, 0); glVertex2f(-1.0f, -1.0f);
glEnd();
glPopMatrix();
glMatrixMode(GL_PROJECTION);
glPopMatrix();
glMatrixMode(GL_MODELVIEW);
The next thing you will need to do is capture your user's mouse input. If you are on windows you can use the windowprocedure callback and look for the WM_MOUSE event. If you use a library for window management then the library will probably provide functionality for keyboard and mouse intput.
Now that you have the mouse input, you should draw a line on the screen every time a user moves the mouse while holding down the button:
glLineWidth(2.5);
glColor3f(1.0, 0.0, 0.0);
glBegin(GL_LINES);
glVertex2f(mouse_x_start, mouse_y_start);
glVertex2f(mouse_x_end, mouse_y_end);
glEnd();
glColor3f(1.0, 1.0, 1.0);
When all of the above goes well, you should see your texture on the screen and a red line if you hold the mouse button and move the mouse. You are nearly there. The last thing that needs to be done is read the pixels. You can do this with glReadPixels() like this:
void glReadPixels(0, 0, window_width, window_height, GL_RGB, GL_UNSIGNED_BYTE, new_image_data);
You now have a byte array with the user's strokes on it. I would highly recommend writing your own code for this process, because the code I used is deprecated, and should only be used when targeting older platforms. The workflow should remain the same though. I hope this is enough to get you started. Good luck!
I assume you are working on a plain 2D app.
The idea is that if performance isn't your concern you may consider doing everything in software by crudely manipulating pixel data and drawing the image with your graphics library of choice. I recommend the Simple Directmedia Layer library. It has also a sublibrary called SDL_image that can load a good assortment of formats.
An approach like this works until you mess with big/multiple textures. If you need the GPU horsepower for realtime framerates then you must fight your way through FrameBuffer Objects, but beware! This basically means "do-everything-you-can-inside-the-pixel-shaders" and limit as much as you can calls like glReadPixels/glTexImage2D &co.

What am I doing wrong that I can't render an image/texture properly in OpenGL?

I'm trying to render an image to the window. Super simple (or so I thought). No matter what I do, I always get this purple square. Some tidbits of my code:
// load image from file, called one time on load:
glClearColor (0.0, 0.0, 0.0, 0.0);
glShadeModel(GL_FLAT);
glEnable(GL_DEPTH_TEST);
RgbImage theTexMap( filename );
glGenTextures(1, &textureID);
glBindTexture(GL_TEXTURE_2D, textureID);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA8, theTexMap.GetNumCols(), theTexMap.GetNumRows(), 0, GL_RGB, GL_UNSIGNED_BYTE, theTexMap.ImageData() );
// render quad and texture, called inside glutDisplayFunc callback
glEnable(GL_TEXTURE_2D);
glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_MODULATE);
glBegin(GL_QUADS);
glTexCoord2f(0.0, 0.0); glVertex3f(-50.0, -50.0, 0.0);
glTexCoord2f(0.0, 1.0); glVertex3f(-50.0, 50.0, 0.0);
glTexCoord2f(1.0, 1.0); glVertex3f(50.0, 50.0, 0.0);
glTexCoord2f(1.0, 0.0); glVertex3f(50.0, -50.0, 0.0);
glEnd();
glFlush();
glDisable(GL_TEXTURE_2D);
I'm cutting out a lot of code because I'm attempting to extend an example from third party library (ARToolkit). There's a lot more code in there than needs displaying here.
Any ideas for a mistake I might be making that would display the quad, but not render the texture/image?
Rebind your texture object in your glutDisplayFunc() callback, Just In Caseā„¢.
Also, I'm slightly leery of the GL_RGBA8. Try GL_RGBA. Probably superstition on my part though.
Dunno without trying it myself, but it seems a bit strange that you're using GL_RGBA8 for internal format and GL_RGB for pixel format.
Personally, unless I'm using it on my texture I'll also do a GL_DISABLE(GL_LIGHTING) too for textured objects, dunno if that will help but I know I've run into some things I didn't really understand as far as light & texture combinations are concerned.
A few ideas:
Shouldn't you call glEnable(GL_TEXTURE_2D) before you configure and upload the texture?
Did you make sure that texture dimensions are powers of 2?
If you want to use GL_LINEAR magnification/minification functions, you will probably want to generate mipmaps from your original texture.
did you try to use GL_REPLACE rather than GL_MODULATE ?
I don't see you passing any color in with your vertices, and GL_MODULATE will modulate with whatever is the current color.
Your texture only specifies MIP level 0, and not the other MIP levels, so the result may be undefined if it isn't rendering at MIP level 0. The quick fix is:
glTexParameteri(GL_TEXTURE_2D, GL_GENERATE_MIPMAP, GL_TRUE);
It's an easy mistake. A good reference:
http://www.opengl.org/wiki/Common_Mistakes

texture on cube-side with opengl

hello i want to use a texture on a cube (created by glutsolidcube()), how can i define where the texture is pictured at?
(for example on the "frontside" of a cube)
glEnable(GL_TEXTURE_2D);
glBindTexture(GL_TEXTURE_2D, texture[0]);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, filterMode);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, filterMode);
glColor4f(0.8,0.7,0.11,1.0);
glPushMatrix();
glScalef(4, 1.2, 1.5);
glTranslatef( 0, 0.025, 0);
glutSolidCube(0.1);
glPopMatrix();
glDisable(GL_TEXTURE_2D);
thanks
Not possible, since glutSolidCube() only generates vertexes and normals, not texture coordinates.
However, there are workarounds.
Unfortunately, using glutSolidCube is impossible, as it doesn't support texturing. What I'd suggest is a tutorial that explains the process that may help you. It's a bit outdated, but NeHe's texturing tutorial has some code that explains how to draw a cube, and the code is commented to explain which side is which for you.

OpenGl texture mapping blocking colours on FreeType?

I'm using FreeType in order to allow fonts to be used in OpenGL. However, I'm having a problem where I cannot change the font colour whenever I do texture mapping. No matter what I select using glColor3f it will just come out white. The texture works fine.
glClear(GL_COLOR_BUFFER_BIT);
glLoadIdentity();
glColor3f(0.5,0.0,0.5);
glPushMatrix();
glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
glEnable(GL_TEXTURE_2D);
glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_REPLACE);
glBindTexture(GL_TEXTURE_2D, texName);
glBegin(GL_POLYGON);
glTexCoord2f(0,1); glVertex2f(-16,-16);
glTexCoord2f(0,0); glVertex2f(-16,16);
glTexCoord2f(1,0); glVertex2f(16,16);
glTexCoord2f(1,1); glVertex2f(16,-16);
glEnd();
glDisable(GL_TEXTURE_2D);
glDisable(GL_BLEND);
glPopMatrix();
glColor3f(1,0,0);
print(our_font, -300+screenWidth/2.0, screenHeight/2.0, "fifty two - %7.2f", spin);
This is the problem code, I can confirm that drawing a polygon beneath this code will indeed make it red. The text is not changing to red though which it should; if you remove the texture mapping above it will turn red again, I can only think it is a problem with enabling and disabling and I've forgotten to do something...?
Fixed it. Just after I disabled texturing I forgot to set the environment back to modulate:
glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_MODULATE);
adding this after disabling texture/blending fixes the problem.

Simple OpenGL texture map not working?

I'm trying to figure out texture mapping in OpenGL and I can't get a simple example to work.
The polygon is being drawn, though it's not textured but just a solid color. Also the bitmap is being loaded correctly into sprite1[] as I was successfully using glDrawPixels up til now.
I use glGenTextures to get my tex name, but I notice it doesn't change texName1; this GLuint is whatever I initialize it to, even after the call to glGenTextures...
I have enabled GL_TEXTURE_2D.
Heres the code:
GLuint texName1 = 0;
glGenTextures(1, &texName1);
glBindTexture(GL_TEXTURE_2D, texName1);
glPixelStorei(GL_UNPACK_ALIGNMENT, 1);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT);
glTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT);
glTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_MODULATE);
glTexImage2D(GL_TEXTURE_2D, 0, GL_BGRA_EXT, sprite1[18], sprite1[22], 0, GL_BGRA_EXT, GL_UNSIGNED_BYTE, &sprite1[54]);
glColor3f(1, 1, 0);
glBindTexture(GL_TEXTURE_2D, texName1);
glBegin(GL_QUADS);
glTexCoord2f (0.0, 0.0);
glVertex3f (0.0, 0.0, -5.0f);
glTexCoord2f (1.0, 0.0);
glVertex3f (.5, 0.0, -5.0f);
glTexCoord2f (1.0, 1.0);
glVertex3f (.5, .5, -5.0f);
glTexCoord2f (0.0, 1.0);
glVertex3f (0.0, .5, -5.0f);
glEnd();
UPDATE:
I'm at a loss. Here's everything I've tried:
Turns out I was initializing my texture before OGL was initialized. The texture is initialized (glGenTextures->glTexImage2D) in a class constructor and drawn (glBegin->glEnd) in a member function that is called every frame. genTextures appears to be working correctly now and I'm getting a name of 1.
Every possible combination of GL_RGBA8, GL_BGRA_EXT (GL_BGRA doesn't work on my system; I need the _EXT), and I even removed the alpha channel from the bitmap and tried all combinations of GL_RGB, GL_BGR_EXT, etc etc. No luck.
Tried procedurally creating a bitmap and using that
Made sure GL_COLOR_MATERIAL isn't enabled.
Changed bitmap size to 32x32.
Tried glTexEnvi instead of glTexEnvf.
In addition to mentat's note that you might have a problem with non-power-of-two texture dimensions, you mention the texture name generation not changing the name.
That sounds as if you're calling glGenTextures() too early, i.e. before initializing OpenGL. If you're not, then I suggest adding code just after the call to glGenTextures() that check the OpenGL error state, by calling glGetError().
In your comments, you say your bitmap is 29x20 pixels. Afaik to generate a valid texture, OpenGL requires that the image size (on each dimension) be a power of 2. It doesn't need to be a square, it can be a rectangle though. You can overcome this by using some OpenGL extensions like GL_ARB_texture_rectangle.
I'll put this here as I had the same issue and found another post explaining the issue.
The iPhone does support GL_BGRA(GL_EXT_BGRA) but seemingly only as an input format and not as an internal format. So, if you change the glTexImage2D call to have an internal format of GL_RGBA then it works.
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_BGRA, GL_UNSIGNED_BYTE, &sprite1[54]);
I hope this helps someone else that stumbles upon this post.
Some random ideas:
GL_COLOR_MATERIAL might be enabled
change "glTexEnvf" to "glTexEnvi" and see if that helps
if texName1 is 0 after glGenTextures you might not have an active OpenGL context
For error checking I recommend writing a small function that prints readable output for the most common results from glGetErrors and use that to find the line that creates the error. Another possibility would be to use something like GLIntercept, BuGLe or gDEBugger.
My OpenGL is rusty, but I remember having same problems with with glTexImage2D
. Finally I managed to make it work, but I always had more luck with gluBuild2DMipmaps
so i ended up with
gluBuild2DMipmaps (
GL_TEXTURE_2D, type, i.width, i.height, type, GL_UNSIGNED_BYTE, i.data
);
which replaced
glTexImage2D (
GL_TEXTURE_2D, 0, type, i.width, i.height, 0, type, GL_UNSIGNED_BYTE, i.data
);
I found the problem. My call to glEnable was glEnable(GL_BLEND | GL_TEXTURE_2D). Using glGetError I saw I was getting a GL_INVALID_ENUM for this call, so I moved GL_TEXTURE_2D to its own enable function and bingo. I guess logical OR isn't allowed for glEnable?
First thing I'd check is the colour material setting, as mentioned by ShadowIce, then check your texture file to ensure it's a reasonable size (i.e. something like 256x256) and an RGB bitmap. If the file has even a slight problem it WILL NOT render correctly, no matter how you try.
Then, I'd stop trying to just debug that code and instead see what you have different to the tutorial on the NeHe website.
NeHe is always a good place to check if you're trying to do stuff in OpenGL. Textures are probably the hardest thing to get right, and they only get more difficult as the rest of your GL skills increase.