OpenGL Fade in / fade out a texture2D - opengl

I am writing an application that displays .jpg files that are stored as Texture2D (RGB) in OpenGL. I want to smoothly change from one texture2D to the next by fading to black, then fading into the next texture.
After looking for some explanation I wrote something like this.
void renderTexture()
{
glMatrixMode(GL_MODELVIEW);
glEnable(GL_TEXTURE_2D);
glBindTexture(GL_TEXTURE_2D, mTexture);
gluSphere(mQuad, 1.0f, 50, 50);
glBindTexture(GL_TEXTURE_2D, 0);
}
void fadeToBlack()
{
glEnable(GL_BLEND);
glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA);
for (GLfloat alpha = 1.0; alpha > 0.0; alpha -= 0.05)
{
glColor4f(0.0, 0.0, 0.0, alpha);
renderTexture();
glFlush();
glutSwapBuffers();
}
glDisable(GL_BLEND);
}
Unfortunately, this does not fade to black but instead switches to black immediately. I must have some misunderstanding on how GL_BLEND is working here. Can somebody please point out what I am doing wrong?
** EDIT: This did the trick. Thanks a lot j-p and Benjamin for the pointers **
void fadeToBlack()
{
for (GLfloat alpha = 1.0; alpha > 0.0; alpha -= 0.001)
{
renderTexture();
glColor4f(alpha, alpha, alpha, alpha);
glFlush();
glutSwapBuffers();
}
glColor4f(1.0, 1.0, 1.0, 1.0);
}

The for loop will be executing so quickly that the texture changes will appear to happen instantly.

Related

Texture Displayed with Green Hue

I'm learning how to apply textures in OpenGL. I have a fairly simple cube on which I am trying to apply a texture to make it look like a wooden board. When I apply my texture, it displays with a green hue. I can apply some other textures that look just fine, so I can't figure out what is wrong with this one. I created the texture from a jpg that I downloaded. The bmp file looks fine when I view it in Preview (I'm on a Mac). I'll attach a screenshot of the original bitmap and also of how it looks when rendered by OpenGL.
The texture loading code that I am using can be found here:
unsigned int texture[2]; // Texture names
// define the board
float square_edge = 1;
float border = 0.5;
float board_thickness = 0.25;
float board_corner = 4 * square_edge + border;
float board_width = 2 * board_corner;
GLfloat board_vertices[8][3] = {
{-board_corner, board_corner, 0.0},
{-board_corner, -board_corner, 0.0},
{ board_corner, -board_corner, 0.0},
{ board_corner, board_corner, 0.0},
{-board_corner, board_corner, -board_thickness},
{-board_corner, -board_corner, -board_thickness},
{ board_corner, -board_corner, -board_thickness},
{ board_corner, board_corner, -board_thickness}
};
void polygon(int a, int b, int c, int d) {
glBindTexture(GL_TEXTURE_2D, texture[0]);
glBegin(GL_QUADS);
glTexCoord2f(0.0, 0.0); glVertex3fv(board_vertices[a]);
glTexCoord2f(1.0, 0.0); glVertex3fv(board_vertices[b]);
glTexCoord2f(1.0, 1.0); glVertex3fv(board_vertices[c]);
glTexCoord2f(0.0, 1.0); glVertex3fv(board_vertices[d]);
glEnd();
}
void draw_board() {
glPushMatrix();
glRotatef(rotx, 1.0, 0.0, 0.0);
glScalef(1/board_corner, 1/board_corner, 1/board_corner);
glEnableClientState(GL_VERTEX_ARRAY);
glVertexPointer(3, GL_FLOAT, 0, board_vertices);
glBindTexture(GL_TEXTURE_2D, texture[1]);
glColor3f(1.0, 1.0, 1.0); //color of the border, sides, bottom of board
// draw the board
polygon(0,3,2,1);
polygon(2,3,7,6);
polygon(0,4,7,3);
polygon(1,2,6,5);
polygon(4,5,6,7);
polygon(0,1,5,4);
glPopMatrix();
}
void display()
{
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glEnable(GL_DEPTH_TEST);
glEnable(GL_TEXTURE_2D);
glLoadIdentity();
double Ex = -2*dim*Sin(th)*Cos(ph);
double Ey = +2*dim *Sin(ph);
double Ez = +2*dim*Cos(th)*Cos(ph);
gluLookAt(Ex,Ey,Ez , 0,0,0 , 0,Cos(ph),0);
glTexEnvi(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_MODULATE);
draw_board();
glFlush();
glutSwapBuffers();
}
The problem is in the bmp image itself. As I mentioned, I created this from a jpg file by converting it via Gimp. Somehow, this conversion went awry. When I run 'file wood.bmp' from the command line, my response is 'wood.bmp: data'. It should show 'wood.bmp: PC bitmap, Windows 3.x format, 128 x 128 x 24" or something similar. I did the conversion using good ol' MS Paint, and the problem went away.
I'm trying to find out how to do this in Gimp, but for now this all works. Thanks for all the suggestions!
When exporting a .bmp image with gimp, make sure you select the "Do not write color space information" option, underneath the Compatibility Options dropdown. .bmp images exported this way won't appear with a green hue when rendered with opengl.

OpenGL depth buffer isn't working

I am attempting to make a simple drawing using openGL. However, the depth buffer doesn't appear to be working.
Other people with a similar problem are typically doing one of two things wrong:
Not including glEnable(GL_DEPTH_TEST)
Bad clipping values
However, my code does not have either of these problems.
...
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
gluPerspective(25.0,1.0,10.0,200.0);
// Set the camera location
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
gluLookAt(20.0, 10.0, 50.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0);
// Enable depth test
glEnable(GL_DEPTH_TEST);
// Cull backfacing polygons
glCullFace(GL_BACK);
glEnable(GL_CULL_FACE)
drawCoordinateAxis();
drawBox(5.0,2.0,5.0,0.8,0.0,0.0);
glTranslated(1.0,-1.0,1.0); //The box is 5x2x5, it is shifted 1 unit down and 1 in the x and z directions
drawBox(5.0,2.0,5.0,0.0,1.0,1.0);
...
When I execute my code, this is drawn. http://imgur.com/G9y41O1
Note that the blue box and the red box collide, so the red box should be covering part of the blue box.
The functions drawCoordinateAxis() and drawBox() just draw a few primitives, nothing fancy inside.
I am running this on Debian squeeze.
void reshape(GLint width, GLint height)
{
g_Width = width;
g_Height = height;
glViewport(0, 0, g_Width, g_Height);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
gluPerspective(65.0, (float)g_Width / g_Height, g_nearPlane, g_farPlane);
glMatrixMode(GL_MODELVIEW);
}
So set Matrix Mode to GL_PROJECTION first, then gluPerspective.... and then back to MODELVIEW mode.

openGL - how to rotate an object without rotating lights

i want to draw a rotating cube in the middle of the screen, and i want it to be lit by a light above it (i want it to look as if the cube was being lit from a fixed screen position). my problem is that i don't know how to prevent the light from rotating with the cube.
here's the code:
(SUMMARY: initGL, paintGL, and resizeGl are the functions that you always have to implement. in paintGL i use makeCube(). in makeCube() i use glBegin(GL_QUADS) to make a cube,and i use calcNormals() to calculate the normals of the cube )
-------------initGL--------------------------
angle=0.0;
glEnable (GL_DEPTH_TEST);
glEnable (GL_LIGHTING);
GLfloat LightDiffuse[]= { 1.0f, 1.0f, 1.0f, 1.0f };
GLfloat LightPosition[]= { 0.0f, 1.5f,1.5f, 1.0f };
glLightfv(GL_LIGHT0, GL_DIFFUSE, LightDiffuse);
glLightfv(GL_LIGHT0, GL_POSITION,LightPosition);
glEnable (GL_LIGHT0);
--------------paintGL------------------
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
glTranslatef(0.0, 0.0, -13.0);
glRotatef(angle,0.0f,1.0f,0.0f);
makeCube();
angle+=0.3;
--------------void makeCube()-------------------
float P[8][3]={ {-1,-1, 1},{1,-1, 1},{1,1, 1},{-1,1, 1},
{-1,-1,-1},{1,-1,-1},{1,1,-1},{-1,1,-1}};
float * planes[6][4] ={ {P[0],P[1],P[2],P[3]},
{P[1],P[5],P[6],P[2]},
{P[4],P[7],P[6],P[5]},
{P[0],P[3],P[7],P[4]},
{P[3],P[2],P[6],P[7]},
{P[0],P[4],P[5],P[1]}};
int i;
for(i=0;i<6;i++){
float *normal;
normal = calcNormal(planes[i][0],planes[i][1],planes[i][2]);
glBegin(GL_QUADS);
glNormal3f(normal[0], normal[1], normal[2]);
glVertex3f(planes[i][0][0],planes[i][0][1],planes[i][0][2]);
glVertex3f(planes[i][1][0],planes[i][1][1],planes[i][1][2]);
glVertex3f(planes[i][2][0],planes[i][2][1],planes[i][2][2]);
glVertex3f(planes[i][3][0],planes[i][3][1],planes[i][3][2]);
glEnd();
}
----------------float* calcNormal()----------------------
float vec1[3] = {P2[0]-P1[0],P2[1]-P1[1],P2[2]-P1[2]};
float vec2[3] = {P3[0]-P2[0],P3[1]-P2[1],P3[2]-P2[2]};
float cross[3] = {vec1[1]*vec2[2]-vec2[1]*vec1[2],
vec1[2]*vec2[0]-vec2[2]*vec1[0],
vec1[0]*vec2[1]-vec2[0]*vec1[1]};
float modCross = sqrt(cross[0]*cross[0]+cross[1]*cross[1]+cross[2]*cross[2]);
cross[0]/=modCross;
cross[1]/=modCross;
cross[2]/=modCross;
return cross;
-------------resizeGL--------------------------
glViewport(0, 0, width, height);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
GLfloat x = GLfloat(width) / height;
glFrustum(-x, +x, -1.0, +1.0, 4.0, 15.0);
glMatrixMode(GL_MODELVIEW);
It seems that you're transforming the position of the light in your paintGL section.
Looking over old code, I found an app in my code directory that loads and rotates .OBJ meshes, while allowing the light to be moved.
I think that the solution is to set the position of the light each frame. (Can't remember it's been over 18 months since I touched the project)
void idleFunc()
{
light(); /// *** I think you need to replicate this functionality ****
glPushMatrix();
myGluLookAt(0.0, -.50, -6.0, /* eye is at (0,0,5) */
0.0, 0.0, 0.0, /* center is at (0,0,0) */
0.0, 1.0, 0.); /* up is in positive Y direction */
transformFunc();
displayFunc();
glPopMatrix();
}
void displayFunc()
{
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
if (useDrawList)
glCallList(DLid);
else
drawObj(loadedObj);
drawLight0(); // *** just displays an unlit sphere at the position of the light **
glutSwapBuffers();
frameCount++;
}
/* set the poition of each of the lights */
void light()
{
glLightfv(GL_LIGHT0, GL_POSITION, lightPos1);
glLightfv(GL_LIGHT1, GL_POSITION, lightPos2);
}
i solved this problem drawing the cube with VERTEX ARRAYS rather than DIRECT MODE, it seems that rotations or lights affect the object in a different way with each method, which is quite weird

Drawing the alpha channel correctly in OpenGl

After loading an image, I have the individual bytes for each channel loaded into an array of unsigned characters. It's passed to a function that projects it as a texture onto a quad. Everything seems to work properly other than the alpha channel, which shows up as the background color. I'm using OpenGL to draw the image. Would I benefit by adding a layering mechanism? Also, how can I achieve the transparent effect that I want?
Note: This is the code that I have a feeling needs changed:
void SetUpView()
{
// Set color and depth clear value
glClearDepth(1.f);
glClearColor(1.f, 0.f, 0.f, 0.f);
// Enable Z-buffer read and write
glEnable(GL_DEPTH_TEST);
glEnable(GL_TEXTURE_2D);
glDepthMask(GL_TRUE);
glEnable (GL_BLEND);
glBlendFunc (GL_ONE, GL_ONE_MINUS_SRC_ALPHA);
// Setup a perspective projection
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
gluPerspective(90.f, 1.f, 1.f, 500.f);
};
Also, here's the code to render the quad.
void DrawTexturedRect(RectBounds *Verts, Image *Texture)
{
glBindTexture(GL_TEXTURE_2D, GetTextureID(Texture));
glPixelStorei(GL_UNPACK_ALIGNMENT, 1);
glPixelZoom(1, -1);
glBegin(GL_QUADS);
glTexCoord2f(0.0, 1.0);
glVertex3f(Verts->corners[0].x, Verts->corners[0].y, Verts->corners[0].z);
glTexCoord2f(1.0, 1.0);
glVertex3f(Verts->corners[1].x, Verts->corners[1].y, Verts->corners[1].z);
glTexCoord2f(1.0, 0.0);
glVertex3f(Verts->corners[2].x, Verts->corners[2].y, Verts->corners[2].z);
glTexCoord2f(0.0, 0.0);
glVertex3f(Verts->corners[3].x, Verts->corners[3].y, Verts->corners[3].z);
glEnd();
};
The Image class holds an array of unsigned chars, obtained using OpenIL.
Relevant code:
loaded = ilLoadImage(filename.c_str());
ilConvertImage(IL_RGBA, IL_UNSIGNED_BYTE);
unsigned char *bytes = ilGetData();
//NewImage is an instance of an Image. This is returned and passed to the above function.
NewImage->data = bytes;
Before drawing anything transparent you should call:
glDepthMask(false);
And then afterwards:
glDepthMask(true);
Also, all transparent objects must be drawn after all opaque ones.
glClearColor(1.f, 0.f, 0.f, 0.f);
I'd assume that's at the RGBA default, and you're setting red to 1.0.
How about a better explanation of what you're trying to accomplish?
Try:
glBlendFunc (GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
Edit: I know what that looks like. It looks like you are drawing the blue square (Which has a Z-order position placing it behind the cursor) AFTER the cursor. You have to draw things in the correct, back-to-front Z-Order when alpha blending or you see errors like you are seeing.

Trying to switch a texture when player dies (OpenGL + C++)

I'm creating a 2D game and when the player dies I want the texture I to switch to another (to show an explosion) I also want the game to pause for a second or two so the user can see that the texture has changed.
My textures are loading correctly because I can apply it to a shape and i can see it if I say switched it with the players original texture.
I think that it must be that it is only rendering in one frame and then disappearing or something like that. Here is the code.
void Player::die(){
if(Player::lives > 0){
glPushMatrix();
glEnable(GL_TEXTURE_2D);
glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
glBindTexture(GL_TEXTURE_2D, explosionTex);
glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_REPLACE);
glTranslatef(200, 200, 0.0);
glRotatef(heading, 0,0,1);
glColor3f(1.0,0.0,0.0);
glBegin(GL_POLYGON);
glTexCoord2f(0.0, 1.0); glVertex2f(-40,40);
glTexCoord2f(0.0, 0.0); glVertex2f(-40,-40);
glTexCoord2f(1.0, 0.0); glVertex2f(40,-40);
glTexCoord2f(1.0, 1.0); glVertex2f(40,40);
glEnd();
glDisable(GL_BLEND);
glDisable(GL_TEXTURE_2D);
glPopMatrix();
Sleep ( 1000 );
*xscroll = 0;
*yscroll = 0;
Player::lives--;
Player::XPos = 0;
Player::YPos = 0;
Player::heading = 0;
Player::speed = 0;
}
}
How can I get it to switch texture, display that and then sleep for a time?
You need to swap your buffers before you Sleep() if you want to see anything.
More generally, replace the Sleep() with a ExplodeStart, which you set to CurrentTimeInMilliseconds(). Then each time through your render loop check if CurrentTimeInMilliseconds()-ExplodeStart > 1000. If it is, switch to your regular player texture again.