Keeping original colors - opengl

I'm trying to place a texture (with alpha) on another texture in OpenGL. I do it without any problems, but it's not as I wanted: my nearest image's color is strongly affected by background image (furthest texture) resulting in appearing orange in spite of red.
Anyone knows a way of blending (or getting rid of alpha) that will resolve this issue?
Blending initialization:
glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE);
Drawing scene:
glClear(GL_COLOR_BUFFER_BIT);
glLoadIdentity();
//furthest image (background)
glBindTexture(GL_TEXTURE_2D, texture[1]);
glBegin(GL_QUADS);
glTexCoord2f(0, 0); glVertex2f(0, 0);
glTexCoord2f(1, 0); glVertex2f(500, 0);
glTexCoord2f(1, 1); glVertex2f(500, 500);
glTexCoord2f(0, 1); glVertex2f(0, 500);
glEnd();
//nearest image (appears orange, should be red)
glBindTexture(GL_TEXTURE_2D, texture[0]);
glBegin(GL_QUADS);
glTexCoord2f(0, 0); glVertex2f(100, 100);
glTexCoord2f(1, 0); glVertex2f(300, 100);
glTexCoord2f(1, 1); glVertex2f(300, 300);
glTexCoord2f(0, 1); glVertex2f(100, 300);
glEnd();
glutSwapBuffers();
EDIT.
Here's an image depicting how it looks:
Here's an image of how it should look:

I believe what you want is 'alpha testing', not blending. See
glEnable(GL_ALPHA_TEST)
glAlphaFunc()
If you want to leave blending enabled, you should use
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
This will only use the source color in places where the source alpha is 1. Currently your function adds the source color to the background color.
If you don't want any mixing of colors, then using alpha test is the better way to go, as it uses less resources then blending.

This blend func
glBlendFunc(GL_SRC_ALPHA, GL_ONE);
is the cause of your problems. The GL_ONE for destination means, that whatever is present already in the framebuffer will be added to the incoming colour regardles of the alpha value.
In your case your red texture gets added with the greenish background. And since red + greenish = orange this is what you get.
What you want is mask the previous content in the destination framebuffer with your alpha channel, which is done using
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
Also remember that OpenGL state is meant to be set and reset on demand, so when drawing other textures then you might need other setting for blending and blend func.

With help of you all, I managed to resolve the issue by:
resaving my texture as PNG (in spite of BMP)
changing blending function to glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
Thanks to all contributtors :)

Related

projecting image with opengl and Antialiasing

In my work I overlap a part of a captured frame with an image. I open my webcam with openCV and then I transform the captured frame in a texture and display it in a GLUT window. Also, I overlap a part of this texture with this image:
I do this in real time, and the result is:
As you can see, edges of projected image are inaccurate. I think it is an aliasing problem, but I don't know how to do the antialiasing process with opengl. I've tried to look for on web, but I didn't find a good solution for my problem.
In my "calculate" function I transform the mat image into a texture usign the following code:
GLvoid calculate(){
...
...
cvtColor(image, image, CV_BGR2RGB);
glHint(GL_PERSPECTIVE_CORRECTION_HINT,GL_NICEST);
glPixelStorei(GL_UNPACK_ALIGNMENT, 1);
glBindTexture(GL_TEXTURE_2D, textures[1]);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_LINEAR);
//glTexImage2D(GL_TEXTURE_2D, 0, 4,image.cols, image.rows, 0, GL_RGB,GL_UNSIGNED_BYTE, image.data);
gluBuild2DMipmaps(GL_TEXTURE_2D, GL_RGB, image.cols, image.rows, GL_RGB, GL_UNSIGNED_BYTE, image.data);
}
and I show the result using this code:
GLvoid Show(void) {
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glEnable(GL_TEXTURE_2D);
// Matrice di proiezione
glMatrixMode (GL_PROJECTION);
glLoadIdentity();
gluOrtho2D(0, WIDTH, HEIGHT, 0);
// Matrice model view
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
...
...
glBindTexture(GL_TEXTURE_2D, textures[1]);
glBegin(GL_QUADS);
glTexCoord2f(0.0f, 0.0f); glVertex2f((GLfloat)((coord[3].x)),(GLfloat)(coord[3].y));
glTexCoord2f(1.0f, 0.0f); glVertex2f((GLfloat)((coord[0].x)),(GLfloat)(coord[0].y));
glTexCoord2f(1.0f, 1.0f); glVertex2f((GLfloat)((coord[1].x)),(GLfloat)(coord[1].y));
glTexCoord2f(0.0f, 1.0f); glVertex2f((GLfloat)((coord[2].x)),(GLfloat)(coord[2].y));
glEnd();
}
glFlush();
glutSwapBuffers();
}
In initialization function I write this:
GLvoid Init() {
glGenTextures(2, textures);
glClearColor (0.0, 0.0, 0.0, 0.0);
glEnable (GL_POLYGON_SMOOTH);
glHint (GL_POLYGON_SMOOTH_HINT, GL_DONT_CARE);
glDisable(GL_DEPTH_TEST);
}
but it doesn't work...
I work on Win7 x64, with OPenGL 4.0 and Glut 3.7. My video card is an NVidia GeForce gt 630. also I enabled antialiasing from Nvidia control panel, but nothing is changed.
does anyone know how to help me?
I solved my problem! I used GLFW insted of GLUT, as #Michael IV suggested me!
in order to do antialiasing with GLFW i used this line of code:
glfwOpenWindowHint(GLFW_FSAA_SAMPLES,4);
and the result now is very good, as you can see in the following image.
Thanks for your help!
First I wonder why you are using OpenGL 4.0 to work with fixed (deprecated ) pipeline...
But let's get to the problem.What you need is MSAA .I am not sure ,enabling it via control panel will always do the trick.Usually it is done inside the code.
Unfortunately for you , you selected to use GLUT which has no option to set hardware MSAA level.If you want to be able to do so switch to GLFW.Another option is do do it manually but that implies you use custom FBOs.In such a scenario you can create FBO with MSAA texture attachment setting MSAA level for the texture (also you can apply custom multisampling algorithms in fragment shader if you wish).
Here is a thread on this topic.
GLFW allows you specifying MSAA level on window setup.See the related API.
MSAA does degrade the performance ,but how much depends on your hardware and probably OpenGL drivers.

Color dodge in OpenGL

I need to render an image on top of a background in OpenGL and I'm trying to get the same result as the "Color Dodge" in Photoshop but I'm not able to do it.
Right now I'm doing:
glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE);
// background
glEnable(GL_TEXTURE_2D);
glBindTexture(GL_TEXTURE_2D, background);
glBegin(GL_TRIANGLE_STRIP);
glTexCoord2f(0.0, 0.0);
...
glEnd();
glDisable(GL_TEXTURE_2D);
// image
glEnable(GL_TEXTURE_2D);
glBindTexture(GL_TEXTURE_2D, image);
glBegin(GL_TRIANGLE_STRIP);
glTexCoord2f(0.0, 0.0);
...
glEnd();
glDisable(GL_TEXTURE_2D);
The background is a tga with no alpha channel. The image is a tga with alpha channel.
This renders the image with alpha on the background but way too bright.
I read that it should be as easy as:
glBlendFunc(GL_ONE, GL_ONE);
But the image despite of having alpha channel gets rendered as a white square.
Clearly I'm doing something wrong.
You're not going to be able to use blending to get the equivalent of the Photoshop "Color Dodge" effect. It's a more complicated mathematical function than can be expressed using standard blending logic. So you're going to have to come up with some programmatic blending methodology to make it work.
There is a way to make color dodge in GL blend func. It's like the Photoshop version of that mixing mode, but only it's darker than photoshop's "Color Dodge". You have to use this type of function:
glBlendFunc(GL_DST_COLOR, GL_ONE);

how to use glDrawPixels to render a picture as a background

Here is the thing: I want to load a picture as a background filled in the whole viewport. This background should always face to the camera no matter where the camera face to.
First I naturally think use a texture as a background, my code is below:
glMatrixMode(GL_PROJECTION);
glPushMatrix();
glLoadIdentity();
glOrtho(0,1,0,1,0,1);
glMatrixMode(GL_MODELVIEW);
glPushMatrix();
glLoadIdentity();
glBindTexture(GL_TEXTURE_2D, myimage.GetID());
glEnable(GL_TEXTURE_2D);
glBegin(GL_QUADS);
glTexCoord2f(0, 0); glVertex2f(0, 0);
glTexCoord2f(1, 0); glVertex2f(1, 0);
glTexCoord2f(1, 1); glVertex2f(1, 1);
glTexCoord2f(0, 1); glVertex2f(0, 1);
glEnd();
glMatrixMode(GL_PROJECTION);
glPopMatrix();
glMatrixMode(GL_MODELVIEW);
glPopMatrix();
glDisable(GL_TEXTURE_2D);
glBindTexture(GL_TEXTURE_2D, 0);
Believe me, myimage is a CIMAGE class that can load pics into textures, it works well.
However, for some unknown reason, my application cannot load a texture into a rectangle. (I described this problem here click) As a result, I only can see a rectangle frame around my viewport.
So, I figure out another solution.
I use the glDrawPixels instead of a texture. My code is below:
glRasterPos2i(0, 0);
glDrawPixels(myimage.GetWidth(), myimage.GetHeight(), (myimage.GetBPP() == 24)?GL_RGB:GL_RGBA, GL_UNSIGNED_BYTE,
myimage.GetData());
The picture appeared! However, the pic didn't always face to my camera. It only appears in a particular direction. You know, like a object in the scene, but not a background always face to the camera.
So anybody know how to use the glDrawPixels to implement a background?
By the way, I think this background is not a object placed in the 3D scene. So billboards may not be my solution. Again, this background filled in the whole view port and always face to camera.
One of the reasons your texture loading might not work is because it might not have power-of-two dimensions. Try a square 256x256 texture (or the like) to see if this is the problem. Look here for more info on Rectangle Textures.
Coming back to your background issue - the right way to do this would be to
Set up an orthographic projection/viewport that fills the entire screen.
glViewport(0,0,nw,nh);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(0,1,0,1,0,1);
glMatrixMode(GL_MODELVIEW);
Disable depth testing
Draw the fullscreen quad with the texture/texture rectangle you have loaded.
glBegin(GL_QUADS);
glVertex2f(0,0);
glVertex2f(1,0);
glVertex2f(1,1);
glVertex2f(0,1);
glEnd();
Set up your regular projection/modelview and continue.
Hope this helps!

Opengl: issue with masking

I'm working on creating a hole in a wall using masking in opengl, my code is quit simple like this,
//Draw the mask
glEnable(GL_BLEND);
glBlendFunc(GL_DST_COLOR,GL_ZERO);
glBindTexture(GL_TEXTURE_2D, texture[3]);
glBegin(GL_QUADS);
glTexCoord2d(0,0); glVertex3f(-20,40,-20);
glTexCoord2d(0,1);glVertex3f(-20,40,40);
glTexCoord2d(1,1);glVertex3f(20,40,40);
glTexCoord2d(1,0);glVertex3f(20,40,-20);
glEnd();
//Draw the Texture
glBlendFunc(GL_ONE, GL_ONE);
glBindTexture(GL_TEXTURE_2D, texture[2]);
glBegin(GL_QUADS);
glTexCoord2d(0,0); glVertex3f(-20,40,-20);
glTexCoord2d(0,1);glVertex3f(-20,40,40);
glTexCoord2d(1,1);glVertex3f(20,40,40);
glTexCoord2d(1,0);glVertex3f(20,40,-20);
glEnd();
The problem is, I got the hole in the wall correctly but it's semi transparent, I'm getting like black shade over it, also I can see through it.
Here's a photo for what I'm getting:
any suggestions?
SOLVED :D
It was a problem with the surfaces' Normals, once I set the normals in the correct position.The black shade faded out.

OpenGl texture mapping blocking colours on FreeType?

I'm using FreeType in order to allow fonts to be used in OpenGL. However, I'm having a problem where I cannot change the font colour whenever I do texture mapping. No matter what I select using glColor3f it will just come out white. The texture works fine.
glClear(GL_COLOR_BUFFER_BIT);
glLoadIdentity();
glColor3f(0.5,0.0,0.5);
glPushMatrix();
glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
glEnable(GL_TEXTURE_2D);
glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_REPLACE);
glBindTexture(GL_TEXTURE_2D, texName);
glBegin(GL_POLYGON);
glTexCoord2f(0,1); glVertex2f(-16,-16);
glTexCoord2f(0,0); glVertex2f(-16,16);
glTexCoord2f(1,0); glVertex2f(16,16);
glTexCoord2f(1,1); glVertex2f(16,-16);
glEnd();
glDisable(GL_TEXTURE_2D);
glDisable(GL_BLEND);
glPopMatrix();
glColor3f(1,0,0);
print(our_font, -300+screenWidth/2.0, screenHeight/2.0, "fifty two - %7.2f", spin);
This is the problem code, I can confirm that drawing a polygon beneath this code will indeed make it red. The text is not changing to red though which it should; if you remove the texture mapping above it will turn red again, I can only think it is a problem with enabling and disabling and I've forgotten to do something...?
Fixed it. Just after I disabled texturing I forgot to set the environment back to modulate:
glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_MODULATE);
adding this after disabling texture/blending fixes the problem.