full transparent object in openGL - c++

I need to create a completely transparent surface passing through the origin of the axes and always parallel to the screen.
I'm trying to use this code (in c++) but the result is something like 50% blend (not completely):
glPushMatrix();
glLoadIdentity();
glBlendFunc(1, 1);
glBegin(GL_QUADS);
glVertex3f(-100, -100, -0.003814);
glVertex3f(100, -100, -0.003814);
glVertex3f(100, 100, -0.003814);
glVertex3f(-100, 100, -0.003814);
glEnd();
glPopMatrix();
Additional informations: I need this transparent surface to get a point on it with the function gluUnProject(winX, winY, winZ, model, proj, view, &ox, &oy, &oz);

If you just want to fill the depth buffer you can disable color writes via glColorMask():
glColorMask( GL_FALSE, GL_FALSE, GL_FALSE, GL_FALSE );
drawQuad();
glColorMask( GL_TRUE, GL_TRUE, GL_TRUE, GL_TRUE );
doUnproject();

Additional informations: I need this transparent surface to get a point on it with the function gluUnProject(winX, winY, winZ, model, proj, view, &ox, &oy, &oz);
No you don't. OpenGL is not a scene graph and there's no obligation to use values with gluUnProject obtained from OpenGL. You can simply pass in whatever you want for winZ. Use 0 if you're interested in the near clipping plane and 1 for the far clipping plane. Also you can perfectly fine just calculate the on-screen position for every point. OpenGL is not magic, the transformations it does are well documented and easy to perform yourself.

The blend function you are using is known as additive blending:
Final Color = (SourceColor * 1.0) + (DestinationColor * 1.0).
This is anything but fully transparent, unless the framebuffer is already fully white at the location you are blending (DestinationColor == (1.0, 1.0, 1.0)). And even then this behavior only works if you are using a fixed-point render target, because values are clamped to [0.0,1.0] after blending by default.
Instead, consider glBlendFunc (GL_ZERO, GL_ONE):
Final Color = (SourceColor * 0.0) + (DestinationColor * 1.0).
[...]
Final Color = Original Color
That said, you will probably get better performance if you simply use a color mask to disable color writes as genpfault suggested. Using a blending function to discard the color of your surface is needlessly complicated.

Related

OpenGL: Draw color with mask on a background image

I need to draw a color with some shape onto an image. My thought was to supply a mask with the given shape (say, hearts), then fill the rectangular area with the color and use the mask to render it over the final image.
Masked by:
PLUS
EQUALS:
The rectangle color is decided at runtime - that's why I don't draw the colored heart on my own.
The black heart image is transparent (alpha is 0) anywhere except for the heart (alpha is 255).
I tried using:
glBlendFunc(GL_DST_ALPHA, GL_ZERO)
where the source is the solid color, and the destination is the alpha channel image.
I used https://www.andersriggelsen.dk/glblendfunc.php for help.
However the bottom image (tree) is being used as the DST image...
Seems like I need an intermediate buffer to first render the blue heart, then do a second render onto the tree.
What is the way to do it?
If the tree is drawn before, it will appear in the dest Color and change your final result.
You are right, you need an intermediate buffer to store which part of the quand should be rendered, with the shape of your heart.
OpenGL provide a perfect tool for this, it's called stencil buffer.
In your case i will render my scene like usual (the tree)
Then i will enable the stencil buffer glEnable(GL_STENCIL_TEST);
Disable the write to the colorBuffer glColorMask(false, false, false, false);,
Draw only the heart with the appropriate mask. glStencilMask(0xFF);
Then you draw your colored quad with stencil test enable with glStencilFunc(GL_EQUAL, 1, 0xFF)
Don't forget to clear your stencil buffer each frame glClear(GL_STENCIL_BUFFER_BIT);
You can find some good tutorials online: https://learnopengl.com/Advanced-OpenGL/Stencil-testing
Here's a very simple way to do this in legacy OpenGL (which I assume you're using) that does not require a stencil buffer:
public void render() {
glClearColor(0, 0, 0, 0);
glClear(GL_COLOR_BUFFER_BIT);
glLoadIdentity();
glOrtho(0, 1, 1, 0, 1, -1);
glEnable(GL_TEXTURE_2D);
glEnable(GL_BLEND);
// Regular blending
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
glEnable(GL_ALPHA_TEST);
// Discard transparent pixels. Not strictly necessary but good for performance in this case.
glAlphaFunc(GL_GREATER, 0.01f);
glColor3f(1,1,1);
glBindTexture(GL_TEXTURE_2D, treeTexture);
drawQuad();
glColor3f(1,0,1); // Your color goes here
glBindTexture(GL_TEXTURE_2D, maskTexture);
drawQuad();
}
private void drawQuad() {
glBegin(GL_QUADS);
glTexCoord2f(0,0);
glVertex2f(0,0);
glTexCoord2f(0,1);
glVertex2f(0,1);
glTexCoord2f(1,1);
glVertex2f(1,1);
glTexCoord2f(1,0);
glVertex2f(1,0);
glEnd();
}
Here, treeTexture is the tree texture, and maskTexture is the white-on-transparent heart shape.
Result:
The principle is that in the legacy OpenGL pipeline, you can use glColor* before glVertex* to specify a color that the texture color (in this case white or transparent) is multiplied by component-wise.
Note that with this method you can easily render multiple colored shapes in multiple different colors without needing any (relatively expensive) clears of the stencil buffer. I suggest cropping the mask texture to the boundaries of the actual mask shape, to save the GPU the small effort of discarding all the transparent fragments.

How to apply texture globally on a set of objects from a particular position?

I know very little of OpenGL.
I want to apply a 2D texture globally onto the scene in OPENGL 3.1 as in figure in this link as if the texture is viewed from the point P.
While the texture projection parameters e.g. focal length f , position P, etc. all are known, how can I do this in OpenGL, so that I can view it from another position?
N.B. The lighting and texture pasting need to be of the form GL_MODULATE.
With the fixed pipeline, this can be achieved by applying a projection matrix to the texture coordinates.
Aside from the very commonly used values of GL_MODELVIEW and GL_PROJECTION, glMatrixMode() also supports a value of GL_TEXTURE, which exposes a mechanism for applying arbitrary transformations to texture coordinates.
In this case, you can use the original world coordinates as the input texture coordinates. So if you used:
glVertexPointer(3, GL_FLOAT, 0, coord);
you use the same for the texture coordinates:
glTexCoordPointer(3, GL_FLOAT, 0, coord);
Then you set up view and projection transformations very similarly to the way you do for the primary view/projection. Keep in mind that transformations are specified in the reverse order of being applied. So the sequence would be something like this:
glMatrixMode(GL_TEXTURE);
// Projection maps to [-1.0, 1.0] range, while texture coordinates are in
// range [0.0, 1.0]. Translate/scale to adjust for this.
glScalef(0.5f, 0.5f, 0.5f);
glTranslatef(1.0f, 1.0f, 1.0f);
// Apply projection.
gluPerspective(...)
// Apply view transformation, using P as eye point.
gluLookAt(...);
glMatrixMode(GL_MODELVIEW);

Drawing a primitive ( GL_QUADS ) on top of a 2D texture - no quad rendered, texture colour changed

I am trying to draw a 2D scene with a texture as background and then ( as the program flows and does computations ) draw different primitives on the "canvas". As a test case I wanted to draw a blue quad on the background image.
I have looked at several resources and SO questions to try get the information I need to accomplish the task ( e.g. this tutorial for first primitive rendering, SOIL "example" for texture loading ).
My understanding was that the texture will be drawn on Z=0, and quad as well. Quad would thus "cover" a portion of texture - be drawn on it, which is what I want. Instead the result of my display function is my initial texture in black/blue colour, and not my texture ( in original colour ) with a blue quad drawn on it. This is the display function code :
void display (void) {
glClearColor (0.0,0.0,0.0,1.0);
glClear (GL_COLOR_BUFFER_BIT);
// background render
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(0.0f, 1024.0, 512.0, 0.0, 0.0, 1.f); // window size is 1024x512
glEnable( GL_TEXTURE_2D );
glBindTexture( GL_TEXTURE_2D, texture );
glBegin (GL_QUADS);
glTexCoord2d(0.0,0.0); glVertex2d(0.0,0.0);
glTexCoord2d(1.0,0.0); glVertex2d(1024.0,0.0);
glTexCoord2d(1.0,1.0); glVertex2d(1024.0,512.0);
glTexCoord2d(0.0,1.0); glVertex2d(0.0,512.0);
glEnd(); // here I get the texture properly displayed in window
glDisable(GL_TEXTURE_2D);
// foreground render
glLoadIdentity();
gluPerspective (60, (GLfloat)winWidth / (GLfloat)winHeight, 1.0, 100.0);
glMatrixMode(GL_MODELVIEW);
glColor3f(0.0, 0.0, 1.0);
glBegin (GL_QUADS);
glVertex2d(400.0,100.0);
glVertex2d(400.0,500.0);
glVertex2d(700.0,100.0);
glVertex2d(700.0,500.0);
glEnd(); // now instead of a rendered blue quad I get my texture coloured in blue
glutSwapBuffers(); }
I have already tried with many modifications, but since I am just beginning with OpenGL and don't yet understand a lot of it, my attempts failed. For example, I tried with pushing and popping matrices before and after drawing the quad, clearing the depth buffer, changing parameters in gluPerspective etc.
How do I have to modify my code so it will render the quad properly on top of the background texture image of my 2D scene ? Being a beginner, extra explanations of the modifications ( as well as mistakes in the present code ) and principles in general will be greatly appreciated.
EDIT - after answer by Reto Koradi :
I have tried to follow the instructions, and the modified code now looks like :
// foreground render
glLoadIdentity();
glMatrixMode(GL_MODELVIEW);
glOrtho(0.0f, 1024.0, 512.0, 0.0, 0.0, 1.f);
glColor3f(0.0, 0.0, 1.0);
glBegin (GL_QUADS); // same from here on
Now I can see the blue "quad", but it is not displayed properly, it looks something like this .
Beside that, the whole scene is flashing really quickly.
What do I have to change in my code so that quad will get displayed properly and screen won't be flashing ?
You are setting up a perspective transformation before rendering the blue quad:
glLoadIdentity();
gluPerspective (60, (GLfloat)winWidth / (GLfloat)winHeight, 1.0, 100.0);
The way gluPerspective() is defined, it sets up a transformation that looks from the origin down the negative z-axis, with the near and far values specifying the distance range that will be visible. With this transformation, z-values from -1.0 to -100.0 will be visible. Which does not include your quad at z = 0.0.
If you want to draw your quad in 2D coordinate space, the easiest solution is to not use gluPerspective() at all. Just use a glOrtho() type transformation like you did for your initial drawing.
If you want perspective, you will need a GL_MODELVIEW transformation as well. You can start with a translation in the negative z-direction, within a range of 1.0 to 100.0. You may have to adjust your coordinates for the different coordinate system as well, or use additional transformations that also translate in xy-direction, and possibly scale.
The code also has the coordinates in the wrong order for drawing the blue quad. You either have to change the draw call to GL_TRIANGLE_STRIP (recommended because it at least gets you one step closer to using features that are not deprecated), or swap the order of the last two vertices:
glBegin (GL_QUADS);
glVertex2d(400.0,100.0);
glVertex2d(400.0,500.0);
glVertex2d(700.0,500.0);
glVertex2d(700.0,100.0);
glEnd(GL_QUADS);

What could globally affect texture coordinate values in OpenGL?

I'm writing a plugin for an application called Autodesk MotionBuilder, which has an OpenGL renderer, and I'm trying to render textured geometry into the scene. I have a window with a 3D View embedded in it, and every time my window is rendered, this is (in a nutshell) what happens:
I tell the renderer that I'm about to draw into a region with a given size
I tell the renderer to draw the MotionBuilder scene in that region
I draw some additional stuff into and/or on top of the scene
The challenge here is that I'm inheriting some arbitrary OpenGL state from MotionBuilder's renderer, which varies depending on what it's drawing and what's present in the scene. I've been dealing with this fine so far, but there's one thing I can't figure out. The way that OpenGL interprets my UV coordinates seems to change based on whatever MotionBuilder is doing behind my back.
Here's my rendering code. If there's no textured geometry in the scene, meaning MotionBuilder hasn't yet fiddled with any texture-related attributes, it works as expected.
// Tell MotionBuilder's renderer to draw the scene
RenderScene();
// Clear whatever arbitrary state MotionBuilder left for us
InitializeAttributes(); // includes glPushAttrib(GL_ALL_ATTRIB_BITS)
InitializePerspective(); // projects into the scene / loads matrices
// Enable texturing, bind to our texture, and draw a triangle into the scene
glEnable(GL_TEXTURE_2D);
glBindTexture(GL_TEXTURE_2D, mTexture);
glBegin(GL_TRIANGLES);
glColor4f(1.0, 1.0, 1.0, 0.5f);
glTexCoord2f(1.0, 0.0); glVertex3f(128.0, 0.0, 0.0);
glTexCoord2f(0.0, 1.0); glVertex3f( 0.0, 128.0, 0.0);
glTexCoord2f(0.0, 0.0); glVertex3f( 0.0, 0.0, 0.0);
glEnd();
// Clean up so we don't confound MotionBuilder's initial expectations
RestoreState(); // includes glPopAttrib()
Now, if I bring in some meshes with textures, something odd happens. My texture coordinates get scaled way up. Here's a before and after:
(source: awforsythe.com)
As you can see from the close-up on the right, when MotionBuilder is asked to render a texture whose file it can't find, it instead loads this small question mark texture and tiles it across the geometry. My only hypothesis is that MotionBuilder is changing some global texture coordinate scalar so that, for example, glTexCoord2f(0.5, 1.0) will instead be interpreted as if it were (50.0, 100.0). Is there such a feature in OpenGL? Any idea what I need to modify in order to preserve my texture coordinates as I've entered them?
Since typing the above and after doing a bit of research, I have discovered that there's a GL_TEXTURE matrix that's used to this effect. Neat! And indeed, when I get the value of this matrix initially, it's the good ol' identity matrix:
1 0 0 0
0 1 0 0
0 0 1 0
0 0 0 1
When I check it again after MotionBuilder fudges up my texture coordinates:
16 0 0 0
0 16 0 0
0 0 1 0
0 0 0 1
How telling! But here's a slight problem: if I try to explicitly set the texture matrix before doing my own drawing, regardless of what MotionBuilder is doing, it seems like my texture coordinates have no effect and it simply samples the lower-left corner of the texture (0.0, 0.0) for every vertex.
Here's the attempted fix, placed after RenderScene in the code posted above:
glMatrixMode(GL_TEXTURE);
glLoadIdentity();
I can verify that the value of GL_TEXTURE_MATRIX is now the identity matrix, but no matter what coordinates I specify in glTexCoord2f, it's always drawn as if the coordinates for each vertex were (0.0, 0.0):
(source: awforsythe.com)
Any idea what else could be affecting how OpenGL interprets my texture coordinates?
Aha! These calls:
glMatrixMode(GL_TEXTURE);
glLoadIdentity();
...have to be made after GL_TEXTURE_2D is enabled.
...should be followed up by setting the matrix mode back to GL_MODELVIEW. It turns out, apparently, that some functions I was calling immediately after resetting the texture matrix (glViewport and/or gluPerspective?) affect the current matrix stack. So those calls were affecting the texture matrix, causing my texture coordinates to be transformed in unexpected ways.
I think I've got it now.

Reading depth value of transparent plane with glReadPixels and gluUnProject

I am trying to create a billiards simulation and have been using glReadPixels along with gluUnProject to project my mouse pointer into the scene.
This works fine if the mouse is pointing at an object in the scene (the table for instance) but when it points to the background it messes up the gluUnProject due to the glReadPixels call returning 1.0.
I'm trying to figure out how to draw a transparent plane at the same level of the table so that no matter where I point the mouse in my scene, it will get the depth as if it were pointing onto the same plane as the table.
If I draw a transparent quad without glAlphaFunc(GL_GREATER, 0.01f); it will draw the quad as white and the depth testing will work as I planned, but when I add in the call to alphaFunc to get the quad to be transparent, the depth goes back to what it was before. From what I've seen, glReadPixels reads the pixels from the frame buffer, so this makes sense, I'm just wondering how I can work around this.
I've also tried reversing the winding on the quad so that it wouldn't be visible from above, but this has the same problem of glReadPixels taking it's measurements from the framebuffer.
In short, how do I get glReadPixels to get it's depth component from an object without drawing that object to the screen?
Here's the calls to glReadPixels and gluUnProject:
winX = (float)x;
winY = (float)viewport[3] - (float)y;
glReadPixels(x, (int) winY, 1, 1, GL_DEPTH_COMPONENT, GL_FLOAT, &winZ);
gluUnProject(winX, winY, winZ, modelview, projection, viewport, &posX, &posY, &posZ);
That is because alpha testing fully discards pixels that don't pass the test.
However gluUnProject should not choke on a depth buffer value of 1.0; what you should get as value for a depth buffer value of 1.0 is the distance of the far clipping plane.
You can completely disable writes to the color buffer using
glColorMask(GL_FALSE, GL_FALSE, GL_FALSE, GL_FALSE);
If you draw something after this, it will still get depth tested (if that's turned on) and its depth values will still be written to the depth buffer (if that's turned on) but none of its fragments will be visible in the color buffer.