Objects shake when rotating - opengl

I'm facing a problem in my opengl code
I'm trying to build a house, and rotate it 360°, for simplicity let's assume the house has the front wall with window and dor, and a back wall.
I'm using DEPTH_BUFFER not to see the back wall when viewing the front wall, and the other way around, but when I rotate the house the door and window start to shake and get distorced.
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glColor3f(0.0, 0.0, 0.0);
glLoadIdentity();
gluLookAt(0.0, 0.0, 40.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0);
glRotatef(angle, 0.0, 1.0, 0.0);
glEnable(GL_DEPTH_TEST);
glColor3f(0.0, 0.0, 0.0);
glBegin(GL_POLYGON);
glVertex3f(8.0, 3.0, 0.0);
glVertex3f(8.0, -10.0, 0.0);
glVertex3f(1.0, -10.0, 0.0);
glVertex3f(1.0, 3.0, 0.0);
glEnd();
glColor3f(0.0, 0.0, 0.0);
glBegin(GL_POLYGON);
glVertex3f(-9.0, -4.0, 0.0);
glVertex3f(-9.0, 3.0, 0.0);
glVertex3f(-2.0, 3.0, 0.0);
glVertex3f(-2.0, -4.0, 0.0);
glEnd();
glColor3f(1.0, 0.0, 0.0);
glBegin(GL_POLYGON);
glVertex3f(10.0, -10.0, -20.0);
glVertex3f(-10.0, -10.0, -20.0);
glVertex3f(-10.0, 10.0, -20.0);
glVertex3f(10.0, 10.0, -20.0);
glEnd();
glColor3f(1.0, 0.0, 0.0);
glBegin(GL_POLYGON);
glVertex3f(10.0, -10.0, 0.0);
glVertex3f(10.0, 10.0, 0.0);
glVertex3f(-10.0, 10.0, 0.0);
glVertex3f(-10.0, -10.0, 0.0);
glEnd();
glDisable(GL_DEPTH_TEST);
glutSwapBuffers();

The issue is called Z-fighting. This is caused, because depth of the "door", "window" and "wall" are equal. The vertiex coordinates are transformed by the model view matrix and projection matrix and interpolated for each fragment which is covered by the polygon. This results in inaccuracies of the final z coordinate (depth).
Enable the polygon fill offset (glPolygonOffset) by before drawing the walls, to solve the issue:
glEnable(GL_DEPTH_TEST);
glDisable( GL_POLYGON_OFFSET_FILL );
// draw door and window
// ...
glEnable( GL_POLYGON_OFFSET_FILL );
glPolygonOffset( 1, 1 );
// draw walls
// ...
Polygon fill offset manipulates the depth of a fragment by a minimum amount. This causes that the depth of the "walls" is different to the depth of the "window" and "door", even after the transformation by the model view and projection matrix.
Since an offset is added to the depth of the "wall", the "wall" is always behind the window and the door, independent of the point of view.

Related

How to draw a rhombus with freeglut?

I am trying to draw something like this image of the target output using OpenGL. I am using freeglut. I could draw three polygons, but I don't know how to give the three polygons right positioning. Below I attached the target output and my correct output.
The target output
My output
#include <GL/glut.h>
void init(void)
{
glClearColor(1.0, 1.0, 1.0, 0.0);
glMatrixMode(GL_PROJECTION);
gluOrtho2D(-5.0, 5.0, -5.0, 5.0);
glMatrixMode(GL_MODELVIEW);
}
void drawSquare(void)
{
glBegin(GL_POLYGON);
glVertex2f(0.0f, -1.0f);
glVertex2f(2.0f, 0.0f);
glVertex2f(0.0f, 1.0f);
glVertex2f(-2.0f, 0.0f);
glEnd();
}
void myDraw1(void)
{
glClear(GL_COLOR_BUFFER_BIT);
glLoadIdentity();
glColor3f(1.0, 0.0, 0.0);
drawSquare();
glTranslatef(2.0, 3.0, 0.0);
glRotatef(30, 0.0, 0.0, 1.0);
glColor3f(0.0, 1.0, 0.0);
drawSquare();
glLoadIdentity();
glTranslatef(-2.0, -3.0, 0.0);
glRotatef(-30, 0.0, 0.0, 1.0);
glColor3f(0.0, 0.0, 1.0);
drawSquare();
glFlush();
}
void myDraw2(void)
{
glClear(GL_COLOR_BUFFER_BIT);
glLoadIdentity();
glColor3f(1.0, 0.0, 0.0);
drawSquare();
glPushMatrix();
glTranslatef(2.0, 3.0, 0.0);
glRotatef(30, 0.0, 0.0, 1.0);
glColor3f(0.0, 1.0, 0.0);
drawSquare();
glPopMatrix();
glTranslatef(-2.0, -3.0, 0.0);
glRotatef(-30, 0.0, 0.0, 1.0);
glColor3f(0.0, 0.0, 1.0);
drawSquare();
glFlush();
}
void main(int argc, char** argv)
{
glutInit(&argc, argv);
glutInitDisplayMode(GLUT_SINGLE | GLUT_RGB);
glutInitWindowPosition(0, 0);
glutInitWindowSize(600, 600);
glutCreateWindow("Rotate");
init();
glutDisplayFunc(myDraw1);
glutMainLoop();
}
The angle between the polygons is 120°. The center of the polygons is (0, 0). You have to rotate around the point (-2, 0). Translate the polygon in that way that that (-2, 0) is moved to (0, 0) and rotate the polygon:
glRotatef(120.0, 0.0, 0.0, 1.0);
glTranslatef(2.0, 0.0, 0.0);
Note, operations like glTranslatef and glRotatef, specify a matrix and multiply the current matrix by the new specified matrix. Use glPushMatrix/glPopMatrix to save and restore the matrix (push on and pop from matrix stack).
e.g:
void myDraw1(void)
{
glClear(GL_COLOR_BUFFER_BIT);
glLoadIdentity();
glColor3f(1.0, 0.0, 0.0);
glPushMatrix();
glRotatef(90.0, 0.0, 0.0, 1.0);
glTranslatef(2.0, 0.0, 0.0);
drawSquare();
glPopMatrix();
glColor3f(0.0, 1.0, 0.0);
glPushMatrix();
glRotatef(210.0, 0.0, 0.0, 1.0);
glTranslatef(2.0, 0.0, 0.0);
drawSquare();
glPopMatrix();
glColor3f(0.0, 0.0, 1.0);
glPushMatrix();
glRotatef(330.0, 0.0, 0.0, 1.0);
glTranslatef(2.0, 0.0, 0.0);
drawSquare();
glPopMatrix();
glFlush();
}

Applying 2d texture to single cube face with change in z direction in opengl

I am trying to apply a texture to 1 face of a cube, but the problem I have is the cube is in the z direction as well.
glEnable(GL_TEXTURE_2D);
glBindTexture(GL_TEXTURE_2D, Tex);
glBegin(GL_POLYGON);
glTexCoord2f( 1.0, 1.0); glVertex3f( 1.0, 1.0, 1);
glTexCoord2f( 1.0, -1.0); glVertex3f( 1.0, -1.0, 1);
glTexCoord2f( 1.0, -1.0); glVertex3f( 1.0, -1.0, -1);
glTexCoord2f( 1.0, 1.0); glVertex3f( 1.0, 1.0, -1);
glEnd();
glDisable(GL_TEXTURE_2D);
Obviously by doing it this way I end up with repeated texture vertices (1,1) and (1,-1) because they don't include the z axis. How can I resolve this issue?
It is not necessary that the texture coordinates are associated to the x and y component of the vertex coordinates or even have to be the same. The texture coordinates are completely independent.
You have to define how the 2 dimensional texture is wrapped on a surface in 3 dimensional space. For each side of the cube you've to define individual texture coordinates.
Chang the texture coordinates. e.g.:
glBegin(GL_POLYGON);
glTexCoord2f( 1.0, 1.0); glVertex3f( 1.0, 1.0, 1.0);
glTexCoord2f( 1.0, 0.0); glVertex3f( 1.0, -1.0, 1.0);
glTexCoord2f( 0.0, 0.0); glVertex3f( 1.0, -1.0, -1.0);
glTexCoord2f( 0.0, 1.0); glVertex3f( 1.0, 1.0, -1.0);
glEnd();

Object Rotating in OpenGL

I'm making a square arm with push-pop matrix and pop-up menu entries. I want to make the objects rotate individually, so I create the code below but the one finger moves along with the others. Why is it moving together?
void display(void) {
glClear(GL_COLOR_BUFFER_BIT);
draw_lines();
glPushMatrix();
glColor3f(1.0, 1.0, 1.0);
glTranslatef(-1.0, 0.0, 0.0);
glRotatef((GLfloat)shoulder, 0.0, 0.0, 1.0);
glTranslatef(1.0, 0.0, 0.0);
glPushMatrix();
glScalef(2.0, 0.4, 1.5);
glutWireCube(1.0); //First square(Shoulder)
glPopMatrix();
glTranslatef(1.0, 0.0, 0.0);
glRotatef((GLfloat)elbow, 0.0, 0.0, 1.0);
glTranslatef(1.0, 0.0, 0.0);
glPushMatrix();
glScalef(2.0, 0.4, 1.5);
glutWireCube(1.0); //Second Square(Elbow)
glPopMatrix();
glTranslatef(1.0, 0.0, 0.0);
glRotatef((GLfloat)finger_1, 0.0, 0.0, 1.0);
glTranslatef(0.25, 0.0, 0.7);
glPushMatrix();
glScalef(0.5, 0.2, 0.2);
glutWireCube(1.0); //First Finger
glPopMatrix();
glTranslatef(-0.3, 0.0, 0.0);
glRotatef((GLfloat)finger_2, 0.0, 0.0, 1.0);
glTranslatef(0.25, 0.0, -0.4);
glPushMatrix();
glScalef(0.5, 0.2, 0.2);
glutWireCube(1.0); //Second Finger
glPopMatrix();
glTranslatef(-0.3, 0.0, 0.0);
glRotatef((GLfloat)finger_3, 0.0, 0.0, 1.0);
glTranslatef(0.25, 0.0, -0.5);
glPushMatrix();
glScalef(0.5, 0.2, 0.2);
glutWireCube(1.0); //Third Finger
glPopMatrix();
glPopMatrix();
glPopMatrix();
glutSwapBuffers();
}
Object rotation is made by separate functions.

GL_QUADS state does not support multiple glBindTexture?

To be short, why couldn't I put two glBindTexture into one glBegin(GL_QUADS)...glEnd()?
I am following chapter 9 of OpenGL red book, http://www.glprogramming.com/red/chapter09.html
In example 9-5, program texbind.c produces two quads with different texture, like this:
The quad on the left has a white checker board texture while the quad on the right has a red one.
Here is the code which maps the textures to two quads:
void display(void)
{
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glBindTexture(GL_TEXTURE_2D, texName[0]);
glBegin(GL_QUADS);
glTexCoord2f(0.0, 0.0); glVertex3f(-2.0, -1.0, 0.0);
glTexCoord2f(0.0, 1.0); glVertex3f(-2.0, 1.0, 0.0);
glTexCoord2f(1.0, 1.0); glVertex3f(0.0, 1.0, 0.0);
glTexCoord2f(1.0, 0.0); glVertex3f(0.0, -1.0, 0.0);
glEnd();
glBindTexture(GL_TEXTURE_2D, texName[1]);
glBegin(GL_QUADS);
glTexCoord2f(0.0, 0.0); glVertex3f(1.0, -1.0, 0.0);
glTexCoord2f(0.0, 1.0); glVertex3f(1.0, 1.0, 0.0);
glTexCoord2f(1.0, 1.0); glVertex3f(2.41421, 1.0, -1.41421);
glTexCoord2f(1.0, 0.0); glVertex3f(2.41421, -1.0, -1.41421);
glEnd();
glFlush();
}
We can see that even though the code uses GL_QUADS it draws one quad in each GL_QUADS state. However, if I comment out the glEnd() and glBegin(GL_QUADS) in the middle of this code, like this:
void display(void)
{
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glBindTexture(GL_TEXTURE_2D, texName[0]);
glBegin(GL_QUADS);
glTexCoord2f(0.0, 0.0); glVertex3f(-2.0, -1.0, 0.0);
glTexCoord2f(0.0, 1.0); glVertex3f(-2.0, 1.0, 0.0);
glTexCoord2f(1.0, 1.0); glVertex3f(0.0, 1.0, 0.0);
glTexCoord2f(1.0, 0.0); glVertex3f(0.0, -1.0, 0.0);
// glEnd();
glBindTexture(GL_TEXTURE_2D, texName[1]);
// glBegin(GL_QUADS);
glTexCoord2f(0.0, 0.0); glVertex3f(1.0, -1.0, 0.0);
glTexCoord2f(0.0, 1.0); glVertex3f(1.0, 1.0, 0.0);
glTexCoord2f(1.0, 1.0); glVertex3f(2.41421, 1.0, -1.41421);
glTexCoord2f(1.0, 0.0); glVertex3f(2.41421, -1.0, -1.41421);
glEnd();
glFlush();
}
It turns out the second glBindTexture(GL_TEXTURE_2D, texName1) does not work anymore. And the result is:
Could anyone tell me why I cannot put two glBindTexture into one GL_QUADS state?
Because OpenGL doesn't support doing that.
The only calls you can make to OpenGL while inside glBegin/glEnd are vertex submission calls like glTexCoord*, glVertex*, etc. Anything else is an error.
As to "why", one reason is "because the specification says so". More likely, the GPU processes the entire data set (or significant chunks of it) in parallel, during which it cannot switch textures.
Also consider the newer way of drawing with OpenGL: uploading your data to a Vertex Buffer Object and calling one of the glDraw* commands. There's no way you can even call glBindTexture in between primitives using that.

Organizing the code for apartment modelling in C++ with OpenGl

I am making an apartment with C++ and openGl. I have made basic walls, roof and floor by just declaring points in the drawing function and everything of course works but code is messy and adding furniture this way would of course be very painful. So I am asking how should I organize my objects and format drawing function?
Here's the current code:
// Floor and roof of room 1
glBegin(GL_QUADS);
glNormal3f(0.0, 1.0, 0.0);
glColor3f(0.0, 1.0, 1.0);
glVertex3f(0.0, 0.0, 0.0);
glVertex3f(1.0, 0.0, 0.0);
glVertex3f(1.0, 0.0, 1.0);
glVertex3f(0.0, 0.0, 1.0);
glNormal3f(0.0, -1.0, 0.0);
glColor3f(0.0, 1.0, 0.0);
glVertex3f(0.0, 1.0, 0.0);
glVertex3f(1.0, 1.0, 0.0);
glVertex3f(1.0, 1.0, 1.0);
glVertex3f(0.0, 1.0, 1.0);
glEnd();
// Walls
glBegin(GL_QUAD_STRIP);
glNormal3f(1.0, 0.0, 0.0);
glColor3f(1.0, 1.0, 1.0);
glVertex3f(0.0, 0.0, 0.0);
glVertex3f(0.0, 1.0, 0.0);
glColor3f(1.0, 0.0, 0.0);
glVertex3f(0.0,0.0,1.0);
glVertex3f(0.0,1.0,1.0);
glNormal3f(0.0, 0.0, -1.0);
glColor3f(0.0, 0.0, 1.0);
glVertex3f(1.0, 0.0, 1.0);
glVertex3f(1.0, 1.0, 1.0);
glNormal3f(-1.0, 0.0, 0.0);
glColor3f(0.5, 0.0, 0.5);
glVertex3f(1.0, 0.0, 0.0);
glVertex3f(1.0, 1.0, 0.0);
glEnd();
And so on for room 2 and door spots..
Any places to read about this subject?
You can use 3d modelling software, e.g. →Blender to define your geometry etc. Then I recommend to use →Assimp to load the exported model. Also recommend to avoid the old fixed-function pipeline – write your own little scenegraph engine and manage your matrices and 3d math with →GLM