Why Transform Matrix doesn't work in OpenGL - c++

Say I want to draw a Ball in the scene and here are two different ways to do them.
float SUN_TRANSLATION_MATRIX[] = {
1.0f, 0.0f, 0.0f, 0.0f,
0.0f, 1.0f, 0.0f, 0.0f,
0.0f, 0.0f, 1.0f, -15.0f,
0.0f, 0.0f, 0.0f, 1.0f
};
void displaySolarSystem1(){
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
glTranslatef(0.0f, 0.0f, -15.f);
glColor3f(1.0f, 0.8f, 0.5f);
glutSolidSphere(2.0, 50, 40);
glutSwapBuffers();
}
void displaySolarSystem(){
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
glMultMatrixf(SUN_TRANSLATION_MATRIX);
glColor3f(1.0f, 0.8f, 0.5f);
glutSolidSphere(2.0, 50, 40);
glutSwapBuffers();
}
displaySolarSystem1 applies glTranslatef where displaySolarSystem uses Matrix operation the problem is that displaySolarSystem1 works as expected but the matrix failed.
What went wrong with displaySolarSystem()?

http://www.opengl.org/sdk/docs/man/xhtml/glMultMatrix.xml
Calling glMultMatrix with an argument of m = [...] replaces the current transformation with C × M × v
Which means that transformations are applied by multiplying matrix by vector, not vice versa. So, translation matrix would be something like this
1 0 0 X
0 1 0 Y
0 0 1 Z
0 0 0 1
But this is matrix, that is written in row-major order, and OpenGL takes column-major order matrices, so you need to transpose it.
So, finally, you just need to use glMultTransposeMatrix which, if I remember right, is slightly slower, or transpose your matrix to look like this
float SUN_TRANSLATION_MATRIX[] = {
1.0f, 0.0f, 0.0f, 0.0f,
0.0f, 1.0f, 0.0f, 0.0f,
0.0f, 0.0f, 1.0f, 0.0f,
0.0f, 0.0f, -15.0f, 1.0f
};

Thanks Unwind. The problem was solved by changing glMultMatrixf(SUN_TRANSLATION_MATRIX); to glMultTransposeMatrixf(SUN_TRANSLATION_MATRIX); thanks for your hint of "this is a transposed matrix"

Related

Why OpenGL cut off polygons (even if this settings is disabled)?

I read similar suggested questions and their solutions, but could not find an answer.
I'm trying to draw a scene with an isometric view in OpenGL.
Draw func:
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glRotatef(atan(0.5f) * 180.0f / PI, 1.0f, 0.0f, 0.0f);
glRotatef(-45.0f, 0.0f, 1.0f, 0.0f);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
glBegin(GL_QUADS);
glColor3f(1.0f, 1.0f, 1.0f);
glVertex3f(0.0f, 0.0f, 0.0f);
glVertex3f(1.0f, 0.0f, 0.0f);
glVertex3f(1.0f, 0.0f, 1.0f);
glVertex3f(0.0f, 0.0f, 1.0f);
glEnd();
glPopMatrix();
In the end, I get this result. The camera does have an isometric projection, but for some reason polygons are clipped.
If I add glTranslatef(-0.8f, 0, -0.8f) before drawing the quad, the result is as follows:
The problem is that I don't apply any optimization to OpenGL render. But why do polygons have to be cut off?
The polygons are clipped by the near or far plane of the viewing volume.
When you do not set a projection matrix, then view space, clip space and normalized device space are the same. The normalized device space is a unique cube with the left, bottom, near of (-1, -1, -1) and right, top, far of (1, 1, 1). All the geometry which is not inside this cube is clipped.
Actually you draw a quad with a side length of 1. One vertex of the quad is at the origin of the view (0, 0, 0). The quad is rotated around the origin by glRotate. Since the length of the diagonal of the quad is sqrt(2.0), one vertex of the rotated quad is clipped by either the near plane or the far plane.
If you construct and rotate a quad whose center is (0, 0 ,0), it will not be clipped, because the length form the center to each vertex is sqrt(2.0)/2.0. That is less than 1 (distance to near and far plane form the center of the viewing volume)
glBegin(GL_QUADS);
glColor3f(1.0f, 1.0f, 1.0f);
glVertex3f(-0.5f, 0.0f, -0.5f);
glVertex3f( 0.5f, 0.0f, -0.5f);
glVertex3f( 0.5f, 0.0f, 0.5f);
glVertex3f(-0.5f, 0.0f, 0.5f);
glEnd();
respectively
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
glRotatef(atan(0.5f) * 180.0f / PI, 1.0f, 0.0f, 0.0f);
glRotatef(-45.0f, 0.0f, 1.0f, 0.0f);
glTranslate(-0.5f, 0.0f, -0.5f);
glBegin(GL_QUADS);
glColor3f(1.0f, 1.0f, 1.0f);
glVertex3f(0.0f, 0.0f, 0.0f);
glVertex3f(1.0f, 0.0f, 0.0f);
glVertex3f(1.0f, 0.0f, 1.0f);
glVertex3f(0.0f, 0.0f, 1.0f);
glEnd();
Alternatively you can set an Orthographic projection, which enlarges the viewing volume by glOrtho:
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(-1.0, 1.0, -1.0, 1.0, -2.0, 2.0);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
glRotatef(atan(0.5f) * 180.0f / PI, 1.0f, 0.0f, 0.0f);
glRotatef(-45.0f, 0.0f, 1.0f, 0.0f);
glBegin(GL_QUADS);
glColor3f(1.0f, 1.0f, 1.0f);
glVertex3f(0.0f, 0.0f, 0.0f);
glVertex3f(1.0f, 0.0f, 0.0f);
glVertex3f(1.0f, 0.0f, 1.0f);
glVertex3f(0.0f, 0.0f, 1.0f);
glEnd();

Directx 11 2D ortho

I startet with Directx 11 and I have some problems with setting up the camera.
I want to set the origin to the top-left of the sceen, currently it is on bot-left this is how I set it up:
D3DXMatrixIdentity(&mProjection);
D3DXMatrixIdentity(&mView);
mPosition = D3DXVECTOR3{ 0.0f, 0.0f, -0.5f };
mTarget = D3DXVECTOR3{ 0.0f, 0.0f, 0.0f };
mUp = D3DXVECTOR3{ 0.0f, 0.0f, 0.0f };
D3DXMatrixOrthoOffCenterLH(&mProjection,
0.0f, static_cast<Float32>(WindowWidth),
0.0f, static_cast<Float32>(WindowHeight), 0.0f, 1.0f);
and here is how I want it
this is currently my coord system:
swap y parameters for building the matrix:
D3DXMatrixOrthoOffCenterLH(&mProjection,
0.0f, static_cast<Float32>(WindowWidth),
static_cast<Float32>(WindowHeight), 0.0f, 0.0f, 1.0f);

Why are my OpenGL objects drawing relative to the last object drawn?

I'm pretty sure this is due to my lack of understanding of how the GL_MODELVIEW matrix works. Here is a screen recording of what's happening: http://youtu.be/3F7FLkVI7kA
As you can see, the bottom-most triangle is the first triangle being drawn, and moves as I expect the other 2 triangles to move. The second triangle is moved and rotated relative to the first, and the third is moved and rotated relative to that combination.
What I want is for all three triangles to be stationary in 3D space, but spinning (like the first triangle).
Source:
// Main loop
do {
// Clear Screen
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
// Update camera
glfwGetCursorPos(window, &cursorX, &cursorY);
cam.update(0.001f, (int)cursorX, (int)cursorY);
// Reset Matrix
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
// move camera
glRotatef(cam.rotation.x, 1.0f, 0.0f, 0.0f);
glRotatef(cam.rotation.y, 0.0f, 1.0f, 0.0f);
// translate modelview matrix to position of the camera - everything should now draw relative to camera position
glTranslatef(-cam.position.x, cam.position.y, -cam.position.z);
// Draw ground
drawGroundGrid(-25.0f);
drawSpinningTriangle(0.0f, 0.0f, -5.0f);
drawSpinningTriangle(3.14f, 3.0f, -6.0f);
drawSpinningTriangle(-6.0f, 12.0f, -5.0f);
// Swap buffers - back buffer is now front buffer to be rendered to next frame
glfwSwapBuffers(window);
glfwPollEvents();
calcFPS();
} while (!glfwGetKey(window, GLFW_KEY_ESCAPE) && !glfwWindowShouldClose(window));// Main Loop End
[...]
void drawSpinningTriangle(float x, float y, float z) {
glMatrixMode(GL_MODELVIEW);
glTranslatef(x, y, z);
glRotatef(glfwGetTime() * 50.0f, 0.0f, 1.0f, 0.0f);
glBegin(GL_TRIANGLES);
{
// Red vertex
glColor3f(1.0f, 0.0f, 0.0f);
glVertex3f(0.0f, 1.0f, 0.0f);
// Yellow vertex
glColor3f(1.0f, 1.0f, 0.0f);
glVertex3f(-1.0f, -1.0f, 0.0f);
// White vertex
glColor3f(1.0f, 1.0f, 1.0f);
glVertex3f(1.0f, -1.0f, 0.0f);
}
glEnd();
}
First using the matrix stack is deprecated. It's much better to manage your own matrices
Second you should pushMatrix and popMatrix before the transformations and after drawing:
void drawSpinningTriangle(float x, float y, float z) {
glMatrixMode(GL_MODELVIEW);
glPushMatrix();
glTranslatef(x, y, z);
glRotatef(glfwGetTime() * 50.0f, 0.0f, 1.0f, 0.0f);
glBegin(GL_TRIANGLES);
{
// Red vertex
glColor3f(1.0f, 0.0f, 0.0f);
glVertex3f(0.0f, 1.0f, 0.0f);
// Yellow vertex
glColor3f(1.0f, 1.0f, 0.0f);
glVertex3f(-1.0f, -1.0f, 0.0f);
// White vertex
glColor3f(1.0f, 1.0f, 1.0f);
glVertex3f(1.0f, -1.0f, 0.0f);
}
glEnd();
glPopMatrix();
}
This will save and restore the top most matrix so any changes between the 2 calls are removed.

DirectX when I add a texture to a quad the triangles mess up somehow, fix?

First of all here are the important parts of my code.
Creating the vertices.
D3DVertexTexture Vertices[] =
{
{-1.0f, 1.0f, 0.0f, 0.0f, 0.0f, },
{ 1.0f, 1.0f, 0.0f, 1.0f, 0.0f, },
{ 1.0f, -1.0f, 0.0f, 1.0f, 1.0f, },
{-1.0f, -1.0f, 0.0f, 0.0f, 1.0f, },
};
Creating the vertex buffer.
D3DDevice->CreateVertexBuffer(sizeof(Vertices),
0,
D3DFVF_CUSTOMVERTEXTEXTURE,
D3DPOOL_MANAGED,
&vb,
NULL);
Memory crap.
void* pVoid;
vb->Lock(0, sizeof(pVoid), (void**) &pVoid, 0);
memcpy(pVoid, Vertices, sizeof(Vertices));
vb->Unlock();
Loading the texture.
D3DXCreateTextureFromFile(D3DDevice, "images/tex.png", &t);
Rendering.
D3DDevice->SetFVF(D3DFVF_CUSTOMVERTEXTEXTURE);
D3DDevice->SetTexture(0, t);
D3DDevice->SetStreamSource(0, vb, 0, sizeof(D3DVertexTexture));
D3DDevice->DrawPrimitive(D3DPT_TRIANGLESTRIP, 0, 2);
Where is my problem.
It shows a square but the left side of the side is missing in a triangular shape like this.
Vertices A,B,C,D in a triangle strip will produce two triangles: A,B,C and B,C,D
A -- B A--B B
| | \ | /|
| | \| / |
D -- C C D--C
Look at that diagram and picture those two triangles...
Then go and put your vertices in the right order - triangle strips should 'zig-zag', not proceed in clockwise or anti-clockwise order.
If you order them: A,B,D,C - the quad will draw correctly.
Have you tried defining you vertices in this order:
D3DVertexTexture Vertices[] =
{
{ 1.0f, -1.0f, 0.0f, 1.0f, 1.0f, },
{-1.0f, -1.0f, 0.0f, 0.0f, 1.0f, },
{-1.0f, 1.0f, 0.0f, 0.0f, 0.0f, },
{ 1.0f, 1.0f, 0.0f, 1.0f, 0.0f, },
};
I believe the order in wich vertices are drawn is by default the clockwise order. You are defining in an incorrect order.

OpenGL Color Matrix

How do I get the OpenGL color matrix transforms working?
I've modified a sample program that just draws a triangle, and added some color matrix code to see if I can change the colors of the triangle but it doesn't seem to work.
static float theta = 0.0f;
glClearColor( 1.0f, 1.0f, 1.0f, 1.0f );
glClearDepth(1.0);
glClear( GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glPushMatrix();
glRotatef( theta, 0.0f, 0.0f, 1.0f );
glMatrixMode(GL_COLOR);
GLfloat rgbconversion[16] =
{
0.0f, 0.0f, 0.0f, 0.0f,
0.0f, 0.0f, 0.0f, 0.0f,
0.0f, 0.0f, 0.0f, 0.0f,
0.0f, 0.0f, 0.0f, 0.0f
};
glLoadMatrixf(rgbconversion);
glMatrixMode(GL_MODELVIEW);
glBegin( GL_TRIANGLES );
glColor3f( 1.0f, 0.0f, 0.0f ); glVertex3f( 0.0f, 1.0f , 0.5f);
glColor3f( 0.0f, 1.0f, 0.0f ); glVertex3f( 0.87f, -0.5f, 0.5f );
glColor3f( 0.0f, 0.0f, 1.0f ); glVertex3f( -0.87f, -0.5f, 0.5f );
glEnd();
glPopMatrix();
As far as I can tell, the color matrix I'm loading should change the triangle to black, but it doesn't seem to work. Is there something I'm missing?
The color matrix only applies to pixel transfer operations such as glDrawPixels which aren't hardware accelerated on current hardware. However, implementing a color matrix using a fragment shader is really easy. You can just pass your matrix as a uniform mat4 then mulitply it with gl_FragColor
It looks like you're doing it correctly, but your current color matrix sets the triangle's alpha value to 0 as well, so while it is being drawn, it does not appear on the screen.
"Additionally, if the ARB_imaging extension is supported, GL_COLOR is also accepted."
From the glMatrixMode documentation. Is the extension supported on your machine?
I have found the possible problem.
The color matrix is supported by the "Image Processing Subset". In most HW, it was supported by driver.(software implementation)
Solution:
Add this line after glEnd():
glCopyPixels(0,0, getWidth(), getHeight(),GL_COLOR);
It's very slow....