OpenGl glm local rotation - opengl

I need to rotate object in local coordinates system, like you can rotate it in 3dmax\maya etc...
My current code is:
ModelMatrix = glm::mat4(1.0f);
TransformMatrix = glm::mat4(1.0f);
ScaleMatrix = glm::mat4(1.0f);
RotateMatrix = glm::mat4(1.0f);
ScaleMatrix = glm::scale(ScaleMatrix, glm::vec3(scalex, scalez, scaley));
TransformMatrix = glm::translate(TransformMatrix, glm::vec3(x, z, y));
RotateMatrix = glm::rotate(RotateMatrix, anglex, glm::vec3(1, 0, 0));
RotateMatrix= glm::rotate(RotateMatrix, angley, glm::vec3(0, 0, 1));
RotateMatrix = glm::rotate(RotateMatrix, anglez, glm::vec3(0, 1, 0));
ModelMatrix = TransformMatrix * ScaleMatrix* RotateMatrix;
MVP = Projection * View * ModelMatrix ;
anglex,y,z - comes from keyboard.
Right now only last dimension works as local (im my example it's glm::vec3(0, 1, 0) Z axis) At this IMAGE I show what I needed(2) and what I've got(3)... If I changes "anglez" it's always works as ROLL. But anglex and angley is in the world coordinates system.
The second my attempt - use Quaternions:
quat MyQuaternion= glm::quat(cos(glm::radians(xangle / 2)), 0, sin(glm::radians(xangle / 2)), 0);
quat MyQuaternion2 = glm::quat(cos(glm::radians(yangle/ 2)), sin(glm::radians(yangle / 2)), 0, 0);
quat MyQuaternion3 = glm::quat(cos(glm::radians(zangle / 2)), 0,0,sin(glm::radians(zangle / 2)));
glm::mat4 RotationMatrix = toMat4(MyQuaternion*MyQuaternion2*MyQuaternion3);
But I have the same result

You should modify the entire ModelMatrix instead of the angles. Initialize ModelMatrix to the identity matrix. Then, when you process keyboard input:
if(rotate about x-axis)
ModelMatrix = glm::rotate(ModelMatrix, angle, glm::vec3(1, 0, 0));
if(rotate about y-axis)
ModelMatrix = glm::rotate(ModelMatrix, angle, glm::vec3(0, 1, 0));
if(rotate about z-axis)
ModelMatrix = glm::rotate(ModelMatrix, angle, glm::vec3(0, 0, 1));
if(any rotation happened)
MVP = Projection * View * ModelMatrix ;
You can do this modification at any level. Either the MVP level, the ModelMatrix level (as shown here) or the RotateMatrix level.

Related

GLM LookAt with different coordinate system

I am using GLM to make a LookAt matrix. I use the normal OpenGL coordinate system, but with the Z axis going inwards which is the opposite of the OpenGL standard. Thus, the LookAt function requires some changes:
glm::vec3 pos = glm::vec3(0, 0, -10); // equal to glm::vec3(0, 0, 10) in standard coords
glm::quat rot = glm::vec3(0.991445, 0.130526, 0, 0); // 15 degrees rotation about the x axis
glm::vec3 resultPos = pos * glm::vec3(1, 1, -1); // flip Z axis
glm::vec3 resultLook = pos + (glm::conjugate(rot) * glm::vec3(0, 0, 1)) * glm::vec3(1, 1, -1); // rotate unit Z vec and then flip Z
glm::vec3 resultUp = (glm::conjugate(rot) * glm::vec3(0, 1, 0)) * glm::vec3(1, 1, -1); // same thing as resultLook but with unit Y vec
glm::mat4 lookAt = glm::lookAt(resultPos, resultLook, resultUp)
However, that is a lot of calculation for just flipping a single axis. What do I need to do to get a view matrix which has a flipped Z axis?

Modern OpenGL: How to get the vector position of the cube?

glm::mat4 yellow_bone_obj_mat = m_bone_animation->get_yellow_mat();
glUniformMatrix4fv(glGetUniformLocation(shader.program, "model"), 1, GL_FALSE, glm::value_ptr(yellow_bone_obj_mat));
bone_obj->obj_color = m_bone_animation->colors[1];
draw_object(shader, *bone_obj);
I created a cube using this code.
glm::vec3 scale = glm::vec3(1.f, 1.f, 1.f);
m_yellow_mat = glm::mat4(1.0f);
m_yellow_mat = glm::scale(m_yellow_mat, scale);
glm::vec3 pivot = glm::vec3(0.0f, 2.f, 0.0f);
glm::vec3 pos = root_position;
m_yellow_mat = glm::translate(m_yellow_mat, pos);
m_yellow_mat = glm::rotate(m_yellow_mat, glm::radians(angleZ), glm::vec3(0, 0, 1));
m_yellow_mat = glm::rotate(m_yellow_mat, glm::radians(angleY), glm::vec3(0, 1, 0));
m_yellow_mat = glm::rotate(m_yellow_mat, glm::radians(angleX), glm::vec3(1, 0, 0));
m_yellow_mat = glm::translate(m_yellow_mat, pivot);
m_yellow_mat = glm::scale(m_yellow_mat, scale_vector[1]);
// scale_vector[1] = {0.5f,4.f,0.5f} This is scale_vector[1]
// root_position = { 2.0f,1.0f,2.0f };
These are the transformations I applied.
This enables it to rotate around the endpoint (bottom part) of the cube. I want to find the Vector position of the start point of the cube (top part). How can I do that?
A 4x4 transformation matrix looks as follows:
column 0: Xx, Xy, Xz, 0
column 1: Yx, Xy, Yz, 0
column 2: Zx Zy Zz, 0
column 3: Tx, Ty, Tz, 1
The translation is stored in the 4th column of the column major order matrix.
That means the xyz components of the translation are m_yellow_mat[3][0], m_yellow_mat[3][1] and m_yellow_mat[3][2]:
glm::vec3 trans = glm::vec3(m_yellow_mat[3]);
If you want to know the world position of a vertex coordinate of the model, then you've to transform the model coordinate by the model matrix:
glm::vec3 vertex_corodiante;
glm::vec3 world_coordiante = glm::vec3(m_yellow_mat * glm::vec4(vertex_corodiante, 1.0f));

Rotating camera lookAt point with quarternions

I am trying to rotate the camera using quarternions but i have problems doing it.
The first thing that I notices now is that when I execute this the Camera Position and Camera LookAt become almost the same and in some cases they are the same and then i get precision problems and all other problems relating to it when i try and move the camera.
if (Input::getInstance()->isMouseDown(SDL_BUTTON_RIGHT-1)){
//log_.debug("Camera Mouse Left down");
glm::vec2 mouseDelta = glm::ivec2(oldX, oldY) - Input::getInstance()->getMousePosition();
glm::quat q1 = glm::quat(glm::vec3(glm::radians(mouseDelta.y), glm::radians(mouseDelta.x), 0.0f));
cameraLook_ = q1 * (direction * mouseSensitivity_) * glm::conjugate(q1) + cameraPosition_;
//cameraLook_ = glm::rotate(cameraLook_, mouseDelta.x * delta, glm::vec3(0,1,0));
//cameraLook_ = glm::rotate(cameraLook_, mouseDelta.y * delta, glm::vec3(0, 0, 1));
}
I switched back to matrices for my solution, because for me they are easier to work with at the moment.
There are few things i need to understand about quarternions then I can fix my old problem.
glm::mat4 rotationMatrix = glm::translate(cameraPosition_ - glm::vec3(1.0));
rotationMatrix *= glm::rotate(mouseDelta.x, glm::vec3(0, 1, 0));
rotationMatrix *= glm::rotate(mouseDelta.y, glm::vec3(0, 0, 1));
rotationMatrix *= glm::translate(-cameraPosition_ + glm::vec3(1.0));
cameraLook_ = glm::vec3(rotationMatrix * glm::vec4(cameraLook_, 1.0f));

Scene rendering wonky when camera transformations occur

I've recently switched over to GLM for managing my matrices and vectors, however when I change my variables such as camera angles or position, the whole rendered scene goes haywire.
I really don't know how to describe it other than stretching and moving all over the place.
Problem:
"Camera" transformations such as panning the camera result in strange atypical/unexpected changes. Typically, when the camera pan variables like X and Y deviate from "0"
Note:
I used to perform these very same types of transformations on Qt's datatypes for QMatrix4x4 and QVector3D, rather than glm::mat4x4 and glm::vec4, and it worked fine
Here is the way I'm implementing the camera in my render function (alpha and beta are rotation vars, and = 0 by default, camX and camY are panning vars, and also = 0 by default):
glm::mat4x4 mMatrix;
glm::mat4x4 vMatrix;
glm::mat4x4 cameraTransformation;
cameraTransformation = glm::rotate(cameraTransformation, glm::radians(alpha)/*alpha*(float)M_PI/180*/, glm::vec3(0, 1 ,0));
cameraTransformation = glm::rotate(cameraTransformation, glm::radians(beta)/*beta*(float)M_PI/180*/, glm::vec3(1, 0, 0));
glm::vec4 cameraPosition = (cameraTransformation * glm::vec4(camX, camY, distance, 0));
glm::vec4 cameraUpDirection = cameraTransformation * glm::vec4(0, 1, 0, 0);
vMatrix = glm::lookAt(glm::vec3(cameraPosition[0],cameraPosition[1],cameraPosition[2]), glm::vec3(camX, camY, 0.0), glm::vec3(cameraUpDirection[0],cameraUpDirection[1],cameraUpDirection[2]));
glm::mat4x4 glmat = pMatrix * vMatrix * mMatrix;
QMatrix4x4 qmat = QMatrix4x4(glmat[0][0],glmat[0][1],glmat[0][2],glmat[0][3],
glmat[1][0],glmat[1][1],glmat[1][2],glmat[1][3],
glmat[2][0],glmat[2][1],glmat[2][2],glmat[2][3],
glmat[3][0],glmat[3][1],glmat[3][2],glmat[3][3]);
shaderProgram.bind();
shaderProgram.setUniformValue("mvpMatrix", qmat);
I set up my projection matrix as so (fov = 30 degrees):
pMatrix = glm::perspective( glm::radians(fov), (float)width/(float)height, (float)0.001, (float)10000 );
My matrices look like this at the time they are used:
Here's an example of how it looks
Before any changes, all values are at 0:
When camX changes to 14 (note, I didn't rotate my camera around!):
glm::mat4x4 cameraTransformation;
cameraTransformation = glm::rotate(cameraTransformation, glm::radians(alpha)/*alpha*(float)M_PI/180*/, glm::vec3(0, 1 ,0));
cameraTransformation = glm::rotate(cameraTransformation, glm::radians(beta)/*beta*(float)M_PI/180*/, glm::vec3(1, 0, 0));
This can be simplified by using matrix multiplication and using a different glm call:
glm::mat4x4 cameraTransformation =
glm::rotate(glm::radians(alpha), glm::vec3(0,1,0)) *
glm::rotate(glm::radians(beta), glm::vec3(1,0,0));
Next:
glm::vec4 cameraPosition = (cameraTransformation * glm::vec4(camX, camY, distance, 0));
glm::vec4 cameraUpDirection = cameraTransformation * glm::vec4(0, 1, 0, 0);
Having a zero in the w component of a vector indicates that the vector is a direction, not a position. Yet you are obtaining a position vector as the output. This happens to work because cameraTransformation has only rotation operations, not translating operations, but it's better to be clear:
glm::vec3 cameraPosition = glm::vec3(cameraTransformation * glm::vec4(camX, camY, distance, 1));
Note- I use a vec3 not a vec4 because I just like to do that.
For the next part you actually do want a direction vector and not a position vector, so you should have a zero in the w component. Still cast it to a vec3, because it's just clearer in my opinion.
glm::vec3 cameraUpDirection = glm::vec3(cameraTransformation * glm::vec4(0, 1, 0, 0));
Next:
vMatrix=
glm::lookAt(glm::vec3(cameraPosition[0],cameraPosition[1],cameraPosition[2]),
glm::vec3(camX, camY, 0.0),
glm::vec3(cameraUpDirection[0],cameraUpDirection[1],cameraUpDirection[2]));
Glm lets you pass a vec3 into a vec4 as a constructor parameter so you can shorten your code like this:
vMatrix=
glm::lookAt(glm::vec3(cameraPosition),
glm::vec3(camX, camY, 0.0),
glm::vec3(cameraUpDirection));
But we don't even need to do that because i changed the variables into vec3s not vec4s:
vMatrix= glm::lookAt(cameraPosition, glm::vec3(camX, camY, 0.0), cameraUpDirection);
And finally, you can access the components of a glm vector using .x,.y,.z,.w instead of the [] operator, which I imagine is much safer and easier to read.
I made a very stupid error!
In attempt to convert my glm::mat4x4 to QMatrix4x4, I accidentally swapped the rows and columns.
I needed to change:
QMatrix4x4 qmat = QMatrix4x4(glmat[0][0],glmat[0][1],glmat[0][2],glmat[0][3],
glmat[1][0],glmat[1][1],glmat[1][2],glmat[1][3],
glmat[2][0],glmat[2][1],glmat[2][2],glmat[2][3],
glmat[3][0],glmat[3][1],glmat[3][2],glmat[3][3]);
to:
QMatrix4x4 qmat = QMatrix4x4(glmat[0][0],glmat[1][0],glmat[2][0],glmat[3][0],
glmat[0][1],glmat[1][1],glmat[2][1],glmat[3][1],
glmat[0][2],glmat[1][2],glmat[2][2],glmat[3][2],
glmat[0][3],glmat[1][3],glmat[2][3],glmat[3][3]);

How exactly do I generate the View Matrix from a camera position and orientation?

I have a camera and it has a position and a quaternion and I have been trying to generate a View matrix from that.
I have tried
auto camera_rotation = glm::mat4_cast(camera.orientation);
auto camera_translation = glm::translate(glm::mat4(), camera.position);
auto viewMatrix = glm::inverse(camera_translation * camera_rotation);
Then I tried doing
auto camera_dir = camera.position + glm::rotate(camera.orientation, glm::vec3(1, 0, 0));
auto camera_up = glm::rotate(camera.orientation, glm::vec3(0, 1, 0));
auto viewMatrix = glm::lookAt(camera.position, camera_dir, camera_up);
but the camera_up is not always correct.
If I manually set the ViewMatrix to
glm::lookAt(camera.position, glm::vec3(0, 0, 0), glm::vec3(0, 1, 0));
Then I see what I expect to see, so I think the rest of my calculations are correct.
How do I do this properly?