C++ OpenGL how to get rotations x,y,z from quaterion - c++

I have been handling object rotations in my engine by storing object`s x,y and z rotation and then when I am about to render, I was creating the transformation matrix like this.
// entity.getRotation() returns a glm::vec3 where I use these
//values to rotate the object.
glm::mat4 model;
model = glm::translate(model, entity.getPosition());
model = glm::rotate(model, glm::radians(entity.getRotation().x), glm::vec3(1.0f, 0.0f, 0.0f));
model = glm::rotate(model, glm::radians(entity.getRotation().y), glm::vec3(0.0f, 1.0f, 0.0f));
model = glm::rotate(model, glm::radians(entity.getRotation().z), glm::vec3(0.0f, 0.0f, 1.0f));
model = glm::scale(model, glm::vec3(entity.getScale(), entity.getScale(), entity.getScale()));
Now, I implemented an AntTweakBar GUI into my engine where there is an option to be able to rotate objects on the GUI. I am currently trying to make it possible to rotate the objects in GUI and see the resulting rotations in the engine. The problem is that GUI works with quaternions while in my engine object rotation is stored as 3 floats of x,y,z rotation amounts.
My question is, how can I take the quaternion and turn it into x, y, z rotations so that I can use my above mention method to create the transformation matrix?
I found this method in glm
glm::eulerAngles(glm::quat(q[0], q[1], q[2], q[3]));
but upon looking at returned vec3, it does not seem like what I am looking for. I believe eulerAngles returns pitch, yaw, and roll which behaves incorrectly when I try to use these values to create my Transformation matrix.
Edit:
I found my mistake. It turned out that my old solution was fine(Although as someone pointed out in comments using quats might be faster). The problem was in my conversion from float array representation of quat(float array representation for AntTweakBar) to angles. It seems like AntTweakBar stores the x,y,z,w components in quat in a different order in the float array. The correct order in the float array is y,z,w,x but I have no clue why this is.

With GLM there are a couple of options, one that you have already mentioned, however make sure that your measure of angles are consistent. If you are using and relying on radians most of GLM's library methods past a specific version now expects all angles that are being passed into their rotation functions in radians however according to their docs when working with quaternions and using glm::eulerAngles(); it returns the angles pitch as X, yaw as Y, and roll as Z in degrees unless GLM_FORCE_RADIANS is defined.
Another alternative would be to use these two methods:
glm::mat4_cast
GLM_FUNC_DECL tmat4x4<T, P> glm::mat4_cast ( tquat< T, P > const & x )
Converts a quaternion to a 4 * 4 matrix.
See also GLM_GTC_quaternion Referenced by glm::toMat4().
glm::quat_cast
GLM_FUNC_DECL tquat<T, P> glm::quat_cast ( tmat4x4< T, P > const & x )
Converts a 4 * 4 matrix to a quaternion.
See also GLM_GTC_quaternion
If your GUI is using Quaternions you can retrieve that information and save it to a glm::quaternion then from there you can use one of these functions to convert it over to a 4x4 matrix. There are also 3x3 matrix-quat & quat-3x3 matrix versions of these conversion functions.

Related

how can I rotate an object around a point other than the origin point with glm::rotate?

I am trying to make my world rotate around my camera no matter where my camera is. I am not doing any crazy math yet, I am leaving middle school this year and I don't know what quaternions are. My problem is that every time I use glm::rotate function for anything it only allows me to rotate around an axis at the origin point and I can't find a way to fix this. if there is any somewhat simple answer to this problem I am having please let me know how I can rotate my world around any given point. thanks
glm::mat4 look(1.0f);
float Rrotation;
Rrotation = 20.0f;
glm::vec3 the_axis_not_orientation(0.0f, 1.0f, 0.0f);
look = glm::rotate(look, Rrotation, the_axis_not_orientation);
What you actually do is to rotate the model:
model_view = look * rotate
If you want to rotate the view, then you have to swap the order of the matrices. Note, the matrix multiplication is not Commutative:
model_view = rotate * look
For your code that menas:
glm::mat4 rotate = glm::rotate(glm::mat4(1.0f), Rrotation, the_axis_not_orientation)
look = rotate * look;

How to properly rotate and scale an object in OpenGL?

I'm trying to build the model matrix every frame, for that I'm creating a translation, rotation, and scale matrix and multiplying it. But I can't seem to figure it how to build the rotation matrix and scale it properly.
Here's what I'm doing:
glm::mat4 scale = glm::scale(mat4(1.0f), my_models[i].myscale);
glm::mat4 rotateM(1.0);
glm:mat4 translate = glm::translate(mat4(1.0f), my_models[i].initialPos);
rotateM = mat4_cast(my_models[i].Quat);
rotateM = glm::rotate(rotateM, (float) my_models[i].angle * t, my_models[i].animation_axis[0]);
my_models[i].modelMatrix = translate * rotateM *scale;
my_models[i].Quat = quat_cast(my_models[i].modelMatrix);
In the Constructor I'm using:
quat Quat = glm::angleAxis(glm::radians(90.f), glm::vec3(0.f, 1.f, 0.f));
If my_models[i].myscale exactly 1.0f it rotates just fine, but if it is bigger the object keeps growing and rotates weirdly. Quaternions are very new to me, so I'm assuming I'm messing up there.
What am I missing? Are there simpler ways to construct the models rotation matrix? if so, what information should I be saving?
Edit:
As jparima suggested, the following fixed my problem.
glm::mat4 scale = glm::scale(mat4(1.0f), my_models[i].myscale);
glm::mat4 rotateM(1.0);
glm::mat4 translate = glm::translate(mat4(1.0f), my_models[i].initialPos);
my_models[i].Quat = rotate(my_models[i].Quat, my_models[i].angle * t, my_models[i].animation_axis[0]);
rotateM = mat4_cast(my_models[i].Quat);
my_models[i].modelMatrix = translate * rotateM * scale;
From the GLM quaternion.hpp for quat_cast it says
Converts a pure rotation 4 * 4 matrix to a quaternion.
You are setting the whole model matrix there which has also scale and translation. In fact, I don't know why you are converting a matrix to quaternion at all. Therefore you could remove the last line (quat_cast) and update the quaternion directly if you want to apply rotations.

GLM::Rotate seems to cause wrong rotation?

Simply put, I'm learning OpenGL and am starting to learn about transform matrices. Below is my current code:
glm::vec4 myPosition(1.0f, 1.0f, 1.0f, 1.0f);
glm::vec3 rotationAxis(0.0f, 1.0f, 0.0f);
glm::mat4 scalar = glm::scale(glm::vec3(1.0f, 1.0f, 1.0f));
glm::mat4 rotator = glm::rotate(360.0f, rotationAxis);
glm::mat4 translator = glm::translate(glm::vec3(1.0f, 1.0f, 1.0f));
glm::mat4 transform = translator * rotator * scalar;
myPosition = transform * myPosition;
As far as I can tell, I'm doing this in the correct order: Scale -> Rotate -> Translate. So, I have the scale set to do nothing because I don't actually want it to scale anywhere (for simplicity sake).
Next, I set rotate to 360.0f on (correct me if I'm wrong) the Y axis. This should return to the original point, at least that's what I would think from a 360 degree rotation around a singular axis.
Then, I set it to translate 1 unit in every direction to make sure it moves.
After finishing this all, I have commented out the rotator line, and it works fantastic, even if I change the scale. However, whenever I add in the rotate line the final position is not a normal 360 degree rotation?
I have configured my program to output the position vector both before transforms and after.
The before position is (1, 1, 1)
The after position is (1.67522, 2, -0.242607).
I have been struggling to find my error, literally all day so if anyone can help me find what I'm doing wrong, it would be greatly appreciated!!
According to the documentation at http://glm.g-truc.net/0.9.7/api/a00236.html (for the latest released version right now), glm::rotate(..) takes in an angle expressed in degrees.
However, changing your rotation matrix line
glm::mat4 rotator = glm::rotate(360.0f, rotationAxis);
to
glm::mat4 rotator = glm::rotate(3.141592f * 2.0f, rotationAxis);
which is just 2*PI fixes this.
This means that the angle should be in radians rather than in degrees. Tested on my machine with GLM 0.9.7.1-1. This is either a mistake in the documentation or in the GLM code itself.
From my experience with GLM (some time ago, might've been an earlier version) these kinds of functions should take a degrees angle by default, and the #define GLM_FORCE_RADIANS macro is what makes them calculate in radians. It is possible that the author made this the default behaviour and forgot to update the docs.
On a side note, you should probably not be using scalar as the name for a glm::mat4 value, since a scalar in mathematics is just a single real number rather than a matrix: https://en.wikipedia.org/wiki/Scalar_%28mathematics%29

How to correctly represent 3D rotation in games

In most 3D platform games, only rotation around the Y axis is needed since the player is always positioned upright.
However, for a 3D space game where the player needs to be rotated on all axises, what is the best way to represent the rotation?
I first tried using Euler angles:
glRotatef(anglex, 1.0f, 0.0f, 0.0f);
glRotatef(angley, 0.0f, 1.0f, 0.0f);
glRotatef(anglez, 0.0f, 0.0f, 1.0f);
The problem I had with this approach is that after each rotation, the axises change. For example, when anglex and angley are 0, anglez rotates the ship around its wings, however if anglex or angley are non zero, this is no longer true. I want anglez to always rotate around the wings, irrelevant of anglex and angley.
I read that quaternions can be used to exhibit this desired behavior however was unable to achieve it in practice.
I assume my issue is due to the fact that I am basically still using Euler angles, but am converting the rotation to its quaternion representation before usage.
struct quaternion q = eulerToQuaternion(anglex, angley, anglez);
struct matrix m = quaternionToMatrix(q);
glMultMatrix(&m);
However, if storing each X, Y, and Z angle directly is incorrect, how do I say "Rotate the ship around the wings (or any consistent axis) by 1 degree" when my rotation is stored as a quaternion?
Additionally, I want to be able to translate the model at the angle that it is rotated by. Say I have just a quaternion with q.x, q.y, q.z, and q.w, how can I move it?
Quaternions are very good way to represent rotations, because they are efficient, but I prefer to represent the full state "position and orientation" by 4x4 matrices.
So, imagine you have a 4x4 matrix for every object in the scene. Initially, when the object is unrotated and untraslated, this matrix is the identity matrix, this is what I will call "original state". Suppose, for instance, the nose of your ship points towards -z in its original state, so a rotation matrix that spin the ship along the z axis is:
Matrix4 around_z(radian angle) {
c = cos(angle);
s = sin(angle);
return Matrix4(c, -s, 0, 0,
s, c, 0, 0,
0, 0, 1, 0,
0, 0, 0, 1);
}
now, if your ship is anywhere in space and rotated to any direction, and lets call this state t, if you want to spin the ship around z axis for an angle amount as if it was on its "original state", it would be:
t = t * around_z(angle);
And when drawing with OpenGL, t is what you multiply for every vertex of that ship. This assumes you are using column vectors (as OpenGL does), and be aware that matrices in OpenGL are stored columns first.
Basically, your problem seems to be with the order you are applying your rotations. See, quaternions and matrices multiplication are non-commutative. So, if instead, you write:
t = around_z(angle) * t;
You will have the around_z rotation applied not to the "original state" z, but to global coordinate z, with the ship already affected by the initial transformation (roatated and translated). This is the same thing when you call the glRotate and glTranslate functions. The order they are called matters.
Being a little more specific for your problem: you have the absolute translation trans, and the rotation around its center rot. You would update each object in your scene with something like:
void update(quaternion delta_rot, vector delta_trans) {
rot = rot * delta_rot;
trans = trans + rot.apply(delta_trans);
}
Where delta_rot and delta_trans are both expressed in coordinates relative to the original state, so, if you want to propel your ship forward 0.5 units, your delta_trans would be (0, 0, -0.5). To draw, it would be something like:
void draw() {
// Apply the absolute translation first
glLoadIdentity();
glTranslatevf(&trans);
// Apply the absolute rotation last
struct matrix m = quaternionToMatrix(q);
glMultMatrix(&m);
// This sequence is equivalent to:
// final_vertex_position = translation_matrix * rotation_matrix * vertex;
// ... draw stuff
}
The order of the calls I choose by reading the manual for glTranslate and glMultMatrix, to guarantee the order the transformations are applied.
About rot.apply()
As explained at Wikipedia article Quaternions and spatial rotation, to apply a rotation described by quaternion q on a vector p, it would be rp = q * p * q^(-1), where rp is the newly rotated vector. If you have a working quaternion library implemented on your game, you should either already have this operation implemented, or should implement it now, because this is the core of using quaternions as rotations.
For instance, if you have a quaternion that describes a rotation of 90° around (0,0,1), if you apply it to (1,0,0), you will have the vector (0,1,0), i.e. you have the original vector rotated by the quaternion. This is equivalent to converting your quaternion to matrix, and doing a matrix to colum-vector multiplication (by matrix multiplication rules, it yields another column-vector, the rotated vector).

Am I computing the attributes of my frustum properly?

I have a basic camera class, of which has the following notable functions:
// Get near and far plane dimensions in view space coordinates.
float GetNearWindowWidth()const;
float GetNearWindowHeight()const;
float GetFarWindowWidth()const;
float GetFarWindowHeight()const;
// Set frustum.
void SetLens(float fovY, float aspect, float zn, float zf);
Where the params zn and zf in the SetLens function correspond to the near and far clip plane distance, respectively.
SetLens basically creates a perspective projection matrix, along with computing both the far and near clip plane's height:
void Camera::SetLens(float fovY, float aspect, float zn, float zf)
{
// cache properties
mFovY = fovY;
mAspect = aspect;
mNearZ = zn;
mFarZ = zf;
float tanHalfFovy = tanf( 0.5f * glm::radians( fovY ) );
mNearWindowHeight = 2.0f * mNearZ * tanHalfFovy;
mFarWindowHeight = 2.0f * mFarZ * tanHalfFovy;
mProj = glm::perspective( fovY, aspect, zn, zf );
}
So, GetFarWindowHeight() and GetNearWindowHeight() naturally return their respective height class member values. Their width counterparts, however, return the respective height value multiplied by the view aspect ratio. So, for GetNearWindowWidth():
float Camera::GetNearWindowWidth()const
{
return mAspect * mNearWindowHeight;
}
Where GetFarWindowWidth() performs the same computation, of course replacing mNearWindowHeight with mFarWindowHeight.
Now that's all out of the way, something tells me that I'm computing the height and width of the near and far clip planes improperly. In particular, I think what causes this confusion is the fact that I'm specifying the field of view on the y axis in degrees, and then converting it to radians in the tangent function. Where I think this is causing problems is in my frustum culling function, which uses the width/height of the near and far planes to obtain points for the top, right, left and bottom planes as well.
So, am I correct in that I'm doing this completely wrong? If so, what should I do to fix it?
Disclaimer
This code originally stems from a D3D11 book, which I decided to quit reading and move back to OpenGL. In order to make the process less painful, I figured converting some of the original code to be more OpenGL compliant would be nice. So far, it's worked fairly well, with this one minor issue...
Edit
I should have originally mentioned a few things:
This is not my first time with OpenGL; I'm well aware of the transformation processes, as well the as the coordinate system differences between GL and D3D.
This isn't my entire camera class, although the only other thing which I think may be questionable in this context is using my camera's mOrientation matrix to compute the look, up, and right direction vectors, via transforming each on a +x, +y, and -z basis, respectively. So, as an example, to compute my look vector I would do: mOrientation * vec4(0.0f, 0.0f, -1.0f, 1.0f), and then convert that to a vec3. The context that I'm referring to here involves how these basis vectors would be used in conjunction with culling the frustum.