c++ quaternion clarification - c++

I'm working on a flight simulator. I've read a tutorial about quaternions, ( this one : http://www.opengl-tutorial.org/intermediate-tutorials/tutorial-17-quaternions/ ), so It it very new for me.
From what I've understand, quaternions should rotate an object using a direction vector and make the object's orientation match the direction, and rotate the quaternion should not move the vector, but just make him turn around himself. Am I true?
If yes, this is my code :
void Plane::Update()
{
m_matrix = GLM_GTX_quaternion::toMat4( GLM_GTX_quaternion::angleAxis( radians( m_angle ), normalize( m_planeDirection ) ) );
}
When my plane model's is pointing to x vector, my plane will rotate correctly around the x vector with the angle equal to 0, but if I change the angle, it will not rotate correctly. So how can I find the angle?

Yes, you are correct - Quaternion rotates an object around a directional vector. You should also use the glm::quat typedef when working with quaternions.
#include <glm/gtc/quaternion.hpp>
//...
glm::mat4 m = glm::mat4_cast(glm::angleAxis(glm::radians(m_angle), glm::normalize(m_planeDirection)));
glm::rotate function also works with quaternions
glm::mat4 m = glm::mat4_cast(glm::rotate(glm::quat(), 45.0f, glm::vec3(1.0f)));

Related

Arcball camera locked when parallel to up vector

I'm currently in the process of finishing the implementation for a camera that functions in the same way as the camera in Maya. The part I'm stuck in the tumble functionality.
The problem is the following: the tumble feature works fine so long as the position of the camera is not parallel with the up vector (currently defined to be (0, 1, 0)). As soon as the camera becomes parallel with this vector (so it is looking straight up or down), the camera locks in place and will only rotate around the up vector instead of continuing to roll.
This question has already been asked here, unfortunately there is no actual solution to the problem. For reference, I also tried updating the up vector as I rotated the camera, but the resulting behaviour is not what I require (the view rolls as a result of the new orientation).
Here's the code for my camera:
using namespace glm;
// point is the position of the cursor in screen coordinates from GLFW
float deltaX = point.x - mImpl->lastPos.x;
float deltaY = point.y - mImpl->lastPos.y;
// Transform from screen coordinates into camera coordinates
Vector4 tumbleVector = Vector4(-deltaX, deltaY, 0, 0);
Matrix4 cameraMatrix = lookAt(mImpl->eye, mImpl->centre, mImpl->up);
Vector4 transformedTumble = inverse(cameraMatrix) * tumbleVector;
// Now compute the two vectors to determine the angle and axis of rotation.
Vector p1 = normalize(mImpl->eye - mImpl->centre);
Vector p2 = normalize((mImpl->eye + Vector(transformedTumble)) - mImpl->centre);
// Get the angle and axis
float theta = 0.1f * acos(dot(p1, p2));
Vector axis = cross(p1, p2);
// Rotate the eye.
mImpl->eye = Vector(rotate(Matrix4(1.0f), theta, axis) * Vector4(mImpl->eye, 0));
The vector library I'm using is GLM. Here's a quick reference on the custom types used here:
typedef glm::vec3 Vector;
typedef glm::vec4 Vector4;
typedef glm::mat4 Matrix4;
typedef glm::vec2 Point2;
mImpl is a PIMPL that contains the following members:
Vector eye, centre, up;
Point2 lastPoint;
Here is what I think. It has something to do with the gimbal lock, that occurs with euler angles (and thus spherical coordinates).
If you exceed your minimal(0, -zoom,0) or maxima(0, zoom,0) you have to toggle a boolean. This boolean will tell you if you must treat deltaY positive or not.
It could also just be caused by a singularity, therefore just limit your polar angle values between 89.99° and -89.99°.
Your problem could be solved like this.
So if your camera is exactly above (0, zoom,0) or beneath (0, -zoom,0) of your object, than the camera only rolls.
(I am also assuming your object is at (0,0,0) and the up-vector is set to (0,1,0).)
There might be some mathematical trick to resolve this, I would do it with linear algebra though.
You need to introduce a new right-vector. If you make a cross product, you will get the camera-vector. Camera-vector = up-vector x camera-vector. Imagine these vectors start at (0,0,0), then easily, to get your camera position just do this subtraction (0,0,0)-(camera-vector).
So if you get some deltaX, you rotate towards the right-vector(around the up-vector) and update it.
Any influence of deltaX should not change your up-vector.
If you get some deltaY you rotate towards the up-vector(around the right-vector) and update it. (This has no influence on the right-vector).
https://en.wikipedia.org/wiki/Rotation_matrix at Rotation matrix from axis and angle you can find a important formula.
You say u is your vector you want to rotate around and theta is the amount you want to pivot. The size of theta is proportional to deltaX/Y.
For example: We got an input from deltaX, so we rotate around the up-vector.
up-vector:= (0,1,0)
right-vector:= (0,0,-1)
cam-vector:= (0,1,0)
theta:=-1*30° // -1 due to the positive mathematical direction of rotation
R={[cos(-30°),0,-sin(-30°)],[0,1,0],[sin(-30°),0,cos(-30°)]}
new-cam-vector=R*cam-vector // normal matrix multiplication
One thing is left to be done: Update the right-vector.
right-vector=camera-vector x up-vector .

Determining angular velocity required to adjust orientation based on Quaternions

Problem:
I have an object in 3D space that exists at a given orientation. I need to reorient the object to a new orientation. I'm currently representing the orientations as quaternions, though this is not strictly necessary.
I essentially need to determine the angular velocity needed to orient the body into the desired orientation.
What I'm currently working with looks something like the following:
Psuedocode:
// 4x4 Matrix containing rotation and translation
Matrix4 currentTransform = GetTransform();
// Grab the 3x3 matrix containing orientation only
Matrix3 currentOrientMtx = currentTransform.Get3x3();
// Build a quat based on the rotation matrix
Quaternion currentOrientation(currentOrientMtx);
currentOrientation.Normalize();
// Build a new matrix describing our desired orientation
Vector3f zAxis = desiredForward;
Vector3f yAxis = desiredUp;
Vector3f xAxis = yAxis.Cross(zAxis);
Matrix3 desiredOrientMtx(xAxis, yAxis, zAxis);
// Build a quat from our desired roation matrix
Quaternion desiredOrientation(desiredOrientMtx);
desiredOrientation.Normalize();
// Slerp from our current orientation to the new orientation based on our turn rate and time delta
Quaternion slerpedQuat = currentOrientation.Slerp(desiredOrientation, turnRate * deltaTime);
// Determine the axis and angle of rotation
Vector3f rotationAxis = slerpedQuat.GetAxis();
float rotationAngle = slerpedQuat.GetAngle();
// Determine angular displacement and angular velocity
Vector3f angularDisplacement = rotationAxis * rotationAngle;
Vector3f angularVelocity = angularDisplacement / deltaTime;
SetAngularVelocity(angularVelocity);
This essentially just sends my object spinning to oblivion. I have verified that the desiredOrientMtx I constructed via the axes is indeed the correct final rotation transformation. I feel like I'm missing something silly here.
Thoughts?
To calculate angular velocity, your turnRatealready provides the magnitude (rads/sec), so all you really need is the axis of rotation. That is just given by GetAxis( B * Inverse(A) ). GetAngle of that same quantity would give the total angle to travel between the two. See 'Difference' between two quaternions for further explanation.
SetAngularVelocity( Normalize( GetAxis( B * Inverse(A)) ) * turnRate )
You need to set the angular velocity to 0 at some point (when you reach your goal orientation). One way to do this is by using a quaternion distance. Another simpler way is by checking against the amount of time taken. Finally, you can check the angle between two quats (as discussed above) and check if that is close to 0.
float totalAngle = GetAngle( Normalize( endingPose * Inverse( startingPose ) ) );
if( fabs( totalAngle ) > 0.0001 ) // some epsilon
{
// your setting angular velocity code here
SetAngularVelocity(angularVelocity);
}
else
{
SetAngularVelocity( Vector3f(0) );
// Maybe, if you want high accuracy, call SetTransform here too
}
But, really, I don't see why you don't just use the Slerp to its fullest. Instead of relying on the physics integrator (which can be imprecise) and relying on knowing when you've reached your destination (which is somewhat awkward), you could just move the object frame-by-frame since you know the motion.
Quaternion startingPose;
Quaternion endingPose;
// As discussed earlier...
Quaternion totalAngle = Quaternion.AngleBetween( startingPose, endingPose );
// t is set to 0 whenever you start a new motion
t += deltaT;
float howFarIn = (turnRate * t) / totalAngle;
SetCurrentTransform( startingPose.Slerp( endingPose, howFarIn ) );
See Smooth rotation with quaternions for some discussion on that.

Quaternion rotation issue, object axes don't rotate with object

Lets say I have an object, and that object has a quaternion representing its orientation.
Currently, I can rotate on all 3 axes without gimbal lock, however, each rotation on any axis SHOULD rotate the other 2 axes of rotation.
What I mean by this is if I pitch an object towards the camera, but then yaw the object 90 degrees away, pitching the object will still rotate it relative to where it was before, not where it is now.
Here's a visual example of my problem:
I'm using quaternions and not euler rotations because these objects can rotate on all 3 axes, and I don't want to gimbal lock/reach singularity
I rotate my object's orientation quaternion like this:
orientation = Quaternion(Vector(0,1,0),angle) * orientation;
I then trigger a rebuilding of my object's vectors, and apply it to them (after transforming the object relative to 3D space origin point):
Quaternion Point = Quaternion(( orientation * (Vertex) ) * (orientation.inverse()));
vertices[x] = QVector3D(round(Point.v.x),round(Point.v.y),round(Point.v.z));
And when I multiply my quaternions by other quaternions, this is the multiplication operator's function:
Quaternion Quaternion::operator*(Quaternion& q) const
{
Quaternion r;
//"w" is the angle, "v" is the vector
r.w = w*q.w - glm::dot(v, q.v);
r.v = v*q.w + q.v*w + glm::cross(v,q.v);
//r.normalize();
return r;
}
I swear to god, I'm only posting because this topic is confusing me to no end. Solidifying this system will be make me unfathomably happy.

Quaternion-Based-Camera unwanted roll

I'm trying to implement a quaternion-based camera, but when moving around the X and Y axis, the camera produces an unwanted roll on the Z axis. I want to be able to look around freely on all axis.
I've read other topics about this problem (for example: http://www.flipcode.com/forums/thread/6525 ), but I'm not getting what is meant by "Fix this by continuously rebuilding the rotation matrix by rotating around the WORLD axis, i.e around <1,0,0>, <0,1,0>, <0,0,1>, not your local coordinates, whatever they might be."
//Camera.hpp
glm::quat rotation;
//Camera.cpp
void Camera::rotate(glm::vec3 vec)
{
glm::quat paramQuat = glm::quat(vec);
rotation = paramQuat * rotation;
}
I call the rotate function like this:
cam->rotate(glm::vec3(0, 0.5, 0));
This must have to do with local/world coordinates, right? I'm just not getting it, since I thought quaternions are always based on each other (thus a quaternion can't be in "world" or "local" space?).
Also, should i use more than one quaternion for a camera?
As far as I understand it, and from looking at the code you provided, what they mean is that you shouldn't store and apply the rotation incrementally by applying rotate on the rotation quat all the time, but instead keeping track of two quaternions for each axis (X and Y in world space) and calculating the rotation vector as the product of those.
[edit: some added (pseudo)code]
// Camera.cpp
void Camera::SetRotation(glm::quat value)
{
rotation = value;
}
// controller.cpp --> probably a place where you'd want to translate user input and store your rotational state
xAngle += deltaX;
yAngle += deltaY;
glm::quat rotationX = QuatAxisAngle(X_AXIS, xAngle);
glm::quat rotationY = QuatAxisAngle(Y_AXIS, yAngle);
camera.SetRotation(rotationX * rotationY);

Calculating the Angle Between a 3D Object and a Point

I have a 3D object in DirectX11 with a position vector and a rotation angle for the direction it's facing (it only rotates around the Y axis).
D3DXVECTOR3 m_position;
float m_angle;
If I then wanted to rotate the object to face something then I'd need to find the angle between the direction it's facing and the direction it needs to face using the dot product on the two normalised vectors.
The thing I'm having a problem with is how I find the direction the object is currently facing with just its position and the angle. What I currently have is:
D3DXVECTOR3 normDirection, normTarget;
D3DXVec3Normalize( &normDirection, ????);
D3DXVec3Normalize( &normTarget, &(m_position-target));
// I store the values in degrees because I prefer it
float angleToRotate = D3DXToDegree( acos( D3DXVec3Dot( &normDirection, &normTarget)));
Does anyone know how I get the vector for the current direction it's facing from the values I have, or do I need to re-write it so I keep track of the object's direction vector?
EDIT: Changed 'cos' to 'acos'.
SOLUTION (with the assistance of user2802841):
// assuming these are member variables
D3DXVECTOR3 m_position;
D3DXVECTOR3 m_rotation;
D3DXVECTOR3 m_direction;
// and these are local variables
D3DXVECTOR3 target; // passed in as a parameter
D3DXVECTOR3 targetNorm;
D3DXVECTOR3 upVector;
float angleToRotate;
// find the amount to rotate by
D3DXVec3Normalize( &targetNorm, &(target-m_position));
angleToRotate = D3DXToDegree( acos( D3DXVec3Dot( &targetNorm, &m_direction)));
// calculate the up vector between the two vectors
D3DXVec3Cross( &upVector, &m_direction, &targetNorm);
// switch the angle to negative if the up vector is going down
if( upVector.y < 0)
angleToRotate *= -1;
// add the rotation to the object's overall rotation
m_rotation.y += angleToRotate;
If you store the orientation as an angle then you must assume some kind of default orientation when angle is 0, express that default orientation as a vector (like (1.0, 0.0, 0.0) if you assume x+ direction to be the default), then rotate that vector by your angle degrees (either directly with sin/cos or by using rotation matrix created by one of the D3DXMatrixRotation* functions) and the resulting vector will be your current direction.
EDIT:
In answer to your comment, you can determine rotation direction like this. Which in your case translates to something like (untested):
D3DXVECTOR3 cross;
D3DXVec3Cross(&cross, &normDirection, &normTarget);
float dot;
D3DXVec3Dot(&dot, &cross, &normal);
if (dot < 0) {
// angle is negative
} else {
// angle is positive
}
Where normal is most likely a (0, 1, 0) vector (because you said your objects rotate around Y axis).
if, as you say, 0 degrees == (0,0,1) you can use:
normDirection = ( sin( angle ), 0, cos( angle ) )
because rotation is always around y axis.
You will have to change the sign of sin(angle) depending on your system (left handed or right handed).