Rotate rigidBody on its edge? - c++

I'm creating a simple game that involves swinging a club using the bullet library in c++. However, I'm having trouble rotating the club (rigid body) the way I want.
For reference, here is the club structure:
struct Club {
btRigidBody *btPhys;
btScalar rotateAmount;
btVector3 axis;
}
Here is how I rotate the club:
//Create a rotation matrix from the club's world transform
btMatrix3x3 orn = club.btPhys->getWorldTransform().getBasis();
//Apply a rotation to the matrix
orn *= btMatrix3x3(btQuaternion( btVector3(0, 0, 1), btScalar(degreesToRads(club.rotationAmount))));
//Set the rotation of the club
club.btPhys->getWorldTransform().setBasis(orn);
However, this just rotates the rigid body on the center of the club. I want the rigid body to be rotated near it's edge (the club's handle).
So I added a transform to move the club upward above the pivot point:
(this code follows directly after the previous code)
btTransform trans;
//Get the club's trasnform
club.btPhys->getMotionState()->getWorldTransform(trans);
float x, y, z;
x = trans.getOrigin().getX();
y = trans.getOrigin().getY();
z = trans.getOrigin().getZ();
//Set the transform to be 2 units above the club's origin (the club is 4 units long)
trans.setOrigin(btVector3(btScalar(x), btScalar(y + 2.0f), btScalar(z)));
//Apply the transform to the club
club.btPhys->getMotionState()->setWorldTransform(trans);
However, this doesn't change the pivot point, it just moves the club above the pivot point (creating an unwanted space).
In short, the rotation matrix just rotates the rigidbody around it's center. I just want to move the pivot point from the center of the rigid body to the edge of the rigidbody.

SetWorldTransform of MotionState always transforms the body around its center of mass.
There are two ways in which you can solve your problem. If you're using btDefaultMotionState as your motion state, you can specify the offset of center of mass in the constructor. This way the body will rotate around another point. There is however a serious side effect of this approach. All physical interaction with this object will treat the new pivot point as the center of mass. If your body is kinematic, it is perfectly fine but if it is dynamic, the effects may differ from desired effects.
Another probably more safe approach is to stack 3 transformations together to achieve the rotation you're looking for. First, translate the body to the location of desired pivot. Next, rotate it. Finally, translate it back where the center of mass should be.

Related

How to rotate a cube by its center

I am trying to rotate a "cube" full of little cubes using keyboard which works but not so great.
I am struggling with setting the pivot point of rotation to the very center of the big "cube" / world. As you can see on this video, center of front (initial) face of the big cube is the pivot point for my rotation right now, which is a bit confusing when I rotate the world a little bit.
To explain it better, it looks like I am moving initial face of the cube when using keys to rotate the cube. So the pivot point might be okay from this point of view, but what is wrong in my code? I don't understand why it is moving by front face, not the entire cube by its very center?
In case of generating all little cubes, I call a function in 3 for loops (x, y, z) and the function returns cubeMat so I have all cubes generated as you can see on the video.
cubeMat = scale(cubeMat, {0.1f, 0.1f, 0.1f});
cubeMat = translate(cubeMat, {positioning...);
For rotation itself, a short example of rotation to left looks like this:
mat4 total_rotation; //global variable - never resets
mat4 rotation; //local variable
if(keysPressed[GLFW_KEY_LEFT]){
timer -= delta;
rotation = rotate(mat4{}, -delta, {0, 1, 0});
}
... //rest of key controls
total_rotation *= rotation;
And inside of those 3 for cycles is also this:
program.setUniform("ModelMatrix", total_rotation * cubeMat);
cube.render();
I have read that I should use transformation to set the pivot point to the middle but in this case, how can I set the pivot point inside of little cube which is in center of world? That cube is obviously x=2, y=2, z=2 since in for cycles, I generate cubes starting at x=0.
You are accumulating the rotation matrices by right-multiplication. This way, all rotations are performed in the local coordinate systems that result from all previous transformations. And this is why your right-rotation results in a turn after an up-rotation (because it is a right-rotation in the local coordinate system).
But you want your rotations to be in the global coordinate system. Thus, simply revert the multiplication order:
total_rotation = rotation * total_rotation;

Arcball camera locked when parallel to up vector

I'm currently in the process of finishing the implementation for a camera that functions in the same way as the camera in Maya. The part I'm stuck in the tumble functionality.
The problem is the following: the tumble feature works fine so long as the position of the camera is not parallel with the up vector (currently defined to be (0, 1, 0)). As soon as the camera becomes parallel with this vector (so it is looking straight up or down), the camera locks in place and will only rotate around the up vector instead of continuing to roll.
This question has already been asked here, unfortunately there is no actual solution to the problem. For reference, I also tried updating the up vector as I rotated the camera, but the resulting behaviour is not what I require (the view rolls as a result of the new orientation).
Here's the code for my camera:
using namespace glm;
// point is the position of the cursor in screen coordinates from GLFW
float deltaX = point.x - mImpl->lastPos.x;
float deltaY = point.y - mImpl->lastPos.y;
// Transform from screen coordinates into camera coordinates
Vector4 tumbleVector = Vector4(-deltaX, deltaY, 0, 0);
Matrix4 cameraMatrix = lookAt(mImpl->eye, mImpl->centre, mImpl->up);
Vector4 transformedTumble = inverse(cameraMatrix) * tumbleVector;
// Now compute the two vectors to determine the angle and axis of rotation.
Vector p1 = normalize(mImpl->eye - mImpl->centre);
Vector p2 = normalize((mImpl->eye + Vector(transformedTumble)) - mImpl->centre);
// Get the angle and axis
float theta = 0.1f * acos(dot(p1, p2));
Vector axis = cross(p1, p2);
// Rotate the eye.
mImpl->eye = Vector(rotate(Matrix4(1.0f), theta, axis) * Vector4(mImpl->eye, 0));
The vector library I'm using is GLM. Here's a quick reference on the custom types used here:
typedef glm::vec3 Vector;
typedef glm::vec4 Vector4;
typedef glm::mat4 Matrix4;
typedef glm::vec2 Point2;
mImpl is a PIMPL that contains the following members:
Vector eye, centre, up;
Point2 lastPoint;
Here is what I think. It has something to do with the gimbal lock, that occurs with euler angles (and thus spherical coordinates).
If you exceed your minimal(0, -zoom,0) or maxima(0, zoom,0) you have to toggle a boolean. This boolean will tell you if you must treat deltaY positive or not.
It could also just be caused by a singularity, therefore just limit your polar angle values between 89.99° and -89.99°.
Your problem could be solved like this.
So if your camera is exactly above (0, zoom,0) or beneath (0, -zoom,0) of your object, than the camera only rolls.
(I am also assuming your object is at (0,0,0) and the up-vector is set to (0,1,0).)
There might be some mathematical trick to resolve this, I would do it with linear algebra though.
You need to introduce a new right-vector. If you make a cross product, you will get the camera-vector. Camera-vector = up-vector x camera-vector. Imagine these vectors start at (0,0,0), then easily, to get your camera position just do this subtraction (0,0,0)-(camera-vector).
So if you get some deltaX, you rotate towards the right-vector(around the up-vector) and update it.
Any influence of deltaX should not change your up-vector.
If you get some deltaY you rotate towards the up-vector(around the right-vector) and update it. (This has no influence on the right-vector).
https://en.wikipedia.org/wiki/Rotation_matrix at Rotation matrix from axis and angle you can find a important formula.
You say u is your vector you want to rotate around and theta is the amount you want to pivot. The size of theta is proportional to deltaX/Y.
For example: We got an input from deltaX, so we rotate around the up-vector.
up-vector:= (0,1,0)
right-vector:= (0,0,-1)
cam-vector:= (0,1,0)
theta:=-1*30° // -1 due to the positive mathematical direction of rotation
R={[cos(-30°),0,-sin(-30°)],[0,1,0],[sin(-30°),0,cos(-30°)]}
new-cam-vector=R*cam-vector // normal matrix multiplication
One thing is left to be done: Update the right-vector.
right-vector=camera-vector x up-vector .

In c++, how to traslate a point attached to another in 3D space according to the latest one's quaternion rotations

I have to possitions, p1 and p2, p2 is attached to p1, not only to p1's position but also to it's rotation, so q1 is a quaternion which represents p1's rotation.
If q1 rotates, then p1's position must also rotate around p1 accordingly.
I only need to calculate p2's position, not it's rotation, I worked the rotation out already.
So basically is a spaceship docked to a station, I need to move and rotate the station around with the ship docked to it.
How do I do it?
the code i wrote for it works as long as the station is not rotated during the time of docking:
bool docked[100];
Quaternion quatTarget[100];
double distance_dock[100];
vector3 docking_position(int ship, int station)
{
if (!docked[ship])
{
docked[ship] = true;
distance_dock[ship] = distances(position[ship], position[station]);
vector3 direcc = normalized(position[station] - position[ship]);
quatTarget[ship] = vecToVecRotation(direcc, { 0, 0, 1 });
QuaternionNormalize(&quatTarget[ship], &quatTarget[ship]);
}
Quaternion orientation = total_rotation[station] * quatTarget[ship];
Matrix docking_place;
MatrixRotationQuaternion(&docking_place, &orientation);
vector3 axis_z = { docking_place(0, 2), docking_place(1, 2), docking_place(2, 2) };
return position[station] + -axis_z * distance_dock[ship];
}
What I do here is take an orientation quaternion from the ship to the station at the time of docking and then traslate the ship "distance_dock" units along the negative z axis of the orientation, so the ship will always move accordingly, but somehow if I dock the ship when the station is already rotated then I get the initial docking position wrong, though it still rotates perfectly along with the station.
If I understand you correctly, you have two objects that have a rigid transformation between them. The problem is that you want to calculate the pose (position + orientation) of one, given the pose of the other.
Let's say you have three frames; the Station frame "S", the Vehicle frame "V" and the Global frame "G" (I assume your graphics environment has a global 3D Cartesian frame).
The transformation between frames S and V is fully known (translation and orientation) and constant, and is denoted S_p_SV (the position of the Vehicle w.r.t the Station, expressed in the Station frame) and SV_q (the quaternion orientation of the Vehicle, expressed in the Station frame).
This will be confusing if you have not had experience in rigid-body mechanics, in which case you should read some introductory notes/slideshows on "Rigid-Body Mechanics" which are plentiful on Google results.
I have written the expression in LATEX but unfortunately StackOverflow does not support it, so I have attached it as an image. The original LATEX can be found here.
In my notation below, for example on the first line Sp_SV , is the position of the Vehicle w.r.t. the Station, expressed in the Station frame (of rotation).
The prefixed superscript indicates the rotation frame. For the quaternion G_Sq for example, this represents the orientation of the Station frame from the Ground Frame.
In terms of implementing this in C++, I am unsure of what library you are using for Quaternions, but you will need the following functions:
Convert Euler to Quaternions - If you are going to manually specify the rotation SVq (rotation of Vehicle w.r.t Station)
Convert Quaternion to DCM - For the first method in the LATEX
Quaternion Multiply - For the second method in the LATEX
Quaternion Conjugate - For the second method in the LATEX

Error control of directx camera rotation?

I use the mouse to control camera rotation in my program(using Directx 9.0c). Mouse X controls the camera to rotate around the Up Vector and Mouse Y controls the rotation around the Right Vector. Rotation caculation is as below:
void Camera::RotateCameraUp(float angle)
{
D3DXMATRIX RoMatrix;
D3DXMatrixRotationAxis(&RoMatrix, &vUp, angle);
D3DXVec3TransformCoord(&vLook, &vLook, &RoMatrix);
D3DXVec3TransformCoord(&vRight, &vRight, &RoMatrix);
}
void Camera::RotateCameraRight(float angle)
{
D3DXMATRIX RoMatrix;
D3DXMatrixRotationAxis(&RoMatrix, &vRight, angle);
D3DXVec3TransformCoord(&vLook, &vLook, &RoMatrix);
D3DXVec3TransformCoord(&vUp, &vUp, &RoMatrix);
}
It is supposed that rotation around Up or Right vector should not leads to rotation around the "LookAt" vector, but if I circle my mouse for a while and stop it at the starting point, rotation around the "LookAt" vector has happened. I think it's because of the error while caculating, but I don't know how to eliminate it or control it. Any idea?
This is a common problem. You apply many rotations, and over time, the rounding errors sum up. After a while, the three vectors vUp, vLook and vRight are not normalized and orthogonal anymore.
I would use one of two options:
1.
Don't store vLook and vRight; instead, just store 2 angles. Assuming x is right, y is top, z is back, store a) the angle between your view axis and the xz-Plane, and b) the angle between the projection of your view axis on the xz-Plane and the z-Axis or x-Axis. Update these angles according to mouse move and calculate vLook and vRight from them.
2.
Set the y-component of vRight to 0, as vRight should be in the xz-Plane. Then re-orthonormalize the vectors (you know the vectors should be perpendicular to each other and have length 1). So after calculating the new vLook and vRight, apply these corrections:
vRight.y = 0
vRight = Normalize(vRight)
vUp = Normalize(Cross(vLook, vRight))
vLook = Normalize(Cross(vRight, vUp))

"Looking At" an object with a Quaternion

So I am currently trying to create a function that will take two 3D points A and B, and provide me with the quaternion representing the rotation required of point A to be "looking at" point B (such that point A's local Z axis passes through point B, if you will).
I originally found this post, the top answer of which seemed to provide me with a good starting point. I went on to implement the following code; instead of assuming a default (0, 0, -1) orientation, as the original answer suggests, I try to extract a unit vector representing the actual orientation of the camera.
void Camera::LookAt(sf::Vector3<float> Target)
{
///Derived from pseudocode found here:
///https://stackoverflow.com/questions/13014973/quaternion-rotate-to
//Get the normalized vector from the camera position to Target
sf::Vector3<float> VectorTo(Target.x - m_Position.x,
Target.y - m_Position.y,
Target.z - m_Position.z);
//Get the length of VectorTo
float VectorLength = sqrt(VectorTo.x*VectorTo.x +
VectorTo.y*VectorTo.y +
VectorTo.z*VectorTo.z);
//Normalize VectorTo
VectorTo.x /= VectorLength;
VectorTo.y /= VectorLength;
VectorTo.z /= VectorLength;
//Straight-ahead vector
sf::Vector3<float> LocalVector = m_Orientation.MultVect(sf::Vector3<float>(0, 0, -1));
//Get the cross product as the axis of rotation
sf::Vector3<float> Axis(VectorTo.y*LocalVector.z - VectorTo.z*LocalVector.y,
VectorTo.z*LocalVector.x - VectorTo.x*LocalVector.z,
VectorTo.x*LocalVector.y - VectorTo.y*LocalVector.x);
//Get the dot product to find the angle
float Angle = acos(VectorTo.x*LocalVector.x +
VectorTo.y*LocalVector.y +
VectorTo.z*LocalVector.z);
//Determine whether or not the angle is positive
//Get the cross product of the axis and the local vector
sf::Vector3<float> ThirdVect(Axis.y*LocalVector.z - Axis.z*LocalVector.y,
Axis.z*LocalVector.x - Axis.x*LocalVector.z,
Axis.x*LocalVector.y - Axis.y*LocalVector.x);
//If the dot product of that and the local vector is negative, so is the angle
if (ThirdVect.x*VectorTo.x + ThirdVect.y*VectorTo.y + ThirdVect.z*VectorTo.z < 0)
{
Angle = -Angle;
}
//Finally, create a quaternion
Quaternion AxisAngle;
AxisAngle.FromAxisAngle(Angle, Axis.x, Axis.y, Axis.z);
//And multiply it into the current orientation
m_Orientation = AxisAngle * m_Orientation;
}
This almost works. What happens is that the camera seems to rotate half the distance towards the Target point. If I attempt the rotation again, it performs half the remaining rotation, ad infinitum, such that if I hold down the "Look-At-Button", the camera's orientation gets closer and closer to looking directly at the target, but is also constantly slowing down in its rotation, such that it never quite gets there.
Note that I don't want to resort to gluLookAt(), as I will also eventually need this code to point objects other than the camera at one another, and my objects already use quaternions for their orientations. For example, I might want to create an eyeball that tracks the position of something moving around in front of it, or a projectile that updates its orientation to seek out its target.
Normalize Axis vector before passing it to FromAxisAngle.
Why are you using a quaternion? You're just making things more complex and requiring more computation in this instance. To set up a matrix:-
calculate vector from observer to observed (which you're doing already)
normalise it (again, doing it already) = at
cross product this with the observer's up direction = right
normalise right
cross product at and right to get up
and you're done. The right, up and at vectors are the first, second and third row (or column, depending on how you set things up) of your matrix. The final row/column is the objects position.
But it looks like you want to transform an existing matrix to this new matrix over several frames. SLERPs do this to matricies as well as quaternions (which isn't surprising when you look into the maths). For the transformation, store the initial and target matricies and then SLERP between them, changing the amount to SLERP by each frame (e.g. 0, 0.25, 0.5, 0.75, 1.0 - although a non-linear progression would look nicer).
Don't forget that you're converting a quaternion back into a matrix in order to pass it to the rendering pipeline (unless there's some new features in the shaders to handle quaternions natively). So any efficencies due to quaternion use has to take into account the conversion process as well.