Quaternion rotation gimbal problems in Unreal engine - c++

I am streaming data into unreal engine from an inertial sensor. It outputs UnNormalized quaternion data in the format:
X = 6561.00000
Y = 6691.00000
Z = 2118.00000
W = 2078.00000
I am applying this to an actor, in c++, using:
this->SetActorRelativeRotation(rotsQ);
And it gives me strange gimbal issues.
When i rotate pitch 90 degrees, it rotates in Pitch.
I rotate 90 degrees in Yaw.
When i rotate pitch 90 degrees, it rotates in Roll.
I have tried Converting it to a FRotator and flipping axes, applying axes one at a time and switching the rotation order. I have tried setting the Actor to 0,0,0 every tick then adding the rotation value. No matter what I do, I see the same thing. Any help here would be very much appreciated!
Could it be a handedness problem? What can i try here?

It's not clear if your input data from the sensor show change of rotation or its absolute value. If it is an absolute value try using SetActorRotation instead of SetActorRelativeRotation.
If input data represents delta rotation, try AddActorLocalRotation or AddActorWorldRotation.

Related

Make character look at player?

I need to make a function that will calculate the degrees necessary to make an NPC look at the center of the player. However, I have not been able to find any results regarding 3 dimensions which is what I need. Only 2 dimensional equations. I'm programming in C++.
Info:
Data Type: Float.
Vertical-Axis: 90 is looking straight up, -90 is looking straight down and 0 is looking straight ahead.
Horizontal-Axis: Positive value between 0 and 360, North is 0, East is 90, South 180, West 270.
See these transformation equations from Wikipedia. But note since you want "elevation" or "vertical-axis" to be zero on the xy-plane, you need to make the changes noted after "if theta measures elevation from the reference plane instead of inclination from the zenith".
First, find a vector from the NPC to the player to get the values x, y, z, where x is positive to the East, y is positive to the North, and z is positive upward.
Then you have:
float r = sqrtf(x*x+y*y+z*z);
float theta = asinf(z/r);
float phi = atan2f(x,y);
Or you might get better precision from replacing the first declaration with
float r = hypotf(hypotf(x,y), z);
Note acosf and atan2f return radians, not degrees. If you need degrees, start with:
theta *= 180./M_PI;
and theta is now your "vertical axis" angle.
Also, Wikipedia's phi = arctan(y/x) assumes an azimuth of zero at the positive x-axis and pi/2 at the positive y-axis. Since you want an azimuth of zero at the North direction and 90 at the East direction, I've switched to atan2f(x,y) (instead of the more common atan2f(y,x)). Also, atan2f returns a value from -pi to pi inclusive, but you want strictly positive values. So:
if (phi < 0) {
phi += 2*M_PI;
}
phi *= 180./M_PI;
and now phi is your desired "horizontal-axis" angle.
I'm not too familiar with math which involves rotation and 3d envionments, but couldn't you draw a line from your coordinates to the NPC's coordinates or vise versa and have a function approximate the proper rotation to that line until within a range of accepted +/-? This way it does this is by just increasing and decreasing the vertical and horizontal values until it falls into the range, it's just a matter of what causes to increase or decrease first and you could determine that based on the position state of the NPC. But I feel like this is the really lame way to go about it.
use 4x4 homogenous transform matrices instead of Euler angles for this. You create the matrix anyway so why not use it ...
create/use NPC transform matrix M
my bet is you got it somewhere near your mesh and you are using it for rendering. In case you use Euler angles you are doing a set of rotations and translation and the result is the M.
convert players GCS Cartesian position to NPC LCS Cartesian position
GCS means global coordinate system and LCS means local coordinate system. So is the position is 3D vector xyz = (x,y,z,1) the transformed position would be one of these (depending on conventions you use)
xyz'=M*xyz
xyz'=Inverse(M)*xyz
xyz'=Transpose(xyz*M)
xyz'=Transpose(xyz*Inverse(M))
either rotate by angle or construct new NPC matrix
You know your NPC's old coordinate system so you can extract X,Y,Z,O vectors from it. And now you just set the axis that is your viewing direction (usually -Z) to direction to player. That is easy
-Z = normalize( xyz' - (0,0,0) )
Z = -xyz' / |xyz'|
Now just exploit cross product and make the other axises perpendicular to Z again so:
X = cross(Y,Z)
Y = cross(Z,X)
And feed the vectors back to your NPC's matrix. This way is also much much easier to move the objects. Also to lock the side rotation you can set one of the vectors to Up prior to this.
If you still want to compute the rotation then it is:
ang = acos(dot(Z,-xyz')/(|Z|*|xyz'|))
axis = cross(Z,-xyz')
but to convert that into Euler angles is another story ...
With transform matrices you can easily make cool stuff like camera follow, easy computation between objects coordinate systems, easy physics motion simulations and much more.

Extrinsic camera calibration OpenCV

I am attempting to calibrate the extrinsics of four cameras that I have mounted on a set-up. They are pointing 90 degrees apart. I have already calibrated the intrinsic paramteres, and I am thinking of using an image of a calibration pattern to find the extrinsics. What I have done so far is: placed the calibration pattern so that it lies flat on the table, so that its roll and yaw angles are 0 and pitch is 90 (as it lies parallel with the camera). The cameras have 0,90,180,270 degrees angles yaw (as they are 90 degrees apart) and the roll angle of the cameras are 0 (as they do not tilt. So what is left to calculate is the pitch angle of the cameras.
I can't quite wrap my head around how to calculate it, as I am not used to doing mapping between coordinate systems, so any help is welcome. I have already made a part of the program that calculates the rotation vector (of the calibration pattern in the image) using the cv::solvePnPRansac() function, so I have the rotation vector (which I believe I can make into a matrix using cv::Rodrigues()
What would the next step be for me in my calculations?

WM_MOUSEMOVE not working with FPS camera implemetation in Direct3D

hey guys,
i am trying to implement an FPS=Style camera. The mouse movement is working but without even touching the mouse. The camera is going on all degrees without me even touching the mouse. Basically, the yaw and the pitch are getting wrong values from the mouse without the movement of the mouse itself.
here is the code for the win32 loop
case WM_MOUSEMOVE:
gCamera->Yaw() = (float)LOWORD(lparam);
gCamera->Pitch() = (float)HIWORD(lparam);
break;
the Yaw and Pitch methods basically return a reference to the data members mPitch and mYaw, and through them, i do the rotations for the basis vectors(right, up and look vectors)
Just to clarify, i WM_MOUSEMOVE is getting input(i checked through debugging), but it is getting very high and very wrong values because i am not even moving the mouse and because the camera is rotating in every direction like it just ate some rocket fuel.
P.S: i had to typecast the values because i am using the Yaw and the Pitch to create matrices, i have to use floats.
Appreciate the help, guys
Keep in mind your units. The WM_MOUSEMOVE lparam x & y values are in logical (screen pixel) coordinates, but most rotation values in games are expected in terms of degrees or radians. For instance, if the mouse is at say <400, 300> but your camera class expects radians, then you're multiplying a large number of rotations against some other (potentially varying) numbers in your transform math, potentially leading to crazy movement even though you're not moving the mouse. The solution in such a case is to convert your logical units into radians or degrees, using a scale factor of your choosing.
In response to further comments:
One way to think about it is to ask yourself the question: how many logical units of movement (by the mouse) do you want to correspond to 360 degrees of rotation?
For instance, if you decided you wanted mouse movement across the full width of the window to correspond to 360 degrees, then then mathematical relationship is
screenW * scaleFactor = 2 * PI
Solve for scaleFactor then apply it to future mouse values using:
mouseX * scaleFactor = orientationInRadians
Keep in mind, this approach would link an absolute mouse location to an absolute camera orientation (for at least one DOF), so you may instead want to track changes in mouse position, rather than absolute mouse position; and then calculate changes in orientation (radians) to apply to existing orientation. The same formula can be used to convert the delta (change amount) from logical coords to radians.

opengl matrix rotation quaternions

Im trying to do a simple rotation of a cube about the x and y axis:
I want to always rotate the cube over the x axis by an amount x
and rotate the cube over the yaxis by an amount y independent of the x axis rotation
first i naively did :
glRotatef(x,1,0,0);
glRotatef(y,0,1,0);
then
but that first rotates over x then rotates over y
i want to rotate over the y independently of the x access.
I started looking into quaternions, so i tried :
Quaternion Rotation1;
Rotation1.createFromAxisAngle(0,1, 0, globalRotateY);
Rotation1.normalize();
Quaternion Rotation2;
Rotation2.createFromAxisAngle(1,0, 0, globalRotateX);
Rotation2.normalize();
GLfloat Matrix[16];
Quaternion q=Rotation2 * Rotation1;
q.createMatrix(Matrix);
glMultMatrixf(Matrix);
that just does almost exactly what was accomplished doing 2 consecutive glRotates ...so i think im missing a step or 2.
is quaternions the way to go or should i be using something different? AND if quaternions are the way to go what steps can i add to make the cube rotate independently of each axis.
i think someone else has the same issue:
Rotating OpenGL scene in 2 axes
I got this to work correctly using quaternions: Im sure there are other ways, but afeter some reseatch , this worked perfectly for me. I posted a similar version on another forum. http://www.opengl.org/discussion_boards/ubbthreads.php?ubb=showflat&Number=280859&#Post280859
first create the quaternion representation of the angles of change x/y
then each frame multiply the changing angles quaternions to an accumulating quaternion , then finally convert that quaternion to matrix form to multiply the current matrix. Here is the main code of the loop:
Quaternion3D Rotation1=Quaternion3DMakeWithAxisAndAngle(Vector3DMake(-1.0f,0,0), DEGREES_TO_RADIANS(globalRotateX));
Quaternion3DNormalize(&Rotation1);
Quaternion3D Rotation2=Quaternion3DMakeWithAxisAndAngle(Vector3DMake(0.0f,-1.0f,0), DEGREES_TO_RADIANS(globalRotateY));
Quaternion3DNormalize(&Rotation2);
Matrix3D Mat;
Matrix3DSetIdentity(Mat);
Quaternion3DMultiply(&QAccum, &Rotation1);
Quaternion3DMultiply(&QAccum, &Rotation2);
Matrix3DSetUsingQuaternion3D(Mat, QAccum);
globalRotateX=0;
globalRotateY=0;
glMultMatrixf(Mat);
then draw cube
It would help a lot if you could give a more detailed explanation of what you are trying to do and how the results you are getting differ from the results you want. But in general using Euler angles for rotation has some problems, as combining rotations can result in unintuitive behavior (and in the worst case losing a degree of freedom.)
Quaternion slerp might be the way to go for you if you can find a single axis and a single angle that represent the rotation you want. But doing successive rotations around the X and Y axis using quaternions won't help you avoid the problems inherent in composing Euler rotations.
The post you link to seems to involve another problem though. The poster seems to have been translating his object and then doing his rotations, when he should have been rotating first and then translating.
It is not clear what you want to achieve. Perhaps you should think about some points and where you want them to rotate to -- e.g. vertex (1,1,1) should map to (0,1,0). Then, from that information, you can calculate the required rotation.
Quaternions are generally used to interpolate between two rotational 'positions'. So step one is identifying your start and end 'positions', which you don't have yet. Once you have that, you use quaternions to interpolate. It doesn't sound like you have any time-varying aspect here.
Your problem is not the gimbal lock. And effectively, there is no reason why your quaternion version would work better than your matrix (glRotate) version because the quaternions you are using are mathematically identical to your rotation matrices.
If what you want is a mouse control, you probably want to check out arcballs.

3D Rotation in OpenGL and Local Rotation

I am trying to prototype a space flight sim in OpenGL, but after reading many articles online I still have difficulty with getting the rotations to work correctly (I did have a quaternion camera that I didn't understand well, but it drifts and has other odd behaviors).
I am trying to do the following:
1) Local rotation - when the user presses arrow keys, rotation occurs relative to the viewport (rotating "up" is toward the top of the screen, for example). Two keys, such as Z and X, will control the "roll" of the ship (rotation around the current view).
2) The rotations will be stored in Axis-angle format (which is most natural for OpenGL and a single rotate call with the camera vector should rotate the scene properly). Therefore, given the initial Angle-axis vector, and one or more of the local rotations noted above (we could locally call "X" the left/right axis, "Y" the top/bottom axis, and "Z" the roll axis), I would like the end result to be a new Axis-angle vector.
3) Avoid quarternions and minimize the use of matrices (for some reason I find both unintuitive). Instead of matrix notation please just show in psuedocode the vector components and what's happening.
4) You should be able to rotate in a direction (using the arrow keys) 360 degrees and return to the starting view without drifting. Preferably, if the user presses one combination and then reverses it, they would expect to be able to return to near their original orientation.
5) The starting state for the camera is at coordinates (0,0,0) facing the Axis-angle vector (0,0,1,0 - z-axis with no starting rotation). "up" is (0,1,0).
Using Euler angles approach is wrong with spacesim. I have tried that approach and quickly had to give up. Player wants all degrees of freedom, and Euler's angles don't provide that, or complicate it enormously.
What you really, really want are quaternions. This is a part of my update code.
Quaternion qtmp1, qtmp2, qtmp3;
Rotation r(........);
qtmp1.CreateFromAxisAngle(1., 0., 0., r.j*m_updatediff);
qtmp2.CreateFromAxisAngle(0., 1., 0., r.i*m_updatediff);
qtmp3.CreateFromAxisAngle(0., 0., 1., r.k*m_updatediff);
m_rotq = qtmp1 * qtmp2 * qtmp3 * m_rotq;
r.i, r.j and r.k contain the current speed of rotation around a certain axis. Getting a spacesim-like feel is just a matter of multiplying these quaternions.
Everything else is just a complication. With Euler's angles, you can play all day long -- in fact, all year long -- but you will just make loads of messy code.
Your daily recommendation: quaternions.