How to flip an angle - angle

I'm working with a robot and right now the angle is setup to be between -179 to 180. The thing is the angle accumulation is opposite that of cartesian coords which is a headache. So if the robot is facing 90 degrees and turns 45 to the left: instead of being 135 it is 45 and if it is facing 0 degrees, the angle should read as 180.
How would I go about converting the angle to something I can use more effectively? i.e the angle needs to be flipped across the y axis if imagined in the scope of the cartesian plane.

I figured it out bozos, thanks for the useful responses for the junior coder. Anyways if anyone ever has this problem for whatever reason:
(90 - CurrentAngle) + 90;
Then the angle has to be rewrapped to the 180/-179 plane.

Related

fusion readings from accelerometer and gyroscope for attitude estimation

I would like to implement attitude estimation from accelerometer and gyroscope.
Sampling frequency is 50 Hz.
I would like to get roll and pitch (yaw isn't important) in the range from -180 to 180.
I tried the Kalman filter, but I got the values in the range from -90 to 90, and I can't determine if the body is oriented upside down.
So I tried the Madgwick and Mahoney filters as well, but with this implementation I have a problem with the singularity. Because the transformation from quaternions to Euler angles has singularity at pitch 90 degrees.
So if I rotate the body around pitch, the roll values starts to drift around singularity, and because of that I detect that body is facing down.
I can lock the angles near singularity value, but then because pitch goes from -90 to 90, I won't detect body facing down as pitch value will start to go down to 0 degrees.
I tried to overtake this problem by calculating the difference of quaternions each filter iteration when I reach the singularity angle, but I didn't prove to be really reliable. It is also quite noisy because around the singularity threshold I am changing the algorithm from integrating the difference, back to Mahoney filter.
What would be the best practice to get pitch value from -180 to 180. Maybe using two separated mahoney filters and read roll value for both roll and pitch?
Is there a way to get Kalman filter values from -180 to 180?
Any other solutions?
It is really important to reliably detect upside down.
Thank you.

Make character look at player?

I need to make a function that will calculate the degrees necessary to make an NPC look at the center of the player. However, I have not been able to find any results regarding 3 dimensions which is what I need. Only 2 dimensional equations. I'm programming in C++.
Info:
Data Type: Float.
Vertical-Axis: 90 is looking straight up, -90 is looking straight down and 0 is looking straight ahead.
Horizontal-Axis: Positive value between 0 and 360, North is 0, East is 90, South 180, West 270.
See these transformation equations from Wikipedia. But note since you want "elevation" or "vertical-axis" to be zero on the xy-plane, you need to make the changes noted after "if theta measures elevation from the reference plane instead of inclination from the zenith".
First, find a vector from the NPC to the player to get the values x, y, z, where x is positive to the East, y is positive to the North, and z is positive upward.
Then you have:
float r = sqrtf(x*x+y*y+z*z);
float theta = asinf(z/r);
float phi = atan2f(x,y);
Or you might get better precision from replacing the first declaration with
float r = hypotf(hypotf(x,y), z);
Note acosf and atan2f return radians, not degrees. If you need degrees, start with:
theta *= 180./M_PI;
and theta is now your "vertical axis" angle.
Also, Wikipedia's phi = arctan(y/x) assumes an azimuth of zero at the positive x-axis and pi/2 at the positive y-axis. Since you want an azimuth of zero at the North direction and 90 at the East direction, I've switched to atan2f(x,y) (instead of the more common atan2f(y,x)). Also, atan2f returns a value from -pi to pi inclusive, but you want strictly positive values. So:
if (phi < 0) {
phi += 2*M_PI;
}
phi *= 180./M_PI;
and now phi is your desired "horizontal-axis" angle.
I'm not too familiar with math which involves rotation and 3d envionments, but couldn't you draw a line from your coordinates to the NPC's coordinates or vise versa and have a function approximate the proper rotation to that line until within a range of accepted +/-? This way it does this is by just increasing and decreasing the vertical and horizontal values until it falls into the range, it's just a matter of what causes to increase or decrease first and you could determine that based on the position state of the NPC. But I feel like this is the really lame way to go about it.
use 4x4 homogenous transform matrices instead of Euler angles for this. You create the matrix anyway so why not use it ...
create/use NPC transform matrix M
my bet is you got it somewhere near your mesh and you are using it for rendering. In case you use Euler angles you are doing a set of rotations and translation and the result is the M.
convert players GCS Cartesian position to NPC LCS Cartesian position
GCS means global coordinate system and LCS means local coordinate system. So is the position is 3D vector xyz = (x,y,z,1) the transformed position would be one of these (depending on conventions you use)
xyz'=M*xyz
xyz'=Inverse(M)*xyz
xyz'=Transpose(xyz*M)
xyz'=Transpose(xyz*Inverse(M))
either rotate by angle or construct new NPC matrix
You know your NPC's old coordinate system so you can extract X,Y,Z,O vectors from it. And now you just set the axis that is your viewing direction (usually -Z) to direction to player. That is easy
-Z = normalize( xyz' - (0,0,0) )
Z = -xyz' / |xyz'|
Now just exploit cross product and make the other axises perpendicular to Z again so:
X = cross(Y,Z)
Y = cross(Z,X)
And feed the vectors back to your NPC's matrix. This way is also much much easier to move the objects. Also to lock the side rotation you can set one of the vectors to Up prior to this.
If you still want to compute the rotation then it is:
ang = acos(dot(Z,-xyz')/(|Z|*|xyz'|))
axis = cross(Z,-xyz')
but to convert that into Euler angles is another story ...
With transform matrices you can easily make cool stuff like camera follow, easy computation between objects coordinate systems, easy physics motion simulations and much more.

Quaternion rotation gimbal problems in Unreal engine

I am streaming data into unreal engine from an inertial sensor. It outputs UnNormalized quaternion data in the format:
X = 6561.00000
Y = 6691.00000
Z = 2118.00000
W = 2078.00000
I am applying this to an actor, in c++, using:
this->SetActorRelativeRotation(rotsQ);
And it gives me strange gimbal issues.
When i rotate pitch 90 degrees, it rotates in Pitch.
I rotate 90 degrees in Yaw.
When i rotate pitch 90 degrees, it rotates in Roll.
I have tried Converting it to a FRotator and flipping axes, applying axes one at a time and switching the rotation order. I have tried setting the Actor to 0,0,0 every tick then adding the rotation value. No matter what I do, I see the same thing. Any help here would be very much appreciated!
Could it be a handedness problem? What can i try here?
It's not clear if your input data from the sensor show change of rotation or its absolute value. If it is an absolute value try using SetActorRotation instead of SetActorRelativeRotation.
If input data represents delta rotation, try AddActorLocalRotation or AddActorWorldRotation.

Extrinsic camera calibration OpenCV

I am attempting to calibrate the extrinsics of four cameras that I have mounted on a set-up. They are pointing 90 degrees apart. I have already calibrated the intrinsic paramteres, and I am thinking of using an image of a calibration pattern to find the extrinsics. What I have done so far is: placed the calibration pattern so that it lies flat on the table, so that its roll and yaw angles are 0 and pitch is 90 (as it lies parallel with the camera). The cameras have 0,90,180,270 degrees angles yaw (as they are 90 degrees apart) and the roll angle of the cameras are 0 (as they do not tilt. So what is left to calculate is the pitch angle of the cameras.
I can't quite wrap my head around how to calculate it, as I am not used to doing mapping between coordinate systems, so any help is welcome. I have already made a part of the program that calculates the rotation vector (of the calibration pattern in the image) using the cv::solvePnPRansac() function, so I have the rotation vector (which I believe I can make into a matrix using cv::Rodrigues()
What would the next step be for me in my calculations?

Finding angle between two points in an image using OpenCV

I'm trying to find the angle between two points in an image. The angle is with reference to the centre line of the camera.
In this image the center point is along the center of the image (assumption, I still have to figure out how to actually calculate it) and I want to find the angle between the line connecting point 1 and the camera center and the line connecting the desired point and the camera center
Now I want to know two things about finding the angle
- Is it possible to know the angle if the distance is not known exactly (but can be estimated by a human at run time) Assuming both points lie in the same plane in the image
- If the points are not in the same plane, how should I handle the angle calculation?
It can be achieved by inner product.
If you are talking in 3D space so your x(vector) and y(vector) should be in the form [a,b,f](a and b are points) where f is the distance of image plane from the camera center and a and a are the corresponding coordinates in the camera frame.
If it is in 2D space, so you have to specify the origin of your frame and find a and b according to that frame and your x and y vectors are in the form [a,b].
It can be found by using this formula:
Angle between two rays
K is the camera matrix. x1 and x2 are the image points given in homogeneous form like [u,v,1] and d1 and d2 are the corresponding 3D points.
See Richard Hartley, Australian National University, Canberra, Andrew Zisserman, University of Oxford: “Multiple View Geometry in Computer Vision”, 2nd edition, p. 209 for more details.
Inverting the camera matrix is quite simple. See
https://www.imatest.com/support/docs/pre-5-2/geometric-calibration-deprecated/projective-camera/ for more details.