I would like to implement attitude estimation from accelerometer and gyroscope.
Sampling frequency is 50 Hz.
I would like to get roll and pitch (yaw isn't important) in the range from -180 to 180.
I tried the Kalman filter, but I got the values in the range from -90 to 90, and I can't determine if the body is oriented upside down.
So I tried the Madgwick and Mahoney filters as well, but with this implementation I have a problem with the singularity. Because the transformation from quaternions to Euler angles has singularity at pitch 90 degrees.
So if I rotate the body around pitch, the roll values starts to drift around singularity, and because of that I detect that body is facing down.
I can lock the angles near singularity value, but then because pitch goes from -90 to 90, I won't detect body facing down as pitch value will start to go down to 0 degrees.
I tried to overtake this problem by calculating the difference of quaternions each filter iteration when I reach the singularity angle, but I didn't prove to be really reliable. It is also quite noisy because around the singularity threshold I am changing the algorithm from integrating the difference, back to Mahoney filter.
What would be the best practice to get pitch value from -180 to 180. Maybe using two separated mahoney filters and read roll value for both roll and pitch?
Is there a way to get Kalman filter values from -180 to 180?
Any other solutions?
It is really important to reliably detect upside down.
Thank you.
Related
I am trying to estimate initial camera pose with respect to ECEF system using EXIF metadata. I have the a DJI Drone's gimbal Roll, Pitch and Yaw readings along with the Latitude, Longitude and Altitude. Assuming these values are fairly accurate, I am hoping to obtain the transformation matrix from Image to ECEF to find all the objects (out of a collection of objects whose location and altitude is known) that lie in a particular image along with their pixel coordinates, and then imporve the pose estimation at a later stage. I have tried the Pix4d way by converting roll, yaw and pitch to omega, phi and kappa, and then find a rotation matrix from PATB coordinate system to ECEF coordinates. I am not even sure if the values of Omega,Phi and Kappa obtained are correct, but assuming they are, the rotation matrix that I obtain from them is orthonormal, so I cannot directly use it for constructing the 4x4 transformation matrix, I probably need a scale factor for that.
Here is the link for Pix4D article - https://support.pix4d.com/hc/en-us/articles/205678146-How-to-convert-Yaw-Pitch-Roll-to-Omega-Phi-Kappa-
Please help me, I have been scouring through websites and papers for a week now to find a solution, but to no avail.
If you work in Python or Matlab, I recommend using the excellent nvector library for these problems - I use it every day exactly for these things.
I am streaming data into unreal engine from an inertial sensor. It outputs UnNormalized quaternion data in the format:
X = 6561.00000
Y = 6691.00000
Z = 2118.00000
W = 2078.00000
I am applying this to an actor, in c++, using:
this->SetActorRelativeRotation(rotsQ);
And it gives me strange gimbal issues.
When i rotate pitch 90 degrees, it rotates in Pitch.
I rotate 90 degrees in Yaw.
When i rotate pitch 90 degrees, it rotates in Roll.
I have tried Converting it to a FRotator and flipping axes, applying axes one at a time and switching the rotation order. I have tried setting the Actor to 0,0,0 every tick then adding the rotation value. No matter what I do, I see the same thing. Any help here would be very much appreciated!
Could it be a handedness problem? What can i try here?
It's not clear if your input data from the sensor show change of rotation or its absolute value. If it is an absolute value try using SetActorRotation instead of SetActorRelativeRotation.
If input data represents delta rotation, try AddActorLocalRotation or AddActorWorldRotation.
I am attempting to calibrate the extrinsics of four cameras that I have mounted on a set-up. They are pointing 90 degrees apart. I have already calibrated the intrinsic paramteres, and I am thinking of using an image of a calibration pattern to find the extrinsics. What I have done so far is: placed the calibration pattern so that it lies flat on the table, so that its roll and yaw angles are 0 and pitch is 90 (as it lies parallel with the camera). The cameras have 0,90,180,270 degrees angles yaw (as they are 90 degrees apart) and the roll angle of the cameras are 0 (as they do not tilt. So what is left to calculate is the pitch angle of the cameras.
I can't quite wrap my head around how to calculate it, as I am not used to doing mapping between coordinate systems, so any help is welcome. I have already made a part of the program that calculates the rotation vector (of the calibration pattern in the image) using the cv::solvePnPRansac() function, so I have the rotation vector (which I believe I can make into a matrix using cv::Rodrigues()
What would the next step be for me in my calculations?
Ok, so I have my Arduino hooked up to an MPU-6050 6-axis gyro+accelerometer GY-521 breakout board. I've got it sending serial data to a program on my computer, all working perfectly. I'll pose this question in C++ even though my program is written in DBPro.
I'm currently using a complementary filter to combine the gyro rates of angular rotation with the accelerometer gravity vector for stable yet responsive rotations. Here's my current code:
angleX=0.98*(angleX+(gyroX/16.4)*timeMultiplier)+0.02*atanfull(accelY,accelZ)
angleY=0.98*(angleY+(gyroY/16.4)*timeMultiplier)+0.02*atanfull(accelZ,accelX)
angleZ=0.98*(angleZ+(gyroZ/16.4)*timeMultiplier)+0.02*atanfull(accelX,accelZ)
angleX/Y/Z and timeMultiplier are floats. timeMultiplier is basically (millis()-lastMillis)/1000.0. atanfull is basically atan2 but outputs in degrees (0-360), not radians (I think that's the only difference).
The issue is that when the board is sitting flat on the table, the accelerometer obviously can't measure rotations about the Y axis (up), but the gyro can. When I twist the board around Y, the 3D cube representation I'm rendering twists for a second then is pulled back to 0. That's because when the board is upright (with the Y axis pointing upwards) the atanfull results in 0.
What I've been scouring the internet for (but really don't know what to search) is a way to reduce the affect of the accelerometer data the less use it becomes. So as the X axis, say, is rotated so it's perpendicular to the ground (and becomes useless for measurements) the gyro is allowed more and more control, until when the X axis becomes fully perpendicular the gyro is taking no notice of the accelerometer for that axis and relies fully on the gyro.
TL;DR: I'm trying to find a way to combine gyro and accelerometer readings in a rotation-independent way. That is, all the implementations of gyro+accel complementary filters I can find handle pitch/roll differently to yaw, and I don't want to do that. I want to be able to turn my board in any direction and still get good readings.
I really have no idea what I should be searching for to find out how to do this, and would appreciate any help :)
I'm accessing the Kinect Accelerometer in c++ via openFrameworks and ofxKinect and am having some issues with certain angles. If I pitch the kinect 90 degrees downwards I get nan values. I had a look at the getAccelPitch() method and this kind of makes sense since asin will return 0 when there will be values greater than 9.80665 divided by 10.1/9.80665.
The main problem though is after I pitch the device 90 degrees, the roll doesn't seem reliable(doesn't seem change much). In my setup I will need to have the device pitched 90 degrees but also know it's new roll.
Any hints,tips on how I may do that ? Is there an easy way to get the data to draw the kinect's orientation with 3 lines(axes).
I'm trying to detect orientations like these:
The problem is that you are using Euler angles (roll, pitch and yaw).
Euler angles are evil and they screw up the stability of your app, see for example
Strange behavior with android orientation sensor,
Reducing wiimote pitch/roll variations.
They are not useful for interpolation either.
A solution is to use rotation matrices instead. A tutorial on rotation matrices is given in the
Direction Cosine Matrix IMU: Theory
manuscript.