I need to get an up vector for a camera (to get the right look) from a roll, pitch, and yaw angles (in degrees). I've been trying different things for a couple hours and have had no luck :(. Any help here would be appreciated!
Roll, Pitch and Yaw define a rotation in 3 axis. from these angles you can construct a 3x3 transformation matrix which express this rotation (see here how)
After you have this matrix you take your regular up vector, say (0,1,0) if 'up' is the Y axis and multiply it with the matrix. What you'll get is the transformed up vector.
Edit-
Applying the transformation to (0,1,0) is the same thing as taking the middle row. The 3 rows of the matrix make up an orthogonal base of the rotated system. Mind you that a 3D graphic API uses 4x4 matrices. So to make a 4x4 matrix out of the 3x3 rotation matrix you need to add a '1' at M[3][3] (the corner) and zeros at the rest like so:
r r r 0
r r r 0
r r r 0
0 0 0 1
This may not directly answer your question, although it may still help. I have a free open-source project for XNA that creates a debug terminal that overlays your game while it is running. You can use this for looking up values, invoking methods, or whatever. So if you have a transformation matrix and you wanted to extract various parts of it while the game is running, you can do that. The project can be found here:
http://www.protohacks.net/xna_debug_terminal
I don't have much expertise in the kind of math you are using, but hopefully Shoosh's post helps on that. Maybe the debug terminal can help you when trying out his idea or in any other future problems you encounter.
12 years later...
In case anyone is still interested in the answer to this question, here is the solution (even tough its in Java it should be pretty easy to translate it in other languages):
private Vector3f getRayFromCamera() {
float rx = (float)Math.sin((double)getYaw() * (double)(Math.PI / 180)) * -1 * (1-Math.abs((float)Math.cos((double)getPitch() * (double)(Math.PI / 180) - 90 * (Math.PI / 180)) * -1));
float ry = (float)Math.cos((double)getPitch() * (double)(Math.PI / 180) - 90 * (Math.PI / 180)) * -1;
float rz = (float)Math.cos((double)getYaw() * (double)(Math.PI / 180)) * -1 * (1- Math.abs((float)Math.cos((double)getPitch() * (double)(Math.PI / 180) - 90 * (Math.PI / 180)) * -1));
return new Vector3f(rx, ry, rz);
}
Note: This calculates the Front Vector but when multiplying with the vector (0,1,0) you can change that!
Related
I am working on a game project using OpenGl. I am building a game from skeleton code I found online. I have a character that can move around in a 2D plane. (x and z, ie you are viewing the character from above.) I am currently stuck on making him rotate as he moves, and I can't seem to find a solution online that solves my problem.
At the moment when the character is being drawn he faces a certain way (along the arrow in my diagram below.). I can rotate him an arbitrary number of degrees from his default direction using glm::rotate.
I have updated the code to log the characters position from a frame ago when he moves, so I have this information:
character old position (known)-> O
character starting angle (unknown)-> |\
| \
| \
|(X)\
| \
V O <- character new position (known)
How do I compute the angle (X)? Is it possible with the information I have?
I have been doodling on a page trying to figure this out for the last hour but can't seem to figure it out. Thanks very much.
Yes. This answer gives you an example of how to do it: How to calculate the angle between a line and the horizontal axis?
Note however that that will give you the angle between the horizontal axsis and the point. You can however just add 90 degrees.
What you're doing sounds somewhat convoluted. From the description, it seems like you want a rotation matrix that matches the direction. There's really no need to calculate an angle. You can build the rotation matrix directly, which is easier and more efficient.
I'll illustrate the calculations with points/vectors in the xy-plane, since that's much more standard. It sounds like you're operating in the xz-plane. While that doesn't change things much, you might need slight changes because you have a left-handed coordinate system.
If you have the direction vector (difference between new position and old position), all you need to do is normalize it, and you already have what's needed for the rotation matrix. I'll write the calculation explicitly, but your matrix/vector library most likely has a method to normalize a vector.
float vx = nexPosX - oldPosX;
float vy = newPosY - oldPosY;
float s = 1.0f / sqrt(vx * vx + vy * vy);
vx *= s;
vy *= s;
vx is now the cosine of the rotation angle, and vy the sine of the rotation angle. Substituting this into the standard form of a rotation matrix, you get:
R = ( cos(phi) -sin(phi) ) = ( vx -vy )
( sin(phi) cos(phi) ) ( vy vx )
This is the absolute rotation for the new direction. If you need the relative rotation between old direction and new direction, it just takes a few more operations. Let's say you already calculated the normalized vectors for the old and new directions as (v1x, v1y) and (v2x, v2y). The cosine of the rotation angle is the scalar product of the two vectors:
cosPhi = v1x * v2x + v1y * v2y;
and the sine is the length of the cross product. Since both vectors are in the xy-plane, that's simply the z-component of the cross product:
sinPhi = v1x * v2y - v1y * v2x;
With these two values, you can directly build the rotation matrix again:
R = ( cosPhi -sinPhi )
( sinPhi cosPhi )
I've been trying to figure out what 2 portions of code are doing in this tutorial: Keyboard and Mouse.
Specifically:
// Direction : Spherical coordinates to Cartesian coordinates conversion
glm::vec3 direction(
cos(verticalAngle) * sin(horizontalAngle),
sin(verticalAngle),
cos(verticalAngle) * cos(horizontalAngle)
);
and
// Right vector
glm::vec3 right = glm::vec3(
sin(horizontalAngle - 3.14f/2.0f),
0,
cos(horizontalAngle - 3.14f/2.0f)
);
I don't see how the first one is spherical -> cartesian. When I look it up, I get:
x = r * sin(theta) * cos(phi)
y = r * sin(theta) * sin(phi)
z = r * cos(theta)
I've read on Euler angles, axis-angle and quaternions none of those have shed light on what this is doing or I'm just not able to grasp what I'm reading. ;)
On the 2nd one, shouldn't the right vector just be 90 degrees to the right of the direction vector?
This has a lot to do with the tutorial maker, and how he has decided to use spherical coordinates to generate his viewing angles. His approach is interesting, but remember that you can come up with your own!
// Direction : Spherical coordinates to Cartesian coordinates conversion
glm::vec3 direction(
cos(verticalAngle) * sin(horizontalAngle),
sin(verticalAngle),
cos(verticalAngle) * cos(horizontalAngle)
);
The reason that this looks different than the formula that you looked up, is because it's the same idea, but converted into a different space. The author simply wants the camera perspective to be "straight ahead" when verticalAngle == 0 && horizontalAngle == 0
Work it out yourself!
x = cos(0) * sin(0) = 0
y = sin(0) = 0
z = cos(0) * cos(0) = 1
So, in this instance, the "look" vector of the camera is pointed directly into the z-axis, which in the case of a typical OpenGL application, would generally be considered as straight ahead (ie: Y-Axis is usually up and down).
Try calculating different angles and see how that would make the camera look vector spin around.
In the second instance, the author has taken some liberties with the formula, and defined it in a way which would only be useful in first-person games / applications. There are some 3D situations in which the camera can be oriented in a different way (a flight simulator, for example). Regardless, it's still the same idea. The author is just adjusting spherical coordinates to his own needs.
Personally, I prefer to use euler angles to do camera angles. It's a little bit more work to set up (you'll need to do some matrix math), but it's a different way of solving the same problem. But that might be more useful in a situation that goes beyond the typical FPS-game.
I'm trying to implement a simple AI system in my DirectX Application. I'm trying to get my Ai to rotate and face the direction I want it to face towards, which I manage to do, but can't figure out how to get it to determine how to rotate to the given direction (i.e should it rotate left or rotate right?).
Here is the code I've got which works out the angle it needs to rotate by to face the direction it's given:
D3DXVECTOR3 incident = destination - position;
float top = D3DXVec3Dot(&incident, &forwardVec);
float bottom = sqrt((incident.x * incident.x) + (incident.y * incident.y) + (incident.z * incident.z)) *
sqrt((forwardVec.x * forwardVec.x) + (forwardVec.y * forwardVec.y) + (forwardVec.z * forwardVec.z));
float remainingAngle = acos(top/bottom) * 180.0f / PI;
The forwardVec is a D3DXVECTOR3 of which way the AI is currently facing.
The dot product rule just tells you the shortest angle (which is always less than 180!), not which way to go. Do you have a way to get a direction angle out of a D3DXVECTOR (ie polar form kind of thing?) If so, then you can subtract (desired angle)-(current angle) and if that is within -180 to 180 go counterclockwise; otherwise, go clockwise.
I have a feeling that the cross product might also give a method, but I'd have to sit down with a piece of paper to work it out.
Let's suppose that straight ahead is 0 and you're counting degrees in a clockwise fashion.
If you need to turn 180 or less then you're moving right.
If you need to turn more than 180 you have to turn left. This turn is a left turn of 360 - value degrees.
I hope this answers your question.
The angle between 2 normalized vectors:
double GetAng (const D3DXVECTOR3& Xi_V1, const D3DXVECTOR3& Xi_V2)
{
D3DXVECTOR3 l_Axis;
D3DXVec3Cross(&l_Axis, &Xi_V1, &Xi_V2);
return atan2(D3DXVec3Length(&l_Axis), D3DXVec3Dot(&Xi_V1, &Xi_V2));
}
The returned angle is between -PI and PI and represents the shortest anglular rotation from v1 to v2.
This is how I calculate my line of sight vector and the up vector.
ly = sin(inclination);
lx = cos(inclination)*sin(azimuth);
lz = cos(inclination)*cos(azimuth);
uy = sin(inclination + M_PI / 2.0);
ux = cos(inclination + M_PI / 2.0)*sin(azimuth + M_PI);
uz = cos(inclination + M_PI / 2.0)*cos(azimuth + M_PI);
inclination is the angle of the line of sight vector from the xz plane and azimuth is the angle in the xz plane.
This works fine till my inclination reaches 225 degrees. At that point there is a discontinuity in the rotation for some reason. (Note By 225 degrees, I mean its past the upside-down point)
Any ideas as to why this is so?
EDIT: Never mind, figured it out. The azimuth does not need a 180 deg. tilt for the up vector.
I think you are talking of a limit angle of 90 degrees (pi). What you get is a normal behavior. When using gluLookAt, you specify an 'up' vector, used to determine the roll of the camera. In the special case where you are looking upside down, the 'up' vector is parallel to the eye direction vector, so it is not possible to determine the roll of the camera (this problem as an infinite number of solutions, so an arbitrary one is chosen by gluLookAt). May be you should compute this 'up' vector using your inclination and azimuth.
I'm primarily a Flash AS3 dev, but I'm jumping into openframeworks and having trouble using 3D (these examples are in AS)
In 2D you can simulate an object orbiting a point by using Math.Sin() and Math.cos(), like so
function update(event:Event):void
{
dot.x = xCenter + Math.cos(angle*Math.PI/180) * range;
dot.y = yCenter + Math.sin(angle*Math.PI/180) * range;
angle+=speed;
}
I am wondering how I would translate this into a 3D orbit, if I wanted to also orbit in the third dimension.
function update(event:Event):void
{
...
dot.z = zCenter + Math.sin(angle*Math.PI/180) * range;
// is this valid?
}
An help is greatly appreciated.
If you are orbiting around the z-axis, you are leaving your z-coordinate fixed and changing your x- and y-coordinates. So your first code sample is what you are looking for.
To rotate around the x-axis (or y-axes), just replace x (or y) with z. Use Cos on whichever axis you want to be 0-degrees; the choice is arbitrary.
If what you actually want is to orbit an object around a point in 3d-space, you'll need two angles to describe the orbit: its elevation angle and its inclination angle. See here and here.
For reference, those equations are (where θ and φ are your angles)
x = x0 + r sin(θ) cos(φ)
y = y0 + r sin(θ) sin(φ)
z = z0 + r cos(θ)
If you are orbiting around Z axis, then you just do your first code, and leave Z coordinate as is.
I would pick two unit perpendicular vectors v, w that define the plane in which to orbit, then loop over the angle and pick the proper ratio of these vectors v and w to build your vector p = av + bw.
More details are coming.
EDIT:
This might be of help
http://en.wikipedia.org/wiki/Orbit_equation
EDIT: I think it is actually
center + sin(angle) * v * radius1 + cos(angle) * w * radius2
Here v and w are your unit vectors for the circle.
In 2D they were (1,0) and (0,1).
In 3D you will need to compute them - depends on orientation of the plane.
If you set radius1 = radius 2, you will get a circle. Otherwise, you should get an ellipse.
If you just want the orbit to happen at an angled plane and don't mind it being elliptic you can just do something like z = 0.2*x + 0.2*y, or any combination you fancy, after you have determined the x and y coordinates.