How to Pitch Camera Around Origin - c++

I am trying to implement a camera which orbits around the origin, where I have successfully implemented the ability to yaw using the gluLookat function. I am trying to implement pitch, but have a few issues with the outcome (pitch only works if I yaw to a certain point and then pitch).
Here is my attempt so far:
float distance, // radius (from origin) updated by -, + keys
pitch, // angle in degrees updated from W, S keys (increments of +- 10)
yaw; // angle in degrees updated from A, D keys (increments of +- 10)
view = lookAt(
Eigen::Vector3f(distance * sin(toRadians(pitch)) * cos(toRadians(yaw)), distance * sin(toRadians(pitch)) * sin(toRadians(yaw)), distance * cos(toRadians(pitch))),
Eigen::Vector3f(0.0f, 0.0f, 0.0f),
Eigen::Vector3f(0.0f, 0.0f, 1.0f));
proj = perspective(toRadians(90.0f), static_cast<float>(width) / height, 1.0f, 10.0f);
I feel like my issue is the Up vector, but I'm not sure how to update it properly(and at the same time I think its fine, as I always want the orientation of the camera to stay the same, I really just want to move the position of the camera)
Edit: I wanted to add that I'm calculating the position based info found here: http://tutorial.math.lamar.edu/Classes/CalcIII/SphericalCoords.aspx I'm not sure if the math discussed here directly translates over so please correct me if wrong.

It might be a matter of interpretation. Your code looks correct but pitch might not have the meaning that you think.
When pitch is 0, the camera is located at the north pole of the sphere (0, 0, 1). This is a bit problematic since your up-vector and view direction become parallel and you will not get a valid transform. Then, when pitch increases, the camera moves south until it reaches the south pole when pitch=PI. Your code should work for any point that is not at the poles. You might want to swap sin(pitch) and cos(pitch) to start at the equator when pitch=0 (and support positive and negative pitch).
Actually, I prefer to model this kind of camera more directly as a combination of matrices:
view = Tr(0, 0, -distance) * RotX(-pitch) * RotY(-yaw)
Tr is a translation matrix, RotX is a rotation about the x-axis, and RotY is a rotation about the y-axis. This assumes that the y-axis is up. If you want another axis to be up, you can just add an according rotation matrix. E.g., if you want the z-axis to be up, then
view = Tr(0, 0, -distance) * RotX(-pitch) * RotY(-yaw) * RotX(-Pi/2)

Related

How to set the pitch, yaw, roll of a quaternion relative to the world

So I have a quaternion that stores the orientation of an object and I have functions to change the pitch, yaw and roll of the quaternion like so:
void pitch(float amount)
{
orientation *= glm::angleAxis(glm::radians(amount), glm::vec3(1, 0, 0));
}
void yaw(float amount)
{
orientation *= glm::angleAxis(glm::radians(-amount), glm::vec3(0, 1, 0));
}
void roll(float amount)
{
orientation *= glm::angleAxis(glm::radians(amount), glm::vec3(0, 0, -1));
}
What I want to do is set the pitch, yaw and roll relative to the world instead of adding to the current orientation. What I mean by that is right now, say I have a roll of 90*. Each time I use my pitch function and pass 45* into it, the roll will turn into 135* then 180* and so on.
What I want the set pitch function to do is when I pass 45* into it, it will set the roll of the quaternion to 45* and still keep the pitch and yaw.
I made a better explanation in my comment below.
So to picture what I'm trying to do, imagine 4 walls around you with pictures hanging of an arrow pointing to the world up. What I want to do is no matter what the pitch, yaw and roll of the quaternion, if I'm looking at any of the hanging pictures (any yaw) but with a little roll (so the arrow will point anywhere but up). If I use the setRoll function and pass 0*, the object should see the arrow pointing up. If I pass 90* to the setRoll function, no matter how many times I call the function it should set the object's rotation to 90* so the arrow will point to the left.
Thanks to Thomas' answer, I now have this code, but sadly it still doesn't work correctly:
void Camera::setPitch(float amount)
{
glm::vec3 globalUp = glm::vec3(0.0f, 1.0f, 0.0f);
glm::vec3 globalRight = glm::vec3(1.0f, 0.0f, 0.0f);
// "The pitch is the angle between the front vector and the horizontal plane. This is pi/2 minus the angle between the front vector and the global up vector."
float currentPitch = 3.14f / 2.0f - glm::acos(glm::dot(direction, globalRight));
// "To find the angle over which you need to rotate, compute the current pitch, and subtract this from the target pitch."
amount = currentPitch - amount;
// "To set the pitch without affecting roll and yaw, you'll want to rotate around an axis that lies in the horizontal plane and is orthogonal to the front vector. For a vector to lie in a plane, it means that it's orthogonal to the plane's normal, so we're looking for a vector that is orthogonal to both the global up vector and the front vector. That's just the cross product of the two."
glm::quat rotation = glm::angleAxis(glm::radians(amount), glm::cross(globalUp, direction));
orientation *= rotation;
updateCameraVectors();
updateViewMatrix();
}
void Camera::setYaw(float amount)
{
glm::vec3 globalUp = glm::vec3(0.0f, 1.0f, 0.0f);
glm::vec3 globalRight = glm::vec3(1.0f, 0.0f, 0.0f);
// "The yaw is the angle between the projection of the front vector onto the global horizontal plane, and some global "zero yaw" vector that also lies in the global horizontal plane. To project the front vector onto the global horizontal plane, simply set its vertical coordinate to zero."
float currentYaw = 3.14f / 2.0f - glm::acos(glm::dot(direction, globalRight));
// "To find the angle over which to rotate, compute the current yaw and subtract this from the target yaw."
amount = currentYaw - amount;
// "To change the yaw, simply rotate around the global up vector."
glm::quat rotation = glm::angleAxis(glm::radians(amount), globalUp);
orientation *= rotation;
// "Note that yaw is ill-defined when looking straight up; in that case, you can maybe set the roll instead, because the two are essentially the same."
// TODO...
updateCameraVectors();
updateViewMatrix();
}
void Camera::setRoll(float amount)
{
glm::vec3 globalUp = glm::vec3(0.0f, 1.0f, 0.0f);
// "The roll is the angle between the local right vector and the global up vector, minus pi/2"
float currentRoll = glm::acos(glm::dot((right, globalUp)) - 3.14f / 2.0f;
// "To find the angle over which to rotate, compute the current roll and subtract this from the target roll."
amount = currentRoll - amount;
// "To change the roll, you'll want to rotate around the local front vector."
glm::quat rotation = glm::angleAxis(glm::radians(amount), direction);
orientation *= rotation;
updateCameraVectors();
updateViewMatrix();
}
So here's my understanding of what you're trying to do. You have a unit quaternion orientation. You can picture this as a "front" vector pointing out from your eyes, and an "up" vector pointing out of the top of your head. From these follows a "right" vector pointing out your right ear.
You didn't explain your coordinate system, so I'll keep this in pretty generic terms. And this post is more of a "how to think about the problem" rather than working code, but I hope that's at least as useful in the long term.
Pitch
The pitch is the angle between the front vector and the horizontal plane. This is pi/2 minus the angle between the front vector and the global up vector.
To set the pitch without affecting roll and yaw, you'll want to rotate around an axis that lies in the horizontal plane and is orthogonal to the front vector. For a vector to lie in a plane, it means that it's orthogonal to the plane's normal, so we're looking for a vector that is orthogonal to both the global up vector and the front vector. That's just the cross product of the two.
To find the angle over which you need to rotate, compute the current pitch, and subtract this from the target pitch.
Yaw
The yaw is the angle between the projection of the front vector onto the global horizontal plane, and some global "zero yaw" vector that also lies in the global horizontal plane. To project the front vector onto the global horizontal plane, simply set its vertical coordinate to zero.
To change the yaw, simply rotate around the global up vector.
To find the angle over which to rotate, compute the current yaw and subtract this from the target yaw.
Note that yaw is ill-defined when looking straight up; in that case, you can maybe set the roll instead, because the two are essentially the same.
Roll
The roll is the angle between the local right vector and the global up vector, minus pi/2.
To change the roll, you'll want to rotate around the local front vector.
To find the angle over which to rotate, compute the current rol and subtract this from the target roll.

Does the camera face the x axis when the yaw is 0?

so I’ve been reading about the camera in learnopengl and noticed that in the yaw image, it seems as the though camera is facing the x axis when the yaw is 0. Shouldn’t the camera be facing the negative z axis? I attached the image to this message. In the image, the yaw is already a certain amount of degrees but if the yaw is 0, would that mean that the camera is facing the x axis?
First, let's bring a little bit more context into your question so that we know what your are actually talking about.
We can assume that when you say
so I’ve been reading about the camera in learnopengl
that by this you are specifically referring to the chapter called "Camera" in the https://learnopengl.com/Getting-started/Camera tutorial.
Under the sub-section "Euler angles", there is the image which you are also including in your question.
If you read a little bit further, you'd see the following definition of the direction vector, which that tutorial later uses to build a lookat matrix:
direction.x = cos(glm::radians(yaw)) * cos(glm::radians(pitch));
direction.y = sin(glm::radians(pitch));
direction.z = sin(glm::radians(yaw)) * cos(glm::radians(pitch));
So, just by looking at that and doing the math, we see that direction will be (1, 0, 0) when yaw and pitch are 0.
But, if we read further, we see this paragraph:
We've set up the scene world so everything's positioned in the direction of the negative z-axis. However, if we look at the x and z yaw triangle we see that a θ of 0 results in the camera's direction vector to point towards the positive x-axis. To make sure the camera points towards the negative z-axis by default we can give the yaw a default value of a 90 degree clockwise rotation. Positive degrees rotate counter-clockwise so we set the default yaw value to:
yaw = -90.0f;
So that is the answer to your question: In the context of that tutorial, by simply its own definition of the direction vector, it will point to (1, 0, 0) when both angles are 0. And to counteract this, that tutorial will assume that the initial value of yaw is -90.
Sure, they could've just used a different formula for the direction vector components to yield (0, 0, -1) as the result when the angles are 0, but they didn't.
I think you might be confusing definitions of camera space and projection space. -z-forward is how the projection matrix is defined, whereas the above is how the view matrix is defined, when you multiply them, you get the effects of both (a perspective corrected final image that creates the illusion of depth, as well as orthogonal world space coordinates that you can properly translate and rotate in).
If you want to define it to be -z forward:
We replace the trig for yaw, and pitch isn't affected:
direction.x = sin(glm::radians(yaw)) * cos(glm::radians(pitch));
direction.y = sin(glm::radians(pitch));
direction.z = -cos(glm::radians(yaw)) * cos(glm::radians(pitch));
You can now use Yaw = 0.0f and the default camera will be looking down -z.

Rotating 2D camera to space ship's heading in OpenGL (OpenTK)

The game is a top-down 2D space ship game -- think of "Asteroids."
Box2Dx is the physics engine and I extended the included DebugDraw, based on OpenTK, to draw additional game objects. Moving the camera so it's always centered on the player's ship and zooming in and out work perfectly. However, I really need the camera to rotate along with the ship so it's always facing in the same direction. That is, the ship will appear to be frozen in the center of the screen and the rest of the game world rotates around it as it turns.
I've tried adapting code samples, but nothing works. The best I've been able to achieve is a skewed and cut-off rendering.
Render loop:
// Clear.
Gl.glClear(Gl.GL_COLOR_BUFFER_BIT | Gl.GL_DEPTH_BUFFER_BIT);
// other rendering omitted (planets, ships, etc.)
this.OpenGlControl.Draw();
Update view -- centers on ship and should rotate to match its angle. For now, I'm just trying to rotate it by an arbitrary angle for a proof of concept, but no dice:
public void RefreshView()
{
int width = this.OpenGlControl.Width;
int height = this.OpenGlControl.Height;
Gl.glViewport(0, 0, width, height);
Gl.glMatrixMode(Gl.GL_PROJECTION);
Gl.glLoadIdentity();
float ratio = (float)width / (float)height;
Vec2 extents = new Vec2(ratio * 25.0f, 25.0f);
extents *= viewZoom;
// rotate the view
var shipAngle = 180.0f; // just a test angle for proof of concept
Gl.glRotatef(shipAngle, 0, 0, 0);
Vec2 lower = this.viewCenter - extents;
Vec2 upper = this.viewCenter + extents;
// L/R/B/T
Glu.gluOrtho2D(lower.X, upper.X, lower.Y, upper.Y);
Gl.glMatrixMode(Gl.GL_MODELVIEW);
}
Now, I'm obviously doing this wrong. Degrees of 0 and 180 will keep it right-side-up or flip it, but any other degree will actually zoom it in/out or result in only blackness, nothing rendered. Below are examples:
If ship angle is 0.0f, then game world is as expected:
Degree of 180.0f flips it vertically... seems promising:
Degree of 45 zooms out and doesn't rotate at all... that's odd:
Degree of 90 returns all black. In case you've never seen black:
Please help!
Firstly the 2-4 arguments are the axis, so please state them correctly as stated by #pingul.
More importantly the rotation is applied to the projection matrix.
// L/R/B/T
Glu.gluOrtho2D(lower.X, upper.X, lower.Y, upper.Y);
In this line your Orthogonal 2D projection matrix is being multiplied with the previous rotation and applied to your projection matrix. Which I believe is not what you want.
The solution would be move your rotation call to a place after the model view matrix mode is selected, as below
// L/R/B/T
Glu.gluOrtho2D(lower.X, upper.X, lower.Y, upper.Y);
Gl.glMatrixMode(Gl.GL_MODELVIEW);
// rotate the view
var shipAngle = 180.0f; // just a test angle for proof of concept
Gl.glRotatef(shipAngle, 0.0f, 0.0f, 1.0f);
And now your rotations will be applied to the model-view matrix stack. (I believe this is the effect you want). Keep in mind that glRotatef() creates a rotation matrix and multiplies it with the matrix at the top of the selected stack stack.
I would also strongly suggest you move away from fixed function pipeline if possible as suggested by #BDL.

How to correctly represent 3D rotation in games

In most 3D platform games, only rotation around the Y axis is needed since the player is always positioned upright.
However, for a 3D space game where the player needs to be rotated on all axises, what is the best way to represent the rotation?
I first tried using Euler angles:
glRotatef(anglex, 1.0f, 0.0f, 0.0f);
glRotatef(angley, 0.0f, 1.0f, 0.0f);
glRotatef(anglez, 0.0f, 0.0f, 1.0f);
The problem I had with this approach is that after each rotation, the axises change. For example, when anglex and angley are 0, anglez rotates the ship around its wings, however if anglex or angley are non zero, this is no longer true. I want anglez to always rotate around the wings, irrelevant of anglex and angley.
I read that quaternions can be used to exhibit this desired behavior however was unable to achieve it in practice.
I assume my issue is due to the fact that I am basically still using Euler angles, but am converting the rotation to its quaternion representation before usage.
struct quaternion q = eulerToQuaternion(anglex, angley, anglez);
struct matrix m = quaternionToMatrix(q);
glMultMatrix(&m);
However, if storing each X, Y, and Z angle directly is incorrect, how do I say "Rotate the ship around the wings (or any consistent axis) by 1 degree" when my rotation is stored as a quaternion?
Additionally, I want to be able to translate the model at the angle that it is rotated by. Say I have just a quaternion with q.x, q.y, q.z, and q.w, how can I move it?
Quaternions are very good way to represent rotations, because they are efficient, but I prefer to represent the full state "position and orientation" by 4x4 matrices.
So, imagine you have a 4x4 matrix for every object in the scene. Initially, when the object is unrotated and untraslated, this matrix is the identity matrix, this is what I will call "original state". Suppose, for instance, the nose of your ship points towards -z in its original state, so a rotation matrix that spin the ship along the z axis is:
Matrix4 around_z(radian angle) {
c = cos(angle);
s = sin(angle);
return Matrix4(c, -s, 0, 0,
s, c, 0, 0,
0, 0, 1, 0,
0, 0, 0, 1);
}
now, if your ship is anywhere in space and rotated to any direction, and lets call this state t, if you want to spin the ship around z axis for an angle amount as if it was on its "original state", it would be:
t = t * around_z(angle);
And when drawing with OpenGL, t is what you multiply for every vertex of that ship. This assumes you are using column vectors (as OpenGL does), and be aware that matrices in OpenGL are stored columns first.
Basically, your problem seems to be with the order you are applying your rotations. See, quaternions and matrices multiplication are non-commutative. So, if instead, you write:
t = around_z(angle) * t;
You will have the around_z rotation applied not to the "original state" z, but to global coordinate z, with the ship already affected by the initial transformation (roatated and translated). This is the same thing when you call the glRotate and glTranslate functions. The order they are called matters.
Being a little more specific for your problem: you have the absolute translation trans, and the rotation around its center rot. You would update each object in your scene with something like:
void update(quaternion delta_rot, vector delta_trans) {
rot = rot * delta_rot;
trans = trans + rot.apply(delta_trans);
}
Where delta_rot and delta_trans are both expressed in coordinates relative to the original state, so, if you want to propel your ship forward 0.5 units, your delta_trans would be (0, 0, -0.5). To draw, it would be something like:
void draw() {
// Apply the absolute translation first
glLoadIdentity();
glTranslatevf(&trans);
// Apply the absolute rotation last
struct matrix m = quaternionToMatrix(q);
glMultMatrix(&m);
// This sequence is equivalent to:
// final_vertex_position = translation_matrix * rotation_matrix * vertex;
// ... draw stuff
}
The order of the calls I choose by reading the manual for glTranslate and glMultMatrix, to guarantee the order the transformations are applied.
About rot.apply()
As explained at Wikipedia article Quaternions and spatial rotation, to apply a rotation described by quaternion q on a vector p, it would be rp = q * p * q^(-1), where rp is the newly rotated vector. If you have a working quaternion library implemented on your game, you should either already have this operation implemented, or should implement it now, because this is the core of using quaternions as rotations.
For instance, if you have a quaternion that describes a rotation of 90° around (0,0,1), if you apply it to (1,0,0), you will have the vector (0,1,0), i.e. you have the original vector rotated by the quaternion. This is equivalent to converting your quaternion to matrix, and doing a matrix to colum-vector multiplication (by matrix multiplication rules, it yields another column-vector, the rotated vector).

Am I computing the attributes of my frustum properly?

I have a basic camera class, of which has the following notable functions:
// Get near and far plane dimensions in view space coordinates.
float GetNearWindowWidth()const;
float GetNearWindowHeight()const;
float GetFarWindowWidth()const;
float GetFarWindowHeight()const;
// Set frustum.
void SetLens(float fovY, float aspect, float zn, float zf);
Where the params zn and zf in the SetLens function correspond to the near and far clip plane distance, respectively.
SetLens basically creates a perspective projection matrix, along with computing both the far and near clip plane's height:
void Camera::SetLens(float fovY, float aspect, float zn, float zf)
{
// cache properties
mFovY = fovY;
mAspect = aspect;
mNearZ = zn;
mFarZ = zf;
float tanHalfFovy = tanf( 0.5f * glm::radians( fovY ) );
mNearWindowHeight = 2.0f * mNearZ * tanHalfFovy;
mFarWindowHeight = 2.0f * mFarZ * tanHalfFovy;
mProj = glm::perspective( fovY, aspect, zn, zf );
}
So, GetFarWindowHeight() and GetNearWindowHeight() naturally return their respective height class member values. Their width counterparts, however, return the respective height value multiplied by the view aspect ratio. So, for GetNearWindowWidth():
float Camera::GetNearWindowWidth()const
{
return mAspect * mNearWindowHeight;
}
Where GetFarWindowWidth() performs the same computation, of course replacing mNearWindowHeight with mFarWindowHeight.
Now that's all out of the way, something tells me that I'm computing the height and width of the near and far clip planes improperly. In particular, I think what causes this confusion is the fact that I'm specifying the field of view on the y axis in degrees, and then converting it to radians in the tangent function. Where I think this is causing problems is in my frustum culling function, which uses the width/height of the near and far planes to obtain points for the top, right, left and bottom planes as well.
So, am I correct in that I'm doing this completely wrong? If so, what should I do to fix it?
Disclaimer
This code originally stems from a D3D11 book, which I decided to quit reading and move back to OpenGL. In order to make the process less painful, I figured converting some of the original code to be more OpenGL compliant would be nice. So far, it's worked fairly well, with this one minor issue...
Edit
I should have originally mentioned a few things:
This is not my first time with OpenGL; I'm well aware of the transformation processes, as well the as the coordinate system differences between GL and D3D.
This isn't my entire camera class, although the only other thing which I think may be questionable in this context is using my camera's mOrientation matrix to compute the look, up, and right direction vectors, via transforming each on a +x, +y, and -z basis, respectively. So, as an example, to compute my look vector I would do: mOrientation * vec4(0.0f, 0.0f, -1.0f, 1.0f), and then convert that to a vec3. The context that I'm referring to here involves how these basis vectors would be used in conjunction with culling the frustum.