How to get angle between two vectors in 3D - c++

I have two objects one sphere and a cone. I want cone to always face the sphere as shown in the images.
we have constructed the cone in local coordinate system in such a way, that the tip of the cone points upward the y-axis and the center is at the origin (0,0,0).
The angle between two 3D vectors would be
float fAngle = std::acos(dot(sphereVector, coneVector) / magnitude(sphereVector * magnitude(coneVector)));
For cone to be always facing the sphere it need to be rotated in all three axis based on the position of the sphere but i am getting only one angle from the maths formula.
How do i calculate all the three angles for the cone that it is always perpendicular to the sphere.

First, you need the vector where the cone should point to:
direction = center_cone - center_sphere;
Then, we assume, that you've constructed your cone in the local coordinate system in such a way, that the tip of the cone points upward the y-axis and the center is at the origin (0,0,0).
The axises to rotate are:
x_axis(1, 0, 0);
y_axis(0, 1, 0);
z_axis(0, 0, 1);
Now, you simply have to project the axises to the direction vector to get the 3 angles.
Example:
float angle(vec a, vec b)
{
return acos(dot(a, b) / (magnitude(a) * magnitude(b)));
}
vec direction = normalize(center_cone - center_sphere);
float x_rot = angle(x_axis, direction);
float y_rot = angle(y_axis, direction);
float z_rot = angle(z_axis, direction);

Related

How can I define vertices of a plane that is always parallel to the camera?

I am trying to draw a rectangle (basically a plane) that is always parallel to the camera. I want to restrict plane to a certain size (lets say height = 2 and width = 2 units). However, I do not understand how to set position to the vertices such that rectangle will always be parallel to the camera.
First I am calculating camera normal (direction) using:
glm::normalize(mPosition - mTargetPos); // normal
and then I am using point-normal equation to define the plane:
normal = (A, B, C)
point = (a, b, c) // this point will serve as a center to the plane
A(x−a)+B(y−b)+C(z−c) = 0
Question: How can I define vertices of the plane?
Take some normilized vector UpDir for up direction (it can be UpDir=(0,1,0) or UpDir=(0,0,1) depending on your coordinate system, or it can be computed somehow)
Compute cross product SideDir of the normal and the UpDir.
Now you can use the SideDir and UpDir as basis for your plane's coordinate system, and compute four vertices of the rectangle as point+width*SideDir+height*UpDir, point+width*SideDir-height*UpDir, point-width*SideDir-height*UpDir, point-width*SideDir+height*UpDir
I recommend to define the points in view space. Finally transform the points by the inverse view matrix.
In view space the points are parallel to the view if they have the same z coordinate. The z-coordinate has to be negative and its amount has to greater than the distance to the near plane and less than the distance to the far plane:
near < -z < far
Compute the view matrix (view_mat) and define the points in view sapce:
glm::mat4 view_mat = glm::lookAt(mPosition, mTargetPos, mUp);
float z =
glm::vec3 pt1View(x1, y1, z);
glm::vec3 pt2View(x2, y2, z);
// [...]
Transform the points from view space to world space:
glm::mat4 inverse_view_mat = glm::inverse(view_mat);
glm::vec3 pt1World = glm::vec3(inverse_view_mat * glm::vec4(pt1View, 1.0f));
glm::vec3 pt2World = glm::vec3(inverse_view_mat * glm::vec4(pt2View, 1.0f));
// [...]

Arcball Camera: how to obtain right, direction and up

I'm trying to implement a camera that follows a moving object. I've implemented these functions:
void Camera::espheric_yaw(float degrees, glm::vec3 center_point)
{
float lim_yaw = glm::radians(89.0f);
float radians = glm::radians(degrees);
absoluteYaw += radians;
... clamp absoluteYaw
float radius = 10.0f;
float camX = cos(absoluteYaw) * cos(absoluteRoll) * radius;
float camY = sin(absoluteRoll)* radius;
float camZ = sin(absoluteYaw) * cos(absoluteRoll) * radius;
eyes.x = camX;
eyes.y = camY;
eyes.z = camZ;
lookAt = center_point;
view = glm::normalize(lookAt - eyes);
up = glm::vec3(0, 1, 0);
right = glm::normalize(glm::cross(view, up));
}
I want to use this function (and the pitch version) for a camera that follows a moving 3d model. Right now, it works when the center_point is the (0,1,0). I think i'm getting the position right but the up vector is clearly not always (0,1,0).
How can I get my up, view and right vector for the camera? And then, if I update the eyes position of the camera this way, how will my camera move when the other object (centered at center_position parameter) moves?
The idea is to update this each time I have mouse input with centered_value = center of the moving object. Then use gluLookAt with view, eyes and up values of my camera (and lookAt which will be eyes+view).
Following a moving object is matter of pointing the camera to that object. This is what typical lookAt function does. See the maths here and then use glm::lookAt().
The 'Arcball' technic is for rotating with the mouse. See some maths here.
The idea is to get two vectors (first, second) from positions on screen. For each vector, X,Y are taking depending on pixels "travelled" by mouse and the size of the window. Z is calculated by 'trackball' maths. With these two vectors (after normalizing them), its cross product gives the axis of rotation in camera coordinates, and its dot product gives the angle. Now, you can rotate the camera by glm::rotate()
If you go another route (e.g. calculating camera matrix on your own), then the "up" direction of the camera must be updated by yourself. Remember it's perpendicular to the other two axis of the camera.

Compute a RPY (roll pitch yaw) from a 3d point on a sphere

I need a method to find a set of homogenous transformation matrices that describes the position and orientation in a sphere.
The idea is that I have an object in the center of this sphere which has a radius of dz. Since I know the 3d coordinate of the object I know all the 3d coordinates of the sphere. Is it possible to determine the RPY of any point on the sphere such that the point always points toward the object in the center?
illustration:
At the origo of this sphere we have an object. The radius of the sphere is dz.
The red dot is a point on the sphere, and the vector from this point toward the object/origo.
The position should be relatively easy to extract, as a sphere can be described by a function, but how do I determine the vector, or rotation matrix that points such that it points toward origo.
You could, using the center of the sphere as the origin, compute the unit vector of the line formed by the origin to the point on the edge of the sphere, and then multiply that unit vector by -1 to obtain the vector pointing toward the center of the sphere from the point on the edge of the sphere.
Example:
vec pointToCenter(Point edge, Point origin) {
vec norm = edge - origin;
vec unitVec = norm / vecLength(norm);
return unitVec * -1;
}
Once you have the vector you can convert it to euler angles for the RPY, an example is here
Of the top of my head I would suggest using quaterneons to define the rotation of any point at the origin, relative to the point you want on the surface of the sphere:
Pick the desired point on the sphere's surface, say the north pole for example
Translate that point to the origin (assuming the radius of the sphere is known), using 3D Pythagorus: x_comp^2 + y_comp^2 + z_comp^2 = hypotenuse^2
Create a rotation that points an axis at the original surface point. This will just be a scaled multiple of the x, y and z components making up the hypotenuse. I would just make it into unit components. Capture the resulting axis and rotation in a quaterneon (q, x, y, z), where x, y, z are the components of your axis and q is the rotation about that axis. Hard code q to one. You want to use quaterneons because it will make your resulting rotation matricies easier to work with
Translate the point back to the sphere's surface and negate the values of the components of your axis, to get (q, -x, -y, -z).
This will give you your point on the surface of the sphere, with an axis pointing back to the origin. With the north pole as an example, you would have a quaternion of (1, 0, -1, 0) at point (0, radius_length, 0) on the sphere's surface. See quatrotation.c in my below github repository for the resulting rotation matrix.
I don't have time to write code for this but I wrote a little tutorial with compilable code examples in a github repository a while back, which should get you started:
https://github.com/brownwa/opengl
Do the mat_rotation tutorial first, then do the quatereons one. It's doable in a weekend, a day if you're focused.

Orbiting object around orbiting object

How do I get to orbit green circle around orange and blue around green ?
I found many solutions which works fine with rotating around static point(int this case orange circle) but didn't find any good maths equation which would work for both static and moving points.
angle += sunRot;
if(angle > 360.0f)
{
angle = 0.0f;
}
float radian = glm::radians(angle);
float radius = glm::distance(position, rotCenter);
float x = rotCenter.x + (radius * cosf(radian));
float z = rotCenter.z + (radius * sinf(radian));
glm::vec3 newPos = glm::vec3(x, 0, z);
setPosition(newPos);
Here is what I'm trying to achieve (Thanks to #George Profenza for sharing link)
Base all your calculations on the radius and angle of the current object where possible and store the radius and angle with the object.
In particular, do not calculate the radius based on the x/y coordinates in every iteration: If the base object has moved between steps, your calculated radius will be slightly off and the error will accumulate.
You should be able to nest coordinate spaces using opengl using glPushMatrix(), glPopMatrix() calls. Here's a basic example(press mouse to see coordinate spaces).
The syntax isn't c++, but it's easy to see what I mean.
You can do this multiple ways:
polar coordinate formula
manually multiplying transformation matrices
simply using push/pop matrix calls (along with translate/rotate where needed), which does the matrix multiplication for you behind the scenes.
Just in case you want to try the polar coordinate formula:
x = cos(angle) * radius
y = sin(angle) * radius
Where angle is the current rotation of a circle and the radius is it's distance from the centre of rotation.

Transform cube on to surface of sphere in openGL

I'm currently working on a game which renders a textured sphere (representing Earth) and cubes representing player models (which will be implemented later).
When a user clicks a point on the sphere, the cube is translated from the origin (0,0,0) (which is also the center of the sphere) to the point on the surface of the sphere.
The problem is that I want the cube to rotate so as to sit with it's base flat on the sphere's surface (as opposed to just translating the cube).
What the best way is to calculate the rotation matrices about each axis in order to achieve this effect?
This is the same calculation as you'd perform to make a "lookat" matrix.
In this form, you would use the normalised point on the sphere as one axis (often used as the 'Z' axis), and then make the other two as perpendicular vectors to that. Typically to do that you choose some arbitrary 'up' axis, which needs to not be parallel to your first axis, and then use two cross-products. First you cross 'Z' and 'Up' to make an 'X' axis, and then you cross the 'X' and 'Z' axes to make a 'Y' axis.
The X, Y, and Z axes (normalised) form a rotation matrix which will orient the cube to the surface normal of the sphere. Then just translate it to the surface point.
The basic idea in GL is this:
float x_axis[3];
float y_axis[3];
float z_axis[3]; // This is the point on sphere, normalised
x_axis = cross(z_axis, up);
normalise(x_axis);
y_axis = cross(z_axis, x_axis);
DrawSphere();
float mat[16] = {
x_axis[0],x_axis[1],x_axis[2],0,
y_axis[0],y_axis[1],y_axis[2],0,
z_axis[0],z_axis[1],z_axis[2],0,
(sphereRad + cubeSize) * z_axis[0], (sphereRad + cubeSize) * z_axis[1], (sphereRad + cubeSize) * z_axis[2], 1 };
glMultMatrixf(mat);
DrawCube();
Where z_axis[] is the normalised point on the sphere, x_axis[] is the normalised cross-product of that vector with the arbitrary 'up' vector, and y_axis[] is the normalised cross-product of the other two axes. sphereRad and cubeSize are the sizes of the sphere and cube - I'm assuming both shapes are centred on their local coordinate origin.