Rotate geometry to align to a direction vector - c++

I've been trying to get my generated geometry to align with a direction vector. To illustrate what my current problem is:
A = Correctly aligned geometry ( just a triangle for testing )
B = Incorrectly aligned geometry
My current solution in code for this triangle example (This code is run for all the nodes you see on screen starting at the split, I am using the GLM math library):
glm::vec3 v1, v2, v3;
v1.x = -0.25f;
v1.z = -0.25f;
v2.x = 0.25f;
v2.z = -0.25f;
v3.x = 0.0f;
v3.z = 0.25f;
v1.y = 0.0f;
v2.y = 0.0f;
v3.y = 0.0f;
glm::mat4x4 translate = glm::translate(glm::mat4x4(1.0f), sp.position);
glm::mat4x4 rotate = glm::lookAt(glm::vec3(0.0f, 0.0f, 0.0f), sp.direction, glm::vec3(0.0f, 1.0f, 0.0f));
v1 = glm::vec4(translate * rotate * glm::vec4(v1, 1.0f)).swizzle(glm::comp::X, glm::comp::Y, glm::comp::Z);
v2 = glm::vec4(translate * rotate * glm::vec4(v2, 1.0f)).swizzle(glm::comp::X, glm::comp::Y, glm::comp::Z);
v3 = glm::vec4(translate * rotate * glm::vec4(v3, 1.0f)).swizzle(glm::comp::X, glm::comp::Y, glm::comp::Z);
The direction vector values for point A:
x 0.000000000 float
y 0.788205445 float
z 0.615412235 float
The direction vector values for point B:
x 0.0543831661 float
y 0.788205445 float
z -0.613004684 float
Edit 1 (24/11/2013 # 20:36):
A and B do not have any relation, both are generated separately. When generating A or B only a position and direction is known.
I've been looking at solutions posted here:
Quaternions, rotate a model and align with a direction
Direct3D
Rotation Matrix from Vector and vice-versa
Direction Vector To
Rotation Matrix
But I haven't been able to successfully rotate my geometry to align with my direction vector. I feel like I'm doing something rather basic wrong.
Any help would be greatly appreciated!

If A and B are unit vectors and you want a rotation matrix R that transforms B so that it aligns with A, then start by computing C = B x A (the cross-product of B and A). C is the axis of rotation, and arcsin(|C|) is the necessary rotation angle.
From these you can build the required rotation matrix. It looks like glm has support for this, so I won't explain further.
NB if you are doing many, many of these in performance-critical code, you can gain a bit of speed by noting |C| = sin(theta), sqrt(1 - |C|^2) = cos(theta) and computing the matrix yourself with these known values of sin(theta) and cos(theta). For this see for example this discussion. The glm routine will take your angle arcsin(|C|) and proceed immediately to compute its sin and cos, a small waste since you already knew these and the operations are relatively expensive.
If the rotation is about some point p other than the origin, then let T be a translation that takes p to the origin, and find X = T^-1 R T. This X will be the transformation you want.

Related

get an angle between 2 points and rotate a point about an other point with this angle - C++

I'm basically trying to create 2D lines based on points from bezier curves.
All the points of the bezier curves are well placed and everything seems in order.
Starting with these points I'm creating 2 other points on the z axis which will be the border of the line :
glm::vec3 p1 = pos[i];
p1.z = p1.z + (size / 2);
glm::vec3 p2 = pos[i];
p2.z = p2.z - (size / 2);
Then I change these points positions by rotating them around the main point :
pm is the mobile point rotating around the fix point pf
glm::vec3 rotP = glm::vec3(0.0f, 0.5f, 0.0f);
float co = cos(angle);
float si = sin(angle);
// CLOCKWISE
rotP.x = (pf.x - pm.x) * co + (pf.z - pm.z) * si + pm.x;
rotP.z = -(pf.x - pm.x) * si + (pf.z - pm.z) * co + pm.z;
angle is the angle between the backward and forward point on the bezier curve :
depForward is x, glm::vec3(1.0f, 0.0f, 0.0f)
glm::vec3 normForwardUnit = normalize(p2 - p1);
float angle = (acos(dot(depForward, normForwardUnit)));
The problem that I get is that the rotations are wrong. Some of my lines are correct but it seems to depend on the orientation of the lines.
not correct example
correct example
I think the problem comes from the format of the rotation but I'm still unable to understand.
I tried to normalize the angle to different ranges :
//0 to 2PI
if (angle < 0) { angle += 2 * PI; }
//-PI to PI
if (angle > PI) { angle -= 2 * PI; }
else if (angle <= -PI) { angle += 2 * PI; }
Other ways to calculate the angle :
float angle = atan2(p2.z - p1.z, p2.x - p1.x);
To rotate the points counter-clockwise :
//COUNTER CLOCKWISE
rotP.x = (pf.x - pm.x) * co - (pf.z - pm.z) * si + pm.x;
rotP.z = (pf.x - pm.x) * si + (pf.z - pm.z) * co + pm.z;
In case anyone needs it, here's the implementation of paddy's approach.
You could use the point between backP and nextP instead of midPoint to place your new points.
backP and nextP being the point before and the point after of the b curve
// VEC FORWARD VECTOR
glm::vec3 forwardVec = normalize(backP - nextP);
//PERPENDICULAR VEC
glm::vec3 perpVec = cross(forwardVec, glm::vec3(0.0f, 1.0f, 0.0f));
perpVec = normalize(perpVec);
//MID POINT
glm::vec3 midP = midPoint(backP, nextP);
// GEN POINTS
glm::vec3 p1 = midP + (width * perpVec);
glm::vec3 p2 = midP - (width * perpVec);
I think you should definately look at this:
Is it possible to express "t" variable from Cubic Bezier Curve equation?
In case you insist on your way you do not need to use any angles nor rotations...
You have line p0,p1 which is sampled from your polynomial curve so:
tangent = p1-p0
However its better to have better approximation of tangent so either take it by 1st derivation of your curve or use 2 consequent lines (p0,p1) , (p1,p2) then tangent at point p1 is:
tangent = p2-p1
For more info see:
How do i verify the gradient at midpoint coordinate which i calculate by using cubic bezire curve equation
Now take bitangent (z axis of your camera which can be extracted from camera matrix) and use cross product to get normal
normal = normalize(cross(tangent,binormal))
now you just displace the p1 by normal:
p1' = p1 + 0.5*curve_thickness*normal
p1'' = p1 - 0.5*curve_thickness*normal
do the same for all points of your curve ... after that you just render quads using p' and p'' points ...
However with this approach you might run into problems that might need further tweaking see:
draw outline for some connected lines

Why is there a difference of 90 degrees between rotation and direction?

Firstly, this problem is not unique to the current project I am currently working on and has happened several times before. Here is the problem.
I have a Triangle struct. olc::vf2d is a vector class with x and y components.
struct Triangle
{
olc::vf2d p1 = { 0.0f, -10.0f };
olc::vf2d p2 = { -5.0f, 5.0f };
olc::vf2d p3 = { 5.0f, 5.0f };
};
I create a triangle along with position and angle for it.
Triangle triangle
olc::vf2d position = { 0.0f, 0.0f };
float angle = 0.0f;
Now, I rotate (and offset) the triangle as so:
float x1 = triangle.p1.x * cosf(angle) - triangle.p1.y * sinf(-angle) + position.x;
float y1 = triangle.p1.x * sinf(-angle) + triangle.p1.y * cosf(angle) + position.y;
float x2 = triangle.p2.x * cosf(angle) - triangle.p2.y * sinf(-angle) + position.x;
float y2 = triangle.p2.x * sinf(-angle) + triangle.p2.y * cosf(angle) + position.y;
float x3 = triangle.p3.x * cosf(angle) - triangle.p3.y * sinf(-angle) + position.x;
float y3 = triangle.p3.x * sinf(-angle) + triangle.p3.y * cosf(angle) + position.y;
When I increase the angle every frame and draw it, it rotates and works as expected. But now here is the problem. When I try to calculate the direction for it to move towards, like this:
position.x -= cosf(angle) * elapsedTime;
position.y -= sinf(-angle) * elapsedTime;
It moves, but is look 90 degrees off from the rotation. Example, it is facing directly up and is moving to the right.
Up until this point, I have always solved this problem by using different angles values, i.e taking away 3.14159f / 2.0f radians from angle used in direction calculation
position.x -= cosf(angle - (3.14159f / 2.0f));
position.y -= sinf(-angle - (3.14159f / 2.0f));
or vice-versa and this fixes the problem (now it moves in the direction it is facing).
But now I want to know exactly why this happens and a proper way to solve this problem, many thanks.
There are some missing items to be able to diagnose. You have to have some kind of coordinate system. Is it a right-handed coordinate system or a left-handed coordinate system. You determine this by taking your X/Y origin, and visualizing your hand over it with your thumb pointing towards you. When the X-axis rotates counter-clockwise, i.e. the way the fingers of your right hand curl when held as visualized, does the positive X-axis move towards the positive Y-axis (right-handed system) or does it move towards the negative Y-axis (left-handed system).
As an example, most algebra graphing is done with a right-handed system, but on the raw pixels of a monitor positive Y is down instead of up as typically seen in algebra.
Direction of motion should be independent of rotation angle -- unless you really want them coupled, which is not typically the case.
Using two different variables for direction-of-motion and angle-of-rotation will allow you to visually and mentally decouple the two and see better what is happening.
Now, typically -- think algebra -- angles for rotation are measured starting from "east" -- pointing to the right -- is zero degrees, "north" is 90 degrees -- pointing up -- and so on.
If you want to move "straight up" you are not moving in the zero-degree direction, but rather in the 90-degree direction. So if your rotation is zero degrees but movement is desired to be "straight up" like the triangle points, then you should be using a 90-degree offset for your movement versus your rotation.
If you decouple rotation and motion, this behavior is much easier to understand and observe.

Determine angle between camera position and world point in 3D engine

I want to determine the horizontal and vertical angle, from a camera's position to a world point, in respect to the camera's forward axis.
My linear algebra is a bit rusty, but given the camera's forward, up, and right vector, for example:
camForward = [0 0 1];
camUp = [0 1 0];
camRight = [1 0 0];
And the camera position and world point, for example:
camPosition = [1 2 3];
worldPoint = [5 6 4];
The sought-after angles should be determinable by first taking the difference of the positions:
delta = worldPoint-camPosition;
Then projecting it on the camera axes using the dot products:
deltaHorizontal = dot(delta,camRight);
deltaVertical = dot(delta,camUp);
deltaDepth = dot(delta,camForward);
And finally computing angles as:
angleHorizontal = atan(deltaHorizontal/deltaDepth);
angleVertical = atan(deltaVertical/deltaDepth);
In the example case, this yields that both angles become ~76°, which seems reasonable; varying the positions and axes also seem to give reasonable results.
Thus, if I am not getting the angles I expect, it should be due to that I am using either incorrect position and/or camera axes. It is worth noting that the 3D engine is using OpenGL and GLM.
I am fairly certain that the positions are correct, as moving around in the scene and inspecting the positions in relation to known reference points give consistent and correct results. Leading me to believe that I am using the wrong camera axes. To get the angles I am using (the equivalent of):
glm::vec3 worldPoint = glm::unProject( glm::vec3(windowX, windowY, windowZ), viewMatrix, projectionMatrix, glm::vec4(0,0,windowWidth,windowHeight));
glm::vec3 delta = glm::vec3(worldPoint.x, worldPoint.y, worldPoint.z);
float horizontalDistance = glm::dot(delta, cameraData->right);
float verticalDistance = glm::dot(delta, cameraData->up);
float depthDistance = glm::dot(delta, cameraData->forward);
float horizontalAngle = glm::atan(horizontalDistance/depthDistance)
float verticalAngle = glm::atan(verticalDistance/depthDistance)
Each frame, forward, up, and right are read from a view matrix, viewMatrix which in turn is produced by a converting a quaternion, Q, which holds the camera rotation which is controlled by mouse:
void updateView(CameraData * cameraData, MouseData * mouseData, MouseParameters * mouseParameters){
float deltaX = mouseData->currentX - mouseData->lastX;
float deltaY = mouseData->currentY - mouseData->lastY;
mouseData->lastX = mouseData->currentX;
mouseData->lastY = mouseData->currentY;
float pitch = mouseParameters->sensitivityY * deltaY;
float yaw = mouseParameters->sensitivityX * deltaX;
glm::quat pitch_Q = glm::quat(glm::vec3(pitch, 0.0f, 0.0f));
glm::quat yaw_Q = glm::quat(glm::vec3(0.0f, yaw, 0.0f));
cameraData->Q = pitch_Q * cameraData->Q * yaw_Q;
cameraData->Q = glm::normalize(cameraData->Q);
glm::mat4 rotation = glm::toMat4(cameraData->Q);
glm::mat4 translation = glm::mat4(1.0f);
translation = glm::translate(translation, -(cameraData->position));
cameraData->viewMatrix = rotation * translation;
cameraData->forward = (cameraData->viewMatrix)[2];
cameraData->up = (cameraData->viewMatrix)[1];
cameraData->right = (cameraData->viewMatrix)[0];
}
However, something goes wrong, and the correct angles are seemingly only produced while looking along, or perpendicular to, the world z-axis ([0 0 1]). Where am I mistaken?

Extracting three euler angles from top and bottom two 3D points/one vector of unsymmetrical object

I know two 3D-points in a line (top and bottom of an unsymmetrical object), and would like to find euler angles(rotation along x, y and z axis).
Example: Need reverse engineering of following code of OpenGL, below is the just an example to show the scenario.
//Translation
glTranslated(p1.x, p1.y, p1.z);
//Rotation
glRotatef(rot.x, 1.0f, 0.0f, 0.0f);
glRotatef(rot.y, 0.0f, 1.0f, 0.0f);
glRotatef(rot.z, 0.0f, 0.0f, 1.0f);
// Draw the object ALONG Y-AXIS
p2 = DrawMyObject(); //p2 is top of my object
Now in some situations I got only p1 and p2 and I need to know euler angles (roation along x, y and z axis). How?
This is what I tried and answer should be (Rx, Ry, Rz): (4, -3, -11),
cv::Point3d p1, p2;
p1.x = 0.0525498;
p1.y = 0.0798909;
p1.z = -1.20806;
p2.x = 0.0586557;
p2.y = 0.111226;
p2.z = -1.20587;
double dx, dy, dz;
double angle;
dx = p2.x - p1.x;
dy = p2.y - p1.y;
dz = p2.z - p1.z;
angle = std::atan2(dy, dz); angle = RAD2DEG(angle);
std::cout<<"\n atan2(dy, dz): "<<int(90 - angle);
angle = std::atan2(dx, dz); angle = RAD2DEG(angle);
std::cout<<"\n atan2(dz, dx): "<<angle;
angle = std::atan2(dy, dx); angle = RAD2DEG(angle);
std::cout<<"\n atan2(dy, dx): "<<int(angle -90);
std::cout<<std::endl;
I am not getting exactly correct answer, especially rotation along Y is not correct at all. I think p1 and p2 both lies in y-axis while rotating along y-axis so the problem is. Then what is best possible solution?
As stated in the comments you will need either a 3rd point on your object or a constant world-space vector. Be aware that using a constant vector could introduce gimbal problems depending on your specific application and the orientation of the line relative to that vector so a third point might be preferable if you have it.
Construct an orthonormalized 3x3 rotation matrix:
Use the Grahm-Schmit method to orthonormalize the first two rows where:
u1 = p2 - p1
u2 = p3 - p1 (or a constant vector)
After applying Grahm-Schmit, these vectors will become the first 2 rows of your 3x3 matrix.
The third row of your matrix is just the cross-product of those first two rows.
Decompose the resulting matrix into euler angles.

Convert Quaternion rotation to rotation matrix?

Basically, given a quaterion (qx, qy, qz, qw)... How can i convert that to an OpenGL rotation matrix? I'm also interested in which matrix row is "Up", "Right", "Forward" etc... I have a camera rotation in quaternion that I need in vectors...
The following code is based on a quaternion (qw, qx, qy, qz), where the order is based on the Boost quaternions:
boost::math::quaternion<float> quaternion;
float qw = quaternion.R_component_1();
float qx = quaternion.R_component_2();
float qy = quaternion.R_component_3();
float qz = quaternion.R_component_4();
First you have to normalize the quaternion:
const float n = 1.0f/sqrt(qx*qx+qy*qy+qz*qz+qw*qw);
qx *= n;
qy *= n;
qz *= n;
qw *= n;
Then you can create your matrix:
Matrix<float, 4>(
1.0f - 2.0f*qy*qy - 2.0f*qz*qz, 2.0f*qx*qy - 2.0f*qz*qw, 2.0f*qx*qz + 2.0f*qy*qw, 0.0f,
2.0f*qx*qy + 2.0f*qz*qw, 1.0f - 2.0f*qx*qx - 2.0f*qz*qz, 2.0f*qy*qz - 2.0f*qx*qw, 0.0f,
2.0f*qx*qz - 2.0f*qy*qw, 2.0f*qy*qz + 2.0f*qx*qw, 1.0f - 2.0f*qx*qx - 2.0f*qy*qy, 0.0f,
0.0f, 0.0f, 0.0f, 1.0f);
Depending on your matrix class, you might have to transpose it before passing it to OpenGL.
One way to do it, which is pretty easy to visualize, is to apply the rotation specified by your quaternion to the basis vectors (1,0,0), (0,1,0), and (0,0,1). The rotated values
give the basis vectors in the rotated system relative to the original system. Use these
vectors to form the rows of the rotation matrix. The resulting matrix, and its transpose,
represent the forward and inverse transformations between the original system and the
rotated system.
I'm not familiar with the conventions used by OpenGL, so maybe someone else can answer
that part of your question...
You might not have to deal with a rotation matrix at all. Here is a way that appears to be faster than converting to a matrix and multiplying a vector with it:
// move vector to camera position co (before or after rotation depending on the goal)
v -= co;
// rotate vector v by quaternion q; see info [1]
vec3 t = 2 * cross(q.xyz, v);
v = v + q.w * t + cross(q.xyz, t);
[1] http://mollyrocket.com/forums/viewtopic.php?t=833&sid=3a84e00a70ccb046cfc87ac39881a3d0
using glm, you can simply use a casting operator.
so to convert from a matrix4 to quaternion, simply write
glm::mat4_cast(quaternion_name)