Why is there a difference of 90 degrees between rotation and direction? - c++

Firstly, this problem is not unique to the current project I am currently working on and has happened several times before. Here is the problem.
I have a Triangle struct. olc::vf2d is a vector class with x and y components.
struct Triangle
{
olc::vf2d p1 = { 0.0f, -10.0f };
olc::vf2d p2 = { -5.0f, 5.0f };
olc::vf2d p3 = { 5.0f, 5.0f };
};
I create a triangle along with position and angle for it.
Triangle triangle
olc::vf2d position = { 0.0f, 0.0f };
float angle = 0.0f;
Now, I rotate (and offset) the triangle as so:
float x1 = triangle.p1.x * cosf(angle) - triangle.p1.y * sinf(-angle) + position.x;
float y1 = triangle.p1.x * sinf(-angle) + triangle.p1.y * cosf(angle) + position.y;
float x2 = triangle.p2.x * cosf(angle) - triangle.p2.y * sinf(-angle) + position.x;
float y2 = triangle.p2.x * sinf(-angle) + triangle.p2.y * cosf(angle) + position.y;
float x3 = triangle.p3.x * cosf(angle) - triangle.p3.y * sinf(-angle) + position.x;
float y3 = triangle.p3.x * sinf(-angle) + triangle.p3.y * cosf(angle) + position.y;
When I increase the angle every frame and draw it, it rotates and works as expected. But now here is the problem. When I try to calculate the direction for it to move towards, like this:
position.x -= cosf(angle) * elapsedTime;
position.y -= sinf(-angle) * elapsedTime;
It moves, but is look 90 degrees off from the rotation. Example, it is facing directly up and is moving to the right.
Up until this point, I have always solved this problem by using different angles values, i.e taking away 3.14159f / 2.0f radians from angle used in direction calculation
position.x -= cosf(angle - (3.14159f / 2.0f));
position.y -= sinf(-angle - (3.14159f / 2.0f));
or vice-versa and this fixes the problem (now it moves in the direction it is facing).
But now I want to know exactly why this happens and a proper way to solve this problem, many thanks.

There are some missing items to be able to diagnose. You have to have some kind of coordinate system. Is it a right-handed coordinate system or a left-handed coordinate system. You determine this by taking your X/Y origin, and visualizing your hand over it with your thumb pointing towards you. When the X-axis rotates counter-clockwise, i.e. the way the fingers of your right hand curl when held as visualized, does the positive X-axis move towards the positive Y-axis (right-handed system) or does it move towards the negative Y-axis (left-handed system).
As an example, most algebra graphing is done with a right-handed system, but on the raw pixels of a monitor positive Y is down instead of up as typically seen in algebra.
Direction of motion should be independent of rotation angle -- unless you really want them coupled, which is not typically the case.
Using two different variables for direction-of-motion and angle-of-rotation will allow you to visually and mentally decouple the two and see better what is happening.
Now, typically -- think algebra -- angles for rotation are measured starting from "east" -- pointing to the right -- is zero degrees, "north" is 90 degrees -- pointing up -- and so on.
If you want to move "straight up" you are not moving in the zero-degree direction, but rather in the 90-degree direction. So if your rotation is zero degrees but movement is desired to be "straight up" like the triangle points, then you should be using a 90-degree offset for your movement versus your rotation.
If you decouple rotation and motion, this behavior is much easier to understand and observe.

Related

get an angle between 2 points and rotate a point about an other point with this angle - C++

I'm basically trying to create 2D lines based on points from bezier curves.
All the points of the bezier curves are well placed and everything seems in order.
Starting with these points I'm creating 2 other points on the z axis which will be the border of the line :
glm::vec3 p1 = pos[i];
p1.z = p1.z + (size / 2);
glm::vec3 p2 = pos[i];
p2.z = p2.z - (size / 2);
Then I change these points positions by rotating them around the main point :
pm is the mobile point rotating around the fix point pf
glm::vec3 rotP = glm::vec3(0.0f, 0.5f, 0.0f);
float co = cos(angle);
float si = sin(angle);
// CLOCKWISE
rotP.x = (pf.x - pm.x) * co + (pf.z - pm.z) * si + pm.x;
rotP.z = -(pf.x - pm.x) * si + (pf.z - pm.z) * co + pm.z;
angle is the angle between the backward and forward point on the bezier curve :
depForward is x, glm::vec3(1.0f, 0.0f, 0.0f)
glm::vec3 normForwardUnit = normalize(p2 - p1);
float angle = (acos(dot(depForward, normForwardUnit)));
The problem that I get is that the rotations are wrong. Some of my lines are correct but it seems to depend on the orientation of the lines.
not correct example
correct example
I think the problem comes from the format of the rotation but I'm still unable to understand.
I tried to normalize the angle to different ranges :
//0 to 2PI
if (angle < 0) { angle += 2 * PI; }
//-PI to PI
if (angle > PI) { angle -= 2 * PI; }
else if (angle <= -PI) { angle += 2 * PI; }
Other ways to calculate the angle :
float angle = atan2(p2.z - p1.z, p2.x - p1.x);
To rotate the points counter-clockwise :
//COUNTER CLOCKWISE
rotP.x = (pf.x - pm.x) * co - (pf.z - pm.z) * si + pm.x;
rotP.z = (pf.x - pm.x) * si + (pf.z - pm.z) * co + pm.z;
In case anyone needs it, here's the implementation of paddy's approach.
You could use the point between backP and nextP instead of midPoint to place your new points.
backP and nextP being the point before and the point after of the b curve
// VEC FORWARD VECTOR
glm::vec3 forwardVec = normalize(backP - nextP);
//PERPENDICULAR VEC
glm::vec3 perpVec = cross(forwardVec, glm::vec3(0.0f, 1.0f, 0.0f));
perpVec = normalize(perpVec);
//MID POINT
glm::vec3 midP = midPoint(backP, nextP);
// GEN POINTS
glm::vec3 p1 = midP + (width * perpVec);
glm::vec3 p2 = midP - (width * perpVec);
I think you should definately look at this:
Is it possible to express "t" variable from Cubic Bezier Curve equation?
In case you insist on your way you do not need to use any angles nor rotations...
You have line p0,p1 which is sampled from your polynomial curve so:
tangent = p1-p0
However its better to have better approximation of tangent so either take it by 1st derivation of your curve or use 2 consequent lines (p0,p1) , (p1,p2) then tangent at point p1 is:
tangent = p2-p1
For more info see:
How do i verify the gradient at midpoint coordinate which i calculate by using cubic bezire curve equation
Now take bitangent (z axis of your camera which can be extracted from camera matrix) and use cross product to get normal
normal = normalize(cross(tangent,binormal))
now you just displace the p1 by normal:
p1' = p1 + 0.5*curve_thickness*normal
p1'' = p1 - 0.5*curve_thickness*normal
do the same for all points of your curve ... after that you just render quads using p' and p'' points ...
However with this approach you might run into problems that might need further tweaking see:
draw outline for some connected lines

How do I use a left handed coordinate system for rendering?

I'm trying to setup view, and projection matrices to work with my intended world coordinates and handedness. I'm going for a left handed coordinate system, +X to your right, +Y above you and +Z before you.
Y coordinates are working fine but objects placed in front of the camera (+Z) are showing up behind it, so I have to turn the camera 180 degrees to see them, this was an easy fix as flipping the view matrices' Z did it, but now object are flipped X wise (text is seen as in a mirror). I tried negating each objects Z for their model matrix and that works fine, but I feel there should be another cleaner solution.
My issue is similar to this: Inverted X axis in OpenGL, but I couldn't find a proper solution.
This is the projection matrix code.
Matrix4 BuildPerspectiveMatrix(const float32 fov, const float32 aspectRatio, const float32 nearPlane, const float32 farPlane)
{
Matrix4 matrix;
//Tangent of half the vertical view angle.
const auto yScale = 1.0f / Tangent(fov * 0.5f);
const auto far_m_near = farPlane - nearPlane;
matrix[0][0] = yScale / aspectRatio; //xScale
matrix[1][1] = -yScale;
matrix[2][2] = farPlane / (nearPlane - farPlane);
matrix[2][3] = (farPlane * nearPlane) / (nearPlane - farPlane);
matrix[3][2] = -1.0f;
matrix[3][3] = 0.0f;
return matrix;
}
Scene is setup like this:
Camera is at (0, 0, 0) (center of the world), object 1 is at (0, 0, 2) (2 units forward in front of the camera) and object 2 is at (1, 0, 2) (1 unit to the right and 2 units in front of the camera).
Any help is appreciated!
Vulkan, like non-legacy OpenGL and DX 11+ are all independent of any chosen "handdedness", that's an artefact of the math library you're using (if any).
As to your actual question, the matrix you're building is right handed because you assign -1 to matrix[3][2]. The left handed version is the same except it has 1 for that location.

Plotting data accurately (without lines) texture or ortho projection or something else?

I have some scatter graph data that I want to plot accurately. The actual data is a grid of data.
If I do I simply put all the graph data into a vbo and draw it as points, I see lines between my data points.
I believe this is to do with the conversion into screen space. So therefore I need to apply a projection matrix.
Im sure I want Ortho projection but currently I seem to have a bug in my ortho matrix generation:
void OrthoMatrix(float[] matrix, float left, float right, float top, float bottom, float near, float far)
{
float r_l = right - left;
float t_b = top - bottom;
float f_n = far - near;
float tx = - (right + left) / (right - left);
float ty = - (top + bottom) / (top - bottom);
float tz = - (far + near) / (far - near);
matrix[0] = 2.0f / r_l;
matrix[1] = 0.0f;
matrix[2] = 0.0f;
matrix[3] = tx;
matrix[4] = 0.0f;
matrix[5] = 2.0f / t_b;
matrix[6] = 0.0f;
matrix[7] = ty;
matrix[8] = 0.0f;
matrix[9] = 0.0f;
matrix[10] = 2.0f / f_n;
matrix[11] = tz;
matrix[12] = 0.0f;
matrix[13] = 0.0f;
matrix[14] = 0.0f;
matrix[15] = 1.0f;
}
But I can't work it out. My shader is:
gl_Position = projectionMatrix * vec4(position, 0, 1.0);
and my data is plotted like (subset)
-15.9 -9.6
-15.8 -9.6
-15.7 -9.6
-15.6 -9.6
-16.4 -9.5
-16.3 -9.5
-16.2 -9.5
-16.1 -9.5
-16 -9.5
-15.9 -9.5
The image on the left is correct (but with lines) and the image on the right is with ortho:
A colleague has suggested to first put the data into a bitmap that conforms to the plot region and then load that. Which to some degree makes sense but it seems like a step backwards. Particularly as the image is still "stretched" and in realty we are just filling in those gaps.
EDIT
I tried a frustrum projection using glm.net and that works exactly how I want it to.
The frustrum function seems to take similar paramaters to ortho.. I think its time I went and read up a bit more about projection matrices!
If anyone can explain the strange image I got (the ortho) that would be fantastic.
EDIT 2
Now that I am adding zooming I am seeing the same lines. I will have to either draw them as quads or map the point size to the zoom level.
The "lines" you see are actually an artifact of your data samples locations becoming coerced into the pixel grid, which of course involves a rounding step. OpenGL assumes pixel centers to be at x+0.5 in viewport coordinates (i.e. after the NDC to viewport mapping).
The only way to get rid of these lines is to increase the resolution at which points are mapped into viewport space. But then your viewport contains only that much of pixels, so what can you do? Well the answer is "supersampling" or "multisampling", i.e. have several samples per pixels and the coverage of these sample points modulates the rasterization weighting. Now if you fear implementing this yourself, fear not. While implementing it "by hand" is certainly possible it's not efficient: Most (all) modern GPU come with some kind of support for multisampling, usually advertised as "antialiasing".
So that's what you should do: Create an OpenGL context with full screen antialiasing support, enable multisampling and see the grid frequency beat artifacts (aliasing) vanish.

Orbiting object around orbiting object

How do I get to orbit green circle around orange and blue around green ?
I found many solutions which works fine with rotating around static point(int this case orange circle) but didn't find any good maths equation which would work for both static and moving points.
angle += sunRot;
if(angle > 360.0f)
{
angle = 0.0f;
}
float radian = glm::radians(angle);
float radius = glm::distance(position, rotCenter);
float x = rotCenter.x + (radius * cosf(radian));
float z = rotCenter.z + (radius * sinf(radian));
glm::vec3 newPos = glm::vec3(x, 0, z);
setPosition(newPos);
Here is what I'm trying to achieve (Thanks to #George Profenza for sharing link)
Base all your calculations on the radius and angle of the current object where possible and store the radius and angle with the object.
In particular, do not calculate the radius based on the x/y coordinates in every iteration: If the base object has moved between steps, your calculated radius will be slightly off and the error will accumulate.
You should be able to nest coordinate spaces using opengl using glPushMatrix(), glPopMatrix() calls. Here's a basic example(press mouse to see coordinate spaces).
The syntax isn't c++, but it's easy to see what I mean.
You can do this multiple ways:
polar coordinate formula
manually multiplying transformation matrices
simply using push/pop matrix calls (along with translate/rotate where needed), which does the matrix multiplication for you behind the scenes.
Just in case you want to try the polar coordinate formula:
x = cos(angle) * radius
y = sin(angle) * radius
Where angle is the current rotation of a circle and the radius is it's distance from the centre of rotation.

OpenGL - first person camera

I'm afraid I'm experiencing Gimbal lock when I'm trying to implement a first person camera in OpenGl.
rotateFPS_OY(float angle){
forward = forward * cos(angle) + right * sin(angle);
right = forward.CrossProduct(up);
}
rotateFPS_OX(float angle){
up = up * cos(angle) + forward * sin(angle);
forward = up.CrossProduct(right);
}
Later on, I call gluLookAt:
Vector3D center = position + forward;
gluLookAt(position.x, position.y, position.z,
center.x, center.y, center.z,
up.x, up.y, up.z);
When testing this out it seems that after a few movements, the camera rolls (changes the right vector). The calculations seem correct, I just can't tell what is wrong.
normalize vectors after calculating
I've managed to successfully implement the camera by keeping the forward.y and right.y on 0 and calculating a lookAt vector independently from the forward/right/up vectors.