Rotate 2D Vector with an Aspect Ratio - c++

Many of us are familiar with the approach to rotating a 2D vector around the origin given an angle theta:
newX = x * cos(theta) - y * sin(theta);
newY = x * sin(theta) + y * cos(theta);
I'm now trying to rotate coordinates in image UV space, which looks like this:
(Image borrowed from this SO question.)
Here the units of the u axis are wider than those of the v axis, so the approach above leads to the coordinates rotating around an ellipse as opposed to a circle. I need the rotation of the vector to act as though the coordinates were square, meaning the aspect ratio needs to be accounted for. I thought it'd be as simple as stretching the coordinates to a square space, rotating, then stretching back, although it still appears that the vectors are rotating elliptically:
newX = (x * cos(theta) * Aspect - y * sin(theta)) / Aspect;
newY = x * sin(theta) * Aspect + y * cos(theta);
Any help is appreciated, thanks in advance!

The general version for rotation and aspect ratio is:
(center_c, center_y) being the center of rotation
(aspect_x, aspect_y) being the aspect_ratio
tmp_x = (x-center_x)/aspect_x
tmp_y = (y-center_y)/aspect_y
tmp_x = tmp_x * cos(theta) - tmp_y * sin(theta)
tmp_x = tmp_x * sin(theta) + tmp_y * cos(theta)
new_x = aspect_x*tmp_x-center_x
new_y = aspect_y*tmp_x-center_y
Hope that helps.

Related

Finding the new target of camera relative to a direction in 3D space (bad title) C++

I have a camera set up with the coordinates of 0, 0, 1000 and a cube at 0, 0, 0. There is a camera position vector, rotation vector and target vector.
When finding the target, in 2d space I would use:
newx = cos(angle); // this will be very small so i would multiply it by 100 or something idk
newy = sin(angle); // same and so on
So in 3d space I'm assuming that I would use:
newx = cos(angle);
newy = sin(angle);
newz = tan(angle);
But because I'm using the mouse to find the x and y direction the z rotation is always 0:
float x_diff = (WIDTH/2) - mousePos.x;
x_diff /= WIDTH;
float y_diff = (HEIGHT/2)- mousePos.y;
y_diff /= HEIGHT;
cameraRotation.x += /* too small of a number so multiply by 5 */ 5 * (FOV * x_diff);
cameraRotation.y += 5 * (FOV * y_diff);
cameraRotation.z += ???;
and so the target z will always be 0.
I could be doing this whole thing completely wrong I don't know.
But to sum it, up i need help calculating the cameras target (FOV: 90) for its rotation in 3D space.

Determining texture co-ordinates across a geodesic sphere

I've generated a geodesic sphere for opengl rendering following a question on here and I'm trying to put texture on it. I came up with the following code by reversing an algorithm for a point on a sphere:
//complete circle equation is as follows
///<Summary>
///x = r * sin(s) * sin(t)
///y = r* cos(t)
///z = r * cos(s) * sin(t)
///</Summary>
float radius = 1.0f;
//T (height/latitude) angle
float angleT = acos(point.y / radius) ;
//S (longitude )angle
float angleS = ( asin(point.x / (radius * sin(angleT)))) + (1.0f* M_PI);
float angleS2 =( acos(point.z / (radius * sin(angleT)))) + (1.0f * M_PI);
//Angle can be 0-PI (0-180 degs), divide by this to get 0-1
angleT = angleT / (M_PI);
//Angle can be 0-2PI (0-360 degs)S
angleS = angleS / ( M_PI *2 );
angleS2 = angleS2 / ( M_PI *2 );
//Flip the y co-ord
float yTex = 1 - angleT;
float xTex = 0.0f;
//I have found that angleS2 is valid 0.5-1.0, and angleS is valid (0.3-0.5)
if (angleS < 0.5f)
{
xTex = angleS;
}
else
{
xTex = angleS2;
}
return glm::vec2( xTex , yTex);
As you can see, I've found that both versions of calculating the S angle have limited valid ranges.
float angleS = ( asin(point.x / (radius * sin(angleT)))) + (1.0f* M_PI);
float angleS2 =( acos(point.z / (radius * sin(angleT)))) + (1.0f * M_PI);
S1 is gives valid answers between x texture co-ords 0.3 and 0.5 and S2 gives valid answers for between x texture co-ords 0.5 and 1.0 (Conversion to co-ords omitted above but present in first code example). Why is it that neither formula is giving me valid answers for under 0.3?
Thanks
Will
Correct on this side
The weird border between working and not, probably caused by opengl's interpolation
Reversed section
The image being used
Edit: Here is the seam
The equations you use to calculate the longitude angles are not correct seeing what you are trying to accomplish. For the longitude angle, the range you require is 0-360 degrees, which can not be obtained through asin or acos functions, because those functions only return results between -90 and 90 degrees or 0 to 180 degrees. You can, however, use the atan2 function, which returns values from the correct interval. The code I've been working with for the past 2 years is the following:
float longitude = atan2f(point.x, point.z) + (float)M_PI;
This equation will map the horizontal center of the texture in the direction of positive Z axis. If you want the horizontal center of the texture to be in the direction of positive X axis, add M_PI / 2.0.

Determine position in front of quaternion with GLM?

Is there a way to calculate the XYZ position in front of a quaternion (XYZW) rotation, preferably using GLM?
I know the Quat rotation and the Position of the object I want to calculate the position in front of.
I know how to calculate the position in front of a rotation matrix where you have a Front vector, Up vector and Right vector, but in this case I only have XYZW values (where W is always 0, I never see it becomming 1..?)
In very short:
The data I have: Quat (X Y Z W) and Position(X Y Z) and I want to calculate PositionInFront(Position, Quat, Distance, &X, &Y, &Z)
How to accomplish this goal?
I tried a cast to 3x3matrix and perform the Up,Right,Front (because a 3x3 matrix cast is these values, right?) calculations but they do not return the correct positions.
Or would it be possible to determine the objects Z Angle? (rotation around world Z / height axis only)
It seemed that there were 2 more quaternion structures for the vehicle which I forgot to use. and those 3 are the complete set needed for the Front,Right,Up calculation formula:
float offX = 10.0f;
float offY = 0.0f;
float offZ = 0.0f;
float x = offX * info.Rotation.Front.x + offY * info.Rotation.Right.x + offZ * info.Rotation.Up.x + info.Pos.x;
float y = offX * info.Rotation.Front.y + offY * info.Rotation.Right.y + offZ * info.Rotation.Up.y + info.Pos.y;
float z = offX * info.Rotation.Front.z + offY * info.Rotation.Right.z + offZ * info.Rotation.Up.z + info.Pos.z;
float Angle = (atan2(x-info.Pos.x, y-info.Pos.y) * 180.0f / PI);

Point in OBB (Oriented Bounding Box) algorithm?

Given a center point, width, height and angle forming an OBB, how can I find if a given point P is inside the OBB?
Thanks
I take it that the wrinkle in your problem is that the bounding box can be rotated? If so, the easiest solution to me seems to be to do all calculations in the rotated coordinate plane, centered on the center of the bounding box.
To calculate the coordinates of the point relative to these axes:
newy = sin(angle) * (oldy - centery) + cos(angle) * (oldx - centerx);
newx = cos(angle) * (oldx - centerx) - sin(angle) * (oldy - centery);
(you may need to adjust this depending on how angle is supposed to be measured, I'll leave that to you, since you didn't specify)
Then hit test, the normal way:
return (newy > centery - height / 2) && (newy < centery + height / 2)
&& (newx > centerx - width / 2) && (newx < centerx + width / 2);
You could transform the coordinates of your test point (via a transformation matrix) into a rotated coordinate system based on the angle of the bounding box.
At this stage it should just be an axis-aligned point-in-rectangle test, i.e. compare against xmin, xmax, ymin, ymax. In the rotated coordinate system xmin, xmax = xmid -+ width/2 and ymin, ymax = ymid -+ height/2.
Hope this helps.

Rotating back points from a rotated image in OpenCV

I’m having troubles with rotation.
What I want to do is this:
Rotate an image
Detect features on the rotated image (points)
Rotate back the points so I can have the points coordinates corresponding to the initial image
I’m a bit stuck on the third step.
I manage to rotated the image with the following code:
cv::Mat M(2, 3, CV_32FC1);
cv::Point2f center((float)dst_img.rows / 2.0f, (float)dst_img.cols / 2.0f);
M = cv::getRotationMatrix2D(center, rotateAngle, 1.0);
cv::warpAffine(dst_img, rotated, M, cv::Size(rotated.cols, rotated.rows));
I try to rotate back the points with this code:
float xp = r.x * std::cos( PI * (-rotateAngle) / 180 ) - r.y * sin(PI * (rotateAngle) / 180);
float yp = r.x * sin(PI * (-rotateAngle) / 180) + r.y * cos(PI * (rotateAngle) / 180);
It is not to fare to be working but the points don’t go back well on the image. There is an offset.
Thank you for your help
If M is the rotation matrix you get from cv::getRotationMatrix2D, to rotate a cv::Point p with this matrix you can do this:
cv::Point result;
result.x = M.at<double>(0,0)*p.x + M.at<double>(0,1)*p.y + M.at<double>(0,2);
result.y = M.at<double>(1,0)*p.x + M.at<double>(1,1)*p.y + M.at<double>(1,2);
If you want to rotate a point back, generate the inverse matrix of M or use cv::getRotationMatrix2D(center, -rotateAngle, scale) to generate a matrix for reverse rotation.
For a rotation matrix, its transpose is its inverse. So you can just do M.t() * r to move it back to your original frame, where r is a cv::Mat (you might have to convert it to a cv::Mat from a cv::Point2f or whatever, or just write out the matrix multiplication explicitly).
Here's the code to do it explicitly (should be correct, but warning, it's entirely untested):
cv::Point2f p;
p.x = M.at<float>(0, 0) * r.x + M.at<float>(1, 0) * r.y;
p.y = M.at<float>(0, 1) * r.x + M.at<float>(1, 1) * r.y;
// p contains r rotated back to the original frame.
I had the same problem.
For a transform M and point pp in the rotated image, we wish to find the point pp_org in the coordanates of the original image. Use the following lines:
cv::Mat_<double> iM;
cv::invertAffineTransform(M, iM);
cv::Point2f pp_org = iM*pp;
Where the operator * in the above line is defined as:
cv::Point2f operator*(cv::Mat_<double> M, const cv::Point2f& p)
{
cv::Mat_<double> src(3/*rows*/,1 /* cols */);
src(0,0)=p.x;
src(1,0)=p.y;
src(2,0)=1.0;
cv::Mat_<double> dst = M*src; //USE MATRIX ALGEBRA
return cv::Point2f(dst(0,0),dst(1,0));
}
Note: M is the rotation matrix you used to go from the original to the rotated image
You need to rotate your points accorning to center point of your image.
Here x and y are your points which you want to rotate, imageCenter_x aand _y is center point of your image.
Below is my code.
angle = angle * (M_PI / 180);
float axis_x = x - imageCenter_x;
float axis_y = y - imageCenter_y;
x = axis_x * cos(angle) + axis_y * sin(angle);
y = (-axis_x) * sin(angle) + axis_y * cos(angle);
x = x + imageCenter_x;
y = y + imageCenter_y;