OpenGL 2D Circle - Rotated AABB Collision - c++

I have trouble figuring out a way to detect collision between a circle and a rotated rectangle. My approach was to first rotate the circle and the rectangle by -angle, where angle is the amount of radians the rectangle is rotated. Therefore, the rectangle and the circle are aligned with the axes, so I can perform the basic circle - AABB collision detection.
bool CheckCollision(float circleX, float circleY, float radius, float left, float bottom, float width, float height, float angle){
// Rotating the circle and the rectangle with -angle
circleX = circleX * cos(-angle) - circleY * sin(-angle);
circleY = circleX * sin(-angle) + circleY * cos(-angle);
left = left * cos(-angle) - bottom* sin(-angle);
bottom = left * sin(-angle) + bottom * cos(-angle);
}
glm::vec2 center(circleX, circleY);
// calculate AABB info (center, half-extents)
glm::vec2 aabb_half_extents(width / 2.0f, height / 2.0f);
glm::vec2 aabb_center(
left + aabb_half_extents.x,
bottom + aabb_half_extents.y
);
// get difference vector between both centers
glm::vec2 difference = center - aabb_center;
glm::vec2 clamped = glm::clamp(difference, -aabb_half_extents, aabb_half_extents);
// add clamped value to AABB_center and we get the value of box closest to circle
glm::vec2 closest = aabb_center + clamped;
// retrieve vector between center circle and closest point AABB and check if length <= radius
difference = closest - center;
return glm::length(difference) < radius;

Let rectangle center is rcx, rcy. Set coordinate origin in this point and rotate circle center about this point (cx, cy are coordinates relative to rectangle center):
cx = (circleX - rcx) * cos(-angle) - (circleY - rcy) * sin(-angle);
cy = (circleX - rcx) * sin(-angle) + (circleY - rcy) * cos(-angle);
Now get squared distance from circle center to rectangle's closest point(zero denotes circle center is inside rectangle):
dx = max(Abs(cx) - rect_width / 2, 0)
dy = max(Abs(cy) - rect_height / 2, 0)
SquaredDistance = dx * dx + dy * dy
Then compare it with squared radius

Related

Finding the new target of camera relative to a direction in 3D space (bad title) C++

I have a camera set up with the coordinates of 0, 0, 1000 and a cube at 0, 0, 0. There is a camera position vector, rotation vector and target vector.
When finding the target, in 2d space I would use:
newx = cos(angle); // this will be very small so i would multiply it by 100 or something idk
newy = sin(angle); // same and so on
So in 3d space I'm assuming that I would use:
newx = cos(angle);
newy = sin(angle);
newz = tan(angle);
But because I'm using the mouse to find the x and y direction the z rotation is always 0:
float x_diff = (WIDTH/2) - mousePos.x;
x_diff /= WIDTH;
float y_diff = (HEIGHT/2)- mousePos.y;
y_diff /= HEIGHT;
cameraRotation.x += /* too small of a number so multiply by 5 */ 5 * (FOV * x_diff);
cameraRotation.y += 5 * (FOV * y_diff);
cameraRotation.z += ???;
and so the target z will always be 0.
I could be doing this whole thing completely wrong I don't know.
But to sum it, up i need help calculating the cameras target (FOV: 90) for its rotation in 3D space.

I have a device reporting left handed coordinate angle and magnitude, how do I represent that as a line on the screen from the center?

The device I am using generates vectors like this;
How do I translate polar (angle and magnitude) from a left handed cordinate to a cartesian line, drawn on a screen where the origin point is the middle of a screen?
I am displaying the line on a wt32-sc01 screen using c++. There is a tft.drawline function but its references are normal pixel locations. In which case 0,0 is the upper left corner of the screen.
This is what I have so far (abbreviated)
....
int screen_height = tft.height();
int screen_width = tft.width();
// Device can read to 12m and reports in mm
float zoom_factor = (screen_width / 2.0) / 12000.0;
int originY = (int)(screen_height / 2);
int originX = (int)(screen_width / 2);
// Offset is for screen scrolling. No screen offset to start
int offsetX = 0;
int offsetY = 0;
...
// ld06 holds the reported angles and distances.
Coord coord = polarToCartesian(ld06.angles[i], ld06.distances[i]);
drawVector(coord, WHITE);
Coord polarToCartesian(float theta, float r) {
// cos() and sin() take radians
float rad = theta * 0.017453292519;
Coord converted = {
(int)(r * cos(rad)),
(int)(r * sin(rad))
};
return converted;
}
void drawVector(Coord coord, int color) {
// Cartesian relative the center of the screen factoring zoom and pan
int destX = (int)(zoom_factor * coord.x) + originX + offsetX;
int destY = originY - (int)(zoom_factor * coord.y) + offsetY;
// From the middle of the screen (origin X, origin Y) to destination x,y
tft.drawLine( originX, originY, destX, destY, color);
}
I have something drawing on the screen, but now I have to translate between a left handed coordinate system and the whole plane is rotated 90 degrees. How do I do that?
If I understood correctly, your coordinate system is with x pointing to the right and the y to the bottom and you used the formula for the standard math coordinate system where y is pointing up so multiplying your sin by -1 should do the trick (if it doesn't, try multiplying random things by -1, it often works for this kind of problems).
I assuming (from your image) your coordinate system has x going right y going up angle going from y axis clockwise and (0,0) is also center of your polar coordinates and your goniometrics accept radians then:
#include <math.h>
float x,y,ang,r;
const float deg = M_PI/180.0;
// ang = <0,360> // your angle
// r >= 0 // your radius (magnitude)
x = r*sin(ang*deg);
y = r*cos(ang*deg);

OpenGL ArcBall for rotating mesh

I am using legacy OpenGL to draw a mesh. I am now trying to implement an arcball class to rotate the object with the mouse. However, when i move the mouse, the object either doesn't rotate or rotates by way too big an angle.
This is the method that is called when the mouse is clicked:
void ArcBall::startRotation(int xPos, int yPos) {
int x = xPos - context->getWidth() / 2;
int y = context->getHeight() / 2 - yPos;
startVector = ArcBall::mapCoordinates(x, y).normalized();
endVector = startVector;
rotating = true;
}
This method is meant to simply map the mouse coordinates to be centered at the center of the screen and map them to the bounding sphere, resulting in a starting vector
This is the method that is called when the mouse moves:
void ArcBall::updateRotation(int xPos, int yPos) {
int x = xPos - context->getWidth() / 2;
int y = context->getHeight() / 2 - yPos;
endVector = mapCoordinates(x, y).normalized();
rotationAxis = QVector3D::crossProduct(endVector, startVector).normalized();
angle = (float)qRadiansToDegrees(acos(QVector3D::dotProduct(startVector, endVector)));
rotation.rotate(angle, rotationAxis.x(), rotationAxis.y(), rotationAxis.z());
startVector = endVector;
}
This method is again meant to map the mouse coordinates to be centered t the middle of the screen, then compute the new vector and compute a rotation axis and angle based on these two vectors.
I then use
glMultMatrixf(ArcBall::rotation.data());
to apply the rotation
I recommend to do store the mouse position at the point where you initially click in the view. Calculate the amount of the mouse movement in window coordinates. The distance of the movement has to be mapped to an angle. The rotation axis is perpendicular (normal) to the direction of the mouse movement. The result is a rotation of an object similar to this WebGL demo.
Store the current mouse position in startRotation. Note store the coordinates of the position mouse position not normalized vector:
// xy normalized device coordinates:
float ndcX = 2.0f * xPos / context->getWidth() - 1.0f;
float ndcY = 1.0 - 2.0f * yPos / context->getHeight();
startVector = QVector3D(ndcX, ndcY, 0.0);
Get the current position in updateRotation:
// xy normalized device coordinates:
float ndcX = 2.0f * xPos / context->getWidth() - 1.0f;
float ndcY = 1.0 - 2.0f * yPos / context->getHeight();
endVector = QVector3D(ndcX, ndcY, 0.0);
Calculate the vector from the start position to the end position:
QVector3D direction = endVector - startVector;
The rotation axis is normal to the direction of movement:
rotationAxis = QVector3D(-direction.y(), direction.x(), 0.0).normalized();
Note even if the type of direction is QVector3D, it is still a 2 dimensional vector. It is a vector in the XY plane of the viewport representing the mouse movement on the viewport. The z coordinate is 0. A 2 dimensional vector (x, y), can be 90 degrees counter clockwise rotated, by (-y, x).
The length of the direction vector represents tha angle of rotation. A mouse motion over the entire screen results in a vector with length 2.0. So if a dragging over the full screen should result in a full rotation, the length of the vector has to be multiplied by PI. If the a hlf rotation should be performed, then by PI/2:
angle = (float)qRadiansToDegrees(direction.length() * 3.141593);
Finally the new rotation has to be applied to the existing rotation and not to the model:
QMatrix4x4 addRotation;
addRotation.rotate(angle, rotationAxis.x(), rotationAxis.y(), rotationAxis.z());
rotation = addRotation * rotation;
Final code listing of the methods startRotation and updateRotation:
void ArcBall::startRotation(int xPos, int yPos) {
// xy normalized device coordinates:
float ndcX = 2.0f * xPos / context->getWidth() - 1.0f;
float ndcY = 1.0 - 2.0f * yPos / context->getHeight();
startVector = QVector3D(ndcX, ndcY, 0.0);
endVector = startVector;
rotating = true;
}
void ArcBall::updateRotation(int xPos, int yPos) {
// xy normalized device coordinates:
float ndcX = 2.0f * xPos / context->getWidth() - 1.0f;
float ndcY = 1.0 - 2.0f * yPos / context->getHeight();
endVector = QVector3D(ndcX, ndcY, 0.0);
QVector3D direction = endVector - startVector;
rotationAxis = QVector3D(-direction.y(), direction.x(), 0.0).normalized();
angle = (float)qRadiansToDegrees(direction.length() * 3.141593);
QMatrix4x4 addRotation;
addRotation.rotate(angle, rotationAxis.x(), rotationAxis.y(), rotationAxis.z());
rotation = addRotation * rotation;
startVector = endVector;
}
If you want a rotation around the upwards axis of the object a tilting the object along the view space x axis, then the calculation is different. First apply the rotation matrix around the y axis (up vector) then the current view matrix and finally the rotation on the x axis:
view-matrix = rotate-X * view-matrix * rotate-Y
The function update rotation has to look like this:
void ArcBall::updateRotation(int xPos, int yPos) {
// xy normalized device coordinates:
float ndcX = 2.0f * xPos / context->getWidth() - 1.0f;
float ndcY = 1.0 - 2.0f * yPos / context->getHeight();
endVector = QVector3D(ndcX, ndcY, 0.0);
QVector3D direction = endVector - startVector;
float angleY = (float)qRadiansToDegrees(-direction.x() * 3.141593);
float angleX = (float)qRadiansToDegrees(-direction.y() * 3.141593);
QMatrix4x4 rotationX;
rotationX.rotate(angleX, 1.0f 0.0f, 0.0f);
QMatrix4x4 rotationUp;
rotationX.rotate(angleY, 0.0f 1.0f, 0.0f);
rotation = rotationX * rotation * rotationUp;
startVector = endVector;
}

FPS Camera using JOGL and gluLookAt

In display i have:
Camera c = new Camera(canvas);
glu.gluLookAt(c.eyeX, c.eyeY, c.eyeZ, c.point_X, c.point_Y, c.point_Z, 0, 1, 0);
those variables are from an object from my Camera class which has:
float eyeX = 5.0f, eyeY = 5.0f, eyeZ = 5.0f;
float camYaw = 0.0f; //camera rotation in X axis
float camPitch = 0.0f; //camera rotation in Y axis
float point_X = 10.0f;
float point_Y = 5.0f;
float point_Z = 5.0f;
i calculate the delta on the mouse movement and camYaw goes from 0 to 360 degrees
and camPitch from -90 to 90 degrees (or 0 - 2PI and -+PI/2)
this works fine but when i put the calculated point_X, point_Y, point_Z to the gluLookAt() it moves the cam in a strange way (seems it rotates the camera to an invisible sphere depending on the given radius in the equasions)
public void updateCamera() {
float radius = 5.0f;
point_X = (float) (radius * Math.cos(camYaw) * Math.sin(camPitch));
point_Z = (float) (radius * Math.sin(camPitch) * Math.sin(camYaw));
point_Y = (float) (radius * Math.cos(camPitch));
}
Im trying to convert polar to cartesian coordinates.
The larger the radius the better it "works".
Changing from degrees to radians still doesnt work.

Point in OBB (Oriented Bounding Box) algorithm?

Given a center point, width, height and angle forming an OBB, how can I find if a given point P is inside the OBB?
Thanks
I take it that the wrinkle in your problem is that the bounding box can be rotated? If so, the easiest solution to me seems to be to do all calculations in the rotated coordinate plane, centered on the center of the bounding box.
To calculate the coordinates of the point relative to these axes:
newy = sin(angle) * (oldy - centery) + cos(angle) * (oldx - centerx);
newx = cos(angle) * (oldx - centerx) - sin(angle) * (oldy - centery);
(you may need to adjust this depending on how angle is supposed to be measured, I'll leave that to you, since you didn't specify)
Then hit test, the normal way:
return (newy > centery - height / 2) && (newy < centery + height / 2)
&& (newx > centerx - width / 2) && (newx < centerx + width / 2);
You could transform the coordinates of your test point (via a transformation matrix) into a rotated coordinate system based on the angle of the bounding box.
At this stage it should just be an axis-aligned point-in-rectangle test, i.e. compare against xmin, xmax, ymin, ymax. In the rotated coordinate system xmin, xmax = xmid -+ width/2 and ymin, ymax = ymid -+ height/2.
Hope this helps.