Check if a point is within a "gluPartialDisk" (OpenGL and C++) - c++

I'm making a pie chart program and I'm creating the pie segments with "gluPartialDisks". However, I also want to check if a point is within the area of one of the disks (The point in question being my mouse cursor). I know how to find the position of a mouse cursor, but how can I check if it is within the area of a disk?
Quick snippet of code:
glTranslatef(-0.3, 0, 0);
gluPartialDisk(gluNewQuadric(), 0, 0.65, 10, 1,
((2 * 3.141592654 * 0.65) * (/*Specific angle*/) - (/*Specific angle*/ * 5),
/*Different angle*/ * 360);

As long as your partial disks are parallel to the screen, and rendered with a parallel projection, it's easiest to do the math without getting OpenGL involved at all.
Say you were drawing a partial disk with:
glTranslatef(xPos, yPos, 0.0f);
gluPartialDisk(quadric, innerRad, outerRad, slices, loops, startAng, sweepAng);
Now if you want to test point (x0, y0), you subtract the translation vector, and then calculate the polar coordinates:
x0 -= xPos;
y0 -= yPos;
float dist = sqrt(xPos * xPos + yPos * yPos);
float ang = atan2(yPos, xPos);
To be inside the partial disk, the distance to the center would have to be within the range of radii:
if (dist < innerRad || dist > outerRad) {
// it's outside!
}
The angle is slightly trickier because it wraps around. Also, the result of atan2() is in radians, measured counter-clockwise from the x-axis in a range [-PI, PI] while the arguments to gluPartialDisk() are in degrees, and measured clockwise from the y-axis. With startAng and sweepAng in the range [0.0, 360.0] degrees, the interval test logic could look like this (untested):
ang *= 180.0f / PI; // convert to degrees
ang = 90.0f - ang; // make clockwise, relative to y-axis
if (ang < 0.0f) {
ang += 360.0f; // wrap into range [0.0, 360.0]
}
ang -= startAng; // make relative to startAng
if (ang < 0.0f) {
ang += 360.0f; // ... and back into range [0.0, 360.0]
}
if (ang > sweepAng) {
// it's outside!
} else {
// it's inside!
}

OpenGL is not going to do this for you, unfortunately.
You can either compute a bounding area for your disk and then do some point vs. bounding area intersection testing (which would be complicated for a shape like this) or you can implement color picking.
Since this is for a charting program, it may be very useful to go with the latter approach. The idea there is to assign each object in your scene a unique color code, draw the scene and then read back the color at the cursor's position. This approach is pixel-perfect and much too slow for most applications, but for a simple charting program it is perfect.

Related

Why is there a difference of 90 degrees between rotation and direction?

Firstly, this problem is not unique to the current project I am currently working on and has happened several times before. Here is the problem.
I have a Triangle struct. olc::vf2d is a vector class with x and y components.
struct Triangle
{
olc::vf2d p1 = { 0.0f, -10.0f };
olc::vf2d p2 = { -5.0f, 5.0f };
olc::vf2d p3 = { 5.0f, 5.0f };
};
I create a triangle along with position and angle for it.
Triangle triangle
olc::vf2d position = { 0.0f, 0.0f };
float angle = 0.0f;
Now, I rotate (and offset) the triangle as so:
float x1 = triangle.p1.x * cosf(angle) - triangle.p1.y * sinf(-angle) + position.x;
float y1 = triangle.p1.x * sinf(-angle) + triangle.p1.y * cosf(angle) + position.y;
float x2 = triangle.p2.x * cosf(angle) - triangle.p2.y * sinf(-angle) + position.x;
float y2 = triangle.p2.x * sinf(-angle) + triangle.p2.y * cosf(angle) + position.y;
float x3 = triangle.p3.x * cosf(angle) - triangle.p3.y * sinf(-angle) + position.x;
float y3 = triangle.p3.x * sinf(-angle) + triangle.p3.y * cosf(angle) + position.y;
When I increase the angle every frame and draw it, it rotates and works as expected. But now here is the problem. When I try to calculate the direction for it to move towards, like this:
position.x -= cosf(angle) * elapsedTime;
position.y -= sinf(-angle) * elapsedTime;
It moves, but is look 90 degrees off from the rotation. Example, it is facing directly up and is moving to the right.
Up until this point, I have always solved this problem by using different angles values, i.e taking away 3.14159f / 2.0f radians from angle used in direction calculation
position.x -= cosf(angle - (3.14159f / 2.0f));
position.y -= sinf(-angle - (3.14159f / 2.0f));
or vice-versa and this fixes the problem (now it moves in the direction it is facing).
But now I want to know exactly why this happens and a proper way to solve this problem, many thanks.
There are some missing items to be able to diagnose. You have to have some kind of coordinate system. Is it a right-handed coordinate system or a left-handed coordinate system. You determine this by taking your X/Y origin, and visualizing your hand over it with your thumb pointing towards you. When the X-axis rotates counter-clockwise, i.e. the way the fingers of your right hand curl when held as visualized, does the positive X-axis move towards the positive Y-axis (right-handed system) or does it move towards the negative Y-axis (left-handed system).
As an example, most algebra graphing is done with a right-handed system, but on the raw pixels of a monitor positive Y is down instead of up as typically seen in algebra.
Direction of motion should be independent of rotation angle -- unless you really want them coupled, which is not typically the case.
Using two different variables for direction-of-motion and angle-of-rotation will allow you to visually and mentally decouple the two and see better what is happening.
Now, typically -- think algebra -- angles for rotation are measured starting from "east" -- pointing to the right -- is zero degrees, "north" is 90 degrees -- pointing up -- and so on.
If you want to move "straight up" you are not moving in the zero-degree direction, but rather in the 90-degree direction. So if your rotation is zero degrees but movement is desired to be "straight up" like the triangle points, then you should be using a 90-degree offset for your movement versus your rotation.
If you decouple rotation and motion, this behavior is much easier to understand and observe.

How to draw only 1/4 of a circle in OpenGL C++

I'm trying to draw only a sector/part of a circle, but currently I always get a full circle.
I use this to draw a circle:
glColor3f (0.25, 1.0, 0.25);
GLfloat angle, raioX=0.3f, raioY=0.3f;
GLfloat circle_points = 100.0f;
glBegin(GL_LINE_LOOP);
for (int i = 0; i < circle_points; i++) {
angle = 2*PI*i/circle_points;
glVertex2f(0.5+cos(angle)*raioX, 0.5+sin(angle)*raioY);
}
glEnd();
Assuming you want a sector as illustrated in the following diagram:
  
You will need to re-write your code this way:
glBegin (GL_LINE_LOOP);
glVertex2f (0.5f, 0.5f);
for (int i = 0; i < circle_points; i++) {
angle = 2*PI*i/circle_points;
glVertex2f (0.5+cos(angle)*raioX, 0.5+sin(angle)*raioY);
}
glEnd ();
The only thing I changed was the addition of the point 0.5,0.5 at the center of your circle. WIthout that point, you wind up drawing a segment instead of a sector.
As BDL points out, your original code drew a full circle. Your angle for 1/4 of a circle should be Pi/2 rather than 2*Pi. So at minimum, you would also need to re-write this line:
angle = PI * 0.5f * i / circle_points;
BDL's answer shows a more efficient approach to this. Though it draws an arc, which may or may not be what you want. Either way, you have enough code now to draw all three things in the diagram above.
The code you will see frequently using a cos() and sin() call for each point is correct, but very inefficient. Those are fairly expensive functions, and it's easy to write the code so that they are only needed once.
The idea is that you obtain each point from the previous point by rotating it by the angle increment. The rotation itself can be performed by a 2x2 transformation matrix. This reduced the calculation of each point to a few additions and multiplications.
The code will then look something like this:
// Calculate angle increment from point to point, and its cos/sin.
float angInc = 0.5f * PI / (circle_points - 1.0f);
float cosInc = cos(angInc);
float sinInc = sin(angInc);
// Start with vector (1.0f, 0.0f), ...
float xc = 1.0f;
float yc = 0.0f;
// ... and then rotate it by angInc for each point.
glBegin(GL_LINE_LOOP);
for (int i = 0; i < circle_points; i++) {
glVertex2f(0.5f + xc, 0.5f + yc);
float xcNew = cosInc * xc - sinInc * yc;
yc = sinInc * xc + cosInc * yc;
xc = xcNew;
}
glEnd();
As a subtle detail, note that if you want to draw a quarter circle with circle_points points, including the start and end point, you need to divide the angle range by circle_points - 1 to obtain the angle increment. It's the thing with the number of fence posts and number of gaps between them...
This will draw a circle segment. Andon already elaborated on the difference between a segment and a sector.
The above shared code with my own answer here: https://stackoverflow.com/a/25321141/3530129, which shows how to draw a circle with modern OpenGL.
When drawing a fraction of a circle, one needs to limit the angle in which the points should be placed. circle_points defines then in how many subparts this circle arc should be devided. In addition (and as pointed out by #Andon M. Coleman) using a GL_LINE_LOOP might not be the correct choice, since it will always close the line from the last to the first point.
You're code could be modified somehow like this:
glColor3f (0.25, 1.0, 0.25);
GLfloat angle, raioX=0.3f, raioY=0.3f;
GLfloat circle_points = 100;
GLfloat circle_angle = PI / 2.0f;
glBegin(GL_LINE_STRIP);
for (int i = 0; i <= circle_points; i++) {
GLfloat current_angle = circle_angle*i/circle_points;
glVertex2f(0.5+cos(current_angle)*raioX, 0.5+sin(current_angle)*raioY);
}
glEnd();

Orbiting object around orbiting object

How do I get to orbit green circle around orange and blue around green ?
I found many solutions which works fine with rotating around static point(int this case orange circle) but didn't find any good maths equation which would work for both static and moving points.
angle += sunRot;
if(angle > 360.0f)
{
angle = 0.0f;
}
float radian = glm::radians(angle);
float radius = glm::distance(position, rotCenter);
float x = rotCenter.x + (radius * cosf(radian));
float z = rotCenter.z + (radius * sinf(radian));
glm::vec3 newPos = glm::vec3(x, 0, z);
setPosition(newPos);
Here is what I'm trying to achieve (Thanks to #George Profenza for sharing link)
Base all your calculations on the radius and angle of the current object where possible and store the radius and angle with the object.
In particular, do not calculate the radius based on the x/y coordinates in every iteration: If the base object has moved between steps, your calculated radius will be slightly off and the error will accumulate.
You should be able to nest coordinate spaces using opengl using glPushMatrix(), glPopMatrix() calls. Here's a basic example(press mouse to see coordinate spaces).
The syntax isn't c++, but it's easy to see what I mean.
You can do this multiple ways:
polar coordinate formula
manually multiplying transformation matrices
simply using push/pop matrix calls (along with translate/rotate where needed), which does the matrix multiplication for you behind the scenes.
Just in case you want to try the polar coordinate formula:
x = cos(angle) * radius
y = sin(angle) * radius
Where angle is the current rotation of a circle and the radius is it's distance from the centre of rotation.

Picking Ray is inaccurate

I'm trying to implement a picking ray via instructions from this website.
Right now I basically only want to be able to click on the ground to order my little figure to walk towards this point.
Since my ground plane is flat , non-rotated and non-translated I'd have to find the x and z coordinate of my picking ray when y hits 0.
So far so good, this is what I've come up with:
//some constants
float HEIGHT = 768.f;
float LENGTH = 1024.f;
float fovy = 45.f;
float nearClip = 0.1f;
//mouse position on screen
float x = MouseX;
float y = HEIGHT - MouseY;
//GetView() returns the viewing direction, not the lookAt point.
glm::vec3 view = cam->GetView();
glm::normalize(view);
glm::vec3 h = glm::cross(view, glm::vec3(0,1,0) ); //cameraUp
glm::normalize(h);
glm::vec3 v = glm::cross(h, view);
glm::normalize(v);
// convert fovy to radians
float rad = fovy * 3.14 / 180.f;
float vLength = tan(rad/2) * nearClip; //nearClippingPlaneDistance
float hLength = vLength * (LENGTH/HEIGHT);
v *= vLength;
h *= hLength;
// translate mouse coordinates so that the origin lies in the center
// of the view port
x -= LENGTH / 2.f;
y -= HEIGHT / 2.f;
// scale mouse coordinates so that half the view port width and height
// becomes 1
x /= (LENGTH/2.f);
y /= (HEIGHT/2.f);
glm::vec3 cameraPos = cam->GetPosition();
// linear combination to compute intersection of picking ray with
// view port plane
glm::vec3 pos = cameraPos + (view*nearClip) + (h*x) + (v*y);
// compute direction of picking ray by subtracting intersection point
// with camera position
glm::vec3 dir = pos - cameraPos;
//Get intersection between ray and the ground plane
pos -= (dir * (pos.y/dir.y));
At this point I'd expect pos to be the point where my picking ray hits my ground plane.
When I try it, however, I get something like this:
(The mouse cursor wasn't recorded)
It's hard to see since the ground has no texture, but the camera is tilted, like in most RTS games.
My pitiful attempt to model a remotely human looking being in Blender marks the point where the intersection happened according to my calculation.
So it seems that the transformation between view and dir somewhere messed up and my ray ended up pointing in the wrong direction.
The gap between the calculated position and the actual position increases the farther I mouse my move away from the center of the screen.
I've found out that:
HEIGHT and LENGTH aren't acurate. Since Windows cuts away a few pixels for borders it'd be more accurate to use 1006,728 as window resolution. I guess that could make for small discrepancies.
If I increase fovy from 45 to about 78 I get a fairly accurate ray. So maybe there's something wrong with what I use as fovy. I'm explicitely calling glm::perspective(45.f, 1.38f, 0.1f, 500.f) (fovy, aspect ratio, fNear, fFar respectively).
So here's where I am lost. What do I have to do in order to get an accurate ray?
PS: I know that there are functions and libraries that have this implemented, but I try to stay away from these things for learning purposes.
Here's working code that does cursor to 3D conversion using depth buffer info:
glGetIntegerv(GL_VIEWPORT, #fViewport);
glGetDoublev(GL_PROJECTION_MATRIX, #fProjection);
glGetDoublev(GL_MODELVIEW_MATRIX, #fModelview);
//fViewport already contains viewport offsets
PosX := X;
PosY := ScreenY - Y; //In OpenGL Y axis is inverted and starts from bottom
glReadPixels(PosX, PosY, 1, 1, GL_DEPTH_COMPONENT, GL_FLOAT, #vz);
gluUnProject(PosX, PosY, vz, fModelview, fProjection, fViewport, #wx, #wy, #wz);
XYZ.X := wx;
XYZ.Y := wy;
XYZ.Z := wz;
If you do test only ray/plane intersection this is the second part without DepthBuffer:
gluUnProject(PosX, PosY, 0, fModelview, fProjection, fViewport, #x1, #y1, #z1); //Near
gluUnProject(PosX, PosY, 1, fModelview, fProjection, fViewport, #x2, #y2, #z2); //Far
//No intersection
Result := False;
XYZ.X := 0;
XYZ.Y := 0;
XYZ.Z := aZ;
if z2 < z1 then
SwapFloat(z1, z2);
if (z1 <> z2) and InRange(aZ, z1, z2) then
begin
D := 1 - (aZ - z1) / (z2 - z1);
XYZ.X := Lerp(x1, x2, D);
XYZ.Y := Lerp(y1, y2, D);
Result := True;
end;
I find it rather different from what you are doing, but maybe that will make more sense.

Zooming into the mouse, factoring in a camera translation? (OpenGL)

Here is my issue, I have a scale point, which is the unprojected mouse position. I also have a "camera which basically translates all objects by X and Y. What I want to do is achieve zooming into mouse position.
I'v tried this:
1. Find the mouse's x and y coordinates
2. Translate by (x,y,0) to put the origin at those coordinates
3. Scale by your desired vector (i,j,k)
4. Translate by (-x,-y,0) to put the origin back at the top left
But this doesn't factor in a translation for the camera.
How can I properly do this. Thanks
glTranslatef(controls.MainGlFrame.GetCameraX(),
controls.MainGlFrame.GetCameraY(),0);
glTranslatef(current.ScalePoint.x,current.ScalePoint.y,0);
glScalef(current.ScaleFactor,current.ScaleFactor,0);
glTranslatef(-current.ScalePoint.x,-current.ScalePoint.y,0);
Instead of using glTranslate to move all the objects, you should try glOrtho. It takes as parameters the wanted left coords, right coords, bottom coords, top coords, and min/max depth.
For example if you call glOrtho(-5, 5, -2, 2, ...); your screen will show all the points whose coords are inside a rectangle going from (-5,2) to (5,-2). The advantage is that you can easily adjust the zoom level.
If you don't multiply by any view/projection matrix (which I assume is the case), the default screen coords range from (-1,1) to (1,-1).
But in your project it can be very useful to control the camera. Call this before you draw any object instead of your glTranslate:
float left = cameraX - zoomLevel * 2;
float right = cameraX + zoomLevel * 2;
float top = cameraY + zoomLevel * 2;
float bottom = cameraY - zoomLevel * 2;
glOrtho(left, right, bottom, top, -1.f, 1.f);
Note that cameraX and cameraY now represent the center of the screen.
Now when you zoom on a point, you simply have to do something like this:
cameraX += (cameraX - screenX) * 0.5f;
cameraY += (cameraY - screenY) * 0.5f;
zoomLevel += 0.5f;