Draping 2d point on a 3d terrain - opengl

I am using OpenTK(OpenGL) and a general hint will be helpful.
I have a 3d terrain. I have one point on this terrain O(x,y,z) and two perpendicular lines passing through this point that will serve as my X and Y axes.
Now I have a set of 2d points with are in polar coordinates (range,theta). I need to find which points on the terrain correspond to these points. I am not sure what is the best way to do it. I can think of two ideas:
Lets say I am drawing A(x1,y1).
Find the intersection of plane passing through O and A which is perpendicular to the XY plane. This will give me a polyline (semantics may be off). Now on this line, I find a point that is visible from O and is at a distance of the range.
Create a circle which is perpendicular to the XY plane with radius "range", find intersection points on the terrain, find which ones are visible from O and drop rest.
I understand I can find several points which satisfy the conditions, so I will do further check based on topography, but for now I need to get a smaller set which satisfy this condition.
I am new to opengl, but I get geometry pretty well. I am wondering if something like this exists in opengl since it is a standard problem with ground measuring systems.

As you say, both of the options you present will give you more than the one point you need. As I understand your problem, you need only to perform a change of bases from polar coordinates (r, angle) to cartesian coordinates (x,y).
This is fairly straight forward to do. Assuming that the two coordinate spaces share the origin O and that the angle is measured from the x-axis, then point (r_i, angle_i) maps to x_i = r_i*cos(angle_i) and y_i = r_i*sin(angle_i). If those assumptions aren't correct (i.e. if the origins aren't coincident or the angle is not measured from a radii parallel to the x-axis), then the transformation is a bit more complicated but can still be done.
If your terrain is represented as a height map, or 2D array of heights (e.g. Terrain[x][y] = z), once you have the point in cartesian coordinates (x_i,y_i) you can find the height at that point. Of course (x_i, y_i) might not be exactly one of the [x] or [y] indices of the height map.
In that case, I think you have a few options:
Choose the closest (x,y) point and take that height; or
Interpolate the height at (x_i,y_i) based on the surrounding points in the height map.
Unfortunately I am also learning OpenGL and can not provide any specific insights there, but I hope this helps solve your problem.

Reading your description I see a bit of confusion... maybe.
You have defined point O(x,y,z). Fine, this is your pole for the 3D coordinate system. Then you want to find a point defined by polar coordinates. That's fine also - it gives you 2D location. Basically all you need to do is to pinpoint the location in 3D A'(x,y,0), because we are assuming you know the elevation of the A at (r,t), which you of course do from the terrain there.
Angle (t) can be measured only from one axis. Choose which axis will be your polar north and stick to. Then you measure r you have and - voila! - you have your location. What's the point of having 2D coordinate set if you don't use it? Instead, you're adding visibility to the mix - I assume it is important, but highest terrain point on azimuth (t) NOT NECESSARILY will be in the range (r).
You have specific coordinates. Just like RonL suggest, convert to (x,y), find (z) from actual terrain and be done with it.
Unless that's not what you need. But in that case a different question is in order: what do you look for?

Related

Duplicate points along NURBS curve

in my current project I have implemented NURBS curves and at the beginning of the curve I have some 3D points, which are all located in the normal plane of the point (u = 0.0). Now I want to copy these points to other locations of the curve (e.g. u = 0.5) to create some kind of extrude / sweep mechanism. My theoretical approach is to create a local coordinate system in point 0.0 and to calculate the coordinates of every point in relation to this system. Then I can create local coordinate systems at the desired points and place the points there. My problem is that with the first derivation of the NURBS curve I can get the tangent and therefore the normal plane of the point / system (local X direction) but I don't know how to orient the system. My first idea was to take the second derivative of the NURBS curve and use this to calculate the local Y and Z axis of the system but the results of the second derivatives does not seem to be suitable for this approach.
Is there a common approach to solve this problem?
As an additional question I am wondering how to dictate the tangent vector of a given control point, for example the tangent of the first control point. Currently I solve this by dictating the position of the second control point, which seems to be not very elegant.
We solved the same problem using this approach:
https://www.microsoft.com/en-us/research/wp-content/uploads/2016/12/Computation-of-rotation-minimizing-frames.pdf
Look like you would like to find a local coordinate system at any given point on the NURBS curve. If this is the case, Frenet frame is the typical choice. See this link for more details.
As for the issue of "tangent vector of a given control point", since control points in general do not lie on the NURBS curve, it does not have a tangent vector. If you really need one for some special reason, you can use the tangent vector at the point on the curve that is closest to the control point.

my plane is not vertical, How to update coordinate of point cloud to lie on a vertical plane

I have a bunch of points lying on a vertical plane. In reality this plane
should be exactly vertical. But, when I visualize the point cloud, there is a
slight inclination (nearly 2 degrees) from the verticality. At the moment, I can calculate
this inclination only. Concerning other errors, I assume there are no
shifts or something like that.
So, I want to update coordinates of my point data so that they lie on the vertical plane. I think, I should do some kind of transformation. It may be only via rotation along X-axis. Not sure what it would be.
I guess, you understood my question. Honestly, I am poor at
mathematics. So, please let me know how to update my point coordinates
to lie on the exact vertical plane.
Note: AS I am implementing this in c++ and there are many programmers who have sound knowledge on these things, I am posting this question under c++.
UPDATES
If I say exactly what I have done so far;
I have point cloud data representing a vertical object + its surroundings things. (The data is collected by a moving scanner and may have axes deviations from the correct world axes). The problem is, I cannot say exactly that there is an error on my data or not. Therefore, I checked this with a vertical planar object (which is the dominated object in my data as well). In reality that plane is truly vertical. But, when I fit a plane by removing outliers, then that plane is not truly vertical and has nearly 2 degree inclination. Therefore, I am suspecting that my data has some error. So I want to update all my point clouds (including points on the plane and points which represent other objects) in a way to lay that particular planar points exactly on the vertical plane. Then, I guess, all the points will be updated into their correct positions as in the reality. That is all (x,y,z) coordinates should be updated.
As an example please refer the below figure.
left-represents original point cloud (as you can see, points themselves are not vertical) and back line tells the vertical plane which I fitted and red is the zenith line. as you can see, there is an inclination of the vertical plane.
So, I want to update whole my data in the right figure. then, after updating if i fit a plane again (removing outliers), then it is exactly parallel to the zenith line. please help me.
I may be able to help you out, considering I worked with planes recently. First of all, how come the points aren't coplanar from the get go? I'd make the points coplanar in the first place instead of them being at an inclination (from what origin?), and then having to fix them. Also, having the points be coplanar on your first go would increase efficiency.
Sorry if this is the answer you're not looking for, but I need more information before I can help you out. Also, 3D math is hard. If you work with it enough, it starts to get pounded into your head, where you will NEVER forget it, especially if you went through the headaches I had to go through.
I did a bit of thinking on it, and since you want to rotate along the x-axis, your rotation will be done on the xz-plane, which means we can make this a 2D problem. After doing a bit of research on Wikipedia, this may be your solution.
new z = ((x - intended x) * sin(angle)) + (z * cos(angle)) + intended x
What I'm doing here is subtracting our intended x value from our current x value, so that we make (intended x, 0) our point of origin to rotate around. After the point is rotated, I add (intended x, 0) back to our coordinate so that we get the correct result.
Depending on where you got your points from (some kind of measurement, I guess) and what you want to do with them, there are several different things you could do with your data.
The search keyword "regression plane" might help - there are several ways of finding planes approximating point clouds, and several ways to "snap" points to planes.
Edit: You want to apply a rotation around the axis defined by the cross product of the normal vector on your regression plane and the normal of your desired plane, and a point your choice. From your illustration I take it that you probably want the bottom of your vertical planar object to be the point of reference for the rotation.
So you've got your point of reference, you now the axis around which you want to rotate, and the angle. All you need to do is:
Translation (to get to your point of reference)
Rotation
I read your question again, and hopefully this answer will help you out. If there's anything else I need to know, please tell me.
Now, In order to rotate anything, there must be a center point to rotate around. Now you've already been able to detect the angle of inclination, so now we need a formula for rotating a point a certain angle around an origin. In addition, since this problem only occurs on a 2D plane, we can use this basic formula to readjust the points. For any two axis x and y:
Theta is the angle that you will rotate around in a counter-clockwise direction. x' and y' are your new points. x.origin and y.origin are the coordinates for the point you will be going around. Now I don't know if my math is 100% correct on this but if it's not, hopefully you can change a thing or two and it will work.

Replicating Blender bezier curves in a C++ program

I'm trying to export (3D) bezier curves from Blender to my C++ program. I asked a related question a while back, where I was successfully directed to use De Casteljau's Algorithm to evaluate points (and tangents to these points) along a bezier curve. This works well. In fact, perfectly. I can export the curves and evaluate points along the curve, as well as the tangent to these points, all within my program using De Casteljau's Algorithm.
However, in 3D space a point along a bezier curve and the tangent to this point is not enough to define a "frame" that a camera can lock into, if that makes sense. To put it another way, there is no "up vector" which is required for a camera's orientation to be properly specified at any point along the curve. Mathematically speaking, there are an infinite amount of normal vectors at any point along a 3D bezier curve.
I've noticed when constructing curves in Blender that they aren't merely infinitely thin lines, they actually appear to have a proper 3D orientation defined at any point along them (as shown by the offshooting "arrow lines" in the screenshot below). I'd like to replicate what blender does here as closely as possible in my program. That is, I'd like to be able to form a matrix that represents an orientation at any point along a 3D bezier curve (almost exactly as it would in Blender itself).
Can anyone lend further guidance here, perhaps someone with an intimate knowledge of Blender's source code? (But any advice is welcome, Blender background or not.) I know it's open source, but I'm having a lot of troubles isolating the code responsible for these curve calculations due to the vastness of the program.
Some weeks ago, I have found a solution to this problem. I post it here, in case someone else would need it :
1) For a given point P0, calculate the tangent vector T0.
One simple, easy way, is to take next point on the curve, subtract current point, then normalize result :
T0 = normalize(P1 - P0)
Another, more precise way, to get tangent is to calculate the derivative of your bezier curve function.
Then, pick an arbitrary vector V (for example, you can use (0, 0, 1))
Make N0 = crossproduct(T0, V) and B0 = crossproduct(T0, N0) (dont forget to normalize result vectors after each operation)
You now have a starting set of coordinates ( P0, B0, T0, N0)
This is the initial camera orientation.
2) Then, to calculate next points and their orientation :
Calculate T1 using same method as T0
Here is the trick, new reference frame is calculated from previous frame :
N1 = crossproduct(B0, T1)
B1 = crossproduct(T1, N1)
Proceed using same method for other points. It will results of having camera slightly rotating around tangent vector depending on how curve change its direction. Loopings will be handled correctly (camera wont twist like in my previous answer)
You can watch a live example here (not from me) : http://jabtunes.com/labs/3d/webgl_geometry_extrude_splines.html
Primarily, we know, that the normal vector you're searching for lies on the plane "locally perpendicular" to the curve on the specific point. So the real problem is to choose a single vector on this plane.
I've made an empty object to track the curve and noticed, that it behave similarly to the cart of a rollercoaster: its "up" vector was correlated to the centrifugal force while it was moving along the curve. This one can be uniquely evaluated from the local shape of the curve.
I'm not very good at physics, but I would try to estimate that vector by evaluating two planes: the first is previously mentioned perpendicular plane and the second is a plane made of three neighboring points of a curve segment (if the curve is not straight, these will form a triangle, which describes exactly one plane). Intersection of these two planes will give you an axis and you'll only have to choose a direction of such calculated normal vector.
If I understand you question correcly, what you want is to get 3 orientation vectors (left, front, up) for any point of the curve.
Here is a simple method ( there is a limitation, (*) see below ) :
1) Front vector :
Calculate a 3d point on bezier curve for a given position (t). This is the point for which we will calculate front, left, up vectors. We will call it current_point.
Calculate another 3d point on the curve, next to first one (t + 0.01), let's call it next_point.
Note : i don't write formula here, because i believe you already how to
do that.
Then, to calculate front vector, just substract the two points calculated previously :
vector front = next_point - current_point
Don't forget to normalize the result.
2) Left vector
Define a temporary "up" vector
vector up = vector(0.0f, 1.0f, 0.0f);
Now you can calculate left easily, using front and up :
vector left = CrossProduct(front, up);
3) Up vector
vector up = CrossProduct(left, front);
Using this method you can always calculate a front, left, up for any point along the curve.
(*) NOTE : this wont work in all cases. Imagine you have a loop in you curve, just like a rollercoaster loop. On the top of the loop your calculated up vector will be (0, 1, 0), while you maybe want it to be (0, -1, 0). Only way to solve that is to have two curves : one for points and one for up vectors (from which left and front can be calculated easily).

Estimating equation for plane if distances to various points on it are known

I know the distance to various points on a plane, as it is being viewed from an angle. I want to find the equation for this plane from just that information (5 to 15 different points, as many as necessary).
I will later use the equation for the plane to estimate what the distance to the plane should be at different points; in order to prove that it is roughly flat.
Unfortunately, a google search doesn't bring much up. :(
If you, indeed, know distances and not coordinates, then it is ill-posed problem - there is infinite number of planes that will have points with any number of given distances from origin.
This is easy to verify. Let's take shortest distance D0, from set of given distances {D0..DN-1} , and construct a plane with normal vector {D0,0,0} (vector of length D0 along x-axis). For each of remaining lengths we now have infinite number of points that will lie in this plane (forming circles in-plane around (D0,0,0) point). Moreover, we can rotate all vectors by an arbitrary angle and get a new plane.
Here is simple picture in 2D (distances to a line; it's simpler to draw ;) ).
As we can see, there are TWO points on the line for each distance D1..DN-1 > D0 - one is shown for D1 and D2, and the two other for these distances would be placed in 4th quadrant (+x, -y). Moreover, we can rotate our line around origin by an arbitrary angle and still satisfy given distances.
I'm going to skip over the process of finding the best fit plane, it's been handled in some other answers, and talk about something else.
"Prove" takes us into statistical inference. The way this is done is you make a formal hypothesis "the surface is flat" and then see if the data supports rejecting this hypothesis at some confidence level.
So you can wind up saying "I'm not even 1% sure that the surface isn't flat" -- but you can't ever prove that it's flat.
Geometry? Sounds like a job for math.SE! What form will the equation take? Will it be a plane?
I will assume you want an accurate solution.
Find the absolute positions with geometry
Make a best fit regression line in C++ in 2 of the 3 dimensions.

Coordinate Transformation C++

I have a webcam pointed at a table at a slant and with it I track markers.
I have a transformationMatrix in OpenSceneGraph and its translation part contains the relative coordinates from the tracked Object to the Camera.
Because the Camera is pointed at a slant, when I move the marker across the table the Y and Z axis is updated, although all I want to be updated is the Z axis, because the height of the marker doesnt change only its distance to the camera.
This has the effect when when project a model on the marker in OpenSceneGraph, the model is slightly off and when I move the marker arround the Y and Z values are updated incorrectly.
So my guess is I need a Transformation Matrix with which I multiply each point so that I have a new coordinate System which lies orthogonal on the table surface.
Something like this: A * v1 = v2 v1 being the camera Coordinates and v2 being my "table Coordinates"
So what I did now was chose 4 points to "calibrate" my system. So I placed the marker at the top left corner of the Screen and defined v1 as the current camera coordinates and v2 as (0,0,0) and I did that for 4 different points.
And then taking the linear equations I get from having an unknown Matrix and two known vectors I solved the matrix.
I thought the values I would get for the matrix would be the values I needed to multiply the camera Coordinates with so the model would updated correctly on the marker.
But when I multiply the known Camera Coordinates I gathered before with the matrix I didnt get anything close to what my "table coordinates" were suposed to be.
Is my aproach completely wrong, did I just mess something up in the equations? (solved with the help of wolframalpha.com) Is there an easier or better way of doing this?
Any help would be greatly appreciated, as I am kind of lost and under some time pressure :-/
Thanks,
David
when I move the marker across the table the Y and Z axis is updated, although all I want to be updated is the Z axis, because the height of the marker doesnt change only its distance to the camera.
Only true when your camera's view direction is aligned with your Y axis (or Z axis). If the camera is not aligned with Y, it means the transform will apply a rotation around the X axis, hence modifying both the Y and Z coordinates of the marker.
So my guess is I need a Transformation Matrix with which I multiply each point so that I have a new coordinate System which lies orthogonal on the table surface.
Yes it is. After that, you will have 2 transforms:
T_table to express marker's coordinates in the table referential,
T_camera to express table coordinates in the camera referential.
Finding T_camera from a single 2d image is hard because there's no depth information.
This is known as the Pose problem -- it has been studied by -among others- Daniel DeMenthon. He developed a fast and robust algorithm to find the pose of an object:
articles available on its research homepage, section 4 "Model Based Object Pose" (and particularly "Model-Based Object Pose in 25 Lines of Code", 1995);
code at the same place, section "POSIT (C and Matlab)".
Note that the OpenCv library offers an implementation of the DeMenthon's algorithm. This library also offers a convenient and easy-to-use interface to grab images from a webcam. It's worth a try: OpenCv homepage
If you know the location in the physical world of your four markers and you've recorded the positions as they appear on the camera, you ought to be able to derive some sort of transform.
When you do the calibration, surely you'd want to put the marker at the four corners of the table not the screen? If you're just doing the corners of the screen, I imagine you're probably not taking into acconut the slant of the table.
Is the table literally just slanted relative to the camera or is it also rotated at all?