Ray-bounded plane intersection - c++

I'm trying to write a ray tracer in my freetime. Currently trying to do ray - bounded plane intersections.
My program is already working with infinite planes. I'm trying to work out the math for non-infinite planes. Tried to google, but all of the resources talk only about infinite planes.
My plane has a corner point (called position), from which two vectors (u and v) extend (their length correspond to the length of the sides). The ray has an origin and a direction.
First I calculate the intersection point with an infinite plane with the formula
t = normal * (position - origin) / (normal * direction)
The normal is calculated as a cross product of u and v.
Then with the formula
origin + direction * t
I get the intersection point itself.
The next step is checking if this point is in the bounds of the rectangle, and this is where I'm having trouble.
My idea was to get the relative vector intersection - position that is extending from the corner of the plane to the intersection point, then transform it to a new basis of u, normal and v then check if the lengths of the transformed vectors are shorter than the u and v vectors.
bool BoundedPlane::intersect(const Vec3f &origin, const Vec3f &direction, float &t) const {
t = normal * (position - origin) / (normal * direction);
Vec3f relative = (origin + direction * t) - position;
Mat3f transform{
Vec3f(u.x, normal.x, v.x),
Vec3f(u.y, normal.y, v.y),
Vec3f(u.z, normal.z, v.z)
};
Vec3f local = transform.mul(relative);
return t > 0 && local.x >= 0 && local.x <= u.x && local.z <= 0 && local.z <= v.z;
}
At the end I check if t is larger than 0, meaning the intersection is in front of the camera, and if the lengths of the vectors are inside the bounds. This gives me a weird line:
.
The plane should appear below the spheres like this:
(this used manual checking to see if it appears correctly if the numbers are right).
I'm not sure what I'm doing wrong, and if there's an easier way to check the bounds. Thanks in advance.
Edit1:
I moved the transformation matrix calculations into the constructor, so now the intersection test is:
bool BoundedPlane::intersect(const Vec3f &origin, const Vec3f &direction, float &t) const {
if (!InfinitePlane::intersect(origin, direction, t)) {
return false;
}
Vec3f local = transform.mul((origin + direction * t) - position);
return local.x >= 0 && local.x <= 1 && local.z >= 0 && local.z <= 1;
}
The transform member is the inverse of the transformation matrix.

Could I suggest another approach? Consider the frame with origin
position and basis vectors
u = { u.x, u.y, u.z }
v = { v.x, v.y, v.z }
direction = { direction.x, direction.y, direction.z}
Step 1: Form the matrix
M = {
{u.x, v.x, direction.x},
{u.y, v.y, direction.y},
{u.z, v.z, direction.z}
}
Step 2: Calculate the vector w, which is a solution to the 3 x 3 system of liner equations
M * w = origin - position, i.e.
w = inverse(M) * (origin - position);
Make sure that direction is not coplanar with u, v, otherwise there is no intersection and inverse(M) does not exist.
Step 3: if 0.0 <= w.x && w.x <= 1.0 && 0.0 <= w.y && w.y <= 1.0 then the line intersects the parallelogram spanned by the vectors u, v and the point of intersection is
w0 = { w.x, w.y , 0 };
intersection = position + M * w0;
else, the line does not intersect the parallelogram spanned by the vectors u, v
The idea of this algorithm is to consider the (non-orthonormal) frame position, u, v, direction. Then the matrix M changes everything in the coordinates of this new frame. In this frame, the line is vertical, parallel to the "z-"axis, the point origin has coordinates w, and the vertical line through w intersects the plane at w0.
Edit 1: Here is a templet formula for the inverse of a 3x3 matrix:
If original matrix M is
a b c
d e f
g h i
inverse is
(1 / det(M)) * {
{e*i - f*h, c*h - b*i, b*f - c*e},
{f*g - d*i, a*i - c*g, c*d - a*f},
{d*h - e*g, b*g - a*h, a*e - b*d},
}
where
det(M) = a*(e*i - f*h) + b*(f*g - d*i) + c*(d*h - e*h)
is the determinant of M.
So the inversion algorithm can be as follows:
Given
M = {
{a, b, c},
{d, e, f},
{g, h, i},
}
Calculate
inv_M = {
{e*i - f*h, c*h - b*i, b*f - c*e},
{f*g - d*i, a*i - c*g, c*d - a*f},
{d*h - e*g, b*g - a*h, a*e - b*d},
};
Calculate
det_M = a*inv_M[1][1] + b*inv_M[2][1] + c*inv_M[3][1];
Return inverse matrix of M
inv_M = (1/det_M) * inv_M;
Edit 2: Let's try another approach in order to speed things up.
Step 1: For each plane, determined by the point position and the two vectors u and v, precompute the following quatntities:
normal = cross(u, v);
u_dot_u = dot(u, u);
u_dot_v = dot(u, v);
v_dot_v = dot(v, v); // all these need to be computed only once for the u and v vectors
det = u_dot_u * v_dot_v - u_dot_v * u_dot_v; // again only once per u and v
Step 2: Now, for a given line with point origin and direction direction, as before, calculate the intersection point int_point with the plane spanned by u and v:
t = dot(normal, position - origin) / dot(normal, direction);
int_point = origin + t * direction;
rhs = int_point - position;
Step 3: Calcualte
u_dot_rhs = dot(u, rhs);
v_dot_rhs = dot(v, rhs);
w1 = (v_dot_v * u_dot_rhs - u_dot_v * v_dot_rhs) / det;
w2 = (- u_dot_v * u_dot_rhs + u_dot_u * v_dot_rhs) / det;
Step 4:
if (0 < = w1 && w1 <= 1 && 0 < = w2 && w2 <= 1 ){
int_point is in the parallelogram;
}
else{
int_point is not in the parallelogram;
}
So what I am doing here is basically finding the intersection point of the line origin, direction with the plane given by position, u, v and restricting myself to the plane, which allows me to work in 2D rather than 3D. I am representing
int_point = position + w1 * u + w2 * v;
rhs = int_point - position = w1 * u + w2 * v
and finding w1 and w2 by dot-multiplying of this vector expression with the basis vectors u and v, which results in a 2x2 linear system, which I am solving directly.

Related

Computing the side planes of a 3D AABB

I have a 3D AABB defined by two sets of points a min/max.
I'd like to define the 6 planes that make up the sides of the AABB, such that any point that is within the AABB will have a positive signed-distance.
My plane definition is comprised of a normal (x,y,z) and a constant D, Corresponding to the Ax + By +Cz + D = 0 form plane equation.
struct myplane {
double nx,ny,nz;
double D;
};
Note: nx,ny, and nz are normalized.
The AABB struct is as follows:
struct myAABB {
point3d min;
point3d max;
};
I'm currently defining instances of the AABB sides like so:
myplane p0 = myplane{-1.0f, 0.0f, 0.0f,aabb.max.x);
myplane p1 = myplane{ 0.0f,-1.0f, 0.0f,aabb.max.y);
myplane p2 = myplane{ 0.0f, 0.0f,-1.0f,aabb.max.z);
myplane p3 = myplane{+1.0f, 0.0f, 0.0f,aabb.min.x);
myplane p4 = myplane{ 0.0f,+1.0f, 0.0f,aabb.min.y);
myplane p5 = myplane{ 0.0f, 0.0f,+1.0f,aabb.min.z);
where aabb is in this case is: min(-1,-1,-1) max(1,1,1)
The problem is that points in the AABB return a positive distance for the planes p0,p1 and p2, however not so for planes p3,p4 and p5, as they return negative distances which seem to indicate the points are on the other side.
For example the origin point (0,0,0) should return a positive distance of 1 for each of the planes however does not for planes p3,p4 and p5.
The signed-distance calculation being used is:
double distance(myplane& p, const point3d& v)
{
// p.normal dot v + D
return (p.nx * v.x) + (p.ny * v.y) + (p.nz * v.z) + p.D;
}
I think my equations are wrong in some way, but I can't seem to figure it out.
Signed distance from point to plane according to Chapter 2.3 of Mathematical Handbook (Korn, Korn) is
Delta = (Normal. dot. v + D) / (-Sign(D) * NormalLength)
but you don't account for D sign. Just modify function:
dt = (p.nx * v.x) + (p.ny * v.y) + (p.nz * v.z) + p.D;
return (p.D < 0) ? dt: -dt;
Xm < X < XM
is equivalent to
1.X + 0.Y + 0.Z - Xm > 0 and - 1.X + 0.Y + 0.Z + XM > 0

Problems with a simple raytracer in c++

What it does is basically checking for collisions against an array of triangles and drawing an image based on the color of the triangle it hits. I think my problem lies in the collision detection. Code here:
for (int i = 0; i < triangles.size(); i++){
vec3 v0 = triangles[i].v0;
vec3 v1 = triangles[i].v1;
vec3 v2 = triangles[i].v2;
vec3 e1 = v1 - v0;
vec3 e2 = v2 - v0;
vec3 b = start - v0;
mat3 A(-dir, e1, e2);
vec3 x = glm::inverse(A) * b;
if (x.x > 0 && x.y > 0 && (x.x + x.y < 1) && (x.z - start.z>= 0)){
return true;
}
}
...where "dir" is the direction of the ray coming from the camera, calculated as "x - SCREEN_WIDTH / 2, y - SCREEN_HEIGHT / 2, focalLength". SCREEN_WIDTH, SCREEN_HEIGHT and focalLength are constants. Start is the position of the camera, set to 0,0,0.
What I'm not sure about is what x really is and what I should check for before returning true. "x.x > 0 && x.y > 0 && (x.x + x.y < 1)" is supposed to check if the ray hits not only on the same plane but actually inside the triangle, and the last part ("x.z - start.z>= 0", the one I'm least sure about), if the collision happened in front of the camera.
I get images, but no matter how much I try it's never right. It's supposed to be a classic TestModel of a room with different colored walls and two shapes in it. The closest I think I've been is getting four of five walls right, with the far one missing and a part of one of the shapes on the other side of it.
I'm not familiar with the matrix formulation for triangle intersection - it sounds quite expensive.
My own code is below, where my e1 and e2 are equivalent to yours - i.e. they represent the edge vectors from v0 to v1 and v2 respectively:
// NB: triangles are assumed to be in world space
vector3 pvec = vector3::cross(ray.direction(), e2);
double det = e1.dot(pvec);
if (::fabs(det) < math::epsilon) return 0;
double invDet = 1.0 / det;
vector3 tvec(p0, ray.origin());
double u = tvec.dot(pvec) * invDet;
if (u < 0 || u > 1) return 0;
vector3 qvec = vector3::cross(tvec, e1);
double v = ray.direction().dot(qvec) * invDet;
if (v < 0 || u + v > 1) return 0;
double t = e2.dot(qvec) * invDet;
if (t > math::epsilon) { // avoid self intersection
// hit found at distance "t"
}
I suspect the problem is in your calculation of the ray vector, which should be normalised.

Create a 3D sphere and 3D box C++

I need to implement a tool to detect intersction between a 3D box and 3D Sphere in c++. Write now I find a way how to detect the intersection using that code.
inline float squared(float v) { return v * v; }
bool doesCubeIntersectSphere(vec3 C1, vec3 C2, vec3 S, float R)
{
float dist_squared = R * R;
/* assume C1 and C2 are element-wise sorted, if not, do that now */
if (S.X < C1.X) dist_squared -= squared(S.X - C1.X);
else if (S.X > C2.X) dist_squared -= squared(S.X - C2.X);
if (S.Y < C1.Y) dist_squared -= squared(S.Y - C1.Y);
else if (S.Y > C2.Y) dist_squared -= squared(S.Y - C2.Y);
if (S.Z < C1.Z) dist_squared -= squared(S.Z - C1.Z);
else if (S.Z > C2.Z) dist_squared -= squared(S.Z - C2.Z);
return dist_squared > 0;
}
What I need is an example of C++ code to create a 3D sphere using origin vector and a radius and
I need to create a 3D sphere using origin vector and a radius and a 3D box through its maximum and minimum corner vector.
I may be mistaken, but (assuming Axis-Aligned boxes):
The length of a vector from origin to corner C1 or C2 should be the radius r, right?
Explanation for my deriviation below: An Axis-Aligned-box with equal distance from center to all corners is a perfect cube. Translating such a cube to the origin puts two of the corners exactly at the line from the origin through the point {x=1,y=1,z=1}. Thus those two corners will have coordinates {d,d,d} and {-d, -d, -d}, where d is "distance" of the corner along the axises X,Y,Z. The distance to say the first corner is squaring and adding all components of the vector, and taking the square root, e.g:
|C1| = |{d,d,d}| = sqrt(d * d + d * d + d * d) = sqrt(3 * d * d)
Therefore solve:
r = sqrt(3 *d * d)
<=>
r * r = 3 * d * d
<=>
d = sqrt(r*r/3)
<=>
d = r/sqrt(3)
This needs to be translated back to the center of the Sphere, thus:
C1 = { S.x+d, S.y+d, S.z+d}
C2 = { S.x-d, S.y-d, S.z-d}
Your explanation is a little vague, so I made some assumptions. Perhaps I'm dead wrong. Anyway here is some non-tested code showing what I mean:
void makeCube(vec3 S, float R, vec3* C1, vec3* C2)
{
static const float sqrt_one_third = sqrtf(1.0f/3.0f);
float d = R * sqrt_one_third;
C1->X = S.X + d;
C1->Y = S.Y + d;
C1->Z = S.Z + d;
C2->X = S.X - d;
C2->Y = S.Y - d;
C2->Z = S.Z - d;
}

Ray Tracing - Geometric Sphere Intersection - Intersection function returns true for all rays despite no intersection

I am writing a ray tracing project with C++ and OpenGL and am running into some obstacles with my sphere intersection function: I've checked multiple sources and the math looks right, but for some reason for every single ray, the intersection method is returning true. Here is the code to the sphere intersection function as well as some other code for clarification:
bool intersect(Vertex & origin, Vertex & rayDirection, float intersection)
{
bool insideSphere = false;
Vertex oc = position - origin;
float tca = 0.0;
float thcSquared = 0.0;
if (oc.length() < radius)
insideSphere = true;
tca = oc.dot(rayDirection);
if (tca < 0 && !insideSphere)
return false;
thcSquared = pow(radius, 2) - pow(oc.length(), 2) + pow(tca, 2);
if (thcSquared < 0)
return false;
insideSphere ? intersection = tca + sqrt(thcSquared) : intersection = tca - sqrt(thcSquared);
return true;
}
Here is some context from the ray tracing function that calls the intersection function. FYI my camera is at (0, 0, 0) and that is what is in my "origin" variable in the ray tracing function:
#define WINDOW_WIDTH 640
#define WINDOW_HEIGHT 480
#define WINDOW_METERS_WIDTH 30
#define WINDOW_METERS_HEIGHT 20
#define FOCAL_LENGTH 25
rayDirection.z = FOCAL_LENGTH * -1;
for (int r = 0; r < WINDOW_HEIGHT; r++)
{
rayDirection.y = (WINDOW_METERS_HEIGHT / 2 * -1) + (r * ((float)WINDOW_METERS_HEIGHT / (float)WINDOW_HEIGHT));
for (int c = 0; c < WINDOW_WIDTH; c++)
{
intersection = false;
t = 0.0;
rayDirection.x = (WINDOW_METERS_WIDTH / 2 * -1) + (c * ((float)WINDOW_METERS_WIDTH / (float)WINDOW_WIDTH));
rayDirection = rayDirection - origin;
for (int i = 0; i < NUM_SPHERES; i++)
{
if (spheres[i].intersect(CAM_POS, rayDirection, t))
{
intersection = true;
}
}
Thanks for taking a look and let me know if there is any other code that may help!
It seems you got your math a bit mixed. The first part of the function, ie until the first return false, is ok and will return false if the ray start outside of the sphere and don't go toward it. However, I think you put the camera outside all your spheres in such a manner that all spheres are visible, that's why this part never return false.
thcSquared is really wrong and I don't know what it is supposed to represent.
Let's do the intersection mathematically. We have:
origin : the start of the ray, let's call this A
rayDirection : the direction of the infinite ray, let's call this d.
position : the center of the sphere, called P
radius : self-explanatory, called r
What you want is a point on both the sphere and the line, let's call it M:
M = A + t * d because it is on the line
|M - P| = r because it is on the sphere
The second equation can be changed to be |A + t * d - P|² = r², which gives (A - P)² + 2 * t * (A - P).dot(d) + t²d² = r². This is a simple quadratic equation. Once solved, you have 0, 1 or 2 solutions, select the closest to the ray origin (but which is positive).
edit: You are forced to use another approach that I will detail here:
Compute the distance between the center of the sphere and the line (calling it l). This is done by 'projecting' the center on the line. So:
tca = ( (P - A) dot d ) / |d|, or with your variable names, tca = (OC dot rd) / |rd|. The projection is H = A + tca * d, and l = |H - P|.
If l > R then return false, there is no intersection.
Let's call M one intersection point. The triangle MHP have a right angle, so MH² + HP² = MP², in other terms thc² + l² = r², so we now have thc, the distance from H to the sphere.
With all that, t = tca +- thc, simply take the lowest non-negative of the two.
The paper you linked explain this, but without saying that it assumes the norm of the ray direction to be 1. I don't see a normalization in your code, that may be why your code fails (not verified).
Side note: the name Vertex for a 3d vector is really badly chosen, something like Vector3 or vec3 would be way better.

How to do ray plane intersection?

How do I calculate the intersection between a ray and a plane?
Code
This produces the wrong results.
float denom = normal.dot(ray.direction);
if (denom > 0)
{
float t = -((center - ray.origin).dot(normal)) / denom;
if (t >= 0)
{
rec.tHit = t;
rec.anyHit = true;
computeSurfaceHitFields(ray, rec);
return true;
}
}
Parameters
ray represents the ray object.
ray.direction is the direction vector.
ray.origin is the origin vector.
rec represents the result object.
rec.tHit is the value of the hit.
rec.anyHit is a boolean.
My function has access to the plane:
center and normal defines the plane
As wonce commented, you want to also allow the denominator to be negative, otherwise you will miss intersections with the front face of your plane. However, you still want a test to avoid a division by zero, which would indicate the ray being parallel to the plane. You also have a superfluous negation in your computation of t. Overall, it should look like this:
float denom = normal.dot(ray.direction);
if (abs(denom) > 0.0001f) // your favorite epsilon
{
float t = (center - ray.origin).dot(normal) / denom;
if (t >= 0) return true; // you might want to allow an epsilon here too
}
return false;
First consider the math of the ray-plane intersection:
In general one intersects the parametric form of the ray, with the implicit form of the geometry.
So given a ray of the form x = a * t + a0, y = b * t + b0, z = c * t + c0;
and a plane of the form: A x * B y * C z + D = 0;
now substitute the x, y and z ray equations into the plane equation and you will get a polynomial in t. you then solve that polynomial for the real values of t. With those values of t you can back substitute into the ray equation to get the real values of x, y and z.
Here it is in Maxima:
Note that the answer looks like the quotient of two dot products!
The normal to a plane is the first three coefficients of the plane equation A, B, and C.
You still need D to uniquely determine the plane.
Then you code that up in the language of your choice like so:
Point3D intersectRayPlane(Ray ray, Plane plane)
{
Point3D point3D;
// Do the dot products and find t > epsilon that provides intersection.
return (point3D);
}
Math
Define:
Let the ray be given parametrically by q = p + t*v for initial point p and direction vector v for t >= 0.
Let the plane be the set of points r satisfying the equation dot(n, r) + d = 0 for normal vector n = (a, b, c) and constant d. Fully expanded, the plane equation may also be written in the familiar form ax + by + cz + d = 0.
The ray-plane intersection occurs when q satisfies the plane equation. Substituting, we have:
d = -dot(n, q)
= -dot(n, p + t * v)
= -dot(n, p) + t * dot(n, v)
Rearranging:
t = -(dot(n, p) + d) / dot(n, v)
This value of t can be used to determine the intersection by plugging it back into p + t*v.
Example implementation
std::optional<vec3> intersectRayWithPlane(
vec3 p, vec3 v, // ray
vec3 n, float d // plane
) {
float denom = dot(n, v);
// Prevent divide by zero:
if (abs(denom) <= 1e-4f)
return std::nullopt;
// If you want to ensure the ray reflects off only
// the "top" half of the plane, use this instead:
//
// if (-denom <= 1e-4f)
// return std::nullopt;
float t = -(dot(n, p) + d) / dot(n, v);
// Use pointy end of the ray.
// It is technically correct to compare t < 0,
// but that may be undesirable in a raytracer.
if (t <= 1e-4)
return std::nullopt;
return p + t * v;
}
implementation of vwvan's answer
Vector3 Intersect(Vector3 planeP, Vector3 planeN, Vector3 rayP, Vector3 rayD)
{
var d = Vector3.Dot(planeP, -planeN);
var t = -(d + Vector3.Dot(rayP, planeN)) / Vector3.Dot(rayD, planeN);
return rayP + t * rayD;
}