Calculate spherical Terrain Normals from HeightMap - c++

I try to calculate the Vertex Normal from HeightMap Values.
I create a HeightMap at the beginning. This works so far. LOD works too.
I know how to calculate Normals for flat Terrain but how can i do this for spherical Terrain?
As you can see the Lighting is not working and NormalMaps dont work too.
I was thinking about to first calculate the Normals and Tangents as it would be flat Terrain. And than use some Transformation so the Normals and Tangents
fit the spherical Terrrain.
Edit 1:
Thats what i got so far. Is this correct Michael Nastenko? Here i use PerlinNoise not the Heightmap
float x1 = pRadius * FMath::Cos(lat + yDelta) * FMath::Cos(lon);
float y1 = pRadius * FMath::Cos(lat + yDelta) * FMath::Sin(lon);
float z1 = pRadius * FMath::Sin(lat + yDelta);
float x2 = pRadius * FMath::Cos(lat - yDelta) * FMath::Cos(lon);
float y2 = pRadius * FMath::Cos(lat - yDelta) * FMath::Sin(lon);
float z2 = pRadius * FMath::Sin(lat - yDelta);
float x3 = pRadius * FMath::Cos(lat) * FMath::Cos(lon + xDelta);
float y3 = pRadius * FMath::Cos(lat) * FMath::Sin(lon + xDelta);
float z3 = pRadius * FMath::Sin(lat);
float x4 = pRadius * FMath::Cos(lat) * FMath::Cos(lon - xDelta);
float y4 = pRadius * FMath::Cos(lat) * FMath::Sin(lon - xDelta);
float z4 = pRadius * FMath::Sin(lat);
float xDifference = GetNoiseValue(FVector(x1, y1, z1).GetSafeNormal())
- GetNoiseValue(FVector(x2, y2, z2).GetSafeNormal());
float yDifference = GetNoiseValue(FVector(x3, y3, z3).GetSafeNormal())
- GetNoiseValue(FVector(x4, y4, z4).GetSafeNormal());
FVector planeTangent = FVector(1.f, 0.f, xDifference).GetSafeNormal();
FVector planeBiTangent = FVector(0.f, 1.f, yDifference).GetSafeNormal();
Edit 2:
x1,y1,z1... are the adjacent points of the Vertex, i want to calculate the Tangent/Normal. For each of those Points i get the NoiseValue. So i can use finite differences to get the slope of the surface.
Shouldn't this be the Way to get the Tangent of flat Terrain?
Edit 3:
Calculating Normals like this is somehow wrong.
TArray<FVector> normals;
normals.Init(FVector(0, 0, 0), geoData.GeoData.Num());
int32 triangleCount = geoData.Triangles.Num() / 3;
for (int32 i = 0; i < triangleCount; i++)
{
int32 normalTriangleIndex = i * 3;
int32 triangleIndexA = geoData.Triangles[normalTriangleIndex];
int32 triangleIndexB = geoData.Triangles[normalTriangleIndex + 1];
int32 triangleIndexC = geoData.Triangles[normalTriangleIndex + 2];
FVector pointA = geoData.GeoData[triangleIndexA];
FVector pointB = geoData.GeoData[triangleIndexB];
FVector pointC = geoData.GeoData[triangleIndexC];
FVector sideAB = pointB - pointA;
FVector sideAC = pointC - pointA;
FVector norm;
norm= FVector::CrossProduct(sideAB, sideAC);
normals[triangleIndexA] = -norm;
normals[triangleIndexB] = -norm;
normals[triangleIndexC] = -norm;
}
geoData.NormalsData = normals;
On the left is my calculation, on the right i used the build in function of UE4

Related

Problems rotating opengl camera

I cannot understand the math behind this problem, I am trying to create an FPS camera where I can look freely with my mouse input.
I am trying to rotate and position my lookat point with 180 degrees of freedom. I understand the easier solution is to glRotate the world to fit my perspective, but I do not want this approach. I am fairly unfamiliar with the trigonometry involved here and cannot figure out how to solve this problem the way I want to...
here is my attempt to do this so far...
code to get mouse coordinates relative to the center of the window, then process it in my camera object
#define DEG2RAD(a) (a * (M_PI / 180.0f))//convert to radians
static void glutPassiveMotionHandler(int x, int y) {
glf centerX = WinWidth / 2; glf centerY = WinHeight / 2;//get windows origin point
f speed = 0.2f;
f oldX = mouseX; f oldY = mouseY;
mouseX = DEG2RAD(-((x - centerX)));//get distance from 0 and convert to radians
mouseY = DEG2RAD(-((y - centerY)));//get distance from 0 and convert to radians
f diffX = mouseX - oldX; f diffY = mouseY - oldY;//get difference from last frame to this frame
if (mouseX != 0 || mouseY != 0) {
mainCamera->Rotate(diffX, diffY);
}
Code to rotate the camera
void Camera::Rotate(f angleX, f angleY) {
Camera::refrence = Vector3D::NormalizeVector(Camera::refrence * cos(angleX)) + (Camera::upVector * sin(angleY));//rot up
Camera::refrence = Vector3D::NormalizeVector((Camera::refrence * cos(angleY)) - (Camera::rightVector * sin(angleX)));//rot side to side
};
Camera::refrence is our lookat point, processing the lookat point is handled as follows
void Camera::LookAt(void) {
gluLookAt(
Camera::position.x, Camera::position.y, Camera::position.z,
Camera::refrence.x, Camera::refrence.y, Camera::refrence.z,
Camera::upVector.x, Camera::upVector.y, Camera::upVector.z
);
};
The camera is defined by a position point (position) a target point (refrence) and a up-vector upVector. If you want to change the orientation of the camera, then you've to rotate the direction vector from the position (position) to the target (refrence) rather then the target point by a Rotation matrix.
Note, since the 2 angles are angles which should change an already rotated view, you've to use a rotation matrix, to rotate the vectors which point in an arbitrary direction.
Write a function which set 3x3 rotation matrix around an arbitrary axis:
void RotateMat(float m[], float angle_radians, float x, float y, float z)
{
float c = cos(angle_radians);
float s = sin(angle_radians);
m[0] = x*x*(1.0f-c)+c; m[1] = x*y*(1.0f-c)-z*s; m[2] = x*z*(1.0f-c)+y*s;
m[3] = y*x*(1.0f-c)+z*s; m[4] = y*y*(1.0f-c)+c; m[5] = y*z*(1.0f-c)-x*s;
m[6] = z*x*(1.0f-c)-y*s; m[7] = z*y*(1.0f-c)+x*s; m[8] = z*z*(1.0f-c)+c };
}
Write a function which rotates a 3 dimensional vector by the matrix:
Vector3D Rotate(float m[], const Vector3D &v)
{
Vector3D rv;
rv.x = m[0] * v.x + m[3] * v.y + m[6] * v.z;
rv.y = m[1] * v.x + m[4] * v.y + m[7] * v.z;
rv.z = m[2] * v.x + m[5] * v.y + m[8] * v.z;
return rv;
}
Calculate the vector form the position to the target:
Vector3D los = Vector3D(refrence.x - position.x, refrence.y - position.y, refrence.z - position.z);
Rotate all the vectors around the z axis of the world by angleX:
float rotX[9];
RotateMat(rotX, angleX, Vector3D(0, 0, 1));
los = Rotate(rotX, los);
upVector = Rotate(rotX, upVector);
Rotate all the vectors around the current y axis of the view by angleY:
float rotY[9];
RotateMat(rotY, angleY, Vector3D(los.x, los.y, 0.0));
los = Rotate(rotY, los);
upVector = Rotate(rotY, upVector);
Calculate the new target point:
refrence = Vector3D(position.x + los.x, position.y + los.y, position.z + los.z);
U_Cam_X_angle is left right rotation.. U_Cam_Y_angle is up down rotation.
view_radius is the view distance (zoom) to U_look_point_x, U_look_point_y and U_look_point_z.
This is ALWAYS a negative number! This is because you are always looking in positive direction. Deeper in the screen is more positive.
This is all in radians.
The last three.. eyeX, eyeY and eyeZ is where the camera is in 3D space.
This code is in VB.net. Find a converter online for VB to C++ or do it manually.
Public Sub set_eyes()
Dim sin_x, sin_y, cos_x, cos_y As Single
sin_x = Sin(U_Cam_X_angle + angle_offset)
cos_x = Cos(U_Cam_X_angle + angle_offset)
cos_y = Cos(U_Cam_Y_angle)
sin_y = Sin(U_Cam_Y_angle)
cam_y = Sin(U_Cam_Y_angle) * view_radius
cam_x = (sin_x - (1 - cos_y) * sin_x) * view_radius
cam_z = (cos_x - (1 - cos_y) * cos_x) * view_radius
Glu.gluLookAt(cam_x + U_look_point_x, cam_y + U_look_point_y, cam_z + U_look_point_z, _
U_look_point_x, U_look_point_y, U_look_point_z, 0.0F, 1.0F, 0.0F)
eyeX = cam_x + U_look_point_x
eyeY = cam_y + U_look_point_y
eyeZ = cam_z + U_look_point_z
End Sub

OpenGL Resize Window -> objects are “moved / translated”

When the resize event of the window is called, the objects are moved out of the viewport / screen.
The link below is a video to show what happening is:
https://drive.google.com/file/d/1dBnOqBDUBNCQrwr7ChFlpS8vbBQ6wfKh/view?usp=sharing
I just found out that it just happens whin using QT Windowing. It did not happend with GLFW... wooow
I use the following code:
void Renderer::resize(int width, int height) {
RendererSettings* settings = RendererSettings::getInstance();
settings->setSize(width, height);
glViewport(0, 0, width, height);
if (camera != nullptr)
{
float aspectRatio = float(width) / float(height);
camera->updateProjectionPerspectiveAspect(aspectRatio);
}
}
I do not change the camera anymore.
The updateProjectionPerspectiveAspect is the same of glFrustum(FoV, aspect, near, far). but the data others parameters are kept the same.
void Camera::setProjectionPerspective(float fieldOfView, float aspectRatio, float near, float far) {
this->fieldOfView = fieldOfView;
this->aspectRatio = aspectRatio;
this->nearFrustum = near;
this->farFrustum = far;
float xmin, xmax, ymin, ymax; // Dimensions of near clipping plane
float xFmin, xFmax, yFmin, yFmax; // Dimensions of far clipping plane
// Do the Math for the near clipping plane
ymax = near * tanf(float(fieldOfView * PI_DIV_360));
ymin = -ymax;
xmin = ymin * aspectRatio;
xmax = -xmin;
// Construct the projection matrix
projectionMatrix = Mat4f::identity();
projectionMatrix[0] = (2.0f * near) / (xmax - xmin);
projectionMatrix[5] = (2.0f * near) / (ymax - ymin);
projectionMatrix[8] = (xmax + xmin) / (xmax - xmin);
projectionMatrix[9] = (ymax + ymin) / (ymax - ymin);
projectionMatrix[10] = -((far + near) / (far - near));
projectionMatrix[11] = -1.0f;
projectionMatrix[14] = -((2.0f * far * near) / (far - near));
projectionMatrix[15] = 0.0f; }
Camera parameter is not null and this event "resize" is called some times during the resizing. The parameters width and height are corrects.
I think your projection Matrix is wrong, mainly because you don't use the variable aspectRatio at all, but the way you do it it looks correct..? (So it's just me guessing :P)
Here is how i did my projection Matrix in C using an aspect ratio argument, maybe this helps
mat4 set_perspective_matrix(GLfloat fov, GLfloat aspect, GLfloat nearPlane, GLfloat farPlane)
{
mat4 p;
GLfloat f = 1.0/ tan(fov * 3.1415926/360.0);
GLfloat c1 = -(farPlane + nearPlane) / (farPlane - nearPlane);
GLfloat c2 = -(2.0 * farPlane * nearPlane) / (farPlane - nearPlane);
p._[0] = f/aspect;
p._[1] = 0.0;
p._[2] = 0.0;
p._[3] = 0.0;
p._[4] = 0.0;
p._[5] = f;
p._[6] = 0.0;
p._[7] = 0.0;
p._[8] = 0.0;
p._[9] = 0.0;
p._[10] = c1;
p._[11] = c2;
p._[12] = 0.0;
p._[13] = 0.0;
p._[14] =-1.0;
p._[15] = 0.0;
return p;
}
Here is a good article describing the setup of a projection matrix: The Perspective Matrix
The problem was on QT Windowing. It was solved using the following code to resize:
void QtOpenGLRenderer::resizeEvent(QResizeEvent* event) {
QSize size = event->size();
if (event->oldSize().isEmpty())
{
initialScreenSize = size;
return;
}
size = parentWidget->size();
float deltaX = size.width() - initialScreenSize.width();
float deltaY = size.height() - initialScreenSize.height();
renderer->resize(size.width() - deltaX, size.height() - deltaY); }

Calculate position and rotation of camera with mouse events

My plan:
1. Calculate mouse direction [x, y] [success]
I my Mouse Move event:
int directionX = lastPosition.x - position.x;
int directionY = lastPosition.y - position.y;
2. Calculate angles [theta, phi] [success]
float theta = fmod(lastTheta + sensibility * directionY, M_PI);
float phi = fmod(lastPhi + sensibility * directionX * -1, M_PI * 2);
Edit {
bug fix:
float theta = lastTheta + sensibility * directionY * -1;
if (theta < M_PI / -2)theta = M_PI / -2;
else if (theta > M_PI / 2)theta = M_PI / 2;
float phi = fmod(lastPhi + sensibility * directionX * -1, M_PI * 2);
}
Now I have given theta, phi, the centerpoint and the radius and I want to calculate the position and the rotation [that the camera look at the centerpoint]
3. Calculate position coordinates [X,Y,Z] [failed]
float newX = radius * sin(phi) * cos(theta);
float newY = radius * sin(phi) * sin(theta);
float newZ = radius * cos(phi);
Solution [by meowgoesthedog]:
float newX = radius * cos(theta) * cos(phi);
float newY = radius * sin(theta);
float newZ = radius * cos(theta) * sin(phi);
4. Calculate rotation [failed]
float pitch = ?;
float yaw = ?;
Solution [by meowgoesthedog]:
float pitch = -theta;
float yaw = -phi;
Thanks for your solutions!
Your attempt was almost (kinda) correct:
As the diagram shows, in OpenGL the "vertical" direction is conventionally taken to be Y, whereas your formulas assume it is Z
phi and theta are in the wrong order
Very simple conversion: yaw = -phi, pitch = -theta (from the perspective of the camera)
Fixed formulas:
float position_X = radius * cos(theta) * cos(phi);
float position_Y = radius * sin(theta);
float position_Z = radius * cos(theta) * sin(phi);
(There may also be some sign issues with the mouse deltas but they should be easy to fix.)

Rotation: Quaternion to matrix

I am trying to display a 360 panorama using an IMU for head tracking.
Yaw works correctly but the roll and pitch are reverse. I also notice that the pitch contains some roll (and maybe vice-versa).
I am receiving (W, X, Y, Z) coordinate from the IMU that I am storing in an array as X, Y, Z, W.
The next step is converting the quaternion to a rotation matrix. I have looked at many examples, and can't seem to find anything wrong with the following code:
static GLfloat rotation[16];
// Quaternion (x, y, z, w)
static void quaternionToRotation(float* quaternion)
{
// Normalize quaternion
float magnitude = sqrt(quaternion[0] * quaternion[0] +
quaternion[1] * quaternion[1] +
quaternion[2] * quaternion[2] +
quaternion[3] * quaternion[3]);
for (int i = 0; i < 4; ++i)
{
quaternion[i] /= magnitude;
}
double xx = quaternion[0] * quaternion[0], xy = quaternion[0] * quaternion[1],
xz = quaternion[0] * quaternion[2], xw = quaternion[0] * quaternion[3];
double yy = quaternion[1] * quaternion[1], yz = quaternion[1] * quaternion[2],
yw = quaternion[1] * quaternion[3];
double zz = quaternion[2] * quaternion[2], zw = quaternion[2] * quaternion[3];
// Column major order
rotation[0] = 1.0f - 2.0f * (yy + zz);
rotation[1] = 2.0f * (xy - zw);
rotation[2] = 2.0f * (xz + yw);
rotation[3] = 0;
rotation[4] = 2.0f * (xy + zw);
rotation[5] = 1.0f - 2.0f * (xx + zz);
rotation[6] = 2.0f * (yz - xw);
rotation[7] = 0;
rotation[8] = 2.0f * (xz - yw);
rotation[9] = 2.0f * (yz + xw);
rotation[10] = 1.0f - 2.0f * (xx + yy);
rotation[11] = 0;
rotation[12] = 0;
rotation[13] = 0;
rotation[14] = 0;
rotation[15] = 1;
}
The rotation matrix is then used in the draw call as such:
static void draw()
{
// Get IMU quaternion
float* quaternion = tracker.getTrackingData();
if (quaternion != NULL)
{
quaternionToRotation(quaternion);
}
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glLoadIdentity();
glPushMatrix();
// TODO: Multiply initialRotation quaternion with IMU quaternion
glMultMatrixf(initialRotation); // Initial rotation to point forward
glMultMatrixf(rotation); // Rotation based on IMU
glEnable(GL_TEXTURE_2D);
glBindTexture(GL_TEXTURE_2D, texture);
gluSphere(quad, 0.1, 50, 50);
glBindTexture(GL_TEXTURE_2D, 0);
glPopMatrix();
glFlush();
glutSwapBuffers();
}
I tried to set all but one fields in the quaternion to 0, and I notice that they all work individually, except roll and pitch is swapped around. I tried swapping X and Y but this does not seem to help.
Any help would be really appreciated. Please let me know as well if you have any steps that can let me debug my issue. Thanks!

Camera in opengl not working

The problem is when I face my camera down the z axis for example and pitch this works fine however, after I have finished the pitch and would like to yaw on this new axis it begins to roll for some unknown reason =s.
void FrustumCamera::xAxisRotation(float angle)
{
// angle = angle * (double)degToRad;
Vector3<float> x = m_orientation.getXAxis();
Vector3<float> y = m_orientation.getYAxis();
Vector3<float> z = m_orientation.getZAxis();
y.rotateAroundAxis(x,angle);
x = m_orientation.getXAxis();
z.rotateAroundAxis(x,angle);
m_orientation.setYAxis(y);
m_orientation.setZAxis(z);
}
void FrustumCamera::yAxisRotation(float angle)
{
// angle = angle * (double)degToRad;
Vector3<float> x = m_orientation.getXAxis();
Vector3<float> y = m_orientation.getYAxis();
Vector3<float> z = m_orientation.getZAxis();
x.rotateAroundAxis(y,angle);
y = m_orientation.getYAxis();
z.rotateAroundAxis(y,angle);
m_orientation.setXAxis(x);
m_orientation.setZAxis(z);
}
void FrustumCamera::zAxisRotation(float angle)
{
Vector3<float> x = m_orientation.getXAxis();
Vector3<float> y = m_orientation.getYAxis();
Vector3<float> z = m_orientation.getZAxis();
x.rotateAroundAxis(z,angle);
z = m_orientation.getYAxis();
y.rotateAroundAxis(z,angle);
m_orientation.setXAxis(x);
m_orientation.setYAxis(y);
}
template <class Type>
void Vector3<Type>::rotateAroundAxis(Vector3<Type> axis, const float angle)
{
float radians = static_cast<Type>(angle * degToRad);
Type sinAngle = static_cast<Type>(sin(radians));
Type cosAngle = 0.0;
if (angle == 90 || angle == -90)
cosAngle = 0.0;
else
cosAngle = cos(radians);
normalise(axis); // normalise the axis
Type oneMinusCos = 1 - cosAngle; // (1 - cos(theta))
// construct the rotation matrix
Type tempMatrix[3][3];
tempMatrix[0][0] = (axis.x * axis.x) * oneMinusCos + cosAngle;
tempMatrix[0][1] = (axis.x * axis.y) * oneMinusCos + axis.z * sinAngle;
tempMatrix[0][2] = (axis.x * axis.z) * oneMinusCos - axis.y * sinAngle;
tempMatrix[1][0] = (axis.x * axis.y) * oneMinusCos - axis.z * sinAngle;
tempMatrix[1][1] = (axis.y * axis.y) * oneMinusCos + cosAngle;
tempMatrix[1][2] = (axis.y * axis.z) * oneMinusCos + axis.x * sinAngle;
tempMatrix[2][0] = (axis.x * axis.z) * oneMinusCos + axis.y * sinAngle;
tempMatrix[2][1] = (axis.y * axis.z) * oneMinusCos - axis.x * sinAngle;
tempMatrix[2][2] = (axis.z * axis.z) * oneMinusCos + cosAngle;
Vector3<Type> temp(*this);
Vector3<Type> result;
result.x = (temp.x * tempMatrix[0][0]) + (temp.y * tempMatrix[1][0]) + (temp.z * tempMatrix[2][0]);
result.y = (temp.x * tempMatrix[0][1]) + (temp.y * tempMatrix[1][1]) + (temp.z * tempMatrix[2][1]);
result.z = (temp.x * tempMatrix[0][2]) + (temp.y * tempMatrix[1][2]) + (temp.z * tempMatrix[2][2]);
*this = result;
}
void OpenGLRenderer::startDraw(unsigned long mask)
{
//sortBuffer(); // sort draw queue
clearBuffers(mask); // clear buffers
loadIdentity();
glTranslatef(-1*m_frustumCamera->getViewMatrix().getTranslationAxis().x,-1*m_frustumCamera->getViewMatrix().getTranslationAxis().y,-1*m_frustumCamera->getViewMatrix().getTranslationAxis().z);
glMultMatrixf(m_frustumCamera->getViewMatrix().getMatrix());
glTranslatef(m_frustumCamera->getViewMatrix().getTranslationAxis().x,m_frustumCamera->getViewMatrix().getTranslationAxis().y,m_frustumCamera->getViewMatrix().getTranslationAxis().z);// load identity
//
// push matrix stack
matrixStackPush();
}
You might be experiencing Gimbal Lock; this can happen if you pitch all the way up or down so your look vector becomes parallel with your up vector, In which case a yaw will be the same as a roll.
This can be a downside of constructing rotations piecemeal via Euler angles. You may want to look into quaternions. (Note that you cant rotate with Euler angles; they are just a representation for rotation (you need to convert it to matrix or quats), but the way you are tackling it is very much an 'Euler angle' way of thinking about it)
The strength of matrix multiplication is that any sequence of multiple rotations can be represented (and concatenated) as a single rotation matrix. What you need to be doing is something like this:
void Transformable::yaw(float angle)
{
float4x4 rot; // temp rotation matrix
float3 translate(&_transform._41); // save our translation
float3 up(&_transform._21); // y axis
// build the rotation matrix for rotation around y
MatrixRotationAxis(&rot, &up, angle);
// multiply our transform by the rotation matrix
// note that order of multiplication matters and depends on
// if your matrices are column-major or row-major
MatrixMultiply(&_transform, &_transform, &rot);
// write back our original translation
memcpy(&_transform._41, &translate, sizeof(float3));
// might want to reorthogonalise every now and then
// to make sure basis vectors are orthonormal
// or you will probably get matrix creep after a few operations
}
instead of trying to rotate one basis vector at a time. In this case _transform would be a 4x4 homogenous matrix representing the transformation matrix. (rotation and translation). The topleft 3x3 submatrix is simply the basis vectors of the orientation space.