gluLookAt alternative doesn't work - c++

I'm trying to calculate a lookat matrix myself, instead of using gluLookAt().
My problem is that my matrix doesn't work. using the same parameters on gluLookAt does work however.
my way of creating a lookat matrix:
Vector3 Eye, At, Up; //these should be parameters =)
Vector3 zaxis = At - Eye; zaxis.Normalize();
Vector3 xaxis = Vector3::Cross(Up, zaxis); xaxis.Normalize();
Vector3 yaxis = Vector3::Cross(zaxis, xaxis); yaxis.Normalize();
float r[16] =
{
xaxis.x, yaxis.x, zaxis.x, 0,
xaxis.y, yaxis.y, zaxis.y, 0,
xaxis.z, yaxis.z, zaxis.z, 0,
0, 0, 0, 1,
};
Matrix Rotation;
memcpy(Rotation.values, r, sizeof(r));
float t[16] =
{
1, 0, 0, 0,
0, 1, 0, 0,
0, 0, 1, 0,
-Eye.x, -Eye.y, -Eye.z, 1,
};
Matrix Translation;
memcpy(Translation.values, t, sizeof(t));
View = Rotation * Translation; // i tried reversing this as well (translation*rotation)
now, when i try to use this matrix be calling glMultMatrixf, nothing shows up in my engine, while using the same eye, lookat and up values on gluLookAt works perfect as i said before.
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
glMultMatrixf(View);
the problem must be in somewhere in the code i posted here, i know the problem is not in my Vector3/Matrix classes, because they work fine when creating a projection matrix.

I assume you have a right handed coordinate system (it is default in OpenGL).
Try the following code. I think you forgot to normalize up and you have to put "-zaxis" in the matrix.
Vector3 Eye, At, Up; //these should be parameters =)
Vector3 zaxis = At - Eye; zaxis.Normalize();
Up.Normalize();
Vector3 xaxis = Vector3::Cross(Up, zaxis); xaxis.Normalize();
Vector3 yaxis = Vector3::Cross(zaxis, xaxis); yaxis.Normalize();
float r[16] =
{
xaxis.x, yaxis.x, -zaxis.x, 0,
xaxis.y, yaxis.y, -zaxis.y, 0,
xaxis.z, yaxis.z, -zaxis.z, 0,
0, 0, 0, 1,
};

Related

GLM LookAt with different coordinate system

I am using GLM to make a LookAt matrix. I use the normal OpenGL coordinate system, but with the Z axis going inwards which is the opposite of the OpenGL standard. Thus, the LookAt function requires some changes:
glm::vec3 pos = glm::vec3(0, 0, -10); // equal to glm::vec3(0, 0, 10) in standard coords
glm::quat rot = glm::vec3(0.991445, 0.130526, 0, 0); // 15 degrees rotation about the x axis
glm::vec3 resultPos = pos * glm::vec3(1, 1, -1); // flip Z axis
glm::vec3 resultLook = pos + (glm::conjugate(rot) * glm::vec3(0, 0, 1)) * glm::vec3(1, 1, -1); // rotate unit Z vec and then flip Z
glm::vec3 resultUp = (glm::conjugate(rot) * glm::vec3(0, 1, 0)) * glm::vec3(1, 1, -1); // same thing as resultLook but with unit Y vec
glm::mat4 lookAt = glm::lookAt(resultPos, resultLook, resultUp)
However, that is a lot of calculation for just flipping a single axis. What do I need to do to get a view matrix which has a flipped Z axis?

Incorrect render of a cube mesh in DirectX 11

I am practicing DirectX 11 following Frank Luna's book.
I have implemented a demo that renders a cube, but the result is not correct.
https://i.imgur.com/2uSkEiq.gif
As I hope you can see from the image (I apologize for the low quality), it seems like the camera is "trapped" inside the cube even when I move it away. There is also a camera frustum clipping problem.
I think the problem is therefore in the definition of the projection matrix.
Here is the cube vertices definition.
std::vector<Vertex> vertices =
{
{XMFLOAT3(-1, -1, -1), XMFLOAT4(1, 1, 1, 1)},
{XMFLOAT3(-1, +1, -1), XMFLOAT4(0, 0, 0, 1)},
{XMFLOAT3(+1, +1, -1), XMFLOAT4(1, 0, 0, 1)},
{XMFLOAT3(+1, -1, -1), XMFLOAT4(0, 1, 0, 1)},
{XMFLOAT3(-1, -1, +1), XMFLOAT4(0, 0, 1, 1)},
{XMFLOAT3(-1, +1, +1), XMFLOAT4(1, 1, 0, 1)},
{XMFLOAT3(+1, +1, +1), XMFLOAT4(0, 1, 1, 1)},
{XMFLOAT3(+1, -1, +1), XMFLOAT4(1, 0, 1, 1)},
};
Here is how I calculate the view and projection matrices.
void TestApp::OnResize()
{
D3DApp::OnResize();
mProj = XMMatrixPerspectiveFovLH(XM_PIDIV4, AspectRatio(), 1, 1000);
}
void TestApp::UpdateScene(float dt)
{
float x = mRadius * std::sin(mPhi) * std::cos(mTheta);
float y = mRadius * std::cos(mPhi);
float z = mRadius * std::sin(mPhi) * std::sin(mTheta);
XMVECTOR EyePosition = XMVectorSet(x, y, z, 1);
XMVECTOR FocusPosition = XMVectorZero();
XMVECTOR UpDirection = XMVectorSet(0, 1, 0, 0);
mView = XMMatrixLookAtLH(EyePosition, FocusPosition, UpDirection);
}
And here is how I update the camera position on mouse move.
glfwSetCursorPosCallback(mMainWindow, [](GLFWwindow* window, double xpos, double ypos)
{
TestApp* app = reinterpret_cast<TestApp*>(glfwGetWindowUserPointer(window));
if (glfwGetMouseButton(window, GLFW_MOUSE_BUTTON_LEFT) == GLFW_PRESS)
{
float dx = 0.25f * XMConvertToRadians(xpos - app->mLastMousePos.x);
float dy = 0.25f * XMConvertToRadians(ypos - app->mLastMousePos.y);
app->mTheta += dx;
app->mPhi += dy;
app->mPhi = std::clamp(app->mPhi, 0.1f, XM_PI - 0.1f);
}
else if (glfwGetMouseButton(window, GLFW_MOUSE_BUTTON_RIGHT) == GLFW_PRESS)
{
float dx = 0.05f * XMConvertToRadians(xpos - app->mLastMousePos.x);
float dy = 0.05f * XMConvertToRadians(ypos - app->mLastMousePos.y);
app->mRadius += (dx - dy);
app->mRadius = std::clamp(app->mRadius, 3.f, 15.f);
}
app->mLastMousePos = XMFLOAT2(xpos, ypos);
});
Thanks.
The root problem here was in the constant buffer vs. CPU update.
HLSL defaults to column-major matrix definitions per Microsoft Docs. DirectXMath uses row-major matrices, so you have to transpose while updating the Constant Buffer.
Alternatively, you can declare the HLSL matrix with the row_major keyword, #pragma pack_matrix, or the /Zpr compiler switch.

How to rotate an object using glm::lookAt()?

I'm working on a scenario that involves some cone meshes that are to be used as spot lights in a deferred renderer. I need to scale, rotate and translate these cone meshes so that they point in the correct direction. According to one of my lecturers I can rotate the cones to align with a direction vector and move them to the correct position by multiplying its model matrix with the matrix returned by this,
glm::inverse(glm::lookAt(spot_light_direction, spot_light_position, up));
however this doesn't seem to work, doing this causes all of the cones to be placed on the world origin. If I then translate the cones manually using another matrix it seems that the cones aren't even facing the right direction.
Is there a better way to rotate objects so that they face a specific direction?
Here is my current code that gets executed for each cone,
//Move the cone to the correct place
glm::mat4 model = glm::mat4(1, 0, 0, 0,
0, 1, 0, 0,
0, 0, 1, 0,
spot_light_position.x, spot_light_position.y, spot_light_position.z, 1);
// Calculate rotation matrix
model *= glm::inverse(glm::lookAt(spot_light_direction, spot_light_position, up));
float missing_angle = 180 - (spot_light_angle / 2 + 90);
float scale = (spot_light_range * sin(missing_angle)) / sin(spot_light_angle / 2);
// Scale the cone to the correct dimensions
model *= glm::mat4(scale, 0, 0, 0,
0, scale, 0, 0,
0, 0, spot_light_range, 0,
0, 0, 0, 1);
// The origin of the cones is at the flat end, offset their position so that they rotate around the point.
model *= glm::mat4(1, 0, 0, 0,
0, 1, 0, 0,
0, 0, 1, 0,
0, 0, -1, 1);
I've noted this in the comments but I'll mention again that the cones origin is at center of the flat end of the cone, I don't know if this makes a difference or not, I just thought I'd bring it up.
Your order of the matrices seems correct, but the lookAt function expects:
glm::mat4 lookAt ( glm::vec3 eye, glm::vec3 center, glm::vec3 up )
Here eye is the location of the camera, center is the location of the object you are looking at (in your case if you dont have that location, you can use
spot_light_direction + spot_light_position ).
so just change
glm::lookAt(spot_light_direction, spot_light_position, up)
to
glm::lookAt(spot_light_position, spot_light_direction + spot_light_position, up)

Reflect camera in a plane

I have a camera, which is defined through an up vector, a position and a reference point (camera looks at this point). Furthermore I can calculate the view direction, of course.
Now I tried to reflect this camera in a plane (e.g. z = 0). My first attempt was to reflect every single vector in the plane with the belonging reflection matrix and looked like this:
mat4 mReflection = mat4(1, 0, 0, 0,
0, 1, 0, 0,
0, 0, -1, 0,
0, 0, 0, 1);
up = mReflection * up;
position = mReflection * position;
lookAt = mReflection * lookAt;
But this didn't work very well and I don't know why. What is wrong with this method?

OpenGl custom rotation issue

I'm trying to implement my custom opengl Rotation around y axix. Here is my code;
void mglRotateY(float angle)
{
float radians = angle * (PI/180);
GLfloat t[4][4] =
{
{cosf(angle), 0, -sinf(angle),0},
{0, 1, 0, 0},
{sinf(angle), 0, cosf(angle), 0},
{0, 0, 0, 1}
}; //Rotation matrix y
glMultMatrixf(*t);
}
The effect is a rotation around y axis, but the degrees seems to not correspond.
Does anyone know why?
Use radians not angle when calculating the sine and cosine.
In your code you reference angle instead of radian. Also you may want to precalc the values, as you have 4 calculations to populate the matrix t
perhaps something like
void mglRotateY(float angle)
{
float radians = angle * (PI/180);
float cosVal = cosf(radians);
float sinVal = sinf(radians);
GLfloat t[4][4] =
{
{cosVal, 0, -sinVal,0},
{0, 1, 0, 0},
{sinVal, 0, cosVal, 0},
{0, 0, 0, 1}
}; //Rotation matrix y
glMultMatrixf(*t);
}