ModelMatrix + ViewMatrix vs ModelViewMatrix - opengl

I have a code that works passing the modelViewMatrix to the shader. I try to pass the modelMatrix and the viewMatrix separatly, but I don't get the same result, and I really don't understand what I am missing...
Here are some part of my code :
Passing the ModelViewmatrix (working fine)
java code :
shaderProgram.loadProjectionMatrix(renderer.getProjectionMatrix());
Matrix4f viewMatrix = TransformationMatrix.createViewMatrix(renderer.camera);
Matrix4f modelMatrix = new Matrix4f();
modelMatrix.setIdentity();
Matrix4f layerTransformationMatrix = getTransformationMatrix();
// set the translation
Vector3f entityTranslation = shaderProgram.getTranslation();
modelMatrix.m30 = entityTranslation.x;
modelMatrix.m31 = entityTranslation.y;
modelMatrix.m32 = entityTranslation.z;
// mix the modelMatrix with the layer transformation
modelMatrix.mul(layerTransformationMatrix);
// reset the rotation
modelMatrix.m00 = viewMatrix.m00;
modelMatrix.m01 = viewMatrix.m10;
modelMatrix.m02 = viewMatrix.m20;
modelMatrix.m10 = viewMatrix.m01;
modelMatrix.m11 = viewMatrix.m11;
modelMatrix.m12 = viewMatrix.m21;
modelMatrix.m20 = viewMatrix.m02;
modelMatrix.m21 = viewMatrix.m12;
modelMatrix.m22 = viewMatrix.m22;
// compute modelViewMatrix
Matrix4f modelViewMatrix = modelMatrix;
modelViewMatrix.mul(viewMatrix);
// inverse y axis
modelViewMatrix.m11 = -1;
shaderProgram.loadModelViewMatrix(modelViewMatrix);
shader code :
void main(void)
{
gl_Position = projectionMatrix * modelViewMatrix * vec4(attribute_Position.xy, 0.0, 1.0);
pass_textureCoords = attribute_TextureCoords;
varying_Color = attribute_Color;
}
Screenshot :
Passing the ModelMatrix and the ViewMatrix (not working)
java code :
shaderProgram.loadProjectionMatrix(renderer.getProjectionMatrix());
Matrix4f viewMatrix = TransformationMatrix.createViewMatrix(renderer.camera);
Matrix4f modelMatrix = new Matrix4f();
modelMatrix.setIdentity();
Matrix4f layerTransformationMatrix = getTransformationMatrix();
// set the translation
Vector3f entityTranslation = shaderProgram.getTranslation();
modelMatrix.m30 = entityTranslation.x;
modelMatrix.m31 = entityTranslation.y;
modelMatrix.m32 = entityTranslation.z;
// mix the modelMatrix with the layer transformation
modelMatrix.mul(layerTransformationMatrix);
// reset the rotation
modelMatrix.m00 = viewMatrix.m00;
modelMatrix.m01 = viewMatrix.m10;
modelMatrix.m02 = viewMatrix.m20;
modelMatrix.m10 = viewMatrix.m01;
modelMatrix.m11 = viewMatrix.m11;
modelMatrix.m12 = viewMatrix.m21;
modelMatrix.m20 = viewMatrix.m02;
modelMatrix.m21 = viewMatrix.m12;
modelMatrix.m22 = viewMatrix.m22;
shaderProgram.loadModelMatrix(modelMatrix);
shaderProgram.loadViewMatrix(viewMatrix);
shader code :
void main(void)
{
mat4 MVMatrix = modelMatrix * viewMatrix;
MVMatrix[1][1] = -1;
gl_Position = projectionMatrix * MVMatrix * vec4(attribute_Position.xy, 0.0, 1.0);
pass_textureCoords = attribute_TextureCoords;
varying_Color = attribute_Color;
}
Screenshot :

The order in which you multiply the matrices in the shader is wrong. It has to be
mat4 MVMatrix = viewMatrix * modelMatrix;
See this post for an explanation.

Related

OpenGL object not scaling properly

I want to scale a triangle with a model matrix. I have this code:
void Triangle::UpdateTransform()
{
mView = glm::translate(glm::mat4(1.0f), glm::vec3(0.0f));
mModel = glm::scale(glm::mat4(1.0f), glm::vec3(2.f));
mModel = glm::translate(glm::mat4(1.0f), mLocation);
mMVP = mProj*mView*mModel;
}
With this code I get no results.But if I change the order of the scale and translation:
mView = glm::translate(glm::mat4(1.0f), glm::vec3(0.0f));
mModel = glm::translate(glm::mat4(1.0f), mLocation);
mModel = glm::scale(glm::mat4(1.0f), glm::vec3(2.f));
mMVP = mProj*mView*mModel;
I get a very weird result: result(triangle should be at the center)
I have no idea what is causing this, maybe it has something to do with the orders.
I'd really appreciate some help.
My vertex shader:
#version 410 core
layout(location = 0) in vec4 position;
uniform mat4 u_MVP;
void main()
{
gl_Position = u_MVP * position;
};
The 1st argument of glm::scale and glm::translate is the input matrix. These functions define a matrix and multiply the input matrix by the newly specified matrix.
In both cases, you specify the Identity matrix (glm::mat4(1.0f)) for the inout matrix. You have to pass mModel as the input matrix. e.g.:
mModel = glm::translate(glm::mat4(1.0f), mLocation);
mModel = glm::scale(mModel, glm::vec3(2.f)); // <-- mModel instead of glm::mat4(1.0f)

GLSL convert valid glm::mat4 matrix to nan matrix

I follow tutorial http://www.mbsoftworks.sk/tutorials/opengl3/ and try to compile 10th example.
Everything work fine besides place I send (projection matrix mul modelview matrix) to the shader.
There is place where I send matrix:
//...
// render.cpp
glm::mat4 projectionMatrix = *(oglControl->getProjectionMatrix());
glm::mat4 cam = glm::translate(mModelView, cCamera.vEye);
auto newM = projectionMatrix * cam;
spDirectionalLight.setUniform("projectionMatrixMulModelViewMatrix",&newM);
//...
//...
// setUniform implementation
void CShaderProgram::setUniform(string sName, glm::mat4* mMatrices, int iCount)
{
int iLoc = glGetUniformLocation(uiProgram, sName.c_str());
glUniformMatrix4fv(iLoc, iCount, FALSE, (GLfloat*)mMatrices);
}
//...
and at the end mMatrices contain
] .
Shader code
#version 330 core
uniform mat4 projectionMatrixMulModelViewMatrix;
uniform mat4 normalMatrix;
layout (location = 0) in vec3 inPosition;
layout (location = 1) in vec2 inCoord;
layout (location = 2) in vec3 inNormal;
out vec2 texCoord;
smooth out vec3 vNormal;
void main()
{
gl_Position = projectionMatrixMulModelViewMatrix*vec4(inPosition, 1.0);
texCoord = inCoord;
vec4 vRes = normalMatrix*vec4(inNormal, 0.0);
vNormal = vRes.xyz;
}
The result is blank screen. Renderdoc debugger tells me that gl_position matrix completely NaN.
Renderdoc screeenshot
When I send glm::mat4(1) I got valid result.
Why after multiplication shader got NaN vector?
Seems like VS2017 tricked me. I haven't initialize matrix deep inside code:
glm::mat4 mModelView = cCamera.look();
glm::mat4 CFlyingCamera::look()
{
glm::mat4 result = glm::mat4(1.0f);
result = glm::lookAt(vEye, vView, vUp);
return result;
}
and here is my fault
void CFlyingCamera::update()
{
// Change camera view direction
rotateWithMouse();
// Get view direction
glm::vec3 vMove = vView-vEye;
vMove = glm::normalize(vMove);
vMove *= fSpeed;
// Get normal to view direction vector
glm::vec3 vStrafe = glm::cross(vView-vEye, vUp);
vStrafe = glm::normalize(vStrafe);
vStrafe *= fSpeed;
int iMove = 0;
////
////
///// ERROR HERE, vMoveBy isn't initialized
glm::vec3 vMoveBy;
/////
////
////
// Get vector of move
if(Keys::key(iForw))vMoveBy += vMove*appMain.sof(1.0f);
if(Keys::key(iBack))vMoveBy -= vMove*appMain.sof(1.0f);
if(Keys::key(iLeft))vMoveBy -= vStrafe*appMain.sof(1.0f);
if(Keys::key(iRight))vMoveBy += vStrafe*appMain.sof(1.0f);
vEye += vMoveBy; vView += vMoveBy;
}
But debugger gave me valid matrix anyway. Do you know why it's happened?

GLM rotates objects around origin and around of the object itself

I'm trying to rotate a bunch of objects on their x axis.
This is how I calculate an object's transform:
glm::mat4 GameObject::getTransform(float angle) {
glm::mat4 model = glm::mat4(1.0f);
model = glm::translate(model, position);
model = glm::rotate(model, angle, glm::vec3(1.0f, 0.0f, 0.0f));
model = glm::scale(model, scaleValue);
return model;
}
I've tried to put the translate, rotate, scale functions into different order with no avail. Only strange behaviour.
This is how I iterate over objects and draw them:
for (auto row : objectRows) {
for (auto object : row) {
glm::mat4 model = object->getTransform(glfwGetTime());
glm::mat4 mvp = projection * view * model;
mainShader.setMat4("model", model);
mainShader.setMat4("mvp", mvp);
mainShader.setVec3("objectColour", object->colour);
object->mesh.draw(mainShader);
}
}
The vertex shader:
#version 330 core
layout (location = 0) in vec3 aPos;
layout (location = 1) in vec3 aNormal;
out vec3 fragPos;
out vec3 normal;
uniform mat4 model;
uniform mat4 mvp;
void main()
{
fragPos = vec3(model * vec4(aPos, 1.0));
normal = mat3(transpose(inverse(model))) * aNormal;
gl_Position = mvp * vec4(fragPos, 1.0f);
}
And the result:
As you can see the objects at the top rotate only around themselves and the lower other objects are the more they rotate around what I think is the world origin point?
I've read many similar looking posts explaining about the order of multiplying the matrices but nothing seems to help, and I can't help to think it is something stupidly simple that I'm overlooking.
Turns out the problem was in the vertex shader.
void main()
{
fragPos = vec3(model * vec4(aPos, 1.0));
normal = mat3(transpose(inverse(model))) * aNormal;
gl_Position = mvp * vec4(fragPos, 1.0f);
}
I accidentally was multiplying the model matrix twice. fragPos is a result of multiplying model with a vertex. Two lines below I multiply mvp with fragPos, so the calculation is model * view * projection * model.
To fix this I separated mvp and set each matrix as its own uniform in the shader and changed the line gl_Position = mvp * vec4(fragPos, 1.0f); to gl_Position = projection * view * vec4(fragPos, 1.0);

lwjgl model is not rotating at its center

I have an entity(triangle),when am trying to rotate its rotating in circular motion.When i applied the projection and transformation matrix its not centered either,it is to the right by some distance from center, i dont even know why its not centered
transformation matrix along with vertices
float[] vertices = {
1409.598f, -58.85f, 1471.946f,
1460.572f, -58.9047f, 1462.047f,
1408.506f, -20.5531f, 1471.137f
};
public static Matrix4f createTransformationMatrix(Vector3f entity, float rx, float ry,
float rz, float scale) {
Matrix4f matrix = new Matrix4f();
matrix.setIdentity();
Matrix4f.translate(entity, matrix, matrix);
Matrix4f.rotate((float) Math.toRadians(rx), new Vector3f(1,0,0), matrix, matrix);
Matrix4f.rotate((float) Math.toRadians(ry), new Vector3f(0,1,0), matrix, matrix);
Matrix4f.rotate((float) Math.toRadians(rz), new Vector3f(0,0,1), matrix, matrix);
Matrix4f.scale(new Vector3f(scale,scale,scale), matrix, matrix);
return matrix;
}
Any help ?
Projection matrix:
private static final float FOV = 70;
private static final float NEAR_PLANE = 0.1f;
private static final float FAR_PLANE = 10000;
private void createProjectionMatrix() {
float aspectRatio = (float) Display.getWidth() / (float) Display.getHeight();
float y_scale = (float) ((1f / Math.tan(Math.toRadians(FOV / 2f))) * aspectRatio);
float x_scale = y_scale / aspectRatio;
float frustum_length = FAR_PLANE - NEAR_PLANE;
projectionMatrix = new Matrix4f();
projectionMatrix.m00 = x_scale;
projectionMatrix.m11 = y_scale;
projectionMatrix.m20 = 0f;
projectionMatrix.m21 = 0f;
projectionMatrix.m22 = -((FAR_PLANE + NEAR_PLANE) / frustum_length);
projectionMatrix.m23 = -1;
projectionMatrix.m32 = -((2 * NEAR_PLANE * FAR_PLANE) / frustum_length);
projectionMatrix.m33 = 0;
}
View Matrix:
public static Matrix4f createViewMatrix(Camera camera) {
Matrix4f viewMatrix = new Matrix4f();
viewMatrix.setIdentity();
Matrix4f.rotate((float) Math.toRadians(camera.getPitch()), new Vector3f(1, 0, 0), viewMatrix,
viewMatrix);
Matrix4f.rotate((float) Math.toRadians(camera.getYaw()), new Vector3f(0, 1, 0), viewMatrix, viewMatrix);
Vector3f cameraPos = camera.getPosition();
Vector3f negativeCameraPos = new Vector3f(-cameraPos.x, -cameraPos.y, -cameraPos.z);
Matrix4f.translate(negativeCameraPos, viewMatrix, viewMatrix);
return viewMatrix;
}
Vertex Shader:
#version 400 core
in vec3 position;
uniform mat4 transformationMatrix;
uniform mat4 projectionMatrix;
uniform mat4 viewMatrix;
void main(void){
gl_Position = projectionMatrix * viewMatrix * transformationMatrix * vec4(position,1.0);
}
I think you have an error in projection matrix. Use this line:
float y_scale = 1f / Math.tan(Math.toRadians(FOV / 2f));

Rotating object around multiple axes using glm::mat4

I'm trying to apply multiple rotations around x,y,z axis to an object by using glm::rotate method but for some reason it only rotates around one axis and seems to be completely ignoring other rotations.
Here is how I apply rotation:
glm::mat4 rotateTransform = glm::mat4(1.0f);
rotateTransform = glm::rotate(rotateTransform, this->rotation.x, glm::vec3(1, 0, 0));
rotateTransform = glm::rotate(rotateTransform, this->rotation.y, glm::vec3(0, 1, 0));
rotateTransform = glm::rotate(rotateTransform, this->rotation.z, glm::vec3(0, 0, 1));
return glm::translate(glm::mat4(1.0f), this->position) * rotateTransform * glm::scale(glm::mat4(1.0f), this->scale);
the method returns modelToWorldMatrix which I then pass to my vertexShader where I perform standart calculation on a vertex:
vec4 vertexPositionInModelSpace = vec4(Position, 1);
vec4 vertexInWorldSpace = gModelToWorldTransform * vertexPositionInModelSpace;
vec4 vertexInViewSpace = gWorldToViewTransform * vertexInWorldSpace;
vec4 vertexInHomogeneousClipSpace = gProjectionTransform * vertexInViewSpace;
gl_Position = vertexInHomogeneousClipSpace;
So how can you apply multiple rotations by using glm::mat4 matrices?