Custom point lights in world space (LWJGL / GLSL) - opengl

I'm trying to implement custom lights in world space (I know, it is suggested to do it in view space, but I have my reasons) using LWJGL and GLSL into a little game I'm programming. To do so I need the vertex positions in world space in the shader. I know, that I can calculate them as modelMatrix*gl_Vertex in the Vertex Shader. Since the modelMatrix is not accessible in the shader, I calculate it (as it is done here) by multiplying the ModelViewMatrix with the inverse of the viewMatrix.
Vertex shader code:
uniform mat4 inverseViewMatrix;
varying vec3 normal;
varying vec3 worldCoord;
normal = normalize(vec3(gl_ModelViewMatrix *inverseViewMatrix * vec4(gl_Normal,0.0)));
worldCoord = vec3(gl_ModelViewMatrix *inverseViewMatrix *gl_Vertex); //coordinates in world space?
The light is then calculated per pixel in the fragment shader:
uniform vec4 lights[30]; //lightPositions...
uniform vec4 lightCol[30]; //...& lightColors passed from the Java-Code
uniform int noOfLights;
varying vec3 normal;
varying vec3 worldCoord;
for(int i = 0; i < noOfLights; i++){
vec3 lightPos = (lights[i]).xyz; // lights[i].w does contain data if parallel or point light
if(lights[i].w == 1.0){ //parallel lights (working fine)
vec3 lightDir = normalize(lightPos); //light direction
float NdotL = max(dot(normal, lightDir), 0.0);
brightness += (NdotL*lightCol[i].xyz); //lightCol[i].xyz does contain the color of the light
}
else{ //point lights (not working)
vec3 lightPosRelative = lightPos - worldCoord; //Vector from Vertex to point light
float dist = length(lightPosRelative);
if(dist < lightCol[i].w){ //lightCol[i].w is the radius of the point light
vec3 lightDir = normalize(lightPosRelative); //light direction to point light
float NdotL = max(dot(normal, lightDir), 0.0);
float distVal = pow(1.0-(dist/lightCol[i].w), 2.0); //light attenuation with distance
distVal = clamp(distVal, 0.0, 1.0);
brightness += (NdotL*lightCol[i].xyz)*distVal; //lightCol[i].xyz does contain the color of the light
}
}
}
[... brightness is then multiplied with the vertexColor...]
The inverse of the view matrix is passed to the shader from the Java side of the code. It is calculated after the camera has been positioned using glRotatef(...) and glTranslatef(...).
Matrix4f viewMatrix = new Matrix4f();
Matrix4f inverseViewMatrix = new Matrix4f();
viewBuffer = BufferUtils.createFloatBuffer(16);
viewBuffer.rewind();
glGetFloat(GL_MODELVIEW_MATRIX, viewBuffer);
viewMatrix = (Matrix4f) viewMatrix.load(modelBuffer);
Matrix4f.invert(viewMatrix, inverseViewMatrix);
The result looks fine for the parallel lights. The calculation of the normals therefore seems to be correct.
The point lights, however, do only affect the terrain (which is never rotated nor translated) and not other objects (which are translated and rotated).
Somethings seems to be wrong with the calculation of worldCoord for the objects (I can see that by displaying worldCoord instead of the brightness).
Any suggestions what is wrong in my code?

Related

How to properly transform normals for diffuse lighting in OpenGL?

In the attempt to get diffuse lighting correct, I read several articles and tried to apply them as close as possible.
However, even if the transform of normal vectors seems close to be right, the lighting still slides slightly over the object (which should not be the case for a fixed light).
Note 1: I added bands based on the dot product to make the problem more apparent.
Note 2: This is not Sauron eye.
In the image two problems are apparent:
The normal is affected by the projection matrix: when the viewport is horizontal, the normals display an elliptic shading (as in the image). When the viewport is vertical (height>width), the ellipse is vertical.
The shading move over the surface when the camera is rotated around the object.This is not much visible with normal lighting, but get apparent when projecting patterns from the light source.
Code and attempts:
Unfortunately, a minimal working example get soon very large, so I will only post relevant code. If this is not enough, as me and I will try to publish somewhere the code.
In the drawing function, I have the following matrix creation:
glm::mat4 projection = glm::perspective(45.0f, (float)m_width/(float)m_height, 0.1f, 200.0f);
glm::mat4 view = glm::translate(glm::mat4(1), glm::vec3(0.0f, 0.0f, -2.5f))*rotationMatrix; // make the camera 2.5f away, and rotationMatrix is driven by the mouse.
glm::mat4 model = glm::mat4(1); //The sphere at the center.
glm::mat4 mvp = projection * view * model;
glm::mat4 normalVp = projection * glm::transpose(glm::inverse(view * model));
In the vertex shader, the mvp is used to transform position and normals:
#version 420 core
uniform mat4 mvp;
uniform mat4 normalMvp;
in vec3 in_Position;
in vec3 in_Normal;
in vec2 in_Texture;
out Vertex
{
vec4 pos;
vec4 normal;
vec2 texture;
} v;
void main(void)
{
v.pos = mvp * vec4(in_Position, 1.0);
gl_Position = v.pos;
v.normal = normalMvp * vec4(in_Normal, 0.0);
v.texture = in_Texture;
}
And in the fragment shader, the diffuse shading is applied:
#version 420 core
in Vertex
{
vec4 pos;
vec4 normal;
vec2 texture;
} v;
uniform sampler2D uSampler1;
out vec4 out_Color;
uniform mat4 mvp;
uniform mat4 normalMvp;
uniform vec3 lightsPos;
uniform float lightsIntensity;
void main()
{
vec3 color = texture2D(uSampler1, v.texture);
vec3 lightPos = (mvp * vec4(lightsPos, 1.0)).xyz;
vec3 lightDirection = normalize( lightPos - v.pos.xyz );
float dot = clamp(dot(lightDirection, normalize(v.normal.xyz)), 0.0, 1.0);
vec3 ambient = 0.3 * color;
vec3 diffuse = dot * lightsIntensity * color;
// Here I have my debug code to add the projected bands on the image.
// kind of if(dot>=0.5 && dot<0.75) diffuse +=0.2;...
vec3 totalLight = ambient + diffuse;
out_Color = vec4(totalLight, 1.0);
}
Question:
How to properly transform the normals to get diffuse shading?
Related articles:
How to calculate the normal matrix?
GLSL normals with non-standard projection matrix
OpenGL Diffuse Lighting Shader Bug?
http://www.opengl-tutorial.org/beginners-tutorials/tutorial-3-matrices/
http://www.lighthouse3d.com/tutorials/glsl-12-tutorial/the-normal-matrix/
Mostly, all sources agree that it should be enough to multiply the projection matrix by the transpose of the inverse of the model-view matrix. That is what I think I am doing, but the result is not right apparently.
Lighting calculations should not be performed in clip space (including the projection matrix). Leave the projection away from all variables, including light positions etc., and you should be good.
Why is that? Well, lighting is a physical phenomenon that essentially depends on angles and distances. Therefore, to calculate it, you should choose a space that preserves these things. World space or camera space are two examples of angle and distance-preserving spaces (compared to the physical space). You may of course define them differently, but in most cases they are. Clip space preserves neither of the two, hence the angles and distances you calculate in this space are not the physical ones you need to determine physical lighting.

How to handle lightning (ambient, diffuse, specular) for point spheres in openGL

Initial situation
I want to visualize simulation data in openGL.
My data consists of particle positions (x, y, z) where each particle has some properties (like density, temperature, ...) which will be used for coloring. Those (SPH) particles (100k to several millions), grouped together, actually represent planets, in case you wonder. I want to render those particles as small 3D spheres and add ambient, diffuse and specular lighting.
Status quo and questions
In MY case: In which coordinate frame do I do the lightning calculations? Which way is the "best" to pass the various components through the pipeline?
I saw that it is common to do it in view space which is also very intuitive. However: The normals at the different fragment positions are calculated in the fragment shader in clip space coordinates (see appended fragment shader). Can I actually convert them "back" into view space to do the lightning calculations in view space for all the fragments? Would there be any advantage compared to doing it in clip space?
It would be easier to get the normals in view space if I would use meshes for each sphere but I think with several million particles this would decrease performance drastically, so better do it with sphere intersection, would you agree?
PS: I don't need a model matrix since all the particles are already in place.
//VERTEX SHADER
#version 330 core
layout (location = 0) in vec3 position;
layout (location = 2) in float density;
uniform float radius;
uniform vec3 lightPos;
uniform vec3 viewPos;
out vec4 lightDir;
out vec4 viewDir;
out vec4 viewPosition;
out vec4 posClip;
out float vertexColor;
// transformation matrices
uniform mat4 model;
uniform mat4 view;
uniform mat4 projection;
void main()
{
lightDir = projection * view * vec4(lightPos - position, 1.0f);
viewDir = projection * view * vec4(viewPos - position, 1.0f);
viewPosition = projection * view * vec4(lightPos, 1.0f);
posClip = projection * view * vec4(position, 1.0f);
gl_Position = posClip;
gl_PointSize = radius;
vertexColor = density;
}
I know that projective divion happens for the gl_Position variable, does that actually happen to ALL vec4's which are passed from the vertex to the fragment shader? If not, maybe the calculations in the fragment shader would be wrong?
And the fragment shader where the normals and diffuse/specular lightning calculations in clip space:
//FRAGMENT SHADER
#version 330 core
in float vertexColor;
in vec4 lightDir;
in vec4 viewDir;
in vec4 posClip;
in vec4 viewPosition;
uniform vec3 lightColor;
vec4 colormap(float x); // returns vec4(r, g, b, a)
out vec4 vFragColor;
void main(void)
{
// AMBIENT LIGHT
float ambientStrength = 0.0;
vec3 ambient = ambientStrength * lightColor;
// Normal calculation done in clip space (first from texture (gl_PointCoord 0 to 1) coord to NDC( -1 to 1))
vec3 normal;
normal.xy = gl_PointCoord * 2.0 - vec2(1.0); // transform from 0->1 point primitive coords to NDC -1->1
float mag = dot(normal.xy, normal.xy); // sqrt(x=1) = sqrt(x)
if (mag > 1.0) // discard fragments outside sphere
discard;
normal.z = sqrt(1.0 - mag); // because x^2 + y^2 + z^2 = 1
// DIFFUSE LIGHT
float diff = max(0.0, dot(vec3(lightDir), normal));
vec3 diffuse = diff * lightColor;
// SPECULAR LIGHT
float specularStrength = 0.1;
vec3 viewDir = normalize(vec3(viewPosition) - vec3(posClip));
vec3 reflectDir = reflect(-vec3(lightDir), normal);
float shininess = 64;
float spec = pow(max(dot(vec3(viewDir), vec3(reflectDir)), 0.0), shininess);
vec3 specular = specularStrength * spec * lightColor;
vFragColor = colormap(vertexColor / 8) * vec4(ambient + diffuse + specular, 1);
}
Now this actually "kind of" works but i have the feeling that also the sides of the sphere which do NOT face the light source are being illuminated, which shouldn't happen. How can I fix this?
Some weird effect: In this moment the light source is actually BEHIND the left planet (it just peaks out a little bit at the top left), bit still there are diffuse and specular effects going on. This side should be actually pretty dark! =(
Also at this moment I get some glError: 1282 error in the fragment shader and I don't know where it comes from since the shader program actually compiles and runs, any suggestions? :)
The things that you are drawing aren't actually spheres. They just look like them from afar. This is absolutely ok if you are fine with that. If you need geometrically correct spheres (with correct sizes and with a correct projection), you need to do proper raycasting. This seems to be a comprehensive guide on this topic.
1. What coordinate system?
In the end, it is up to you. The coordinate system just needs to fulfill some requirements. It must be angle-preserving (because lighting is all about angles). And if you need distance-based attenuation, it should also be distance-preserving. The world and the view coordinate systems usually fulfill these requirements. Clip space is not suited for lighting calculations as neither angles nor distances are preserved. Furthermore, gl_PointCoord is in none of the usual coordinate systems. It is its own coordinate system and you should only use it together with other coordinate systems if you know their relation.
2. Meshes or what?
Meshes are absolutely not suited to render spheres. As mentioned above, raycasting or some screen-space approximation are better choices. Here is an example shader that I used in my projects:
#version 330
out vec4 result;
in fData
{
vec4 toPixel; //fragment coordinate in particle coordinates
vec4 cam; //camera position in particle coordinates
vec4 color; //sphere color
float radius; //sphere radius
} frag;
uniform mat4 p; //projection matrix
void main(void)
{
vec3 v = frag.toPixel.xyz - frag.cam.xyz;
vec3 e = frag.cam.xyz;
float ev = dot(e, v);
float vv = dot(v, v);
float ee = dot(e, e);
float rr = frag.radius * frag.radius;
float radicand = ev * ev - vv * (ee - rr);
if(radicand < 0)
discard;
float rt = sqrt(radicand);
float lambda = max(0, (-ev - rt) / vv); //first intersection on the ray
float lambda2 = (-ev + rt) / vv; //second intersection on the ray
if(lambda2 < lambda) //if the first intersection is behind the camera
discard;
vec3 hit = lambda * v; //intersection point
vec3 normal = (frag.cam.xyz + hit) / frag.radius;
vec4 proj = p * vec4(hit, 1); //intersection point in clip space
gl_FragDepth = ((gl_DepthRange.diff * proj.z / proj.w) + gl_DepthRange.near + gl_DepthRange.far) / 2.0;
vec3 vNormalized = -normalize(v);
float nDotL = dot(vNormalized, normal);
vec3 c = frag.color.rgb * nDotL + vec3(0.5, 0.5, 0.5) * pow(nDotL, 120);
result = vec4(c, frag.color.a);
}
3. Perspective division
Perspective division is not applied to your attributes. The GPU does perspective division on the data that you pass via gl_Position on the way to transforming them to screen space. But you will never actually see this perspective-divided position unless you do it yourself.
4. Light in the dark
This might be the result of you mixing different coordinate systems or doing lighting calculations in clip space. Btw, the specular part is usually not multiplied by the material color. This is light that gets reflected directly at the surface. It does not penetrate the surface (which would absorb some colors depending on the material). That's why those highlights are usually white (or whatever light color you have), even on black objects.

opengl - spot light moves when camera rotates when using a shader

I implemented a simple shader for the lighting; it kind of works, but the light seems to move when the camera rotates (and only when it rotates).
I'm experimenting with a spotlight, this is how it looks like (it's the spot in the center):
If now I rotate the camera, the spot moves around; for example, here I looked down (I didn't move at all, just looked down) and it seemed at my feet:
I've looked it up and I've seen that it's a common mistake when mixing reference systems in the shader and/or when setting the light's position before moving the camera.
The thing is, I'm pretty sure I'm not doing these two things, but apparently I'm wrong; it's just that I can't find the bug.
Here's the shader:
Vertex Shader
varying vec3 vertexNormal;
varying vec3 lightDirection;
void main()
{
vertexNormal = gl_NormalMatrix * gl_Normal;
lightDirection = vec3(gl_LightSource[0].position.xyz - (gl_ModelViewMatrix * gl_Vertex).xyz);
gl_Position = ftransform();
}
Fragment Shader
uniform vec3 ambient;
uniform vec3 diffuse;
uniform vec3 specular;
uniform float shininess;
varying vec3 vertexNormal;
varying vec3 lightDirection;
void main()
{
vec3 color = vec3(0.0, 0.0, 0.0);
vec3 lightDirNorm;
vec3 eyeVector;
vec3 half_vector;
float diffuseFactor;
float specularFactor;
float attenuation;
float lightDistance;
vec3 normalDirection = normalize(vertexNormal);
lightDirNorm = normalize(lightDirection);
eyeVector = vec3(0.0, 0.0, 1.0);
half_vector = normalize(lightDirNorm + eyeVector);
diffuseFactor = max(0.0, dot(normalDirection, lightDirNorm));
specularFactor = max(0.0, dot(normalDirection, half_vector));
specularFactor = pow(specularFactor, shininess);
color += ambient * gl_LightSource[0].ambient;
color += diffuseFactor * diffuse * gl_LightSource[0].diffuse;
color += specularFactor * specular * gl_LightSource[0].specular;
lightDistance = length(lightDirection[i]);
float constantAttenuation = 1.0;
float linearAttenuation = (0.02 / SCALE_FACTOR) * lightDistance;
float quadraticAttenuation = (0.0 / SCALE_FACTOR) * lightDistance * lightDistance;
attenuation = 1.0 / (constantAttenuation + linearAttenuation + quadraticAttenuation);
// If it's a spotlight
if(gl_LightSource[i].spotCutoff <= 90.0)
{
float spotEffect = dot(normalize(gl_LightSource[0].spotDirection), normalize(-lightDirection));
if (spotEffect > gl_LightSource[0].spotCosCutoff)
{
spotEffect = pow(spotEffect, gl_LightSource[0].spotExponent);
attenuation = spotEffect / (constantAttenuation + linearAttenuation + quadraticAttenuation);
}
else
attenuation = 0.0;
}
color = color * attenuation;
// Moltiplico il colore per il fattore di attenuazione
gl_FragColor = vec4(color, 1.0);
}
Now, I can't show you the code where I render the things, because it's a custom language which integrates opengl and it's designed to create 3D applications (it wouldn't help to show you); but what I do is something like this:
SetupLights();
UpdateCamera();
RenderStuff();
Where:
SetupLights contains actual opengl calls that setup the lights and their positions;
UpdateCamera updates the camera's position using the built-in classes of the language; I don't have much power here;
RenderStuff calls the built-in functions of the language to draw the scene; I don't have much power here either.
So, either I'm doing something wrong in the shader or there's something in the language that "behind the scenes" breaks things.
Can you point me in the right direction?
you wrote
the light's position is already in world coordinates, and that is where I'm doing the computations
however, since you're applying gl_ModelViewMatrix to your vertex and gl_NormalMatrix to your normal, these values are probably in view space, which might cause the moving light.
as an aside, your eye vector looks like it should be in view coordinates, however, view space is a right-handed coordinate system, so "forward" points along the negative z-axis. also, your specular computation will likely be off since you're using the same eye vector for all fragments, but it should probably point towards that fragment's position on the near/far planes.

Strange specular results from GLSL phong shader

I'm trying to implement phong shading in GLSL but am having some issues with the specular component.
The green light is the specular component. The light (a point light) travels in a circle above the plane. The specular highlight always points inward toward the Y axis about which the light rotates and fans out toward the diffuse reflection as seen in the image. It doesn't appear to be affected at all by the positioning of the camera and I'm not sure where I'm going wrong.
Vertex shader code:
#version 330 core
/*
* Phong Shading with with Point Light (Quadratic Attenutation)
*/
//Input vertex data
layout(location = 0) in vec3 vertexPosition_modelSpace;
layout(location = 1) in vec2 vertexUVs;
layout(location = 2) in vec3 vertexNormal_modelSpace;
//Output Data; will be interpolated for each fragment
out vec2 uvCoords;
out vec3 vertexPosition_cameraSpace;
out vec3 vertexNormal_cameraSpace;
//Uniforms
uniform mat4 mvMatrix;
uniform mat4 mvpMatrix;
uniform mat3 normalTransformMatrix;
void main()
{
vec3 normal = normalize(vertexNormal_modelSpace);
//Set vertices in clip space
gl_Position = mvpMatrix * vec4(vertexPosition_modelSpace, 1);
//Set output for UVs
uvCoords = vertexUVs;
//Convert vertex and normal into eye space
vertexPosition_cameraSpace = mat3(mvMatrix) * vertexPosition_modelSpace;
vertexNormal_cameraSpace = normalize(normalTransformMatrix * normal);
}
Fragment Shader Code:
#version 330 core
in vec2 uvCoords;
in vec3 vertexPosition_cameraSpace;
in vec3 vertexNormal_cameraSpace;
//out
out vec4 fragColor;
//uniforms
uniform sampler2D diffuseTex;
uniform vec3 lightPosition_cameraSpace;
void main()
{
const float materialAmbient = 0.025; //a touch of ambient
const float materialDiffuse = 0.65;
const float materialSpec = 0.35;
const float lightPower = 2.0;
const float specExponent = 2;
//--------------Set Colors and determine vectors needed for shading-----------------
//reflection colors- NOTE- diffuse and ambient reflections will use the texture color
const vec3 colorSpec = vec3(0,1,0); //Green spec color
vec3 diffuseColor = texture2D(diffuseTex, uvCoords).rgb; //Get color from the texture at fragment
const vec3 lightColor = vec3(1,1,1); //White light
//Re-normalize normal vectors : after interpolation they make not be unit length any longer
vec3 normVertexNormal_cameraSpace = normalize(vertexNormal_cameraSpace);
//Set camera vec
vec3 viewVec_cameraSpace = normalize(-vertexPosition_cameraSpace); //Since its view space, camera at origin
//Set light vec
vec3 lightVec_cameraSpace = normalize(lightPosition_cameraSpace - vertexPosition_cameraSpace);
//Set reflect vect
vec3 reflectVec_cameraSpace = normalize(reflect(-lightVec_cameraSpace, normVertexNormal_cameraSpace)); //reflect function requires incident vec; from light to vertex
//----------------Find intensity of each component---------------------
//Determine Light Intensity
float distance = abs(length(lightPosition_cameraSpace - vertexPosition_cameraSpace));
float lightAttenuation = 1.0/( (distance > 0) ? (distance * distance) : 1 ); //Quadratic
vec3 lightIntensity = lightPower * lightAttenuation * lightColor;
//Determine Ambient Component
vec3 ambientComp = materialAmbient * diffuseColor * lightIntensity;
//Determine Diffuse Component
float lightDotNormal = max( dot(lightVec_cameraSpace, normVertexNormal_cameraSpace), 0.0 );
vec3 diffuseComp = materialDiffuse * diffuseColor * lightDotNormal * lightIntensity;
vec3 specComp = vec3(0,0,0);
//Determine Spec Component
if(lightDotNormal > 0.0)
{
float reflectDotView = max( dot(reflectVec_cameraSpace, viewVec_cameraSpace), 0.0 );
specComp = materialSpec * colorSpec * pow(reflectDotView, specExponent) * lightIntensity;
}
//Add Ambient + Diffuse + Spec
vec3 phongFragRGB = ambientComp +
diffuseComp +
specComp;
//----------------------Putting it together-----------------------
//Out Frag color
fragColor = vec4( phongFragRGB, 1);
}
Just noting that the normalTransformMatrix seen in the Vertex shader is the inverse-transpose of the model-view matrix.
I am setting a vector from the vertex position to the light, to the camera, and the reflect vector, all in camera space. For the diffuse calculation I am taking the dot product of the light vector and the normal vector, and for the specular component I am taking the dot product of the reflection vector and the view vector. Perhaps there is some fundamental misunderstanding that I have with the algorithm?
I thought at first that the problem could be that I wasn't normalizing the normals entering the fragment shader after interpolation, but adding a line to normalize didn't affect the image. I'm not sure where to look.
I know that there a lot of phong shading questions on the site, but everyone seems to have a problem that is a bit different. If anyone can see where I am going wrong, please let me know. Any help is appreciated.
EDIT: Okay its working now! Just as jozxyqk suggested below, I needed to do a mat4*vec4 operation for my vertex position or lose the translation information. When I first made the change I was getting strange results until I realized that I was making the same mistake in my OpenGL code for the lightPosition_cameraSpace before I passed it to the shader (the mistake being that I was casting down the view matrix to a mat3 for the calculation instead of setting the light position vector as a vec4). Once I edited those lines the shader appears to be working properly! Thanks for the help, jozxqk!
I can see two parts which don't look right.
"vertexPosition_cameraSpace = mat3(mvMatrix) * vertexPosition_modelSpace" should be a mat4/vec4(x,y,z,1) multiply, otherwise it ignores the translation part of the modelview matrix.
2. distance uses the light position relative to the camera and not the vertex. Use lightVec_cameraSpace instead. (edit: missed the duplicated calculation)

Normal mapping and lighting gone wrong, not displaying correctly

I'm working on an implementation of normal mapping, calculating the tangent vectors via the ASSIMP library.
The normal mapping seems to work perfectly on objects that have a model matrix close to the identity matrix. As long as I start translating and scaling, my lighting seems off. As you can see in the picture the normal mapping works perfectly on the container cube, but the lighting fails on the large floor (direction of the specular light should be towards the player, not towards the container).
I get the feeling it somehow has something to do with the position of the light (currently traversing from x = -10 to x = 10 over time) not properly being included in the calculations as long as I start changing the model matrix (via translations/scaling). I'm posting all the relevant code and hope you guys can somehow see something I'm missing since I've been staring at my code for days.
Vertex shader
#version 330
layout(location = 0) in vec3 position;
layout(location = 1) in vec3 normal;
layout(location = 2) in vec3 tangent;
layout(location = 3) in vec3 color;
layout(location = 4) in vec2 texCoord;
// fragment pass through
out vec3 Position;
out vec3 Normal;
out vec3 Tangent;
out vec3 Color;
out vec2 TexCoord;
out vec3 TangentSurface2Light;
out vec3 TangentSurface2View;
uniform vec3 lightPos;
// vertex transformation
uniform mat4 model;
uniform mat4 view;
uniform mat4 projection;
void main()
{
mat3 normalMatrix = transpose(mat3(inverse(view * model)));
Position = vec3((view * model) * vec4(position, 1.0));
Normal = normalMatrix * normal;
Tangent = tangent;
Color = color;
TexCoord = texCoord;
gl_Position = projection * view * model * vec4(position, 1.0);
vec3 light = vec3(view * vec4(lightPos, 1.0));
vec3 n = normalize(normalMatrix * normal);
vec3 t = normalize(normalMatrix * tangent);
vec3 b = cross(n, t);
mat3 mat = mat3(t.x, b.x ,n.x, t.y, b.y ,n.y, t.z, b.z ,n.z);
vec3 vector = normalize(light - Position);
TangentSurface2Light = mat * vector;
vector = normalize(-Position);
TangentSurface2View = mat * vector;
}
Fragment shader
#version 330
in vec3 Position;
in vec3 Normal;
in vec3 Tangent;
in vec3 Color;
in vec2 TexCoord;
in vec3 TangentSurface2Light;
in vec3 TangentSurface2View;
out vec4 outColor;
uniform vec3 lightPos;
uniform mat4 view;
uniform sampler2D texture0;
uniform sampler2D texture_normal; // normal
uniform float repeatFactor = 1;
void main()
{
vec4 texColor = texture(texture0, TexCoord * repeatFactor);
vec3 light = vec3(view * vec4(lightPos, 1.0));
float dist = length(light - Position);
float att = 1.0 / (1.0 + 0.01 * dist + 0.001 * dist * dist);
// Ambient
vec4 ambient = vec4(0.2);
// Diffuse
vec3 surface2light = normalize(TangentSurface2Light);
vec3 norm = normalize(texture(texture_normal, TexCoord * repeatFactor).xyz * 2.0 - 1.0);
float contribution = max(dot(norm, surface2light), 0.0);
vec4 diffuse = contribution * vec4(0.8);
// Specular
vec3 surf2view = normalize(TangentSurface2View);
vec3 reflection = reflect(-surface2light, norm); // reflection vector
float specContribution = pow(max(dot(surf2view, reflection), 0.0), 32);
vec4 specular = vec4(0.6) * specContribution;
outColor = (ambient + (diffuse * att)+ (specular * pow(att, 3))) * texColor;
}
OpenGL Drawing Code
void Render()
{
...
glm::mat4 view, projection; // Model will be done via MatrixStack
view = glm::lookAt(position, position + direction, up); // cam pos, look at (eye pos), up vec
projection = glm::perspective(45.0f, (float)width/(float)height, 0.1f, 1000.0f);
glUniformMatrix4fv(glGetUniformLocation(basicShader.shaderProgram, "view"), 1, GL_FALSE, glm::value_ptr(view));
glUniformMatrix4fv(glGetUniformLocation(basicShader.shaderProgram, "projection"), 1, GL_FALSE, glm::value_ptr(projection));
// Lighting
lightPos.x = 0.0 + sin(time / 125) * 10;
glUniform3f(glGetUniformLocation(basicShader.shaderProgram, "lightPos"), lightPos.x, lightPos.y, lightPos.z);
// Objects (use bump mapping on this cube)
bumpShader.Use();
glUniformMatrix4fv(glGetUniformLocation(bumpShader.shaderProgram, "view"), 1, GL_FALSE, glm::value_ptr(view));
glUniformMatrix4fv(glGetUniformLocation(bumpShader.shaderProgram, "projection"), 1, GL_FALSE, glm::value_ptr(projection));
glUniform3f(glGetUniformLocation(bumpShader.shaderProgram, "lightPos"), lightPos.x, lightPos.y, lightPos.z);
MatrixStack::LoadIdentity();
MatrixStack::Scale(2);
MatrixStack::ToShader(glGetUniformLocation(bumpShader.shaderProgram, "model"));
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_2D, resources.GetTexture("container"));
glUniform1i(glGetUniformLocation(bumpShader.shaderProgram, "img"), 0);
glActiveTexture(GL_TEXTURE1); // Normal map
glBindTexture(GL_TEXTURE_2D, resources.GetTexture("container_normal"));
glUniform1i(glGetUniformLocation(bumpShader.shaderProgram, "normalMap"), 1);
glUniform1f(glGetUniformLocation(bumpShader.shaderProgram, "repeatFactor"), 1);
cubeNormal.Draw();
MatrixStack::LoadIdentity();
MatrixStack::Translate(glm::vec3(0.0f, -22.0f, 0.0f));
MatrixStack::Scale(glm::vec3(200.0f, 20.0f, 200.0f));
MatrixStack::ToShader(glGetUniformLocation(bumpShader.shaderProgram, "model"));
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_2D, resources.GetTexture("floor"));
glActiveTexture(GL_TEXTURE1); // Normal map
glBindTexture(GL_TEXTURE_2D, resources.GetTexture("floor_normal"));
glUniform1f(glGetUniformLocation(bumpShader.shaderProgram, "repeatFactor"), 100);
cubeNormal.Draw();
MatrixStack::LoadIdentity();
glActiveTexture(GL_TEXTURE0);
...
}
EDIT
I now loaded my objects using the ASSIMP library with the 'aiProcess_CalcTangentSpace' flag enabled and changed my shaders accordingly to adapt to the new changes. Since ASSIMP now automatically calculates the correct tangent vectors I should have valid tangent vectors and my problem should be solved (as noted by Nicol Bolas), but I still have the same issue with the specular lighting acting strange and the diffuse lighting not really showing up. I guess there is still something else that is not working correctly. I unmarked your answer as the correct answer Nicol Bolas (for now) and updated my code accordingly since there is still something I'm missing.
It probably has something to do with translating. As soon as I add a translate (-22.0f in y direction) to the Model matrix it reacts with strange lighting. As long as the floor (which is actually a cube) has no translation the lighting looks fine.
calculating the tangent vectors in the vertex shader
Well there's your problem. That's not possible for an arbitrary surface.
The tangent and bitangent are not arbitrary vectors that are perpendicular to one another. They are model-space direction vectors that point in the direction of the texture coordinates. The tangent points in the direction of the S texture coordinate, and the bitangent points in the direction of the T texture coordinate (or U and V for the tex coords, if you prefer).
This effectively computes the orientation of the texture relative to each vertex on the surface. You need this orientation, because the way the texture is mapped to the surface matters when you want to make sense of a tangent-space vector.
Remember: tangent-space is the space perpendicular to a surface. But you need to know how that surface is mapped to the object in order to know where "up" is, for example. Take a square surface. You could map a texture so that the +Y part of the square is oriented along the +T direction of the texture. Or it could be along the +X of the square. You could even map it so that the texture is distorted, or rotated at an arbitrary angle.
The tangent and bitangent vectors are intended to correct for this mapping. They point in the S and T directions in model space. So, combined with the normal, they form a transformation matrix to transform from tangent-space into whatever space the 3 vectors are in (you generally transform the NBT to camera space or whatever space you use for lighting before using them).
You cannot compute them by just taking the normal and crossing it with some arbitrary vector. That produces a perpendicular normal, but not the right one.
In order to correctly compute the tangent/bitangent, you need access to more than one vertex. You need to be able to see how the texture coordinates change over the surface of the mesh, which is how you compute the S and T directions relative to the mesh.
Vertex shaders cannot access more than one vertex. Geometry shaders can't (generally) access enough vertices to do this either. Compute the tangent/bitangent off-line on the CPU.
mat3 mat = mat3(t.x, b.x ,n.x, t.y, b.y ,n.y, t.z, b.z ,n.z);
Is wrong. In order to use the tbn matrix correctly, you must transpose it,like so:
mat3 mat = transpose(mat3(t.x, b.x ,n.x, t.y, b.y ,n.y, t.z, b.z ,n.z));
then use it to transform your light and view vectors into tangent space. Alternatively(and less efficiently), pass the untransposed tbn matrix to the fragment shader, and use it to transform the sampled normal into view space. It's an easy thing to miss, but very important. See http://www.opengl-tutorial.org/intermediate-tutorials/tutorial-13-normal-mapping/ for more info.
On a side note, a minor optimisation you can do for your vertex shader, is to calculate the normal matrix on the cpu, per mesh, as it will be the same for all vertices on the mesh, and so reduce unnecessary calculations.