Normal Mapping and translation disrupts my lighting - c++

I got a normal mapping issue. I have a texture and a normal texture on each model loaded via the ASSIMP library. I am calculating the tangent vectors on each object with the help of the ASSIMP library so these should be fine. The objects work perfectly with normal mapping but as soon as I start translating one of the objects (thus influence the Model matrix with translations) the lighting fails. As you can see on the image, the floor (which is translated down the y axis) seems to lose most of its diffuse lighting and its specular lighting is in the wrong direction (it should be between the lightbulb and the player position)
It might have something to do with the normal matrix (although translations should be lost), maybe something with a wrong matrix used in the shaders. I am out of ideas and was hoping you could shed some insight into the issue.
Vertex shader:
#version 330
layout(location = 0) in vec3 position;
layout(location = 1) in vec3 normal;
layout(location = 2) in vec3 tangent;
layout(location = 3) in vec3 color;
layout(location = 4) in vec2 texCoord;
// fragment pass through
out vec3 Position;
out vec3 Normal;
out vec3 Tangent;
out vec3 Color;
out vec2 TexCoord;
out vec3 TangentSurface2Light;
out vec3 TangentSurface2View;
uniform vec3 lightPos;
uniform vec3 playerPos;
// vertex transformation
uniform mat4 model;
uniform mat4 view;
uniform mat4 projection;
void main()
{
mat3 normalMatrix = mat3(transpose(inverse(model)));
Position = vec3(model * vec4(position, 1.0));
Normal = normalMatrix * normal;
Tangent = tangent;
Color = color;
TexCoord = texCoord;
gl_Position = projection * view * model * vec4(position, 1.0);
// Calculate tangent matrix and calculate fragment bump mapping coord space.
vec3 light = lightPos;
vec3 n = normalize(normalMatrix * normal);
vec3 t = normalize(normalMatrix * tangent);
vec3 b = cross(n, t);
// create matrix for tangent (from vertex to tangent-space)
mat3 mat = mat3(t.x, b.x ,n.x, t.y, b.y ,n.y, t.z, b.z ,n.z);
vec3 vector = normalize(light - Position);
TangentSurface2Light = mat * vector;
vector = normalize(playerPos - Position);
TangentSurface2View = mat * vector;
}
Fragment Shader
#version 330
in vec3 Position;
in vec3 Normal;
in vec3 Tangent;
in vec3 Color;
in vec2 TexCoord;
in vec3 TangentSurface2Light;
in vec3 TangentSurface2View;
out vec4 outColor;
uniform vec3 lightPos;
uniform vec3 playerPos;
uniform mat4 view;
uniform sampler2D texture0;
uniform sampler2D texture_normal; // normal
uniform float repeatFactor = 1;
void main()
{
vec4 texColor = texture(texture0, TexCoord * repeatFactor);
vec4 matColor = vec4(Color, 1.0);
vec3 light = vec3(vec4(lightPos, 1.0));
float dist = length(light - Position);
// float att = 1.0 / (1.0 + 0.01 * dist + 0.001 * dist * dist);
float att = 1.0;
// Ambient
vec4 ambient = vec4(0.2);
// Diffuse
// vec3 surface2light = normalize(light - Position);
vec3 surface2light = normalize(TangentSurface2Light);
// vec3 norm = normalize(Normal);
vec3 norm = normalize(texture(texture_normal, TexCoord * repeatFactor).xyz * 2.0 - 1.0);
float contribution = max(dot(norm, surface2light), 0.0);
vec4 diffuse = contribution * vec4(0.6);
// Specular
// vec3 surf2view = normalize(-Position); // Player is always at position 0
vec3 surf2view = normalize(TangentSurface2View);
vec3 reflection = reflect(-surface2light, norm); // reflection vector
float specContribution = pow(max(dot(surf2view, reflection), 0.0), 32);
vec4 specular = vec4(1.0) * specContribution;
outColor = (ambient + (diffuse * att)+ (specular * pow(att, 3))) * texColor;
// outColor = vec4(Color, 1.0) * texture(texture0, TexCoord);
}
EDIT
Edited the shader code to calculate everything in world space instead of pingponging between world and camera space (easier to understand and less error-prone).

You are making strange manipulations with matrices. In VS you transform normal (that is model-space) by inverse view-world. That doesn't make any sense. It may be easier to do calculations in world-space. I've got some working sample code, but it uses a bit different naming.
Vertex shader:
void main_vs(in A2V input, out V2P output)
{
output.position = mul(input.position, _worldViewProjection);
output.normal = input.normal;
output.binormal = input.binormal;
output.tangent = input.tangent;
output.positionWorld = mul(input.position, _world);
output.tex = input.tex;
}
Here we transform position to projection(screen)-space, TBN is left in model-space, they will be used later. Also we get world-space position for lighting evaluation.
Pixel shader:
void main_ps(in V2P input, out float4 output : SV_Target)
{
float3x3 tbn = float3x3(input.tangent, -input.binormal, input.normal);
//extract & decode normal:
float3 texNormal = _normalTexture.Sample(_normalSampler, input.tex).xyz * 2 - 1;
//now transform TBN-space texNormal to world space:
float3 normal = mul(texNormal, tbn);
normal = normalize(mul(normal, _world));
float3 lightDirection = -_lightPosition.xyz;//directional
float3 viewDirection = normalize(input.positionWorld - _camera);
float3 reflectedLight = reflect(lightDirection, normal);
float diffuseIntensity = dot(normal, lightDirection);
float specularIntensity = max(0, dot(reflectedLight, viewDirection)*1.3);
output = ((_ambient + diffuseIntensity * _diffuse) * _texture.Sample(_sampler, input.tex)
+ pow(specularIntensity, 7) * float4(1,1,1,1)) * _lightColor;
}
Here I use directional light, you should do something like
float3 lightDirection = normalize(input.positionWorld - _lightPosition.xyz);//omni
Here we first have normal from texture, that is in TBN-space. Then we apply TBN matrix to transform it to model-space. Then apply world matrix to transform it to world-space, were we already have light position, eye, etc.
Some other shader code, ommitted above (DX11, but it's easy to translate):
cbuffer ViewTranforms
{
row_major matrix _worldViewProjection;
row_major matrix _world;
float3 _camera;
};
cbuffer BumpData
{
float4 _ambient;
float4 _diffuse;
};
cbuffer Textures
{
texture2D _texture;
SamplerState _sampler;
texture2D _normalTexture;
SamplerState _normalSampler;
};
cbuffer Light
{
float4 _lightPosition;
float4 _lightColor;
};
//------------------------------------
struct A2V
{
float4 position : POSITION;
float3 normal : NORMAL;
float3 binormal : BINORMAL;
float3 tangent : TANGENT;
float2 tex : TEXCOORD;
};
struct V2P
{
float4 position : SV_POSITION;
float3 normal : NORMAL;
float3 binormal : BINORMAL;
float3 tangent : TANGENT;
float3 positionWorld : NORMAL1;
float2 tex : TEXCOORD;
};
Also, here I use precomputed binormal: you shall leave your code, that computes it (via cross(normal, tangent)).
Hope this helps.

Related

Modern GLSL ( opengl 3+ ) : Implementing phong effect correctly;

I am implementing a basic phong lighting GLSL shader; I have looked up some things on the internet, and found that the phong effect was created by adding an ambient, diffuse, and specular layer on the object (see image below, from tom dalling's site); problem is I have seen a lot of examples, and none of them really suits my GLSL set-up. Can any of you give me a code example of the correct way to implement the phong effect which would fit my GLSL set-up ? :
PS : This question could be put on hold because of the fact that it may be based on user opinion : In my mind, it is not, because I would like to know the most effective, and better way of implementing it.
Here is my vertex shader :
#version 120
uniform mat4 modelView;
uniform mat4 MVP;
uniform float time;
attribute vec3 position;
attribute vec2 texCoord;
attribute vec3 normal;
varying vec3 position0;
varying vec2 texCoord0;
varying vec3 normal0;
varying mat4 modelView0;
void main()
{
//Updating varyings...
position0 = position;
texCoord0 = texCoord;
normal0 = (MVP * vec4(normal, 0.0)).xyz;
modelView0 = modelView;
//set position
gl_Position = MVP * vec4(position, 1.0);
}
and my fragment shader :
#version 120
varying vec3 position0;
varying vec2 texCoord0;
varying vec3 normal0;
varying mat4 modelView0;
uniform sampler2D diffuse;
void main()
{
vec4 surfaceColor = texture2D(diffuse, texCoord0);
gl_FragColor = (texture2D(diffuse, texCoord0))
* clamp(dot(-vec3(0.0, 0.5, 0.5), normal0), 0, 1.0);
}
try this:
void main()
{
vec4 texread = texture2D(diffuse, texCoord0);
vec3 normal = normalize(normal0);
vec3 material_kd = vec3(1.0,1.0,1.0);
vec3 material_ks = vec3(1.0,1.0,1.0);
vec3 material_ka = vec3(0.2,0.2,0.2);
vec3 material_ke = vec3(0.0,0.0,0.0);
float material_shininess = 60;
vec3 lightpos = vec3(0.0,10.0,5.0);
vec3 lightcolor = vec3(1.0,1.0,1.0);
vec3 lightdir = normalize(lightpos - worldPosition);
float shade = clamp(dot(lightdir, normal), 0.0, 1.0);
vec3 toWorldpos = normalize((worldPosition) - u_eyePos);
vec3 reflectDir = reflect( toWorldpos, normal );
vec4 specular = vec4(pow(clamp(dot(lightdir, reflectDir),0.0,1.0), material_shininess) * lightcolor * material_ks, 1.0);
vec4 shaded = texread * vec4(material_kd, 1.0) * vec4(lightcolor , 1.0) * shade;
vec4 ambient = texread * vec4(material_ka, 1.0);
vec4 emission = vec4(material_ke, 1.0);
gl_FragColor = shaded + specular + emission + ambient;
}
it may have some compilation errors though as i didnt run it...
you may need to upload your eye position as a uniform (u_eyePos), and calculate the worldposition (worldPosition) for it to work
I made my own sphong shader : here is the code :
fragment shader :
#version 150
uniform mat4 modelView;
uniform mat3 normalMatrix;
uniform vec3 cameraPosition;
uniform sampler2D materialTex;
uniform float materialShininess;
uniform vec3 materialSpecularColor;
uniform vec3 lightPosition;//light settings
uniform vec3 lightIntensities;
uniform float lightAttenuation;
uniform float lightAmbientCoeff;
in vec3 position0;
in vec2 texCoord0;
in vec3 normal0;
out vec4 fragmentColor;
void main()
{
//calculate normal in world coordinates
vec3 normal = normalize(normalMatrix * normal0);
//calculate the location of this fragment (pixel) in world coordinates
vec3 surfacePos = vec3(modelView * vec4(position0, 1));
//color of the current fragment
vec4 surfaceColor = texture(materialTex, texCoord0);
//calculate the vector from this pixels surface to the light source
vec3 surfaceToLight = normalize(lightPosition - surfacePos);
//cam distance
vec3 surfaceToCamera = normalize(cameraPosition - surfacePos);
///////////////////////////DIFUSE///////////////////////////////////////
//calculate the cosine of the angle of incidence
//float diffuseCoeff = dot(normal, surfaceToLight) / (length(surfaceToLight) * length(normal));
float diffuseCoeff = max(0.0, dot(normal, surfaceToLight));
vec3 diffuse = diffuseCoeff * surfaceColor.rgb * lightIntensities;
/////////////////////////AMBIENT////////////////////////////////////////
vec3 ambient = lightAmbientCoeff * surfaceColor.rgb * lightIntensities;
/////////////////////////SPECULAR//////////////////////////////////////
float specularCoeff = 0.0;
if(diffuseCoeff > 0.0)
specularCoeff = pow(max(0.0, dot(surfaceToCamera, reflect(-surfaceToLight, normal))), materialShininess);
vec3 specular = specularCoeff * materialSpecularColor * lightIntensities;
////////////////////////ATTENUATION///////////////////////////////////
float distanceToLight = length(lightPosition - surfacePos);
float attenuation = 1.0 / (1.0 + lightAttenuation * pow(distanceToLight, 2));
/////////////////////////////////FINAL/////////////////////////////////
vec3 linearColor = ambient + attenuation * (diffuse + specular);
//finalColor with gamma correction
vec3 gamma = vec3(1.0/2.2);
fragmentColor = vec4(pow(linearColor, gamma), surfaceColor.a);
//fragmentColor = vec4(diffuseCoeff * lightIntensities * surfaceColor.rgb, surfaceColor.a);
}

Weird view/light behaviour with normal mapping

I'm trying to implement normal/steep parallax mapping, but I'm getting some weird artefacts. It looks like the view & light vectors are being distorted when they are transformed into tangent space, which is causing lighting and the uv offsetting of the parallax to be incorrect.
Vertex shader source:
// parallax.vert
#version 330
// Some drivers require the following
precision highp float;
//per vertex inputs stored on GPU memory
in vec3 in_Position;
in vec3 in_Normal;
in vec4 tangent;
//per mesh data sent at rendering time
uniform mat4 model; //object to world space transformations
uniform mat4 view; //world to eye space transformations
uniform mat4 projection; //eye space to screen space
uniform mat3 normalMat; //for transforming normals in
uniform vec4 lightPosition; //light position in world space
uniform vec3 eyePosition; //eye position in world space
//outputs to fragment shader
in vec2 in_TexCoord;
// multiply each vertex position by the MVP matrix
// and find V, L, and TBN Matrix vectors for the fragment shader
out VertexData
{
vec3 ts_L; //view vector tangent space
vec3 ts_V; //light vector tangent space
vec2 ex_TexCoord;
float dist;
} vertex;
void main(void)
{
//view space position for lighting calculations
vec3 position = ( ( view * model ) * vec4( in_Position,1.0 ) ).xyz;
//calculate view vector and light vector in view space
vec3 ex_V = -position;
vec3 ex_L = ( view * vec4( lightPosition.xyz,1.0 ) ).xyz - position;
vertex.ex_TexCoord = in_TexCoord;
//now calculate tbn matrix
//calculate biTangent, in view space
vec3 normal = normalMat*in_Normal;
vec3 tan = normalMat * tangent.xyz;
vec3 bitangent = normalize( cross( normal, tan.xyz ) * tangent.w );
//fragment stage will need this value for calculating light attenuation
vertex.dist = length( ex_L );
//calculate transpose tangent space matrix, as we are doing everything in T space in fragment stage
mat3 transTBN = transpose( mat3( tan, bitangent, normal ) );
//transform view space vertex view & light vectors into tangent space
vertex.ts_V = normalize( transTBN * ex_V );
vertex.ts_L = normalize( transTBN * ex_L );
gl_Position = projection * vec4(position,1.0);
}
fragment shader source:
// parallax.frag
#version 330
// Some drivers require the following
precision highp float;
struct lightStruct
{
vec4 ambient;
vec4 diffuse;
vec4 specular;
vec4 position;
float radius;
};
struct materialStruct
{
vec4 ambient;
vec4 diffuse;
vec4 specular;
vec4 emissive;
float shininess;
};
//uniforms sent at render time
uniform lightStruct light;
uniform materialStruct material;
uniform vec2 scaleBias;
//texture information
uniform sampler2D colourMap;
uniform sampler2D normalMap;
uniform sampler2D specularMap;
uniform sampler2D heightMap;
uniform sampler2D glossMap;
//inputs from vertex shader stage
in VertexData
{
vec3 ts_L; //view vector tangent space
vec3 ts_V; //light vector tangent space
vec2 ex_TexCoord;
float dist;
} vertex;
//final fragment colour
layout(location = 0) out vec4 out_Color;
void main(void) {
//calculate halfway vector for blinn phong
vec3 Half = normalize( vertex.ts_V + vertex.ts_L );
//and create some temporary types algorithm will need
vec2 texCoord;
vec3 diffuseTexel ;
vec3 specularTexel;
float gloss;
vec3 normal;
//first sample texel value from height map
float height = texture( heightMap, vertex.ex_TexCoord.st ).r;
//-----------------------------------------------
// steep parallax
float start = 1.0;
int numsteps = 50;
float step = 1.0/numsteps;
vec2 offset = vertex.ex_TexCoord;
vec2 delta = -vertex.ts_V.xy * scaleBias.r / (vertex.ts_V.z * numsteps);
for (int i = 0; i < numsteps; i++)
{
if(height < start)
{
start-=step;
offset+=delta;
height = texture(heightMap,offset).r;
}
else
{
texCoord = offset;
break;
}
}
//---------------------------------------------------
//normal maps are encoded with values between 0.0 and 0.5 being negative, so need to remove the encoding.
//sample textures
diffuseTexel = texture(colourMap,texCoord.st).rgb;
specularTexel = texture(specularMap,texCoord.st).rgb;
normal = normalize(texture(normalMap,texCoord.st).rgb * 2.0 - 1.0);
gloss = texture(glossMap,texCoord.st).r;
//--------------------------------------------------
//Lighting -- Blinn/Phong Lighting with specular, normal and gloss mapping
float lambert = max( dot (normal, vertex.ts_L ), 0.0 );
vec3 litColour = vec3(0.0,0.0,0.0);
//if light is worth calculating
if (lambert > 0.0)
{
vec3 R = normalize(-reflect(vertex.ts_L,normal));
float specularPower = pow(max(dot(R,Half),0.0),material.shininess);
litColour += light.diffuse.xyz * material.diffuse.xyz * lambert;
litColour += light.specular.xyz * material.specular.xyz * specularPower * gloss;
float attenuation = 1.0 + (0.01 * vertex.dist*vertex.dist) + (0.01 * vertex.dist);
attenuation = 1.0/attenuation;
litColour *= attenuation;
}
out_Color = vec4( litColour *diffuseTexel*(specularTexel) ,1.0);
}
The normal matrix is being calculated by taking the
transpose( inverse( modelview ) );
I've probably missed something really obvious, but I can't see it, and would be grateful for some input.
Update:
I've modified to perform the lighting & offsetting based on world space vectors, which has now produced the following results:
Lighting is now correct, and the parallax is good when viewed from close to the surface normal, but...
As the angle between view & normal increases, the parallax goes very weird.
Again, I'd be grateful for some help in this.
vec3 ex_V = -position;
Shouldn't this be:
vec3 ex_V = eyePosition - position;

Specular Light appears on both sides

I have a very strange behaviour of specular(phong light model) light. It seems to be appering on both sides of all objects. Does anyone know what could be the issue ?
The actual calculation seems to be alright, as I can see that the light changes its position as object rotates.
#version 330
in vec4 CameraPos0;
in vec3 Pos0;
in vec4 Colour0;
in vec3 Normal0;
out vec4 FragColor;
// Ambient light parameters
uniform vec3 gAmbientLightIntensity;
// Directional light parameters
uniform vec3 gDirectionalLightIntensity;
uniform vec3 gDirectionalLightDirection;
// Specular light parameter
uniform vec3 gSpecularLightIntensity;
uniform vec3 gLightSourcePosition;
uniform vec3 gCameraPosition;
// Material constants
uniform float gKa;
uniform float gKd;
uniform float gKs;
void main()
{
// Calculate the ambient light intensity at the vertex
// Ia = Ka * ambientLightIntensity
vec4 ambientLightIntensity = gKa * vec4(gAmbientLightIntensity, 1.0);
// Setup the light direction and normalise it
vec3 lightDirection = normalize(-gDirectionalLightDirection);
//lightDirection = normalize(gDirectionalLightDirection);
// Id = kd * lightItensity * N.L
// Calculate N.L
float diffuseFactor = dot(Normal0, lightDirection);
diffuseFactor = clamp(diffuseFactor, 0.0, 1.0);
// N.L * light source colour * intensity
vec4 diffuseLightIntensity = gKd * vec4(gDirectionalLightIntensity, 1.0f) * diffuseFactor;
// Phong light
vec3 L = normalize(gLightSourcePosition - Pos0);
vec3 V = normalize(-Pos0);
vec3 R = normalize(2 * Normal0 * dot(Normal0, L) - L);
float specularFactor = pow(dot(R, V), 0.1f);
vec4 specularLightIntensity = gKs * vec4(gSpecularLightIntensity, 1.0f) * specularFactor;
specularLightIntensity = clamp(specularLightIntensity, 0.0, 1.0);
// Final vertex colour is the product of the vertex colour
// and the total light intensity at the vertex
vec4 lightedFragColor = Colour0 * (ambientLightIntensity + diffuseLightIntensity + specularLightIntensity);
FragColor = lightedFragColor;
}
Vertex Shader
#version 330
layout (location = 0) in vec3 Position;
layout (location = 1) in vec3 Normal;
layout (location = 2) in vec4 Colour;
out vec3 Pos0;
out vec4 Colour0;
out vec3 Normal0;
out vec4 CameraPos0;
uniform mat4 gModelToWorldTransform;
uniform mat4 gWorldToViewTransform;
uniform mat4 gProjectionTransform;
void main()
{
vec4 vertexPositionInModelSpace = vec4(Position, 1);
vec4 vertexInWorldSpace = gModelToWorldTransform * vertexPositionInModelSpace;
vec4 vertexInViewSpace = gWorldToViewTransform * vertexInWorldSpace;
vec4 vertexInHomogeneousClipSpace = gProjectionTransform * vertexInViewSpace;
gl_Position = vertexInHomogeneousClipSpace;
vec3 normalInWorldSpace = (gModelToWorldTransform * vec4(Normal, 0.0)).xyz;
normalInWorldSpace = normalize(normalInWorldSpace);
Normal0 = normalInWorldSpace;
CameraPos0 = vertexInViewSpace;
Pos0 = vertexInWorldSpace.xyz;
Colour0 = Colour;
}
you need to clamp the dot result from the saturation calculus because on the back side the result is negative and the pow can return a positive number instead of clamping it to zero.
float specularFactor = pow(clamp(dot(R, V),0.0,1.0), 0.1f);
Edit:
Also the V should be a vector pointing to the camera position, not to the vertex position in world space:
vec3 V = normalize(CameraPos0 - Pos0);

opengl shader directional lights specular reflection increasing with distance

the title says it all.. using opengls built in lighting system, specularlight does not increase or decrease with distance from the object, but by shader implementation does.
Vertex Shader:
#version 330
layout (location = 0) in vec3 position;
layout (location = 1) in vec2 texCoord;
layout (location = 2) in vec3 normal;
out vec2 texCoord0;
out vec3 normal0;
out vec3 worldPos0;
uniform mat4 transform;
uniform mat4 normalRotation;
uniform mat4 transformProjected;
void main()
{
gl_Position = transformProjected * vec4(position, 1.0);
texCoord0 = texCoord;
normal0 = normalize((normalRotation * vec4(normal, 0.0))).xyz;
worldPos0 = (transform * vec4(position, 1.0)).xyz;
}
Fragment Shader:
#version 330
in vec2 texCoord0;
in vec3 normal0;
in vec3 worldPos0;
out vec4 fragColor;
struct BaseLight
{
vec3 colorDiffuse;
vec3 colorSpecular;
float intensityDiffuse;
};
struct DirectionalLight
{
BaseLight base;
vec3 direction;
};
uniform vec3 tint;
uniform sampler2D sampler;
uniform vec3 eyePos; // camera pos
uniform vec3 ambientLight;
uniform vec3 emissiveLight;
//material
uniform float specularIntensity;
uniform float specularPower;
uniform DirectionalLight directionalLight;
vec4 calcLight(BaseLight base,vec3 direction, vec3 normal)
{
float diffuseFactor = dot(normal, -direction);
vec4 diffuseColorFinal = vec4(0,0,0,0);
vec4 specularColorFinal = vec4(0,0,0,0);
if(diffuseFactor > 0)
{
diffuseColorFinal = vec4(base.colorDiffuse,1) * diffuseFactor * base.intensityDiffuse;
vec3 directionToEye = normalize(eyePos - worldPos0);
vec3 reflectDirection = normalize(reflect(direction, normal));
float specularFactor = dot(directionToEye, reflectDirection);
specularFactor = pow(specularFactor, specularPower);
if(specularFactor > 0)
specularColorFinal = vec4(base.colorSpecular,1) * specularFactor * specularIntensity;
}
//
return diffuseColorFinal + specularColorFinal;
}
void main()
{
vec4 colorD = texture(sampler, texCoord0.xy) * vec4(tint,1);
vec3 normal = normal0;
vec4 totalLight = vec4(ambientLight,1) + vec4(emissiveLight,1);
totalLight += calcLight(directionalLight.base,-directionalLight.direction,normal);
fragColor = colorD * totalLight;
}
As you can see from the 2 images the specular light takes up a larger surface area the farther the camera gets from the plane.In my test with opengls built in lighting, this doesnt happen. is there a way to fix this? im new to lighting, maybe this is normal for directional light sources? thanks for the help!
Im also setting my eyePos uniform to my cameraPos. i dont know if that helps.
Basically you need to have distance between the fragment and the light dist . This can be a problem for directional light though because you have only the direction and distant is assumed to be infinite. Maybe switch to point light?
when youo have the 'dist' you use a formula
att = 1.0 / (Kc + Kl*dist + Kq*dist^2)
Kc - constant attenuation
Kl - linear attenuation
Kq - quadratic attenuation
simpler version (only Kq used, rest set to 1.0):
float attenuation = 1.0 / (1.0 + light.attenuation * pow(distanceToLight, 2));
then in the lighting equation you basically multiply calculated color by this att factor:
vec4 finalColor = ambient + (diffuseColorFinal + specularColorFinal)*att
http://www.ozone3d.net/tutorials/glsl_lighting_phong_p4.php#part_4
http://tomdalling.com/blog/modern-opengl/07-more-lighting-ambient-specular-attenuation-gamma/

Calculate tangent space in C++

I am trying to render a scene using normal mapping
Therefore I am calculating the tangent space in C++ and store the binormal and tanget seperately in an array which will be uploaded to my shader using vertexattribpointer.
Here is how I calculate the space
void ObjLoader::computeTangentSpace(MeshData &meshData) {
GLfloat* tangents = new GLfloat[meshData.vertex_position.size()]();
GLfloat* binormals = new GLfloat[meshData.vertex_position.size()]();
std::vector<glm::vec3 > tangent;
std::vector<glm::vec3 > binormal;
for(unsigned int i = 0; i < meshData.indices.size(); i = i+3){
glm::vec3 vertex0 = glm::vec3(meshData.vertex_position.at(meshData.indices.at(i)), meshData.vertex_position.at(meshData.indices.at(i)+1),meshData.vertex_position.at(meshData.indices.at(i)+2));
glm::vec3 vertex1 = glm::vec3(meshData.vertex_position.at(meshData.indices.at(i+1)), meshData.vertex_position.at(meshData.indices.at(i+1)+1),meshData.vertex_position.at(meshData.indices.at(i+1)+2));
glm::vec3 vertex2 = glm::vec3(meshData.vertex_position.at(meshData.indices.at(i+2)), meshData.vertex_position.at(meshData.indices.at(i+2)+1),meshData.vertex_position.at(meshData.indices.at(i+2)+2));
glm::vec3 normal = glm::cross((vertex1 - vertex0),(vertex2 - vertex0));
glm::vec3 deltaPos;
if(vertex0 == vertex1)
deltaPos = vertex2 - vertex0;
else
deltaPos = vertex1 - vertex0;
glm::vec2 uv0 = glm::vec2(meshData.vertex_texcoord.at(meshData.indices.at(i)), meshData.vertex_texcoord.at(meshData.indices.at(i)+1));
glm::vec2 uv1 = glm::vec2(meshData.vertex_texcoord.at(meshData.indices.at(i+1)), meshData.vertex_texcoord.at(meshData.indices.at(i+1)+1));
glm::vec2 uv2 = glm::vec2(meshData.vertex_texcoord.at(meshData.indices.at(i+2)), meshData.vertex_texcoord.at(meshData.indices.at(i+2)+1));
glm::vec2 deltaUV1 = uv1 - uv0;
glm::vec2 deltaUV2 = uv2 - uv0;
glm::vec3 tan; // tangents
glm::vec3 bin; // binormal
// avoid divion with 0
if(deltaUV1.s != 0)
tan = deltaPos / deltaUV1.s;
else
tan = deltaPos / 1.0f;
tan = glm::normalize(tan - glm::dot(normal,tan)*normal);
bin = glm::normalize(glm::cross(tan, normal));
// write into array - for each vertex of the face the same value
tangents[meshData.indices.at(i)] = tan.x;
tangents[meshData.indices.at(i)+1] = tan.y;
tangents[meshData.indices.at(i)+2] = tan.z;
tangents[meshData.indices.at(i+1)] = tan.x;
tangents[meshData.indices.at(i+1)+1] = tan.y;
tangents[meshData.indices.at(i+1)+2] = tan.z;
tangents[meshData.indices.at(i+2)] = tan.x;
tangents[meshData.indices.at(i+2)+1] = tan.y;
tangents[meshData.indices.at(i+2)+1] = tan.z;
binormals[meshData.indices.at(i)] = bin.x;
binormals[meshData.indices.at(i)+1] = bin.y;
binormals[meshData.indices.at(i)+2] = bin.z;
binormals[meshData.indices.at(i+1)] = bin.x;
binormals[meshData.indices.at(i+1)+1] = bin.y;
binormals[meshData.indices.at(i+1)+2] = bin.z;
binormals[meshData.indices.at(i+2)] = bin.x;
binormals[meshData.indices.at(i+2)+1] = bin.y;
binormals[meshData.indices.at(i+2)+1] = bin.z;
}
// Copy the tangent and binormal to meshData
for(unsigned int i = 0; i < meshData.vertex_position.size(); i++){
meshData.vertex_tangent.push_back(tangents[i]);
meshData.vertex_binormal.push_back(binormals[i]);
}
}
And here are my vertex and fragment shader
Vertex Shader
#version 330
layout(location = 0) in vec3 vertex;
layout(location = 1) in vec3 vertex_normal;
layout(location = 2) in vec2 vertex_texcoord;
layout(location = 3) in vec3 vertex_tangent;
layout(location = 4) in vec3 vertex_binormal;
struct LightSource {
vec3 ambient_color;
vec3 diffuse_color;
vec3 specular_color;
vec3 position;
};
uniform vec3 lightPos;
out vec3 vertexNormal;
out vec3 eyeDir;
out vec3 lightDir;
out vec2 textureCoord;
uniform mat4 view;
uniform mat4 modelview;
uniform mat4 projection;
out vec4 myColor;
void main() {
mat4 normalMatrix = transpose(inverse(modelview));
gl_Position = projection * modelview * vec4(vertex, 1.0);
vec4 binormal = modelview * vec4(vertex_binormal,1);
vec4 tangent = modelview * vec4(vertex_tangent,1);
vec4 normal = vec4(vertex_normal,1);
mat3 tangentMatrix = mat3(tangent.xyz,binormal.xyz,normal.xyz);
vec3 vertexInCamSpace = (modelview * vec4(vertex, 1.0)).xyz;
eyeDir = tangentMatrix * normalize( -vertexInCamSpace);
vec3 lightInCamSpace = (view * vec4(lightPos, 1.0)).xyz;
lightDir = tangentMatrix * normalize((lightInCamSpace - vertexInCamSpace));
textureCoord = vertex_texcoord;
}
Fragment Shader
#version 330
struct LightSource {
vec3 ambient_color;
vec3 diffuse_color;
vec3 specular_color;
vec3 position;
};
struct Material {
vec3 ambient_color;
vec3 diffuse_color;
vec3 specular_color;
float specular_shininess;
};
uniform LightSource light;
uniform Material material;
in vec3 vertexNormal;
in vec3 eyeDir;
in vec3 lightDir;
in vec2 textureCoord;
uniform sampler2D texture;
uniform sampler2D normals;
out vec4 color;
in vec4 myColor;
in vec3 bin;
in vec3 tan;
void main() {
vec3 diffuse = texture2D(texture,textureCoord).rgb;
vec3 E = normalize(eyeDir);
vec3 N = texture2D(normals,textureCoord).xyz;
N = (N - 0.5) * 2.0;
vec3 ambientTerm = vec3(0);
vec3 diffuseTerm = vec3(0);
vec3 specularTerm = vec3(0);
vec3 L, H;
L = normalize(lightDir);
H = normalize(E + L);
ambientTerm += light.ambient_color;
diffuseTerm += light.diffuse_color * max(dot(L, N), 0);
specularTerm += light.specular_color * pow(max(dot(H, N), 0), material.specular_shininess);
ambientTerm *= material.ambient_color;
diffuseTerm *= material.diffuse_color;
specularTerm *= material.specular_color;
color = vec4(diffuse, 1) * vec4(ambientTerm + diffuseTerm + specularTerm, 1);
}
The problem is that sometimes I dont have values for tangent and binormal in the shader.. Here are three screenshots which I hope will clearify my problem:
This is how the scene currently looks like when I render it with the code above:
This is how the scene looks like, when I use lightDir as color
And the third shows the scene with eyeDir as color
All the pictures are taken from the same angle without moving camera or rotating anything.
I've already compared my code to several different sources in the www but I didn't found the error I've done...
Additional information:
I am iterating over all current faces. Three indices will give me one triangle. The UV values for each vertex are stored at the same index. having a lot of debugging there, I am very sure that this are the correct values as I can find the right values in the .obj file when searching using gedit.
After calculating tangent and binormal I am storing the normal at the same index as the vertex position is in the array. For my understanding this should give me the correct position and I am calculating this for each vertex. For each vertex in a face I am using the same tangent basis, which is maybe later overwritten when another face is using this vertex, this could mess up my final result but only in very small details...
EDIT:
For any other questions, here is the whole project:
http://www.incentivelabs.de/Sourcecode/normal_mapping.zip
In your vertex shader you have:
vec4 binormal = modelview * vec4(vertex_binormal,1);
vec4 tangent = modelview * vec4(vertex_tangent,1);
vec4 normal = vec4(vertex_normal,1);
This should be:
vec4 binormal = modelview * vec4(vertex_binormal,0);
vec4 tangent = modelview * vec4(vertex_tangent,0);
vec4 normal = modelview * vec4(vertex_normal,0);
Note the '0' instead of '1' (also I'm assuming you meant to transform your normal too). You use '0' here because you want to ignore the translation part of the modelview transformation (you're transforming a vector not a point).