I'm learning GLSL and trying to implement some lighting and mapping tricks. I'm working with ShaderDesigner tool. After coding normal mapping I recognized that my model illumination looks not real. Here is my code and some pictures. If it possible tell me what is my problem.
Vertex Shader
#define MAX_LIGHTS 1
struct LightProps
{
vec3 direction[MAX_LIGHTS];
};
attribute vec3 tangent;
attribute vec3 bitangent;
varying LightProps lights;
void main()
{
vec3 N = normalize(gl_NormalMatrix*gl_Normal);
vec3 T = normalize(gl_NormalMatrix*tangent);
vec3 B = normalize(gl_NormalMatrix*bitangent);
mat3 TBNMatrix = mat3(T,B,N);
vec4 vertex = gl_ModelViewMatrix*gl_Vertex;
for(int i = 0; i < MAX_LIGHTS; i++)
{
vec4 lightPos = gl_LightSource[i].position;
lights.direction[i] = vec3(lightPos.w > 0 ? lightPos-vertex : lightPos);
lights.direction[i] *= TBNMatrix;
}
gl_TexCoord[0] = gl_MultiTexCoord0;
gl_Position = gl_ModelViewProjectionMatrix*gl_Vertex;
}
Fragment Shader
#define MAX_LIGHTS 1
struct LightProps
{
vec3 direction[MAX_LIGHTS];
};
uniform sampler2D textureUnit;
uniform sampler2D normalTextureUnit;
uniform vec4 TexColor;
varying LightProps lights;
void main()
{
vec3 N = normalize(texture2D(normalTextureUnit,gl_TexCoord[0].st).rgb*2.0-1.0);
vec4 color = vec4(0,0,0,0);
for(int i = 0; i < MAX_LIGHTS; i++)
{
vec3 L = lights.direction[i];
float dist = length(L);
L = normalize(L);
float NdotL = max(dot(N,L),0.0);
if(NdotL > 0)
{
float att = 1.0;
if(gl_LightSource[i].position.w > 0)
{
att = 1.0/ (gl_LightSource[i].constantAttenuation +
gl_LightSource[i].linearAttenuation * dist +
gl_LightSource[i].quadraticAttenuation * dist * dist);
}
vec4 ambient = gl_FrontLightProduct[i].ambient;
vec4 diffuse = clamp(att*NdotL*gl_FrontLightProduct[i].diffuse,0,1);
color += att*(ambient+diffuse);
}
}
vec4 textureColor = texture2D(textureUnit, vec2(gl_TexCoord[0]));
gl_FragColor = TexColor*textureColor + gl_FrontLightModelProduct.sceneColor + color;
}
I set TexColor to (0.3,0.3,0.3,1.0) and take screenshots:
Screenshot
There is little bit lighting when I rotate camera and light to left,
but when I rotate to right the plane got fully illuminated.I think there is something wrong because without normal mapping plane looks same from to sides. Here is normal texture. Thanks in advance.
Normal Map:
That normal map is in tangent-space, but you are treating it as object-space.
You need a bitangent and/or tangent vector per-vertex in addition to your normal in order to form the basis to perform transformation into and out of tangent-space. This matrix is often referred to as simply TBN.
You have two options here:
Transform all of your lighting direction vectors into tangent-space
Useful for forward-shading, can be done in a vertex shader
Transform your normal map from tangent-space back to view-space
Required by deferred-shading, must be done in fragment shader
Both options require the construction of a TBN matrix, and if your tangent-space basis is orthogonal (modeling software like Assimp can be configured to do this for you) you can transpose the TBN matrix to do either one.
You are implementing forward-shading, so solution #1 is the approach you should take.
Below is a rough overview of the necessary steps for solution #1. Ordinarily you would do the calculation of the lighting direction vector in the vertex shader for better performance.
Vertex Shader to Transform of Lighting Vectors into Tangent-space:
attribute vec3 tangent;
attribute vec3 bitangent;
varying vec3 N;
varying vec3 V;
varying vec3 E;
varying vec3 T;
varying vec3 B;
void main()
{
N = normalize(gl_NormalMatrix*gl_Normal);
V = vec3(gl_ModelViewMatrix*gl_Vertex);
E = normalize(-V);
T = normalize(gl_NormalMatrix*tangent);
B = normalize(gl_NormalMatrix*bitangent);
gl_TexCoord[0] = gl_MultiTexCoord0;
gl_Position = gl_ModelViewProjectionMatrix*gl_Vertex;
}
Fragment Shader to Transform Lighting Vectors into Tangent-space:
varying vec3 N;
varying vec3 V;
varying vec3 E;
varying vec3 B;
varying vec3 T;
uniform sampler2D textureUnit;
uniform sampler2D normalTextureUnit;
uniform vec4 TexColor;
#define MAX_LIGHTS 1
void main()
{
// Construct Tangent Space Basis
mat3 TBN = mat3 (T, B, N);
vec3 normal = normalize (texture2D(normalTextureUnit,gl_TexCoord[0].st).xyz*2.0 - 1.0);
vec4 color = vec4(0,0,0,0);
for(int i = 0; i < MAX_LIGHTS; i++)
{
vec4 lightPos = gl_LightSource[i].position;
vec3 L = lightPos.w > 0 ? lightPos.xyz - V : lightPos;
L *= TBN; // Transform into tangent-space
float dist = length(L);
L = normalize(L);
float NdotL = max(dot(L,N),0.0);
if(NdotL > 0)
{
float att = 1.0;
if(lightPos.w > 0)
{
att = 1.0/ (gl_LightSource[i].constantAttenuation +
gl_LightSource[i].linearAttenuation * dist +
gl_LightSource[i].quadraticAttenuation * dist * dist);
}
vec4 diffuse = clamp(att*NdotL*gl_FrontLightProduct[i].diffuse,0,1);
color += att*gl_FrontLightProduct[i].ambient + diffuse;
}
}
vec4 textureColor = texture2D(textureUnit, vec2(gl_TexCoord[0]));
gl_FragColor = TexColor*textureColor + gl_FrontLightModelProduct.sceneColor + color;
}
There is a tutorial here that should fill in the gaps and explain how to compute tangent and bitangent.
Related
I am following along with the LearnOpenGL guide and am trying to implement Steep Parallax Mapping.
Everything seems to be working fine except my brick wall seems to have distinct visible layers whereas the photos in the guide don't show any layers. I was trying to use this code to parallax the topography of the world but these weird layers seem to show up there too so I was hoping to find a fix for this.
Layered wall photo
[1
Photo of how it should look
Here is my modified vertex shader
#version 300 es
in vec4 vPosition; // aPos
in vec2 texCoord; // aTexCoords
in vec4 vNormal; // aNormal
in vec4 vTangent; // aTangent
uniform mat4 model_view;
uniform mat4 projection;
uniform vec4 light_position;
out vec2 ftexCoord;
out vec3 vT;
out vec3 vN;
out vec4 position;
out vec3 FragPos;
out vec3 TangentLightPos;
out vec3 TangentViewPos;
out vec3 TangentFragPos;
void
main()
{
// Normal variables
vN = normalize(model_view * vNormal).xyz;
vT = normalize(model_view * vTangent).xyz;
vec4 veyepos = model_view*vPosition;
position = veyepos;
ftexCoord = texCoord;
// Displacement variables
vec3 bi = cross(vT, vN);
FragPos = vec3(model_view * vPosition).xyz;
vec3 T = normalize(mat3(model_view) * vTangent.xyz);
vec3 B = normalize(mat3(model_view) * bi);
vec3 N = normalize(mat3(model_view) * vNormal.xyz);
mat3 TBN = transpose(mat3(T, B, N));
TangentLightPos = TBN * light_position.xyz;
TangentViewPos = TBN * vPosition.xyz;
TangentFragPos = TBN * FragPos;
gl_Position = projection * model_view * vPosition;
}
and my modified fragment shader is here
#version 300 es
precision highp float;
in vec2 ftexCoord;
in vec3 vT; //parallel to surface in eye space
in vec3 vN; //perpendicular to surface in eye space
in vec4 position;
in vec3 FragPos;
in vec3 TangentLightPos;
in vec3 TangentViewPos;
in vec3 TangentFragPos;
uniform int mode;
uniform vec4 light_position;
uniform vec4 light_color;
uniform vec4 ambient_light;
uniform sampler2D colorMap;
uniform sampler2D normalMap;
uniform sampler2D depthMap;
out vec4 fColor;
// STEEP PARALLAX MAPPING
vec2 ParallaxMapping(vec2 texCoords, vec3 viewDir)
{
// number of depth layers
const float minLayers = 8.0;
const float maxLayers = 32.0;
float numLayers = mix(maxLayers, minLayers, abs(dot(vec3(0.0, 0.0, 1.0), viewDir)));
// calculate the size of each layer
float layerDepth = 1.0 / numLayers;
// depth of current layer
float currentLayerDepth = 0.0;
// the amount to shift the texture coordinates per layer (from vector P)
vec2 P = viewDir.xy / viewDir.z * 0.1;
vec2 deltaTexCoords = P / numLayers;
// get initial values
vec2 currentTexCoords = texCoords;
float currentDepthMapValue = texture(depthMap, currentTexCoords).r;
while(currentLayerDepth < currentDepthMapValue)
{
// shift texture coordinates along direction of P
currentTexCoords -= deltaTexCoords;
// get depthmap value at current texture coordinates
currentDepthMapValue = texture(depthMap, currentTexCoords).r;
// get depth of next layer
currentLayerDepth += layerDepth;
}
return currentTexCoords;
}
void main()
{
// DO NORMAL MAPPING
if (mode == 0) {
vec3 T = normalize(vT);
vec3 N = normalize(vN);
vec3 bi = cross(T, N);
mat4 changeOfCoord = mat4(vec4(T, 0), vec4(bi, 0), vec4(N, 0), vec4(0, 0, 0, 1));
vec3 L = normalize(light_position - position).xyz;
vec3 E = normalize(-position).xyz;
vec4 text = vec4(texture(normalMap, ftexCoord) * 2.0 - 1.0);
vec4 eye = changeOfCoord * text;
vec4 amb = texture(colorMap, ftexCoord) * ambient_light;
vec4 diff = max(0.0, dot(L, eye.xyz)) * light_color * texture(colorMap, ftexCoord);
fColor = amb + diff;
} else if (mode == 1) { // DO PARALLAX MAPPING
// offset texture coordinates with Parallax Mapping
vec3 viewDir = normalize(TangentViewPos - TangentFragPos);
vec2 texCoords = ftexCoord;
texCoords = ParallaxMapping(ftexCoord, viewDir);
// discard samples outside of the default texture coordinate space
if(texCoords.x > 1.0 || texCoords.y > 1.0 || texCoords.x < 0.0 || texCoords.y < 0.0)
discard;
// obtain normal from normal map
vec3 normal = texture(normalMap, texCoords).rgb;
//values stored in normal texture is [0,1] range, we need [-1, 1] range
normal = normalize(normal * 2.0 - 1.0);
// get diffuse color
vec3 color = texture(colorMap, texCoords).rgb;
// ambient
vec3 ambient = 0.1 * color;
// diffuse
vec3 lightDir = normalize(TangentLightPos - TangentFragPos);
float diff = max(dot(lightDir, normal), 0.0);
vec3 diffuse = diff * color;
// specular
vec3 reflectDir = reflect(lightDir, normal);
vec3 halfwayDir = normalize(lightDir + viewDir);
float spec = pow(max(dot(normal, halfwayDir), 0.0), 32.0);
vec3 specular = vec3(0.2) * spec;
fColor = vec4(ambient + diffuse + 0.0, 1.0);
}
}
The layers at acute gazing angles are a common effect at parallax mapping. To improve the result you've to increment the number of samples or implement Parallax Occlusion Mapping (as described in the bottom part of the tutorial):
// STEEP PARALLAX MAPPING
vec2 ParallaxMapping(vec2 texCoords, vec3 viewDir)
{
// number of depth layers
const float minLayers = 8.0;
const float maxLayers = 32.0;
float numLayers = mix(maxLayers, minLayers, abs(dot(vec3(0.0, 0.0, 1.0), viewDir)));
// calculate the size of each layer
float layerDepth = 1.0 / numLayers;
// depth of current layer
float currentLayerDepth = 0.0;
// the amount to shift the texture coordinates per layer (from vector P)
vec2 P = viewDir.xy / viewDir.z * 0.1;
vec2 deltaTexCoords = P / numLayers;
// get initial values
vec2 currentTexCoords = texCoords;
float currentDepthMapValue = texture(depthMap, currentTexCoords).r;
while(currentLayerDepth < currentDepthMapValue)
{
// shift texture coordinates along direction of P
currentTexCoords -= deltaTexCoords;
// get depthmap value at current texture coordinates
currentDepthMapValue = texture(depthMap, currentTexCoords).r;
// get depth of next layer
currentLayerDepth += layerDepth;
}
// get texture coordinates before collision (reverse operations)
vec2 prevTexCoords = currentTexCoords + deltaTexCoords;
// get depth after and before collision for linear interpolation
float afterDepth = currentDepthMapValue - currentLayerDepth;
float beforeDepth = texture(depthMap, prevTexCoords).r - currentLayerDepth + layerDepth;
// interpolation of texture coordinates
float weight = afterDepth / (afterDepth - beforeDepth);
vec2 finalTexCoords = prevTexCoords * weight + currentTexCoords * (1.0 - weight);
return finalTexCoords;
}
By thee way, the vector seems to be inverted. In common the bitangent is the Cross product of the normal vector and the tangent in a Right-handed system. But that depends on the displacement texture.
vec3 bi = cross(vT, vN);
vec3 bi = cross(vN, vT);
See further:
Bump Mapping with javascript and glsl
Normal, Parallax and Relief mapping
Demo
The normal mapping looks great when the objects aren't rotated from the origin, and spot lights and directional lights work, but when I spin an object on the spot it darkens and then lightens again, just on the top face.
I'm testing using a cube. I've used a geometry shader to visualise my calculated normals (after multiplying by a TBN matrix), and they appear to be in the correct places. If I take the normal map out of the equation then the lighting is fine.
Here's where the TBN is calculated:
void calculateTBN()
{
//get the normal matrix
mat3 model = mat3(transpose(inverse(mat3(transform))));
vec3 T = normalize(vec3(model * tangent.xyz ));
vec3 N = normalize(vec3(model * normal ));
vec3 B = cross(N, T);
mat3 TBN = mat3( T , B , N);
outputVertex.TBN =TBN;
}
And the normal is sampled and transformed:
vec3 calculateNormal()
{
//Sort the input so that the normal is between 1 and minus 1 instead of 0 and 1
vec3 input = texture2D(normalMap, inputFragment.textureCoord).xyz;
input = 2.0 * input - vec3(1.0, 1.0, 1.0);
vec3 newNormal = normalize(inputFragment.TBN* input);
return newNormal;
}
My Lighting is in world space (as far as I understand the term, it takes into account the transform matrix but not the camera or projection matrix)
I did try the technique where I pass down the TBN as inverse (or transpose) and then multiplied every vector apart from the normal by it. That had the same effect. I'd rather work in world space anyway as apparently this is better for deffered lighting? Or so I've heard.
If you'd like to see any of the lighting code and so on I'll add it in but I didn't think it was necessary as it works apart from this.
EDIT::
As requested, here is vertex and part of frag shader
#version 330
uniform mat4 T; // Translation matrix
uniform mat4 S; // Scale matrix
uniform mat4 R; // Rotation matrix
uniform mat4 camera; // camera matrix
uniform vec4 posRelParent; // the position relative to the parent
// Input vertex packet
layout (location = 0) in vec4 position;
layout (location = 2) in vec3 normal;
layout (location = 3) in vec4 tangent;
layout (location = 4) in vec4 bitangent;
layout (location = 8) in vec2 textureCoord;
// Output vertex packet
out packet {
vec2 textureCoord;
vec3 normal;
vec3 vert;
mat3 TBN;
vec3 tangent;
vec3 bitangent;
vec3 normalTBN;
} outputVertex;
mat4 transform;
mat3 TBN;
void calculateTBN()
{
//get the model matrix, the transform of the object with scaling and transform removeds
mat3 model = mat3(transpose(inverse(transform)));
vec3 T = normalize(model*tangent.xyz);
vec3 N = normalize(model*normal);
//I used to retrieve the bitangents by crossing the normal and tangent but now they are calculated independently
vec3 B = normalize(model*bitangent.xyz);
TBN = mat3( T , B , N);
outputVertex.TBN = TBN;
//Pass though TBN vectors for colour debugging in the fragment shader
outputVertex.tangent = T;
outputVertex.bitangent = B;
outputVertex.normalTBN = N;
}
void main(void) {
outputVertex.textureCoord = textureCoord;
// Setup local variable pos in case we want to modify it (since position is constant)
vec4 pos = vec4(position.x, position.y, position.z, 1.0) + posRelParent;
//Work out the transform matrix
transform = T * R * S;
//Work out the normal for lighting
mat3 normalMat = transpose(inverse(mat3(transform)));
outputVertex.normal = normalize(normalMat* normal);
calculateTBN();
outputVertex.vert =(transform* pos).xyz;
//Work out the final pos of the vertex
gl_Position = camera * transform * pos;
}
And Lighting vector of fragment:
vec3 applyLight(Light thisLight, vec3 baseColor, vec3 surfacePos, vec3 surfaceToCamera)
{
float attenuation = 1.0f;
vec3 lightPos = (thisLight.finalLightMatrix*thisLight.position).xyz;
vec3 surfaceToLight;
vec3 coneDir = normalize(thisLight.coneDirection);
if (thisLight.position.w == 0.0f)
{
//Directional Light (all rays same angle, use position as direction)
surfaceToLight = normalize( (thisLight.position).xyz);
attenuation = 1.0f;
}
else
{
//Point light
surfaceToLight = normalize(lightPos - surfacePos);
float distanceToLight = length(lightPos - surfacePos);
attenuation = 1.0 / (1.0f + thisLight.attenuation * pow(distanceToLight, 2));
//Work out the Cone restrictions
float lightToSurfaceAngle = degrees(acos(dot(-surfaceToLight, normalize(coneDir))));
if (lightToSurfaceAngle > thisLight.coneAngle)
{
attenuation = 0.0;
}
}
}
Here's the main of the frag shader too:
void main(void) {
//get the base colour from the texture
vec4 tempFragColor = texture2D(textureImage, inputFragment.textureCoord).rgba;
//Support for objects with and without a normal map
if (useNormalMap == 1)
{
calcedNormal = calculateNormal();
}
else
{
calcedNormal = inputFragment.normal;
}
vec3 surfaceToCamera = normalize((cameraPos_World) - (inputFragment.vert));
vec3 tempColour = vec3(0.0, 0.0, 0.0);
for (int count = 0; count < numLights; count++)
{
tempColour += applyLight(allLights[count], tempFragColor.xyz, inputFragment.vert, surfaceToCamera);
}
vec3 gamma = vec3(1.0 / 2.2);
fragmentColour = vec4(pow(tempColour,gamma), tempFragColor.a);
//fragmentColour = vec4(calcedNormal, 1);
}
Edit 2:
The geometry shader used to visualize "sampled" normals by the TBN matrix as shown here:
void GenerateLineAtVertex(int index)
{
vec3 testSampledNormal = vec3(0, 0, 1);
vec3 bitangent = cross(gs_in[index].normal, gs_in[index].tangent);
mat3 TBN = mat3(gs_in[index].tangent, bitangent, gs_in[index].normal);
testSampledNormal = TBN * testSampledNormal;
gl_Position = gl_in[index].gl_Position;
EmitVertex();
gl_Position =
gl_in[index].gl_Position
+ vec4(testSampledNormal, 0.0) * MAGNITUDE;
EmitVertex();
EndPrimitive();
}
And it's vertex shader
void main(void) {
// Setup local variable pos in case we want to modify it (since position is constant)
vec4 pos = vec4(position.x, position.y, position.z, 1.0);
mat4 transform = T* R * S;
// Apply transformation to pos and store result in gl_Position
gl_Position = projection* camera* transform * pos;
mat3 normalMatrix = mat3(transpose(inverse(camera * transform)));
vs_out.tangent = normalize(vec3(projection * vec4(normalMatrix * tangent.xyz, 0.0)));
vs_out.normal = normalize(vec3(projection * vec4(normalMatrix * normal , 0.0)));
}
Here is the TBN vectors visualized. The slight angles on the points are due to an issue with how I'm applying the projection matrix, rather than mistakes in the actual vectors. The red lines just show where the arrows I've drawn on the texture are, they're not very clear from that angle that's all.
Problem Solved!
Actually nothing to do with the code above, although thanks to everyone that helped.
I was importing the texture using my own texture loader, which uses by default non-gamma corrected, SRGB colour in 32 bit. I switched it to 24bit and just RGB colour and it worked straight away. Typical developer problems....
I try to implement normal mapping using LibGDX. So I got some positive results when I calculate diffuse and specular color in vertex shader (at least I think so).
Vertex shader:
attribute vec4 a_position;
attribute vec2 a_texCoord0;
attribute vec3 a_normal;
varying vec2 v_texCoord;
varying float v_diffuse;
varying vec3 v_specular;
varying vec3 v_lightVec;
uniform mat4 u_worldTrans;
uniform mat4 u_projTrans;
uniform mat4 u_matViewInverseTranspose;
uniform mat4 u_matModelView;
const vec3 lightVector = vec3(0.0,0.0,-1.0);
void main()
{
// Output the unmodified vertex position.
gl_Position = u_projTrans * u_worldTrans * a_position;
mat3 normalMatrix = mat3(u_matViewInverseTranspose);
// compute the transformed normal
vec3 n = normalize(normalMatrix * a_normal);
// compute the light vector pointing toward the sun, in model coordinates
// x,y compose the longitude and z the (seasonal) lattitude of the nadir point.
//vec3 lightVec = normalize(vec3(u_matViewInverseTranspose * vec4(u_lightVec, 1.0)));
vec3 lightVec = normalize(normalMatrix * lightVector);
// Calculate a diffuse light intensity
//v_diffuse = dot(lightVec, n);
v_diffuse = clamp(dot(n, lightVec), 0.0, 1.0);
vec4 ecPosition = u_matModelView * a_position;
// compute the reflection vector
vec3 reflectVec = reflect(-lightVec, n);
// compute a unit vector in direction of viewing position
vec3 viewVec = normalize(vec3(-ecPosition));
// Calculate specular light intensity, scale down and apply a tint.
float specIntensity = pow(max(dot(reflectVec, viewVec), 0.0), 8.0);
v_specular = specIntensity *
//gloss color
vec3(1.,.7,.3) *
//gloss intensity
.7;
v_texCoord.y = 1.-a_texCoord0.y;
v_texCoord.x = a_texCoord0.x;
vec3 lightDir = normalize(lightVector - u_matModelView * a_position);
vec3 tangent=a_tangent;
vec3 t = normalize(normalMatrix * tangent);
vec3 b = cross (n, t);
vec3 v;
v.x = dot (lightDir, t);
v.y = dot (lightDir, b);
v.z = dot (lightDir, n);
v_lightVec = normalize (v);
}
Fragment shader:
precision mediump float;
varying vec2 v_texCoord;
varying float v_diffuse;
varying vec3 v_specular;
varying vec3 v_lightVec;
uniform sampler2D u_texture;
uniform sampler2D u_normalMap;
void main()
{
vec3 ground = texture2D(u_texture, v_texCoord).rgb;
vec3 normal = normalize(2.0 * texture2D (u_normalMap, v_texCoord).rgb - 1.0);
float lamberFactor = max (dot (normal, v_lightVec), 0.0);
vec3 color = ( ground.rgb * v_diffuse * lamberFactor + v_specular);
gl_FragColor = vec4 (color, 1.0);
}
Result:
As you can see the result is rendered correctly. Specular spot behaves like from many examples. But I need to implement specular color in fragment shader to get more impressive picture. So I found example from here and now I'm trying to make it works.
Vertex shader:
attribute vec4 a_position;
attribute vec2 a_texCoord0;
attribute vec3 a_normal;
attribute vec3 a_tangent;
varying vec2 v_texCoord;
varying vec3 v_lightVec;
varying vec3 v_eyeVec; //Added
uniform mat4 u_worldTrans;
uniform mat4 u_projTrans;
uniform mat4 u_matViewInverseTranspose;
uniform mat4 u_matModelView;
const vec3 lightVector = vec3(0.0,0.0,-1.0);
void main()
{
// Output the unmodified vertex position.
gl_Position = u_projTrans * u_worldTrans * a_position;
mat3 normalMatrix = mat3(u_matViewInverseTranspose);
// compute the transformed normal
vec3 n = normalize(normalMatrix * a_normal);
v_texCoord.y = 1.-a_texCoord0.y;
v_texCoord.x = a_texCoord0.x;
vec3 lightDir = normalize(lightVector - u_matModelView * a_position);
vec3 tangent=a_tangent;
vec3 t = normalize(normalMatrix * tangent);
vec3 b = cross (n, t);
vec3 v;
v.x = dot (lightDir, t);
v.y = dot (lightDir, b);
v.z = dot (lightDir, n);
v_lightVec = normalize (v);
//Added
vec3 ecPosition = u_matModelView * a_position;
vec3 tmp = vec3(-ecPosition);
v_eyeVec.x = dot(tmp, t);
v_eyeVec.y = dot(tmp, b);
v_eyeVec.z = dot(tmp, n);
v_eyeVec = normalize (v_eyeVec);
}
Fragment shader:
precision mediump float;
varying vec2 v_texCoord;
varying vec3 v_lightVec;
varying vec3 v_eyeVec;
uniform sampler2D u_texture;
uniform sampler2D u_normalMap;
void main()
{
vec3 ground = texture2D(u_texture, v_texCoord).rgb;
vec3 normal = normalize(2.0 * texture2D (u_normalMap, v_texCoord).rgb - 1.0);
//Added
float distSqr = dot(v_lightVec, v_lightVec);
float att = clamp(1.0 - .25 * sqrt(distSqr), 0.0, 1.0);
vec3 lVec = v_lightVec * inversesqrt(distSqr);
vec3 vVec = normalize(v_eyeVec);
vec3 bump = normalize( texture2D(u_normalMap, v_texCoord).xyz * 2.0 - 1.0);
float diffuse = max( dot(lVec, bump), 0.0 );
vec3 specular = pow(clamp(dot(reflect(-lVec, bump), v_eyeVec), 0.0, 1.0), 8.0 ) *
//gloss color
vec3(1.,.7,.3) *
//gloss intensity
.7;
vec3 color = ( ground.rgb * diffuse + specular) * att;
gl_FragColor = vec4 (color, 1.0);
}
Result:
Specular spot is wrong. I thought it happens because of wrong matrix calculation. If it's true why do first couple of shaders work correctly?
How do I get model-view matrix, normal matrix and others in LibGDX?
viewInvTraMatrix.set(camera.view);
viewInvTraMatrix.mul(renderable.worldTransform);
//model-view matrix
program.setUniformMatrix("u_matModelView", viewInvTraMatrix);
viewInvTraMatrix.inv(); //inverse
viewInvTraMatrix.tra(); //transpose
//normal matrix
program.setUniformMatrix("u_matViewInverseTranspose", viewInvTraMatrix);
//other matrix
program.setUniformMatrix("u_worldTrans", renderable.worldTransform);
program.setUniformMatrix("u_projTrans", camera.combined);
So, my question is what is wrong in the last couple of shaders?
The problem was in my model(mesh). After some time of digging I found out that my mesh doesn't have tangents and binormals. It happend because I used Blender 2.58 which cannot export model to FBX with tangent space. Fortunately, they fixed this bug in Blender 2.71 and later. So, when you export your model, you should tick Tangent Space parameter and choose version FBX 7.4 binary. To ensure that your mesh contains tangents and binormals, you can use fbx-converter tool to convert your fbx file to g3dj format instead of default g3db by using additional option like below:
fbx-converter.exe -o g3dj yourModel.fbx yourConvertedModel.g3dj
Then open it in some program like Notepad and check that attributes contain TANGENT and BINORMAL
"version": [ 0, 1],
"id": "",
"meshes": [
{
"attributes": ["POSITION", "NORMAL", "TANGENT", "BINORMAL", "TEXCOORD0"],
"vertices": [
0.257400, 58.707802, -0.257400, 0.000000, 1.000000, -0.000000, 0.941742,
Only one thing I don't understand, why does lamberFactor work with wrong tangent attribute(actually, without it)? Most likely, diffuse color is wrong calculated, but in my example(on sphere) it looks pretty satisfactorily. I hope it will help somebody else.
EDIT
By the way, to get a normal & a model-view matrices, I use the following code:
myMatrix.set(camera.view);
myMatrix.mul(renderable.worldTransform);
program.setUniformMatrix(u_matModelView, myMatrix); //pass model-view matrix
myMatrix.inv();
myMatrix.tra();
program.setUniformMatrix(u_matNormal, myMatrix); //pass normal matrix
I'm trying to implement normal/steep parallax mapping, but I'm getting some weird artefacts. It looks like the view & light vectors are being distorted when they are transformed into tangent space, which is causing lighting and the uv offsetting of the parallax to be incorrect.
Vertex shader source:
// parallax.vert
#version 330
// Some drivers require the following
precision highp float;
//per vertex inputs stored on GPU memory
in vec3 in_Position;
in vec3 in_Normal;
in vec4 tangent;
//per mesh data sent at rendering time
uniform mat4 model; //object to world space transformations
uniform mat4 view; //world to eye space transformations
uniform mat4 projection; //eye space to screen space
uniform mat3 normalMat; //for transforming normals in
uniform vec4 lightPosition; //light position in world space
uniform vec3 eyePosition; //eye position in world space
//outputs to fragment shader
in vec2 in_TexCoord;
// multiply each vertex position by the MVP matrix
// and find V, L, and TBN Matrix vectors for the fragment shader
out VertexData
{
vec3 ts_L; //view vector tangent space
vec3 ts_V; //light vector tangent space
vec2 ex_TexCoord;
float dist;
} vertex;
void main(void)
{
//view space position for lighting calculations
vec3 position = ( ( view * model ) * vec4( in_Position,1.0 ) ).xyz;
//calculate view vector and light vector in view space
vec3 ex_V = -position;
vec3 ex_L = ( view * vec4( lightPosition.xyz,1.0 ) ).xyz - position;
vertex.ex_TexCoord = in_TexCoord;
//now calculate tbn matrix
//calculate biTangent, in view space
vec3 normal = normalMat*in_Normal;
vec3 tan = normalMat * tangent.xyz;
vec3 bitangent = normalize( cross( normal, tan.xyz ) * tangent.w );
//fragment stage will need this value for calculating light attenuation
vertex.dist = length( ex_L );
//calculate transpose tangent space matrix, as we are doing everything in T space in fragment stage
mat3 transTBN = transpose( mat3( tan, bitangent, normal ) );
//transform view space vertex view & light vectors into tangent space
vertex.ts_V = normalize( transTBN * ex_V );
vertex.ts_L = normalize( transTBN * ex_L );
gl_Position = projection * vec4(position,1.0);
}
fragment shader source:
// parallax.frag
#version 330
// Some drivers require the following
precision highp float;
struct lightStruct
{
vec4 ambient;
vec4 diffuse;
vec4 specular;
vec4 position;
float radius;
};
struct materialStruct
{
vec4 ambient;
vec4 diffuse;
vec4 specular;
vec4 emissive;
float shininess;
};
//uniforms sent at render time
uniform lightStruct light;
uniform materialStruct material;
uniform vec2 scaleBias;
//texture information
uniform sampler2D colourMap;
uniform sampler2D normalMap;
uniform sampler2D specularMap;
uniform sampler2D heightMap;
uniform sampler2D glossMap;
//inputs from vertex shader stage
in VertexData
{
vec3 ts_L; //view vector tangent space
vec3 ts_V; //light vector tangent space
vec2 ex_TexCoord;
float dist;
} vertex;
//final fragment colour
layout(location = 0) out vec4 out_Color;
void main(void) {
//calculate halfway vector for blinn phong
vec3 Half = normalize( vertex.ts_V + vertex.ts_L );
//and create some temporary types algorithm will need
vec2 texCoord;
vec3 diffuseTexel ;
vec3 specularTexel;
float gloss;
vec3 normal;
//first sample texel value from height map
float height = texture( heightMap, vertex.ex_TexCoord.st ).r;
//-----------------------------------------------
// steep parallax
float start = 1.0;
int numsteps = 50;
float step = 1.0/numsteps;
vec2 offset = vertex.ex_TexCoord;
vec2 delta = -vertex.ts_V.xy * scaleBias.r / (vertex.ts_V.z * numsteps);
for (int i = 0; i < numsteps; i++)
{
if(height < start)
{
start-=step;
offset+=delta;
height = texture(heightMap,offset).r;
}
else
{
texCoord = offset;
break;
}
}
//---------------------------------------------------
//normal maps are encoded with values between 0.0 and 0.5 being negative, so need to remove the encoding.
//sample textures
diffuseTexel = texture(colourMap,texCoord.st).rgb;
specularTexel = texture(specularMap,texCoord.st).rgb;
normal = normalize(texture(normalMap,texCoord.st).rgb * 2.0 - 1.0);
gloss = texture(glossMap,texCoord.st).r;
//--------------------------------------------------
//Lighting -- Blinn/Phong Lighting with specular, normal and gloss mapping
float lambert = max( dot (normal, vertex.ts_L ), 0.0 );
vec3 litColour = vec3(0.0,0.0,0.0);
//if light is worth calculating
if (lambert > 0.0)
{
vec3 R = normalize(-reflect(vertex.ts_L,normal));
float specularPower = pow(max(dot(R,Half),0.0),material.shininess);
litColour += light.diffuse.xyz * material.diffuse.xyz * lambert;
litColour += light.specular.xyz * material.specular.xyz * specularPower * gloss;
float attenuation = 1.0 + (0.01 * vertex.dist*vertex.dist) + (0.01 * vertex.dist);
attenuation = 1.0/attenuation;
litColour *= attenuation;
}
out_Color = vec4( litColour *diffuseTexel*(specularTexel) ,1.0);
}
The normal matrix is being calculated by taking the
transpose( inverse( modelview ) );
I've probably missed something really obvious, but I can't see it, and would be grateful for some input.
Update:
I've modified to perform the lighting & offsetting based on world space vectors, which has now produced the following results:
Lighting is now correct, and the parallax is good when viewed from close to the surface normal, but...
As the angle between view & normal increases, the parallax goes very weird.
Again, I'd be grateful for some help in this.
vec3 ex_V = -position;
Shouldn't this be:
vec3 ex_V = eyePosition - position;
I am trying to render a scene using normal mapping
Therefore I am calculating the tangent space in C++ and store the binormal and tanget seperately in an array which will be uploaded to my shader using vertexattribpointer.
Here is how I calculate the space
void ObjLoader::computeTangentSpace(MeshData &meshData) {
GLfloat* tangents = new GLfloat[meshData.vertex_position.size()]();
GLfloat* binormals = new GLfloat[meshData.vertex_position.size()]();
std::vector<glm::vec3 > tangent;
std::vector<glm::vec3 > binormal;
for(unsigned int i = 0; i < meshData.indices.size(); i = i+3){
glm::vec3 vertex0 = glm::vec3(meshData.vertex_position.at(meshData.indices.at(i)), meshData.vertex_position.at(meshData.indices.at(i)+1),meshData.vertex_position.at(meshData.indices.at(i)+2));
glm::vec3 vertex1 = glm::vec3(meshData.vertex_position.at(meshData.indices.at(i+1)), meshData.vertex_position.at(meshData.indices.at(i+1)+1),meshData.vertex_position.at(meshData.indices.at(i+1)+2));
glm::vec3 vertex2 = glm::vec3(meshData.vertex_position.at(meshData.indices.at(i+2)), meshData.vertex_position.at(meshData.indices.at(i+2)+1),meshData.vertex_position.at(meshData.indices.at(i+2)+2));
glm::vec3 normal = glm::cross((vertex1 - vertex0),(vertex2 - vertex0));
glm::vec3 deltaPos;
if(vertex0 == vertex1)
deltaPos = vertex2 - vertex0;
else
deltaPos = vertex1 - vertex0;
glm::vec2 uv0 = glm::vec2(meshData.vertex_texcoord.at(meshData.indices.at(i)), meshData.vertex_texcoord.at(meshData.indices.at(i)+1));
glm::vec2 uv1 = glm::vec2(meshData.vertex_texcoord.at(meshData.indices.at(i+1)), meshData.vertex_texcoord.at(meshData.indices.at(i+1)+1));
glm::vec2 uv2 = glm::vec2(meshData.vertex_texcoord.at(meshData.indices.at(i+2)), meshData.vertex_texcoord.at(meshData.indices.at(i+2)+1));
glm::vec2 deltaUV1 = uv1 - uv0;
glm::vec2 deltaUV2 = uv2 - uv0;
glm::vec3 tan; // tangents
glm::vec3 bin; // binormal
// avoid divion with 0
if(deltaUV1.s != 0)
tan = deltaPos / deltaUV1.s;
else
tan = deltaPos / 1.0f;
tan = glm::normalize(tan - glm::dot(normal,tan)*normal);
bin = glm::normalize(glm::cross(tan, normal));
// write into array - for each vertex of the face the same value
tangents[meshData.indices.at(i)] = tan.x;
tangents[meshData.indices.at(i)+1] = tan.y;
tangents[meshData.indices.at(i)+2] = tan.z;
tangents[meshData.indices.at(i+1)] = tan.x;
tangents[meshData.indices.at(i+1)+1] = tan.y;
tangents[meshData.indices.at(i+1)+2] = tan.z;
tangents[meshData.indices.at(i+2)] = tan.x;
tangents[meshData.indices.at(i+2)+1] = tan.y;
tangents[meshData.indices.at(i+2)+1] = tan.z;
binormals[meshData.indices.at(i)] = bin.x;
binormals[meshData.indices.at(i)+1] = bin.y;
binormals[meshData.indices.at(i)+2] = bin.z;
binormals[meshData.indices.at(i+1)] = bin.x;
binormals[meshData.indices.at(i+1)+1] = bin.y;
binormals[meshData.indices.at(i+1)+2] = bin.z;
binormals[meshData.indices.at(i+2)] = bin.x;
binormals[meshData.indices.at(i+2)+1] = bin.y;
binormals[meshData.indices.at(i+2)+1] = bin.z;
}
// Copy the tangent and binormal to meshData
for(unsigned int i = 0; i < meshData.vertex_position.size(); i++){
meshData.vertex_tangent.push_back(tangents[i]);
meshData.vertex_binormal.push_back(binormals[i]);
}
}
And here are my vertex and fragment shader
Vertex Shader
#version 330
layout(location = 0) in vec3 vertex;
layout(location = 1) in vec3 vertex_normal;
layout(location = 2) in vec2 vertex_texcoord;
layout(location = 3) in vec3 vertex_tangent;
layout(location = 4) in vec3 vertex_binormal;
struct LightSource {
vec3 ambient_color;
vec3 diffuse_color;
vec3 specular_color;
vec3 position;
};
uniform vec3 lightPos;
out vec3 vertexNormal;
out vec3 eyeDir;
out vec3 lightDir;
out vec2 textureCoord;
uniform mat4 view;
uniform mat4 modelview;
uniform mat4 projection;
out vec4 myColor;
void main() {
mat4 normalMatrix = transpose(inverse(modelview));
gl_Position = projection * modelview * vec4(vertex, 1.0);
vec4 binormal = modelview * vec4(vertex_binormal,1);
vec4 tangent = modelview * vec4(vertex_tangent,1);
vec4 normal = vec4(vertex_normal,1);
mat3 tangentMatrix = mat3(tangent.xyz,binormal.xyz,normal.xyz);
vec3 vertexInCamSpace = (modelview * vec4(vertex, 1.0)).xyz;
eyeDir = tangentMatrix * normalize( -vertexInCamSpace);
vec3 lightInCamSpace = (view * vec4(lightPos, 1.0)).xyz;
lightDir = tangentMatrix * normalize((lightInCamSpace - vertexInCamSpace));
textureCoord = vertex_texcoord;
}
Fragment Shader
#version 330
struct LightSource {
vec3 ambient_color;
vec3 diffuse_color;
vec3 specular_color;
vec3 position;
};
struct Material {
vec3 ambient_color;
vec3 diffuse_color;
vec3 specular_color;
float specular_shininess;
};
uniform LightSource light;
uniform Material material;
in vec3 vertexNormal;
in vec3 eyeDir;
in vec3 lightDir;
in vec2 textureCoord;
uniform sampler2D texture;
uniform sampler2D normals;
out vec4 color;
in vec4 myColor;
in vec3 bin;
in vec3 tan;
void main() {
vec3 diffuse = texture2D(texture,textureCoord).rgb;
vec3 E = normalize(eyeDir);
vec3 N = texture2D(normals,textureCoord).xyz;
N = (N - 0.5) * 2.0;
vec3 ambientTerm = vec3(0);
vec3 diffuseTerm = vec3(0);
vec3 specularTerm = vec3(0);
vec3 L, H;
L = normalize(lightDir);
H = normalize(E + L);
ambientTerm += light.ambient_color;
diffuseTerm += light.diffuse_color * max(dot(L, N), 0);
specularTerm += light.specular_color * pow(max(dot(H, N), 0), material.specular_shininess);
ambientTerm *= material.ambient_color;
diffuseTerm *= material.diffuse_color;
specularTerm *= material.specular_color;
color = vec4(diffuse, 1) * vec4(ambientTerm + diffuseTerm + specularTerm, 1);
}
The problem is that sometimes I dont have values for tangent and binormal in the shader.. Here are three screenshots which I hope will clearify my problem:
This is how the scene currently looks like when I render it with the code above:
This is how the scene looks like, when I use lightDir as color
And the third shows the scene with eyeDir as color
All the pictures are taken from the same angle without moving camera or rotating anything.
I've already compared my code to several different sources in the www but I didn't found the error I've done...
Additional information:
I am iterating over all current faces. Three indices will give me one triangle. The UV values for each vertex are stored at the same index. having a lot of debugging there, I am very sure that this are the correct values as I can find the right values in the .obj file when searching using gedit.
After calculating tangent and binormal I am storing the normal at the same index as the vertex position is in the array. For my understanding this should give me the correct position and I am calculating this for each vertex. For each vertex in a face I am using the same tangent basis, which is maybe later overwritten when another face is using this vertex, this could mess up my final result but only in very small details...
EDIT:
For any other questions, here is the whole project:
http://www.incentivelabs.de/Sourcecode/normal_mapping.zip
In your vertex shader you have:
vec4 binormal = modelview * vec4(vertex_binormal,1);
vec4 tangent = modelview * vec4(vertex_tangent,1);
vec4 normal = vec4(vertex_normal,1);
This should be:
vec4 binormal = modelview * vec4(vertex_binormal,0);
vec4 tangent = modelview * vec4(vertex_tangent,0);
vec4 normal = modelview * vec4(vertex_normal,0);
Note the '0' instead of '1' (also I'm assuming you meant to transform your normal too). You use '0' here because you want to ignore the translation part of the modelview transformation (you're transforming a vector not a point).