GLSL shadow mapping: shadow2DProj causes rendering artifacts - opengl

Managed to get Shadow Mapping to work in my OpenGL rendering engine, but it is producing some weird artifacts that I think are "shadow acne". However, I am using shadow2DProj to get the shadow value from the shadow depth map, which for me has proven to be the only way to get shadows to show up at all. Therefore, looking around at various tutorial at learnopengl, opengl-tutorials and others have yielded no help. Would like some advice as to how I could mitigate this problem.
Here is my shader that I use to draw the shadow map with:
#version 330 core
out vec4 FragColor;
struct Light {
vec3 position;
vec3 ambient;
vec3 diffuse;
vec3 specular;
vec3 attenuation;
};
in vec3 FragPos;
in vec3 Normal;
in vec2 TexCoords;
in vec4 ShadowCoords;
uniform vec3 viewPos;
uniform sampler2D diffuseMap;
uniform sampler2D specularMap;
uniform sampler2DShadow shadowMap;
uniform Light lights[4];
uniform float shininess;
float calculateShadow(vec3 lightDir)
{
float shadowValue = shadow2DProj(shadowMap, ShadowCoords).r;
float shadow = shadowValue;
return shadow;
}
vec3 calculateAmbience(Light light, vec3 textureMap)
{
return light.ambient * textureMap;
}
void main()
{
vec4 tex = texture(diffuseMap, TexCoords);
if (tex.a < 0.5)
{
discard;
}
vec3 ambient = vec3(0.0);
vec3 diffuse = vec3(0.0);
vec3 specular = vec3(0.0);
vec3 norm = normalize(Normal);
vec3 viewDir = normalize(viewPos - FragPos);
for (int i = 0; i < 4; i++)
{
ambient = ambient + lights[i].ambient * tex.rgb;
vec3 lightDir = normalize(lights[i].position - FragPos);
float diff = max(dot(norm, lightDir), 0.0);
diffuse = diffuse + (lights[i].diffuse * diff * tex.rgb);
vec3 reflectDir = reflect(-lightDir, norm);
float spec = pow(max(dot(viewDir, reflectDir), 0.0), shininess);
specular = specular + (lights[i].specular * spec * tex.rgb);
float dist = length(lights[i].position - FragPos);
float attenuation = lights[i].attenuation.x + (lights[i].attenuation.y * dist) + (lights[i].attenuation.z * (dist * dist));
if (attenuation > 0.0)
{
ambient *= 1.0 / attenuation;
diffuse *= 1.0 / attenuation;
specular *= 1.0 / attenuation;
}
}
float shadow = calculateShadow(normalize(lights[0].position - FragPos));
vec3 result = (ambient + (shadow) * (diffuse + specular));
FragColor = vec4(result, 1.0);
}
This is the result I get. Notice the weird stripes on top of the cube:
Reading the description about shadow acne, this seems to be the same phenomenon (source: https://learnopengl.com/Advanced-Lighting/Shadows/Shadow-Mapping).
According to that article, I need to check if the ShadowCoord depth value, minus a bias constant, is lower then the shadow value read from the shadow map. If so, we have shadow. Now... here comes the problem. Since I am using shadow2DProj and not texture() to get my shadow value from the shadow map (through some intricate sorcery no doubt), I am unable to "port" that article's code into my shader and get it to work. Here is what I have tried:
float calculateShadow(vec3 lightDir)
{
float closestDepth = shadow2DProj(shadowMap, ShadowCoords).r;
float bias = 0.005;
float currentDepth = ShadowCoords.z;
float shadow = currentDepth - bias > closestDepth ? 1.0 : 0.0;
return shadow;
}
But that produces no shadows at all, since the "shadow" float is always assigned 1.0 from the depth & bias check. I must admit that I do not fully understand what I am getting from using shadow2DProj(...).r as compared to texture(...).r, but it sure is something completely different.

This question has a misunderstanding of what shadow2DProj does. The function does not return a depth value, but a depth comparison result. Therefore, apply the bias before calling it.
Solution 1
Apply the bias prior to running the comparison. ShadowCoords.z is your currentDepth value.
float calculateShadow(vec3 lightDir)
{
const float bias = 0.005;
float shadow = shadow2DProj(shadowMap, vec3(ShadowCoords.uv, ShadowCoords.z - bias)).r;
return shadow;
}
Solution 2
Apply the bias while performing the light-space depth pass.
glPolygonOffset(float factor, float units)
This function offsets Z-axis values by factor * DZ + units where DZ is the z-axis slope of the polygon. Setting this to positive values moves polygons deeper into the scene, which acts like our bias.
During initialization:
glEnable(GL_POLYGON_OFFSET_FILL);
During Light Depth Pass:
// These parameters will need to be tweaked for your scene
// to prevent acne and mitigate peter panning
glPolygonOffset(1.0, 1.0);
// draw potential shadow casters
// return to default settings (no offset)
glPolygonOffset(0, 0);
Shader Code:
// we don't even need the light direction for slope bias
float calculateShadow()
{
float shadow = shadow2DProj(shadowMap, ShadowCoords).r;
return shadow;
}

Related

How to fix improperly distributed lighting projected onto a model in OpenGL?

I've been trying to implement the Blinn-Phong lighting model to project lighting onto an imported Wavefront OBJ model through Assimp(github link).
The model seems to be loaded correctly, however, there seems to be a point where the lighting appears to be "cut off" near the middle of the model.
Image of the imported model with and without lighting enabled.
As you can see on the left of the image above, there is a region in the middle of the model where the light effectively gets "split up" which is not what is intended. It can be seen that there is a sort of discrepancy where the side facing towards the light source appears brighter than normal and the side away from the light source appears darker than normal without any sort of easing in between the two sides.
I believe there might be something wrong with how I've implemented the lighting model in the fragment shader but I cannot say for sure as to why this is happening.
Vertex shader:
#version 330 core
layout (location = 0) in vec3 vertPos;
layout (location = 1) in vec3 vertNormal;
layout (location = 2) in vec2 vertTexCoords;
out vec3 fragPos;
out vec3 fragNormal;
out vec2 fragTexCoords;
uniform mat4 proj, view, model;
uniform mat3 normalMat;
void main() {
fragPos = vec3(model * vec4(vertPos, 1));
gl_Position = proj * view * vec4(fragPos, 1);
fragTexCoords = vertTexCoords;
fragNormal = normalMat * vertNormal;
}
Fragment shader:
#version 330 core
in vec3 fragPos;
in vec3 fragNormal;
in vec2 fragTexCoords;
out vec4 FragColor;
const int noOfDiffuseMaps = 1;
const int noOfSpecularMaps = 1;
struct Material {
sampler2D diffuseMaps[noOfDiffuseMaps], specularMaps[noOfSpecularMaps];
float shininess;
};
struct Light {
vec3 direction;
vec3 ambient, diffuse, specular;
};
uniform Material material;
uniform Light light;
uniform vec3 viewPos;
const float pi = 3.14159265;
uniform float gamma = 2.2;
float near = 0.1;
float far = 100;
float LinearizeDepth(float depth)
{
float z = depth * 2 - 1;
return (2 * near * far) / (far + near - z * (far - near));
}
void main() {
vec3 normal = normalize(fragNormal);
vec3 calculatedColor = vec3(0);
for (int i = 0; i < noOfDiffuseMaps; i++) {
vec3 diffuseTexel = texture(material.diffuseMaps[i], fragTexCoords).rgb;
// Ambient lighting
vec3 ambient = diffuseTexel * light.ambient;
// Diffuse lighting
float diff = max(dot(light.direction, normal), 0);
vec3 diffuse = diffuseTexel * light.diffuse * diff;
calculatedColor += ambient + diffuse;
}
for (int i = 0; i < noOfSpecularMaps; i++) {
vec3 specularTexel = texture(material.specularMaps[0], fragTexCoords).rgb;
vec3 viewDir = normalize(viewPos - fragPos);
vec3 halfWayDir = normalize(viewDir + light.direction);
float energyConservation = (8 + material.shininess) / (8 * pi);
// Specular lighting
float spec = pow(max(dot(halfWayDir, normal), 0), material.shininess);
vec3 specular = specularTexel * light.specular * spec * energyConservation;
calculatedColor += specular;
}
float depthColor = 1 - LinearizeDepth(gl_FragCoord.z) / far;
FragColor = vec4(pow(calculatedColor, vec3(1 / gamma)) * depthColor, 1);
}
Make sure your texture and colors are also linear(it is a simple pow 2.2) because you are doing gamma encoding at the end.
Also note, it is expected to have a harsh terminator.
http://filmicworlds.com/blog/linear-space-lighting-i-e-gamma/
Beyond that, if you expect soft falloffs, it must be coming from an area light. For that you can implement wrap lighting or area lights.

How to make normal map shader with limited range?

I have simple normal map shader for 7 lights and it work on entire screen. How the hell to make it work only on limited distance? I tried calculate distance between light and pixel, and simple 'if' if distance is to big but this don't work for me.
varying vec4 v_color;
varying vec2 v_texCoords;
uniform vec3 lightColor[7];
uniform vec3 light[7];
uniform sampler2D u_texture;
uniform sampler2D u_normals;
uniform vec2 resolution;
uniform bool useNormals;
uniform bool useShadow;
uniform float strength;
uniform bool yInvert;
uniform bool xInvert;
uniform vec4 ambientColor;
void main() {
// sample color & normals from our textures
vec4 color = texture2D(u_texture, v_texCoords.st);
vec3 nColor = texture2D(u_normals, v_texCoords.st).rgb;
// some bump map programs will need the Y value flipped..
nColor.g = yInvert ? 1.0 - nColor.g : nColor.g;
nColor.r = xInvert ? 1.0 - nColor.r : nColor.r;
// this is for debugging purposes, allowing us to lower the intensity of our bump map
vec3 nBase = vec3(0.5, 0.5, 1.0);
nColor = mix(nBase, nColor, strength);
// normals need to be converted to [-1.0, 1.0] range and normalized
vec3 normal = normalize(nColor * 2.0 - 1.0);
vec3 sum = vec3(0.0);
for ( int i = 0; i < 7; ++i ){
vec3 currentLight = light[i];
vec3 currentLightColor = lightColor[i];
// here we do a simple distance calculation
vec3 deltaPos = vec3( (currentLight.xy - gl_FragCoord.xy) / resolution.xy, currentLight.z );
vec3 lightDir = normalize(deltaPos * 1);
float lambert = clamp(dot(normal, lightDir), 0.0, 1.0);
vec3 result = color.rgb;
result = (currentLightColor.rgb * lambert);
result *= color.rgb;
sum += result;
}
vec3 ambient = ambientColor.rgb * ambientColor.a;
vec3 intensity = min(vec3(1.0), ambient + sum); // don't remember if min is critical, but I think it might be to avoid shifting the hue when multiple lights add up to something very bright.
vec3 finalColor = color.rgb * intensity;
//finalColor *= (ambientColor.rgb * ambientColor.a);
gl_FragColor = v_color * vec4(finalColor, color.a);
}
edit:
my map editor screen
close-up of details
You need to measure the length of the light delta vector and use that to attenuate.
Right after the lightDir line, you can put something like this, but you'll have to adjust the FALLOFF constant to get the distance you want. FALLOFF must be greater than 0. As a starting point, a value of 0.1 will give you a light radius of about 4 units. Smaller values enlarge the radius. You might even want to define it as a parameter of each light (make them vec4s).
float distance = length(deltaPos);
float attenuation = 1.0 / (1.0 + FALLOFF * distance * distance);
float lambert = attenuation * clamp(dot(normal, lightDir), 0.0, 1.0);
This attenuation formula has a bell curve. If you want the curve to have a pointy tip, which is maybe more realistic (though probably pointless for 2D lighting), you can add a second parameter (which you can initially give a value of 0.1 and increase from there):
float attenuation = 1.0 / (1.0 + SHARPNESS * distance + FALLOFF * distance * distance);
Someone on this question posted this helpful chart you can play with to visually see how the parameters change the curve.
Also, don't multiply by an integer. This will cause the shader to fail to compile on some devices:
vec3 lightDir = normalize(deltaPos * 1); // The integer 1 is unsupported.

opengl - spot light moves when camera rotates when using a shader

I implemented a simple shader for the lighting; it kind of works, but the light seems to move when the camera rotates (and only when it rotates).
I'm experimenting with a spotlight, this is how it looks like (it's the spot in the center):
If now I rotate the camera, the spot moves around; for example, here I looked down (I didn't move at all, just looked down) and it seemed at my feet:
I've looked it up and I've seen that it's a common mistake when mixing reference systems in the shader and/or when setting the light's position before moving the camera.
The thing is, I'm pretty sure I'm not doing these two things, but apparently I'm wrong; it's just that I can't find the bug.
Here's the shader:
Vertex Shader
varying vec3 vertexNormal;
varying vec3 lightDirection;
void main()
{
vertexNormal = gl_NormalMatrix * gl_Normal;
lightDirection = vec3(gl_LightSource[0].position.xyz - (gl_ModelViewMatrix * gl_Vertex).xyz);
gl_Position = ftransform();
}
Fragment Shader
uniform vec3 ambient;
uniform vec3 diffuse;
uniform vec3 specular;
uniform float shininess;
varying vec3 vertexNormal;
varying vec3 lightDirection;
void main()
{
vec3 color = vec3(0.0, 0.0, 0.0);
vec3 lightDirNorm;
vec3 eyeVector;
vec3 half_vector;
float diffuseFactor;
float specularFactor;
float attenuation;
float lightDistance;
vec3 normalDirection = normalize(vertexNormal);
lightDirNorm = normalize(lightDirection);
eyeVector = vec3(0.0, 0.0, 1.0);
half_vector = normalize(lightDirNorm + eyeVector);
diffuseFactor = max(0.0, dot(normalDirection, lightDirNorm));
specularFactor = max(0.0, dot(normalDirection, half_vector));
specularFactor = pow(specularFactor, shininess);
color += ambient * gl_LightSource[0].ambient;
color += diffuseFactor * diffuse * gl_LightSource[0].diffuse;
color += specularFactor * specular * gl_LightSource[0].specular;
lightDistance = length(lightDirection[i]);
float constantAttenuation = 1.0;
float linearAttenuation = (0.02 / SCALE_FACTOR) * lightDistance;
float quadraticAttenuation = (0.0 / SCALE_FACTOR) * lightDistance * lightDistance;
attenuation = 1.0 / (constantAttenuation + linearAttenuation + quadraticAttenuation);
// If it's a spotlight
if(gl_LightSource[i].spotCutoff <= 90.0)
{
float spotEffect = dot(normalize(gl_LightSource[0].spotDirection), normalize(-lightDirection));
if (spotEffect > gl_LightSource[0].spotCosCutoff)
{
spotEffect = pow(spotEffect, gl_LightSource[0].spotExponent);
attenuation = spotEffect / (constantAttenuation + linearAttenuation + quadraticAttenuation);
}
else
attenuation = 0.0;
}
color = color * attenuation;
// Moltiplico il colore per il fattore di attenuazione
gl_FragColor = vec4(color, 1.0);
}
Now, I can't show you the code where I render the things, because it's a custom language which integrates opengl and it's designed to create 3D applications (it wouldn't help to show you); but what I do is something like this:
SetupLights();
UpdateCamera();
RenderStuff();
Where:
SetupLights contains actual opengl calls that setup the lights and their positions;
UpdateCamera updates the camera's position using the built-in classes of the language; I don't have much power here;
RenderStuff calls the built-in functions of the language to draw the scene; I don't have much power here either.
So, either I'm doing something wrong in the shader or there's something in the language that "behind the scenes" breaks things.
Can you point me in the right direction?
you wrote
the light's position is already in world coordinates, and that is where I'm doing the computations
however, since you're applying gl_ModelViewMatrix to your vertex and gl_NormalMatrix to your normal, these values are probably in view space, which might cause the moving light.
as an aside, your eye vector looks like it should be in view coordinates, however, view space is a right-handed coordinate system, so "forward" points along the negative z-axis. also, your specular computation will likely be off since you're using the same eye vector for all fragments, but it should probably point towards that fragment's position on the near/far planes.

GLSL shadow multiplication doesn't work

I'm in front of a very strange problem which seems to originate from a simple multiplication in the fragment shader
I'm trying to calculate shadows using a framebuffer that renders only the depths from "light's perspective" which is a common tecnique for beginners easier to implement
Fragment Shader:
#version 330 core
uniform sampler2D parquet;
uniform samplerCube depthMaps[15];
in vec2 TexCoords;
out vec4 color;
in vec3 Normal;
in vec3 FragPos;
uniform vec3 lightPos[15];
uniform vec3 lightColor[15];
uniform float intensity[15];
uniform float far_plane;
uniform vec3 viewPos;
float ShadowCalculation(vec3 fragPos, vec3 lightPost, samplerCube depthMaps)
{
vec3 fragToLight = fragPos - lightPost;
float closestDepth = texture(depthMaps, fragToLight).r;
// original depth value
closestDepth *= far_plane;
float currentDepth = length(fragToLight);
float bias = 0.05;
float shadow = currentDepth - bias > closestDepth ? 1.0 : 0.0;
return shadow;
}
void main()
{
vec3 norm = normalize(Normal);
vec3 lightDir = normalize(lightPos[0] - FragPos);
float diff = max(dot(norm, lightDir), 0.0);
vec3 diffuse = diff * lightColor[0];
float _distance = length(vec3(FragPos - lightPos[0]));
float attenuation = 1.0 / pow(_distance +1, 2);
if(attenuation > 1.0) attenuation = 1.0;
float intens = intensity[0];
if(intensity[0] > 150) intens = 150.0f;
vec3 resulta = (diffuse * attenuation) * intens;
//texture color
vec3 tCol = vec3(texture(parquet, TexCoords));
//gamma correction
tCol.rgb = pow(tCol.rgb, vec3(0.45));
vec3 colors = resulta * tCol * (1.0f - ShadowCalculation(FragPos, lightPos[0], depthMaps[0]));
color = vec4(colors, 1.0f);
}
The last multiplication inside main() behaves strangely, multiplying the result of the diffuse light by the texture color renders nicely (so we have no shadows, just diffuse lightning)
//works
vec3 colors = resulta * tCol;
Multiplying the diffuse light by the shadow results renders also nicely (now we have no textures)
//works
vec3 colors = resulta * (1.0f - ShadowCalculation(FragPos, lightPos[0], depthMaps[0]));
Doing all togheter, renders just a black screen. I've tried all sort of things in the fragment shader, but none worked.
Lastly, here is the fragment shader used to render the cubemap:
#version 330 core
in vec4 FragPos;
uniform vec3 lightPos;
uniform float far_plane;
void main()
{
float lightDistance = length(FragPos.xyz - lightPos);
// map to [0;1] range by dividing by far_plane
lightDistance = lightDistance / far_plane;
gl_FragDepth = lightDistance;
}
Can you spot any logical error? I'm using uniforms array buffers since i'll later need multiple lights at once
After a while trying to visually debug the shader's output I finally found the error, I was binding the depthmap's cubemap texture incorrectly and this caused the strange behaviour I was seeing in the last multiplication
Lesson learned: It' not always fragment's fault

Shadowmapping always produces shadows beyond far plane

I am working on the beginnings of omnidirectional shadow mapping in my engine. For now I am only producing one shadowmap as a test. I am getting an odd result when using my current shaders. Here is a screenshot which shows the problem:
I am using a near value of 0.5 and a far value of 5.0 in the projection matrix for the shadowmap render. As near as I can tell, any value with a light-space z larger than my far plane distance is being computed by my fragment shader as in shadow.
This is my fragment shader:
in vec2 st;
uniform sampler2D colorTexture;
uniform sampler2D normalTexture;
uniform sampler2D depthTexture;
uniform sampler2D shadowmapTexture;
uniform mat4 invProj;
uniform mat4 lightProj;
uniform vec3 lightPosition;
out vec3 color;
void main () {
vec3 clipSpaceCoords;
clipSpaceCoords.xy = st.xy * 2.0 - 1.0;
clipSpaceCoords.z = texture(depthTexture, st).x * 2.0 - 1.0;
vec4 position = invProj * vec4(clipSpaceCoords,1.0);
position.xyz /= position.w;
vec4 lightSpace = lightProj * vec4(position.xyz,1.0);
lightSpace.xyz /= lightSpace.w;
lightSpace.xyz = lightSpace.xyz * 0.5 + 0.5;
float lightDepth = texture(shadowmapTexture, lightSpace.xy).x;
vec3 normal = texture(normalTexture, st);
vec3 diffuse;
float shadowFactor = 1.0;
if(lightSpace.w > 0.0 && lightSpace.z > lightDepth+0.0042) {
shadowFactor = 0.2;
}
else {
float k = 0.00001;
vec3 distanceToLight = lightPosition - position.xyz;
float distanceLength = length(distanceToLight);
float attenuation = (1.0 / (1.0 + (0.1 * distanceLength) + k * (distanceLength * distanceLength)));
float diffuseTemp = max(dot(normalize(normal), normalize(distanceToLight)), 0.0);
diffuse = vec3(1.0, 1.0, 1.0) * attenuation * diffuseTemp;
}
vec3 gamma = vec3(1.0/2.2);
color = pow(texture(colorTexture, st).xyz*shadowFactor+diffuse, gamma);
}
How can I fix this issue (Other than increasing my far plane distance)?
One other question, as this is the first time I have attempted shadowmapping: am I doing the lighting in relation to the shadows correctly?