How to make normal map shader with limited range? - opengl

I have simple normal map shader for 7 lights and it work on entire screen. How the hell to make it work only on limited distance? I tried calculate distance between light and pixel, and simple 'if' if distance is to big but this don't work for me.
varying vec4 v_color;
varying vec2 v_texCoords;
uniform vec3 lightColor[7];
uniform vec3 light[7];
uniform sampler2D u_texture;
uniform sampler2D u_normals;
uniform vec2 resolution;
uniform bool useNormals;
uniform bool useShadow;
uniform float strength;
uniform bool yInvert;
uniform bool xInvert;
uniform vec4 ambientColor;
void main() {
// sample color & normals from our textures
vec4 color = texture2D(u_texture, v_texCoords.st);
vec3 nColor = texture2D(u_normals, v_texCoords.st).rgb;
// some bump map programs will need the Y value flipped..
nColor.g = yInvert ? 1.0 - nColor.g : nColor.g;
nColor.r = xInvert ? 1.0 - nColor.r : nColor.r;
// this is for debugging purposes, allowing us to lower the intensity of our bump map
vec3 nBase = vec3(0.5, 0.5, 1.0);
nColor = mix(nBase, nColor, strength);
// normals need to be converted to [-1.0, 1.0] range and normalized
vec3 normal = normalize(nColor * 2.0 - 1.0);
vec3 sum = vec3(0.0);
for ( int i = 0; i < 7; ++i ){
vec3 currentLight = light[i];
vec3 currentLightColor = lightColor[i];
// here we do a simple distance calculation
vec3 deltaPos = vec3( (currentLight.xy - gl_FragCoord.xy) / resolution.xy, currentLight.z );
vec3 lightDir = normalize(deltaPos * 1);
float lambert = clamp(dot(normal, lightDir), 0.0, 1.0);
vec3 result = color.rgb;
result = (currentLightColor.rgb * lambert);
result *= color.rgb;
sum += result;
}
vec3 ambient = ambientColor.rgb * ambientColor.a;
vec3 intensity = min(vec3(1.0), ambient + sum); // don't remember if min is critical, but I think it might be to avoid shifting the hue when multiple lights add up to something very bright.
vec3 finalColor = color.rgb * intensity;
//finalColor *= (ambientColor.rgb * ambientColor.a);
gl_FragColor = v_color * vec4(finalColor, color.a);
}
edit:
my map editor screen
close-up of details

You need to measure the length of the light delta vector and use that to attenuate.
Right after the lightDir line, you can put something like this, but you'll have to adjust the FALLOFF constant to get the distance you want. FALLOFF must be greater than 0. As a starting point, a value of 0.1 will give you a light radius of about 4 units. Smaller values enlarge the radius. You might even want to define it as a parameter of each light (make them vec4s).
float distance = length(deltaPos);
float attenuation = 1.0 / (1.0 + FALLOFF * distance * distance);
float lambert = attenuation * clamp(dot(normal, lightDir), 0.0, 1.0);
This attenuation formula has a bell curve. If you want the curve to have a pointy tip, which is maybe more realistic (though probably pointless for 2D lighting), you can add a second parameter (which you can initially give a value of 0.1 and increase from there):
float attenuation = 1.0 / (1.0 + SHARPNESS * distance + FALLOFF * distance * distance);
Someone on this question posted this helpful chart you can play with to visually see how the parameters change the curve.
Also, don't multiply by an integer. This will cause the shader to fail to compile on some devices:
vec3 lightDir = normalize(deltaPos * 1); // The integer 1 is unsupported.

Related

GLSL shadow mapping: shadow2DProj causes rendering artifacts

Managed to get Shadow Mapping to work in my OpenGL rendering engine, but it is producing some weird artifacts that I think are "shadow acne". However, I am using shadow2DProj to get the shadow value from the shadow depth map, which for me has proven to be the only way to get shadows to show up at all. Therefore, looking around at various tutorial at learnopengl, opengl-tutorials and others have yielded no help. Would like some advice as to how I could mitigate this problem.
Here is my shader that I use to draw the shadow map with:
#version 330 core
out vec4 FragColor;
struct Light {
vec3 position;
vec3 ambient;
vec3 diffuse;
vec3 specular;
vec3 attenuation;
};
in vec3 FragPos;
in vec3 Normal;
in vec2 TexCoords;
in vec4 ShadowCoords;
uniform vec3 viewPos;
uniform sampler2D diffuseMap;
uniform sampler2D specularMap;
uniform sampler2DShadow shadowMap;
uniform Light lights[4];
uniform float shininess;
float calculateShadow(vec3 lightDir)
{
float shadowValue = shadow2DProj(shadowMap, ShadowCoords).r;
float shadow = shadowValue;
return shadow;
}
vec3 calculateAmbience(Light light, vec3 textureMap)
{
return light.ambient * textureMap;
}
void main()
{
vec4 tex = texture(diffuseMap, TexCoords);
if (tex.a < 0.5)
{
discard;
}
vec3 ambient = vec3(0.0);
vec3 diffuse = vec3(0.0);
vec3 specular = vec3(0.0);
vec3 norm = normalize(Normal);
vec3 viewDir = normalize(viewPos - FragPos);
for (int i = 0; i < 4; i++)
{
ambient = ambient + lights[i].ambient * tex.rgb;
vec3 lightDir = normalize(lights[i].position - FragPos);
float diff = max(dot(norm, lightDir), 0.0);
diffuse = diffuse + (lights[i].diffuse * diff * tex.rgb);
vec3 reflectDir = reflect(-lightDir, norm);
float spec = pow(max(dot(viewDir, reflectDir), 0.0), shininess);
specular = specular + (lights[i].specular * spec * tex.rgb);
float dist = length(lights[i].position - FragPos);
float attenuation = lights[i].attenuation.x + (lights[i].attenuation.y * dist) + (lights[i].attenuation.z * (dist * dist));
if (attenuation > 0.0)
{
ambient *= 1.0 / attenuation;
diffuse *= 1.0 / attenuation;
specular *= 1.0 / attenuation;
}
}
float shadow = calculateShadow(normalize(lights[0].position - FragPos));
vec3 result = (ambient + (shadow) * (diffuse + specular));
FragColor = vec4(result, 1.0);
}
This is the result I get. Notice the weird stripes on top of the cube:
Reading the description about shadow acne, this seems to be the same phenomenon (source: https://learnopengl.com/Advanced-Lighting/Shadows/Shadow-Mapping).
According to that article, I need to check if the ShadowCoord depth value, minus a bias constant, is lower then the shadow value read from the shadow map. If so, we have shadow. Now... here comes the problem. Since I am using shadow2DProj and not texture() to get my shadow value from the shadow map (through some intricate sorcery no doubt), I am unable to "port" that article's code into my shader and get it to work. Here is what I have tried:
float calculateShadow(vec3 lightDir)
{
float closestDepth = shadow2DProj(shadowMap, ShadowCoords).r;
float bias = 0.005;
float currentDepth = ShadowCoords.z;
float shadow = currentDepth - bias > closestDepth ? 1.0 : 0.0;
return shadow;
}
But that produces no shadows at all, since the "shadow" float is always assigned 1.0 from the depth & bias check. I must admit that I do not fully understand what I am getting from using shadow2DProj(...).r as compared to texture(...).r, but it sure is something completely different.
This question has a misunderstanding of what shadow2DProj does. The function does not return a depth value, but a depth comparison result. Therefore, apply the bias before calling it.
Solution 1
Apply the bias prior to running the comparison. ShadowCoords.z is your currentDepth value.
float calculateShadow(vec3 lightDir)
{
const float bias = 0.005;
float shadow = shadow2DProj(shadowMap, vec3(ShadowCoords.uv, ShadowCoords.z - bias)).r;
return shadow;
}
Solution 2
Apply the bias while performing the light-space depth pass.
glPolygonOffset(float factor, float units)
This function offsets Z-axis values by factor * DZ + units where DZ is the z-axis slope of the polygon. Setting this to positive values moves polygons deeper into the scene, which acts like our bias.
During initialization:
glEnable(GL_POLYGON_OFFSET_FILL);
During Light Depth Pass:
// These parameters will need to be tweaked for your scene
// to prevent acne and mitigate peter panning
glPolygonOffset(1.0, 1.0);
// draw potential shadow casters
// return to default settings (no offset)
glPolygonOffset(0, 0);
Shader Code:
// we don't even need the light direction for slope bias
float calculateShadow()
{
float shadow = shadow2DProj(shadowMap, ShadowCoords).r;
return shadow;
}

opengl - spot light moves when camera rotates when using a shader

I implemented a simple shader for the lighting; it kind of works, but the light seems to move when the camera rotates (and only when it rotates).
I'm experimenting with a spotlight, this is how it looks like (it's the spot in the center):
If now I rotate the camera, the spot moves around; for example, here I looked down (I didn't move at all, just looked down) and it seemed at my feet:
I've looked it up and I've seen that it's a common mistake when mixing reference systems in the shader and/or when setting the light's position before moving the camera.
The thing is, I'm pretty sure I'm not doing these two things, but apparently I'm wrong; it's just that I can't find the bug.
Here's the shader:
Vertex Shader
varying vec3 vertexNormal;
varying vec3 lightDirection;
void main()
{
vertexNormal = gl_NormalMatrix * gl_Normal;
lightDirection = vec3(gl_LightSource[0].position.xyz - (gl_ModelViewMatrix * gl_Vertex).xyz);
gl_Position = ftransform();
}
Fragment Shader
uniform vec3 ambient;
uniform vec3 diffuse;
uniform vec3 specular;
uniform float shininess;
varying vec3 vertexNormal;
varying vec3 lightDirection;
void main()
{
vec3 color = vec3(0.0, 0.0, 0.0);
vec3 lightDirNorm;
vec3 eyeVector;
vec3 half_vector;
float diffuseFactor;
float specularFactor;
float attenuation;
float lightDistance;
vec3 normalDirection = normalize(vertexNormal);
lightDirNorm = normalize(lightDirection);
eyeVector = vec3(0.0, 0.0, 1.0);
half_vector = normalize(lightDirNorm + eyeVector);
diffuseFactor = max(0.0, dot(normalDirection, lightDirNorm));
specularFactor = max(0.0, dot(normalDirection, half_vector));
specularFactor = pow(specularFactor, shininess);
color += ambient * gl_LightSource[0].ambient;
color += diffuseFactor * diffuse * gl_LightSource[0].diffuse;
color += specularFactor * specular * gl_LightSource[0].specular;
lightDistance = length(lightDirection[i]);
float constantAttenuation = 1.0;
float linearAttenuation = (0.02 / SCALE_FACTOR) * lightDistance;
float quadraticAttenuation = (0.0 / SCALE_FACTOR) * lightDistance * lightDistance;
attenuation = 1.0 / (constantAttenuation + linearAttenuation + quadraticAttenuation);
// If it's a spotlight
if(gl_LightSource[i].spotCutoff <= 90.0)
{
float spotEffect = dot(normalize(gl_LightSource[0].spotDirection), normalize(-lightDirection));
if (spotEffect > gl_LightSource[0].spotCosCutoff)
{
spotEffect = pow(spotEffect, gl_LightSource[0].spotExponent);
attenuation = spotEffect / (constantAttenuation + linearAttenuation + quadraticAttenuation);
}
else
attenuation = 0.0;
}
color = color * attenuation;
// Moltiplico il colore per il fattore di attenuazione
gl_FragColor = vec4(color, 1.0);
}
Now, I can't show you the code where I render the things, because it's a custom language which integrates opengl and it's designed to create 3D applications (it wouldn't help to show you); but what I do is something like this:
SetupLights();
UpdateCamera();
RenderStuff();
Where:
SetupLights contains actual opengl calls that setup the lights and their positions;
UpdateCamera updates the camera's position using the built-in classes of the language; I don't have much power here;
RenderStuff calls the built-in functions of the language to draw the scene; I don't have much power here either.
So, either I'm doing something wrong in the shader or there's something in the language that "behind the scenes" breaks things.
Can you point me in the right direction?
you wrote
the light's position is already in world coordinates, and that is where I'm doing the computations
however, since you're applying gl_ModelViewMatrix to your vertex and gl_NormalMatrix to your normal, these values are probably in view space, which might cause the moving light.
as an aside, your eye vector looks like it should be in view coordinates, however, view space is a right-handed coordinate system, so "forward" points along the negative z-axis. also, your specular computation will likely be off since you're using the same eye vector for all fragments, but it should probably point towards that fragment's position on the near/far planes.

Shadowmapping always produces shadows beyond far plane

I am working on the beginnings of omnidirectional shadow mapping in my engine. For now I am only producing one shadowmap as a test. I am getting an odd result when using my current shaders. Here is a screenshot which shows the problem:
I am using a near value of 0.5 and a far value of 5.0 in the projection matrix for the shadowmap render. As near as I can tell, any value with a light-space z larger than my far plane distance is being computed by my fragment shader as in shadow.
This is my fragment shader:
in vec2 st;
uniform sampler2D colorTexture;
uniform sampler2D normalTexture;
uniform sampler2D depthTexture;
uniform sampler2D shadowmapTexture;
uniform mat4 invProj;
uniform mat4 lightProj;
uniform vec3 lightPosition;
out vec3 color;
void main () {
vec3 clipSpaceCoords;
clipSpaceCoords.xy = st.xy * 2.0 - 1.0;
clipSpaceCoords.z = texture(depthTexture, st).x * 2.0 - 1.0;
vec4 position = invProj * vec4(clipSpaceCoords,1.0);
position.xyz /= position.w;
vec4 lightSpace = lightProj * vec4(position.xyz,1.0);
lightSpace.xyz /= lightSpace.w;
lightSpace.xyz = lightSpace.xyz * 0.5 + 0.5;
float lightDepth = texture(shadowmapTexture, lightSpace.xy).x;
vec3 normal = texture(normalTexture, st);
vec3 diffuse;
float shadowFactor = 1.0;
if(lightSpace.w > 0.0 && lightSpace.z > lightDepth+0.0042) {
shadowFactor = 0.2;
}
else {
float k = 0.00001;
vec3 distanceToLight = lightPosition - position.xyz;
float distanceLength = length(distanceToLight);
float attenuation = (1.0 / (1.0 + (0.1 * distanceLength) + k * (distanceLength * distanceLength)));
float diffuseTemp = max(dot(normalize(normal), normalize(distanceToLight)), 0.0);
diffuse = vec3(1.0, 1.0, 1.0) * attenuation * diffuseTemp;
}
vec3 gamma = vec3(1.0/2.2);
color = pow(texture(colorTexture, st).xyz*shadowFactor+diffuse, gamma);
}
How can I fix this issue (Other than increasing my far plane distance)?
One other question, as this is the first time I have attempted shadowmapping: am I doing the lighting in relation to the shadows correctly?

OpenGL shader lssue

I tried to incorporate attentuation, but it failed does nothing.
I have diffuse, ambient, and specular lighting working. I just need to dim the light as the fragments get further away from the light.
Also, i have the attenuation parameter for my light:
glLightf(GL_LIGHT0, GL_QUADRATIC_ATTENUATION, 0.0004f);
This is the floor lighting, the light is positioned just behind the cube:
http://oi43.tinypic.com/i39fuo.jpg
.vert
varying vec3 N;
varying vec3 v;
varying vec3 c;
varying float dist;
void main(void)
{
vec4 ecPos;
vec3 aux;
ecPos = gl_ModelViewMatrix * gl_Vertex;
aux = vec3(gl_LightSource[0].position-ecPos);
dist = length(aux);
c = vec3(gl_Color);
v = vec3(gl_ModelViewMatrix * gl_Vertex);
N = normalize(gl_NormalMatrix * gl_Normal);
gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;
}
.frag
varying vec3 N;
varying vec3 v;
varying vec3 c;
varying float dist;
void main (void)
{
float att;
att = 1.0 / (gl_LightSource[0].constantAttenuation +
gl_LightSource[0].linearAttenuation * dist +
gl_LightSource[0].quadraticAttenuation * dist * dist);
vec3 L = normalize(gl_LightSource[0].position.xyz - v);
vec3 E = normalize(-v); // we are in Eye Coordinates, so EyePos is (0,0,0)
vec3 R = normalize(-reflect(L,N));
float nDotL = max(dot(N,L), 0.0);
float rDotE = max(dot(R,E),0.0);
float power = pow(rDotE, gl_FrontMaterial.shininess);
//calculate Ambient Term:
vec4 Iamb = gl_FrontLightProduct[0].ambient * att;
//calculate Diffuse Term:
vec4 Idiff = gl_FrontLightProduct[0].diffuse * nDotL * att;
Idiff = clamp(Idiff, 0.0, 1.0);
// calculate Specular Term:
vec4 Ispec = gl_FrontLightProduct[0].specular * power * att;
Ispec = clamp(Ispec, 0.0, 1.0);
// write Total Color:
gl_FragColor = Iamb + Idiff + Ispec + c;
}
From this image i cant' really see anything. How about setting a small object as lightsource.
Which object's use this shader?
Some things that come to my mind:
You normalized your normal in your vertex shader, which is an unnecessary step.
Passed vectors from vertex to fragment shader must be normalized inside fragment shader since they will be interpolated.
Aslong you don't do any length based calculations in your vertex shader, which you aren't no normalization is necessary in vertex shader.
You should normalize the normal in fragment shader, then you don't need to normalize your reflect vector.
Attenuation is not based on anything "complex" calculated in shader. So output it and then check the rest. How does your diffuse term looks like?
Further hints:
You could place the light vector and attenuation calculation inside vertex shader and pass it as to fragment shader (pack it in a 4 vec component) to save interpolators.
the final specular clamp is unecessary, the values should be within [0, 1] range automatically. If not you have a problem.

Opengl Refraction ,Texture is repeating on ellipsoid object

I have a query regarding refraction.
I am using a texture image for refraction(refertest_car.png).
But somehow the texture is getting multiplied and givinga distorted image(Refer Screenshot.png)
i am using following shader.
attribute highp vec4 vertex;
attribute mediump vec3 normal;
uniformhighp mat4 matrix;
uniformhighp vec3 diffuse_color;
uniformhighp mat3 matrixIT;
uniformmediump mat4 matrixMV;
uniformmediump vec3 EyePosModel;
uniformmediump vec3 LightDirModel;
varyingmediump vec4 color;
constmediump float cShininess = 3.0;
constmediump float cRIR = 1.015;
varyingmediump vec2 RefractCoord;
vec3 SpecularColor= vec3(1.0,1.0,1.0);
voidmain(void)
{
vec3 toLight = normalize(vec3(1.0,1.0,1.0));
mediump vec3 eyeDirModel = normalize(vertex.xyz -EyePosModel);
mediump vec3 refractDir =refract(eyeDirModel,normal, cRIR);
refractDir = (matrix * vec4(refractDir, 0.0)).xyw;
RefractCoord = 0.5 * (refractDir.xy / refractDir.z) + 0.5;
vec3 normal_cal = normalize(matrixIT *normal );
float NDotL = max(dot(normal_cal, toLight), 0.0);
vec4 ecPosition = normalize(matrixMV * vertex);
vec3 eyeDir = vec3(1.0,1.0,1.0);
float NDotH = 0.0;
vec3 SpecularLight = vec3(0.0,0.0,0.0);
if(NDotL > 0.0)
{
vec3 halfVector = normalize( eyeDirModel + LightDirModel);
float NDotH = max(dot(normal_cal, halfVector), 0.0);
float specular =pow(NDotH,3.0);
SpecularLight = specular * SpecularColor;
}
color = vec4((NDotL * diffuse_color.xyz) + (SpecularLight.xyz) ,1.0);
gl_Position = matrix * vertex;
}
And
varyingmediump vec2 RefractCoord;
uniformsampler2D sTexture;
varyingmediump vec4 color;
voidmain(void)
{
lowp vec3 refractColor = texture2D(sTexture,RefractCoord).rgb;
gl_FragColor = vec4(color.xyz + refractColor,1.0);
}
Can anyone let me know the solution to this problem?
Thanks for any help.
Sorry guys i am not able to attach image.
It seems that you are calculating the refraction vector incorrectly. Hovewer, the answer to your question is already in it's title. If you are looking at ellipsoid, the rays from the view span a cone, wrapping the ellipsoid. But after the refraction, the cone may be much wider, reaching beyond the edges of your images, therefore giving texture coordinates larger than 0 - 1 and leading to texture being wrapped. So we need to take care of that as well.
First, the refraction coordinate should be calculated in vertex shader as follows:
vec3 eyeDirModel = normalize(-vertex * matrix);
vec3 refractDir = refract(eyeDirModel, normal, cRIR);
RefractCoord = normalize((matrix * vec4(refractDir, 0.0)).xyz); // no dehomog!
RefractCoord now contains refracted eye-space vectors. This counts on "matrix" being modelview matrix (that is not clear from your code, but i suspect it is). You could possibly skip normalization if you wish the shader to run faster, it shouldn't cause noticeable errors. Now a little bit of modification to your fragment shader.
vec3 refractColor = texture2D(sTexture, normalize(RefractCoord).xy * .5 + .5).rgb;
Here, using normalize() makes sure that the texture coordinates do not cause the texture to repeat.
Note that using 2D texture for refractions should be only justified by generating it on the fly (as e.g. Half-Life 2 does), otherwise one should probably use cube-map texture, which does the normalization for you and gives you color based on 3D direction - which is what you need.
Hope this helps ... (and, oh yeah, i wrote this from memory, in case there are any errors, please comment).