Physically Based Rendering: role of ambient light in BRDF - opengl

I'm trying to figure out if my BRDF approach is correct or not.
// Calculate sun BRDF color
vec3 sunBrdf = brdf(tsLightSourceDir, tsEyeDir,
tsNormal, vtsTangent, vtsBinormal,
albedo.rgb,
NdotL, NdotV,
tRoughness, tMetallic, specular);
// brdf multiplied by incoming sun radiance -> color
vec3 irradiance = sunAmbientRadiance + sunDiffuseRadiance * NdotL;
vec3 brdfColor = sunBrdf * irradiance;
// Add base ambient color
vec3 ambientColor = albedo.rgb * sunAmbientRadiance;
resultColor = ambientColor;
resultColor += brdfColor;
// Interpolate by shadow
resultColor = mix(ambientColor,
resultColor,
shadowAmount);
My question: I'm adding the ambient light 2x. 1x when calculating the ambient color and 1x by adding it to the amount of light coming into the BRDF.
I can't figure out if this is correct or not, since the BRDF needs the ambient light in my opinion since it's a part of the lighting that arrives onto the surface. But on the other hand, it's already calculated by multiplying it with the albedo color and added as a base color...
Any help is much appreciated!

The correct formula goes like this (in my opinion):
vec3 irradiance = sunDiffuseRadiance * NdotL;

Related

GLSL Specular Highlight Darker when closer

How can I fix my specluar component so it gets brighter when far and dark when nearer?
I'm using a pointlight by the way and not the default light variables.
If necessary I'll put the lines that calculate the specular component here: (Vertex Shader)
//pl_pos is the raw vec3(x y z) before this
vec3 pointlight = gl_ModelViewMatrix * vec4(pl_pos,1.0);
vec3 Norm = normalize(gl_NormalMatrix * gl_Normal);
//Adding Vertex Calc
vec4 VertexPos = gl_ModelViewMatrix * gl_Vertex;
vec3 LightVec = normalize(pointlight - VertexPos.xyz);
float SpecularExp = 128.0;
float NormLightAng = max(0.0, dot(Norm,LightVec));
vec4 Specular = vec4(pow(NormLightAng, SpecularExp));
gl_FrontColor = Diffuse + Specular;
#BDL gave the answer saying
"Which illumination model are you using? I wouldn't know of any where the specular component is calculated depending on dot(n, l)^p. Usually it's dot(reflected, view)^p (phong illumination) or dot(n, halfway)^p (blinn-phong). At least, the view-vector should somehow influence the specular illumination. "
And #Rabbid76 enhanced the answer by saying:
"Note, dot(Norm,LightVec)) is a lambertian diffuse reflection, see How does this faking the light work on aerotwist?"
Thanks a lot for your help guys!

Struggling with casting cloud shadow on earth sphere in OpenGL

I am trying to do an earth simulation in OpenGL with GLSL shaders, and so far it's been going decent. Although I am stuck with a slightly small problem. Right now I have 3 spheres, one for ground level (earth), one for clouds and the third for the atmosphere (scattering effects). The earth sphere handles with most of the textures.
The cloud sphere is a slightly bigger sphere than the earth sphere, and is mapped with a cloud texture and normal mapped using one created with the photoshop plugin. One more thing to point out is, the rotation speed of the cloud sphere is slightly greater than the rotation speed of the earth sphere.
This is where things get confusing for me. I am trying to cast the shadow of the clouds onto the ground (earth) sphere by passing the cloud texture into the earth sphere's shader and subtracting the cloud's color from earth's color. But since the rotation speeds of the two sphere's are different, I figured if I multiplied the rotation matrix of the cloud sphere with the uv coordinates for the cloud texture, that should solve the problem. But sadly, the shadows and the clouds do not seem to rotate in sync. I was hoping if anyone can help me figure out the math to make the shadows and the cloud rotate in sync with each other, no matter how different the rotation speeds of the two sphere are.
Here is my fragment shader for the earth where I'm calculating the cloud's shadow:
#version 400 core
uniform sampler2D day;
uniform sampler2D bumpMap;
uniform sampler2D night;
uniform sampler2D specMap;
uniform sampler2D clouds;
uniform mat4 cloudRotation;
in vec3 vPos;
in vec3 lightVec;
in vec3 eyeVec;
in vec3 halfVec;
in vec2 texCoord;
out vec4 frag_color;
void main()
{
vec3 normal = 2.0 * texture(bumpMap, texCoord).rgb - 1.0;
//normal.z = 1 - normal.x * normal.x - normal.y * normal.y;
normal = normalize ( normal );
vec4 spec = vec4(1.0, 0.941, 0.898, 1.0);
vec4 specMapColor = texture2D(specMap, texCoord);
vec3 L = lightVec;
vec3 N = normal;
vec3 Emissive = normalize(-vPos);
vec3 R = reflect(-L, N);
float dotProd = max(dot(R, Emissive), 0.0);
vec4 specColor = spec * pow(dotProd,6.0) * 0.5;
float diffuse = max(dot(N, L), 0.0);
vec2 cloudTexCoord = vec2(cloudRotation * vec4(texCoord, 0.0, 1.0));
vec3 cloud_color = texture2D( clouds, cloudTexCoord).rgb;
vec3 day_color = texture2D( day, texCoord ).rgb * diffuse + specColor.rgb * specMapColor.g - cloud_color * 0.25;// * (1 - cloud_color.r) + cloud_color.r * diffuse;
vec3 night_color = texture2D( night, texCoord ).rgb * 0.5;// * (1 - cloud_color.r) * 0.5;
vec3 color = day_color;
if(dot(N, L) < 0.1)
color = mix(night_color, day_color, (diffuse + 0.1) * 5.0);
frag_color = vec4(color, 1.0);
}
Here's a sample output as a result of the above shader. Note that the shadows start out at the correct position, but the due to the wrong rotation speed, they tend to move ahead of the rotation of the cloud sphere.
Again, it would be really helpful if anyone can help me figure out the math behind keep the shadow and the clouds in sync
Thanks in advance

Deferred renderer with light volumes produce strange banding

I have a deferred renderer that only calculates lighting equations when the current fragment is in range of a light source. I do this by calculating the size of a light volume in my application and send this with other light information to the shaders. I then check the distance between the fragment and lightPos (per light) and use the light's volume as a treshold.
For simplicity's sake I use a linear equation (quadratic equations generate way too large light volumes) for light attenuation. All the lighting equations work fine, but when I use multiple lights I sometimes see strange circle borders as if the distance check causes the light calculations to prematurely stop causing a sudden change in lighting. You can see this effect in the image below:
The fragment shader code is as follows:
vec3 position = texture(worldPos, fs_in.TexCoords).rgb;
vec3 normal = texture(normals, fs_in.TexCoords).rgb;
normal = normalize(normal * 2.0 - 1.0);
vec3 color = texture(albedo, fs_in.TexCoords).rgb;
float depth = texture(worldPos, fs_in.TexCoords).a;
// Add global ambient value
fragColor = vec4(vec3(0.1) * color, 0.0);
for(int i = 0; i < NR_LIGHTS; ++i)
{
float distance = abs(length(position - lights[i].Position.xyz));
if(distance <= lights[i].Size)
{
vec3 lighting;
// Ambient
lighting += lights[i].Ambient * color;
// Diffuse
vec3 lightDir = normalize(lights[i].Position.xyz - position);
float diffuse = max(dot(normal, lightDir), 0.0);
lighting += diffuse * lights[i].Diffuse * color;
// Specular
vec3 viewDir = normalize(viewPos - position);
vec3 reflectDir = reflect(-lightDir, normal);
float spec = pow(max(dot(viewDir, reflectDir), 0.0), 8);
lighting += spec * lights[i].Specular;
// Calculate attenuation
float attenuation = max(1.0f - lights[i].Linear * distance, 0.0);
lighting *= attenuation;
fragColor += vec4(lighting, 0.0);
}
}
fragColor.a = 1.0;
The attenuation function is a linear function of the distance between the fragment position and each light source.
In this particular scene I use a linear attenuation value of 0.075 of which I generate the light's size/radius as:
Size = 1.0 / Linear;
some observations
When I remove the distance check if(distance <= lights[i].Size) I don't get the weird border issue.
If I visualize the lighting value of a single light source and visualize the distance as distance/lights.Size I get the following 2 images:
which looks as if the light radius/distance-calculations and light attenuation are similar in radius.
When I change the distance check equation to if(distance <= lights[i].Size * 2.0f) (as to drastically increase the light's radius) I get significantly less border banding, but if I look close enough I do see them from time to time so even that doesn't completely remove the issue.
I have no idea what is causing this and I am out of options at the moment.
This section:
vec3 lighting;
// Ambient
lighting += lights[i].Ambient * color;
You are never initializing lighting before you add to it. I think this can cause undefined behaviour. Try to change it to:
// Ambient
vec3 lighting = lights[i].Ambient * color;

GLSL point light shader moving with camera

I've been trying to make a basic static point light using shaders for an LWJGL game, but it appears as if the light is moving as the camera's position is being translated and rotated. These shaders are slightly modified from the OpenGL 4.3 guide, so I'm not sure why they aren't working as intended. Can anyone explain why these shaders aren't working as intended and what I can do to get them to work?
Vertex Shader:
varying vec3 color, normal;
varying vec4 vertexPos;
void main() {
color = vec3(0.4);
normal = normalize(gl_NormalMatrix * gl_Normal);
vertexPos = gl_ModelViewMatrix * gl_Vertex;
gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;
}
Fragment Shader:
varying vec3 color, normal;
varying vec4 vertexPos;
void main() {
vec3 lightPos = vec3(4.0);
vec3 lightColor = vec3(0.75);
vec3 lightDir = lightPos - vertexPos.xyz;
float lightDist = length(lightDir);
float attenuation = 1.0 / (3.0 + 0.007 * lightDist + 0.000008 * lightDist * lightDist);
float diffuse = max(0.0, dot(normal, lightDir));
vec3 ambient = vec3(0.4, 0.4, 0.4);
vec3 finalColor = color * (ambient + lightColor * diffuse * attenuation);
gl_FragColor = vec4(finalColor, 1.0);
}
If anyone's interested, I ended up finding the solution. Removing the calls to gl_NormalMatrix and gl_ModelViewMatrix solved the problem.
The critical value here, lightPos, was being set as a function of vertexPos, which you have expressed in screen space (this happened because its original world space form was multiplied by modelView). Screen space stays with the camera, not anything in the 3D world. So to have a non-moving light source with respect to some absolute point in world space (like [4.0, 4.0, 4.0]), you could just leave your object's points in that space as you found out.
But getting rid of modelview is not a good idea, since the whole point of the model matrix is to place your objects where they belong (so you can re-use your vertex arrays with changes only to the model matrix, instead of burdening them with specifying every single shape's vertex positions from scratch).
A better way is to perform the modelView multiplication on both vertexPos AND lightPos. This way you're treating lightPos as originally a quantity in world space, but then doing the comparison in screen space. The math to get light intensities from normals will work out to the same in either space and you'll get a correct looking light source.

Generic GLSL Lighting Shader

Pixel based lighting is a common issue in many OpenGL applications, as the standard OpenGL lighting has very poor quality.
I want to use a GLSL program to have per-pixel based lighting in my OpenGL program instead of per-vertex. Just Diffuse lighting, but with fog, texture and texture-alpha at least.
I started with this shader:
texture.vert:
varying vec3 position;
varying vec3 normal;
void main(void)
{
gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;
gl_FrontColor = gl_Color;
gl_TexCoord[0] = gl_MultiTexCoord0;
normal = normalize(gl_NormalMatrix * gl_Normal);
position = vec3(gl_ModelViewMatrix * gl_Vertex);
}
texture.frag:
uniform sampler2D Texture0;
uniform int ActiveLights;
varying vec3 position;
varying vec3 normal;
void main(void)
{
vec3 lightDir;
float attenFactor;
vec3 eyeDir = normalize(-position); // camera is at (0,0,0) in ModelView space
vec4 lightAmbientDiffuse = vec4(0.0,0.0,0.0,0.0);
vec4 lightSpecular = vec4(0.0,0.0,0.0,0.0);
// iterate all lights
for (int i=0; i<ActiveLights; ++i)
{
// attenuation and light direction
if (gl_LightSource[i].position.w != 0.0)
{
// positional light source
float dist = distance(gl_LightSource[i].position.xyz, position);
attenFactor = 1.0/( gl_LightSource[i].constantAttenuation +
gl_LightSource[i].linearAttenuation * dist +
gl_LightSource[i].quadraticAttenuation * dist * dist );
lightDir = normalize(gl_LightSource[i].position.xyz - position);
}
else
{
// directional light source
attenFactor = 1.0;
lightDir = gl_LightSource[i].position.xyz;
}
// ambient + diffuse
lightAmbientDiffuse += gl_FrontLightProduct[i].ambient*attenFactor;
lightAmbientDiffuse += gl_FrontLightProduct[i].diffuse * max(dot(normal, lightDir), 0.0) * attenFactor;
// specular
vec3 r = normalize(reflect(-lightDir, normal));
lightSpecular += gl_FrontLightProduct[i].specular *
pow(max(dot(r, eyeDir), 0.0), gl_FrontMaterial.shininess) *
attenFactor;
}
// compute final color
vec4 texColor = gl_Color * texture2D(Texture0, gl_TexCoord[0].xy);
gl_FragColor = texColor * (gl_FrontLightModelProduct.sceneColor + lightAmbientDiffuse) + lightSpecular;
float fog = (gl_Fog.end - gl_FogFragCoord) * gl_Fog.scale; // Intensität berechnen
fog = clamp(fog, 0.0, 1.0); // Beschneiden
gl_FragColor = mix(gl_Fog.color, gl_FragColor, fog); // Nebelfarbe einmischen
}
Comments are german because it's a german site where this code was posted, sorry.
But all this shader does is make everything very dark. No lighting effects at all - yet the shader codes compile. If I only use GL_LIGHT0 in the fragment shader, then it seems to work, but only reasonable for camera facing polygons and my floor polygon is just extremely dark. Also quads with RGBA textures show no sign of transparency.
I use standard glRotate/Translate for the Modelview matrix, and glVertex/Normal for my polygons. OpenGL lighting works fine apart from the fact that it looks ugly on very large surfaces. I triple checked my normals, they are fine.
Is there something wrong in the above code?
OR
Tell me why there is no generic lighting Shader for this actual task (point based light with distance falloff: a candle if you will) - shouldn't there be just one correct way to do this? I don't want bump/normal/parallax/toon/blur/whatever effects. I just want my lighting to perform better with larger polygons.
All Tutorials I found are only useful for lighting a single object when the camera is at 0,0,0 facing orthogonal to the object. The above is the only one found that at least looks like the thing I want to do.
I would strongly suggest you to read this article to see how the standard ADS lightning is done within GLSL.That is GL 4.0 but not a problem to adjust to your version:
Also you operate in the view (camera) space so DON"T negate the eyes vector :
vec3 eyeDir = normalize(-position);
I had pretty similar issues to yours because I also negated the eye vector forgetting that it is transformed into the view space.Your diffuse and specular calculations seem to be wrong too in the current scenario.In your place I wouldn't use data from the fixed pipeline at all ,otherwise what is the point doing it in a shader?
Here is the method to calculate diffuse and specular in the per fragment ADS point lightning:
void ads( int lightIndex,out vec3 ambAndDiff, out vec3 spec )
{
vec3 s = vec3(lights[lightIndex].Position - posOut) ;
vec3 v = normalize( posOut.xyz );
vec3 n = normalize(normOut);
vec3 h = normalize(v+s) ;// half vector (read in the web on what it is )
vec3 diffuse = ((Ka+ lights[lightIndex].Ld) * Kd * max( 0.0,dot(n, v) )) ;
spec = Ks * pow( max(0.0, dot(n,h) ), Shininess ) ;
ambAndDiff = diffuse ;
/// Ka-material ambient factor
/// Kd-material diffuse factor
/// Ks-material specular factor.
/// lights[lightIndex].Ld-lights diffuse factor;you may also add La and Ls if you want to have even more control of the light shading.
}
Also I wouldn't suggest you using the attenuation equation you have here,it is hard to control.If you want to add light radius based attenuation
there is this nice blog post: