Here you see a scene comprised of planes lit by a point light. In nature, the brightness for each "fragment" is determined, for the most part, by it's distance from the source and I would like to see that smooth transition here, from wall to wall.
The only variable factor is the vertex normals, and no doubt they are causing this sharp change in colour. What can be done to light the room more realistically?
Vertex Shader
#version 330
precision highp float;
uniform mat4 projection_matrix;
uniform mat4 model_matrix;
uniform mat4 view_matrix;
in vec3 vert_position;
in vec2 vert_texcoord;
in vec3 vert_normal;
out vec3 frag_normal;
out vec2 frag_texcoord;
uniform vec3 LightPosition;
out vec3 toLightVector;
void main(void)
{
vec4 world_position = model_matrix * vec4(vert_position, 1);
frag_texcoord = vert_texcoord;
frag_normal = (model_matrix * vec4(vert_normal, 0)).xyz;
toLightVector = LightPosition - world_position.xyz;
gl_Position = projection_matrix * view_matrix * world_position;
}
Fragment Shader
#version 330
precision highp float;
in vec3 frag_normal;
in vec2 frag_texcoord;
in vec3 toLightVector;
uniform sampler2D MyTexture0;
uniform vec3 LightColour;
uniform vec3 LightAttenuation;
out vec4 finalColor;
void main(void)
{
float distance = length(toLightVector);
float attFactor = LightAttenuation.x + (LightAttenuation.y * distance) + (LightAttenuation.z * distance * distance);
vec3 unitNormal = normalize(frag_normal);
vec3 unitLightVector = normalize(toLightVector);
float nDot1 = dot(unitNormal, unitLightVector);
float brightness = max(nDot1, 0);
vec3 diffuse = (brightness * LightColour) / attFactor;
finalColor = vec4(diffuse, 1.0) * texture(MyTexture0, frag_texcoord);
}
You're not going to like this Nicol Bolas, but simplifying the calculations to depend solely on distance provided precisely the effect I was after.
Skiving on design and aesthetics will destroy any art piece, but saying that, I'm not writing Doom 4 here, I'm a one man rent strike writing a rogue.
Fragment shader
#version 330
precision highp float;
in vec2 frag_texcoord;
in vec3 toLightVector;
uniform sampler2D MyTexture0;
uniform vec3 LightColour;
uniform vec3 LightAttenuation;
out vec4 finalColor;
void main(void)
{
float distance = length(toLightVector);
float attFactor = LightAttenuation.x + (LightAttenuation.y * distance) + (LightAttenuation.z * distance * distance);
vec3 diffuse = (LightColour) / attFactor;
finalColor = vec4(diffuse, 1.0) * texture(MyTexture0, frag_texcoord);
}
Your problem is your expectations of your lighting model.
Your lighting lighting model is behaving exactly as it would be expected to. But it's just a model of reality, an approximation.
Real lighting is far more complex that a simple dot product and attenuation. It's a highly complicated interplay between lots and lots of small light sources and such. This explains why you don't (usually) see such discontinuities in lighting in real life.
The correct way to resolve this problem is not to change your attenuation or your normals. It's to investigate more accurate lighting models.
Just put more weight on distance and less on normals. Make LightAttenuation look more like (0,0,1).
Related
I currently am developing a program that displays buildings and terrain in 3D in Java's LWJGL framework, which I understand is very similar to OpenGL. I am trying to differentiate between sides of buildings using flat shading. For example, a cube generated by my program should look like this:
To achieve this, I attempted to implement the method here:https://gamedev.stackexchange.com/questions/152991/how-can-i-calculate-normals-using-a-vertex-and-index-buffer.
It seems this is vertex shading, which colors every single pixel in a sort of gradient. My implementation currently looks like this:
Obviously this doesn't look like what I desired. Here are my vertex and fragment shaders:
#version 150
in vec3 position;
in vec2 textureCoordinates;
in vec3 normal;
out vec2 pass_textureCoordinates;
out vec3 surfaceNormal;
out vec3 toLightVector;
out vec3 toCameraVector;
uniform mat4 transformationMatrix;
uniform mat4 projectionMatrix;
uniform mat4 viewMatrix;
uniform vec3 lightPosition;
void main(void){
vec4 worldPosition = transformationMatrix * vec4(position, 1.0);
gl_Position = projectionMatrix * viewMatrix * worldPosition;
pass_textureCoordinates = textureCoordinates;
surfaceNormal = (transformationMatrix * vec4(normal, 0.0)).xyz;
toLightVector = lightPosition - worldPosition.xyz;
toCameraVector = (inverse(viewMatrix) * vec4(0.0,0.0,0.0,1.0)).xyz - worldPosition.xyz;
}
#version 150
in vec2 pass_textureCoordinates;
in vec3 surfaceNormal;
in vec3 toLightVector;
in vec3 toCameraVector;
out vec4 out_Color;
uniform sampler2D modelTexture;
uniform vec3 lightColour;
uniform float shineDamper;
uniform float reflectivity;
void main(void){
vec3 unitNormal = normalize(surfaceNormal);
vec3 unitLightVector = normalize(toLightVector);
float nDotl = dot(unitNormal, unitLightVector);
float brightness = max(nDotl, 0.7);
vec3 diffuse = brightness * lightColour;
vec3 unitVectorToCamera = normalize(toCameraVector);
vec3 lightDirection = -unitLightVector;
vec3 reflectedLightDirection = reflect(lightDirection, unitNormal);
float specularFactor = dot(reflectedLightDirection, unitVectorToCamera);
specularFactor = max(specularFactor, 0.0);
float dampedFactor = pow(specularFactor, shineDamper);
vec3 finalSpecular = dampedFactor * reflectivity * lightColour;
out_Color = vec4(diffuse, 1.0) * texture(modelTexture,pass_textureCoordinates) + vec4(finalSpecular, 1.0);
}
If a cube-shaped building I'm trying to render is indexed like so:
How can I create the array of normals programmatically for flat shading?
The "sort of gradient" is caused by the Specular highlights. If the normal vectors are normal to the faces of the cube, then a Lambertian diffuse light will give you a flat look.
All you have to do is to initialize the uniform variable reflectivity by 0.0. (actually that's the default value).
It is possible to compute the face normal vectors by partial derivatives (dFdx, dFdy) in the fragment shader or by the cross product in the geometry shader. But this options will cause a loss of quality and speed.
See
From within a fragment shader how do I get the surface - not vertex - normal
3D object is colored in a way that looks like a 2D object
In my per-vertex point lighting implementation, every fragment output is white, and I am having trouble locating the source of the problem.
I closely followed this tutorial for the shader code, so I think the problem may stem from my vertex normals.
The scene is comprised from planes (2 triangles each). The normal for each plane is based on it's four corners and it is this normal vector I am passing to the shader for each vertex. Normal directions shown below.
However, after a lot of research, it seems this is the correct technique for calculating normals. Any suggestions would be greatly appreciated!
Vertex Shader
#version 330
uniform mat4 projectionMatrix;
uniform mat4 modelViewMatrix;
in vec3 in_position;
in vec2 in_texcoords;
in vec3 in_normals;
out vec3 vPosition;
out vec2 vTextureCoord;
out vec3 vNormal;
void main(void)
{
vPosition = vec3(modelViewMatrix * vec4(in_position, 1.0));
vTextureCoord = in_texcoords;
vNormal = vec3(modelViewMatrix * vec4(in_normals, 0.0));
gl_Position = projectionMatrix * modelViewMatrix * vec4(in_position, 1.0);
}
Fragment Shader
#version 330
precision mediump float;
in vec3 vPosition;
in vec2 vTextureCoord;
in vec3 vNormal;
out vec4 finalColor;
uniform sampler2D MyTexture0;
void main(void)
{
vec3 lightPosition = vec3(8.2, 0, 3);
float distance = length(lightPosition - vPosition);
vec3 lightVector = normalize(lightPosition - vPosition);
float diffuse = max(dot(vNormal, lightVector), 0.1);
diffuse = diffuse * (1.0 / (1.0 + (0.25 * distance * distance)));
vec4 fragmentColor = texture(MyTexture0, vec2(vTextureCoord.s, vTextureCoord.t));
finalColor = fragmentColor * diffuse;
}
The scene as displayed with a basic shader, no lighting calculations.
Recently I added deferred shading support in my engine; however I ran into some attenuation issues:
As you can see, when I'm rendering the light volume (sphere), it doesn't blend nicely with the ambient part of the image !
Here is how I declare my point light:
PointLight pointlight;
pointlight.SetPosition(glm::vec3(0.0, 6.0, 0.0));
pointlight.SetIntensity(glm::vec3(1.0f, 1.0f, 1.0f));
Here is how I compute the light sphere radius:
Attenuation attenuation = pointLights[i].GetAttenuation();
float lightMax = std::fmaxf(std::fmax(pointLights[i].GetIntensity().r, pointLights[i].GetIntensity().g),
pointLights[i].GetIntensity().b);
float pointLightRadius = (-attenuation.linear +
std::sqrtf(std::pow(attenuation.linear, 2.0f) - 4.0f * attenuation.exponential *
(attenuation.constant - (256.0f / 5.0f) * lightMax))) / (2.0f * attenuation.exponential);
And finally, here is my PointLightPass fragment shader:
#version 450 core
struct BaseLight
{
vec3 intensities;//a.k.a color of light
float ambientCoeff;
};
struct Attenuation
{
float constant;
float linear;
float exponential;
};
struct PointLight
{
BaseLight base;
Attenuation attenuation;
vec3 position;
};
struct Material
{
float shininess;
vec3 specularColor;
float ambientCoeff;
};
layout (std140) uniform Viewport
{
uniform mat4 Projection;
uniform mat4 View;
uniform mat4 ViewProjection;
uniform vec2 scrResolution;
};
layout(binding = 0) uniform sampler2D gPositionMap;
layout(binding = 1) uniform sampler2D gAlbedoMap;
layout(binding = 2) uniform sampler2D gNormalMap;
layout(binding = 3) uniform sampler2D gSpecularMap;
uniform vec3 cameraPosition;
uniform PointLight pointLight;
out vec4 fragmentColor;
vec2 FetchTexCoord()
{
return gl_FragCoord.xy / scrResolution;
}
void main()
{
vec2 texCoord = FetchTexCoord();
vec3 gPosition = texture(gPositionMap, texCoord).xyz;
vec3 gSurfaceColor = texture(gAlbedoMap, texCoord).xyz;
vec3 gNormal = texture(gNormalMap, texCoord).xyz;
vec3 gSpecColor = texture(gSpecularMap, texCoord).xyz;
float gSpecPower = texture(gSpecularMap, texCoord).a;
vec3 totalLight = gSurfaceColor * 0.1; //TODO remove hardcoded ambient light
vec3 viewDir = normalize(cameraPosition - gPosition);
vec3 lightDir = normalize(pointLight.position - gPosition);
vec3 diffuse = max(dot(gNormal, lightDir), 0.0f) * gSurfaceColor *
pointLight.base.intensities;
vec3 halfWayDir = normalize(lightDir + viewDir);
float spec = pow(max(dot(gNormal, halfWayDir), 0.0f), 1.0f);
vec3 specular = pointLight.base.intensities * spec /** gSpecColor*/;
float distance = length(pointLight.position - gPosition);
float attenuation = 1.0f / (1.0f + pointLight.attenuation.linear * distance
+ pointLight.attenuation.exponential * distance * distance +
pointLight.attenuation.constant);
diffuse *= attenuation;
specular *= attenuation;
totalLight += diffuse + specular;
fragmentColor = vec4(totalLight, 1.0f);
}
So what can you suggest to deal with this issue ?
EDIT : Here are more details :
For deferred shading,
I populate my GBuffer;
I make an ambient light pass where I render a fullscreen quad
with the ambient colors :
#version 420 core
layout (std140) uniform Viewport
{
uniform mat4 Projection;
uniform mat4 View;
uniform mat4 ViewProjection;
uniform vec2 scrResolution;
};
layout(binding = 1) uniform sampler2D gAlbedoMap;
out vec4 fragmentColor;
vec2 FetchTexCoord()
{
return gl_FragCoord.xy / scrResolution;
}
void main()
{
vec2 texCoord = FetchTexCoord();
vec3 gSurfaceColor = texture(gAlbedoMap, texCoord).xyz;
vec3 totalLight = gSurfaceColor * 1.2; //TODO remove hardcoded ambient light
fragmentColor = vec4(totalLight, 1.0f);
}
Then I pass my point lights (see code above);
The reason you're having this problem is that you're using a "light volume" (a fact that you didn't make entirely clear in this question, but was brought up in your other question).
You are using the normal light attenuation equation. Well, you'll notice that this equation does not magically stop at some arbitrary radius. It is defined for all distances from 0 to infinity.
The purpose of your light volume is to prevent lighting contributions beyond a certain distance. Well, if your light attenuation doesn't go to zero at that distance, then you're going to see a discontinuity at the edge of the light volume.
If you're going to use a light volume, you need to use a light attenuation equation that actually is guaranteed to reach zero at the edge of the volume. Or failing that, you should pick a radius for your volume such that the attenuated strength of the light is nearly zero. And your radius is too small for that.
Keep making your radius bigger until you can't tell it's there.
I have a simple application that draws a sphere with a single directional light. I'm creating the sphere by starting with an octahedron and subdividing each triangle into 4 smaller triangles.
With just diffuse lighting, the sphere looks very smooth. However, when I add specular highlights, the edges of the triangles show up fairly strongly. Here are some examples:
Diffuse only:
Diffuse and Specular:
I believe that the normals are being interpolated correctly. Looking at just the normals, I get this:
In fact, if I switch to a flat shading, where the normals are per-polygon instead of per-vertex, I get this:
In my vertex shader, I'm multiplying the model's normals by the transpose inverse modelview matrix:
#version 330 core
layout (location = 0) in vec4 vPosition;
layout (location = 1) in vec3 vNormal;
layout (location = 2) in vec2 vTexCoord;
out vec3 fNormal;
out vec2 fTexCoord;
uniform mat4 transInvModelView;
uniform mat4 ModelViewMatrix;
uniform mat4 ProjectionMatrix;
void main()
{
fNormal = vec3(transInvModelView * vec4(vNormal, 0.0));
fTexCoord = vTexCoord;
gl_Position = ProjectionMatrix * ModelViewMatrix * vPosition;
}
and in the fragment shader, I'm calculating the specular highlights as follows:
#version 330 core
in vec3 fNormal;
in vec2 fTexCoord;
out vec4 color;
uniform sampler2D tex;
uniform vec4 lightColor; // RGB, assumes multiplied by light intensity
uniform vec3 lightDirection; // normalized, assumes directional light, lambertian lighting
uniform float specularIntensity;
uniform float specularShininess;
uniform vec3 halfVector; // Halfway between eye and light
uniform vec4 objectColor;
void main()
{
vec4 texColor = objectColor;
float specular = max(dot(halfVector, fNormal), 0.0);
float diffuse = max(dot(lightDirection, fNormal), 0.0);
if (diffuse == 0.0)
{
specular = 0.0;
}
else
{
specular = pow(specular, specularShininess) * specularIntensity;
}
color = texColor * diffuse * lightColor + min(specular * lightColor, vec4(1.0));
}
I was a little confused about how to calculate the halfVector. I'm doing it on the CPU and passing it in as a uniform. It's calculated like this:
vec3 lightDirection(1.0, 1.0, 1.0);
lightDirection = normalize(lightDirection);
vec3 eyeDirection(0.0, 0.0, 1.0);
eyeDirection = normalize(eyeDirection);
vec3 halfVector = lightDirection + eyeDirection;
halfVector = normalize(halfVector);
glUniform3fv(halfVectorLoc, 1, &halfVector [ 0 ]);
Is that the correct formulation for the halfVector? Or does it need to be done in the shaders as well?
Interpolating normals into a face can (and almost always will) result in a shortening of the normal. That's why the highlight is darker in the center of a face and brighter at corners and edges. If you do this, just re-normalize the normal in the fragment shader:
fNormal = normalize(fNormal);
Btw, you cannot precompute the half vector as it is view dependent (that's the whole point of specular lighting). In your current scenario, the highlight will not change when you just move the camera (keeping the direction).
One way to do this in the shader is to pass an additional uniform for the eye position and then calculate the view direction as eyePosition - vertexPosition. Then continue as you did on the CPU.
I'm currently trying to make specular lighting on an sphere using glsl and using Phong-model.
This is how my fragment shader looks like:
#version 120
uniform vec4 color;
uniform vec3 sunPosition;
uniform mat4 normalMatrix;
uniform mat4 modelViewMatrix;
uniform float shininess;
// uniform vec4 lightSpecular;
// uniform vec4 materialSpecular;
varying vec3 viewSpaceNormal;
varying vec3 viewSpacePosition;
vec4 calculateSpecular(vec3 l, vec3 n, vec3 v, vec4 specularLight, vec4 materialSpecular) {
vec3 r = -l+2*(n*l)*n;
return specularLight * materialSpecular * pow(max(0,dot(r, v)), shininess);
}
void main(){
vec3 normal = normalize(viewSpaceNormal);
vec3 viewSpacePosition = (modelViewMatrix * vec4(gl_FragCoord.x, gl_FragCoord.y, gl_FragCoord.z, 1.0)).xyz;
vec4 specular = calculateSpecular(sunPosition, normal, viewSpacePosition, vec4(0.3,0.3,0.3,0.3), vec4(0.3,0.3,0.3,0.3));
gl_FragColor = color+specular;
}
The sunPosition is not moving and is set to the value (2.0f, 3.0f, -1.0f).
The problem is that the image looks nothing as it's supose to do if the specular calculations were correct.
This is how it looks like:
http://i.imgur.com/Na2C6.png
The reason i don't have any ambient-/emissiv-/deffuse- lighting in this code is because i want to get the specular light part working first.
Thankful for any help!
Edit:
#Darcy Rayner
That indead helped alot tough it seams to be something that is still not right...
The current code looks like this:
Vertex Shader:
viewSpacePosition = (modelViewMatrix*gl_Vertex).xyz;
viewSpaceSunPosition = (modelViewMatrix*vec4(sunPosition,1)).xyz;
gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;
viewSpaceNormal = (normalMatrix * vec4(gl_Position.xyz, 0.0)).xyz;
Fragment Shader:
vec4 calculateSpecular(vec3 l, vec3 n, vec3 v, vec4 specularLight, vec4 materialSpecular) {
vec3 r = -l+2*(n*l)*n;
return specularLight * materialSpecular * pow(max(0,dot(r, v)), shininess);
}
void main(){
vec3 normal = normalize(viewSpaceNormal);
vec3 viewSpacePosition = normalize(viewSpacePosition);
vec3 viewSpaceSunPosition = normalize(viewSpaceSunPosition);
vec4 specular = calculateSpecular(viewSpaceSunPosition, normal, viewSpacePosition, vec4(0.7,0.7,0.7,1.0), vec4(0.6,0.6,0.6,1.0));
gl_FragColor = color+specular;
}
And the sphere looks like this:
-->Picture-link<--
with the sun position: sunPosition = new Vector(12.0f, 15.0f, -1.0f);
Try not using gl_FragCoord, as it is stored in screen coordinates, (and I don't think transforming it by the modelViewMatrix will get it back to view coordinates). Easiest thing to do, set viewSpacePosition in your vertex shader as:
// Change gl_Vertex to whatever attribute you are using.
viewSpacePosition = (modelViewMatrix * gl_Vertex).xyz;
This should get you viewSpacePosition in view coordinates, (ie. before projection is applied). You can then go ahead and normalise viewSpacePosition in the fragment shader. Not sure if you are storing the sun vector in world coordinates, but you will probably want to transform it into view space then normalise it as well. Give it a go and see what happens, these things tend to be very error prone.