Problems with flat and phong shading - c++

(Edit): The original code I posted was for both gouraud and phong shading options. I've changed it so it is just phong shading and posted below. The mesh is too big to describe here, as it is generated from a Bezier Patch.
I'm having some problems with flat and phong shading in Open GL 3 Mesa 9. It seems no matter what I do I get flat shaded figures, with tiny facets (planes) and I cannot get Blinn-Phong shading to work.
Here are my shaders:
(Vertex Shader)
//material parameters
uniform vec4 AmbientProduct, DiffuseProduct, SpecularProduct;
uniform float Shininess;
attribute vec4 vPosition;
//attribute vec4 vColor;
attribute vec4 vNormal;
attribute vec4 vControlColor;
attribute vec2 texcoord;
uniform mat4 model_view;
uniform mat4 projection;
uniform int flag;
uniform int phong_flag;
uniform vec4 eye_position;
//lighting parameters
uniform vec4 light_1; //light 1 position
uniform vec4 light_2; //light 2 position
varying vec4 control_color;
varying vec4 color;
varying vec4 position;
varying vec4 normal;
varying vec2 st;
void
main()
{
control_color = vControlColor;
position = vPosition;
normal = vNormal;
tex_coords = texcoord;
st = texcoord;
gl_Position = projection*model_view*vPosition;
}
And my fragment shader:
//material parameters
uniform vec4 AmbientProduct, DiffuseProduct, SpecularProduct;
uniform float Shininess;
uniform vec4 eye_position;
uniform int phong_flag;
//lighting parameters
uniform vec4 light_1; //light 1 position
uniform vec4 light_2; //light 2 position
varying vec4 light_2_transformed; //light 2 transformed position
uniform int Control_Point_Flag;
uniform sampler2D texMap;
varying vec4 color;
varying vec4 position;
varying vec4 normal;
varying vec4 control_color;
varying vec2 st;
void
main()
{
vec4 N = normalize(normal);
vec4 E = normalize(eye_position - position);
vec4 L1 = normalize(light_1 - position);
vec4 L2 = normalize(light_2 - position);
vec4 H1 = normalize( L1 + E);
vec4 H2 = normalize( L2 + E);
//calculate ambient component
vec4 ambient = AmbientProduct;
//calculate diffuse componenent
float k_d_1 = max(dot(L1,N), 0.0);
float k_d_2 = max(dot(L2,N), 0.0);
vec4 diffuse1 = k_d_1*DiffuseProduct;
vec4 diffuse2 = k_d_2*DiffuseProduct;
//calculate specular componenent
float k_s_1 = pow(max(dot(N, H1), 0.0), Shininess);
float k_s_2 = pow(max(dot(N, H2), 0.0), Shininess);
vec4 specular1 = k_s_1*SpecularProduct;
vec4 specular2 = k_s_2*SpecularProduct;
//if specular color is behind the camera, discard it
if (dot(L1, N) < 0.0) {
specular1 = vec4(0.0, 0.0, 0.0, 1.0);
}
if (dot(L2, N) < 0.0) {
specular2 = vec4(0.0, 0.0, 0.0, 1.0);
}
vec4 final_color = ambient + diffuse1 + diffuse2 + specular1 + specular2;
final_color.a = 1.0;
/* gl_FragColor = final_color; */
gl_FragColor = final_color*texture2D(texMap, st);
}
Does everything look ok for my shaders?

Things worth noting:
You have variables for a ModelView in your vertex shader, but you never use it in calculating position. Your vertex "position"s are thus whatever is passed from your OpenGL application and are not affected by any transformations you may be trying to do, though they are physically placed correctly because you use the matrices for gl_Position.
You aren't passing a Normal matrix to your shader. The Normal matrix is calculated by taking the transpose inverse of the ModelView matrix. Calculate this outside of the shader and pass it in. If you don't multiply your normals by the Normal matrix, you'll still be able to transform your model, but the normals will all still be facing the same way, so your lighting will be incorrect.
However, your normal vectors on the OpenGL side are may likely be the culprit. See this question for a good explanation of a possible source of unwanted flat shading.
As a side note, both of your shaders seem more complicated than they should be. That is to say, they have too many variables that aren't used and too much stuff that you could condense into fewer lines. It's just housekeeping, but it will make keeping track of your code easier.

Related

Normal mapping working incorrectly, weird half-light effect

We are trying to implement normal mapping in our 2D Game Engine and get weird effect.
If normal is set manually like that
vec3 Normal = vec3(0.0, 0.0, 1.0) light works correctly, but we dont get "deep" effect that we want to achieve by normal mapping:
But if we get normal using normal map texture: vec3 Normal = texture(NormalMap, TexCoord).rgb it doesn't work at all. What should not be illuminated is illuminated and vice versa (such as the gaps between the bricks). And besides this, a dark area is on the bottom (or top, depending on the position of the light) side of the texture.
Although the texture of the normal map itself looks fine:
This is our fragment shader:
#version 330 core
layout (location = 0) out vec4 FragColor;
in vec2 TexCoord;
in vec2 FragPos;
uniform sampler2D OurTexture;
uniform sampler2D NormalMap;
struct point_light
{
vec3 Position;
vec3 Color;
};
uniform point_light Light;
void main()
{
vec4 Color = texture(OurTexture, TexCoord);
vec3 Normal = texture(NormalMap, TexCoord).rgb;
if (Color.a < 0.1)
discard;
vec3 LightDir = vec3(Light.Position.xy - FragPos, Light.Position.z);
float D = length(LightDir);
vec3 L = normalize(LightDir);
Normal = normalize(Normal * 2.0 - 1.0);
vec3 Diffuse = Light.Color * max(dot(Normal, L), 0);
vec3 Ambient = vec3(0.3, 0.3, 0.3);
vec3 Falloff = vec3(1, 0, 0);
float Attenuation = 1.0 /(Falloff.x + Falloff.y*D + Falloff.z*D*D);
vec3 Intensity = (Ambient + Diffuse) * Attenuation;
FragColor = Color * vec4(Intensity, 1);
}
And vertex as well:
#version 330 core
layout (location = 0) in vec2 aPosition;
layout (location = 1) in vec2 aTexCoord;
uniform mat4 Transform;
uniform mat4 ViewProjection;
out vec2 FragPos;
out vec2 TexCoord;
void main()
{
gl_Position = ViewProjection * Transform * vec4(aPosition, 0.0, 1.0);
TexCoord = aTexCoord;
FragPos = vec2(Transform * vec4(aPosition, 0.0, 1.0));
}
I google about that and found some people that get the same result, but their questions remained unanswered.
Any idea of what is the cause?
What texture format are you using for the normal map? SRGB, SNORM, etc? That might be the issue. Try UNORM.
Additionally, since you are not using a tangent space, make sure the plane's Z axis aligns with the Z axis of the normals. Also OGL reads Y in the reversed direction, so you need to flip the Y coordinates of the normals that you read from the normal map. Alternatively, you can use a reversed Y normal map (green pointing down).

How to implement flat shading in LWJGL with index and vertex buffers?

I currently am developing a program that displays buildings and terrain in 3D in Java's LWJGL framework, which I understand is very similar to OpenGL. I am trying to differentiate between sides of buildings using flat shading. For example, a cube generated by my program should look like this:
To achieve this, I attempted to implement the method here:https://gamedev.stackexchange.com/questions/152991/how-can-i-calculate-normals-using-a-vertex-and-index-buffer.
It seems this is vertex shading, which colors every single pixel in a sort of gradient. My implementation currently looks like this:
Obviously this doesn't look like what I desired. Here are my vertex and fragment shaders:
#version 150
in vec3 position;
in vec2 textureCoordinates;
in vec3 normal;
out vec2 pass_textureCoordinates;
out vec3 surfaceNormal;
out vec3 toLightVector;
out vec3 toCameraVector;
uniform mat4 transformationMatrix;
uniform mat4 projectionMatrix;
uniform mat4 viewMatrix;
uniform vec3 lightPosition;
void main(void){
vec4 worldPosition = transformationMatrix * vec4(position, 1.0);
gl_Position = projectionMatrix * viewMatrix * worldPosition;
pass_textureCoordinates = textureCoordinates;
surfaceNormal = (transformationMatrix * vec4(normal, 0.0)).xyz;
toLightVector = lightPosition - worldPosition.xyz;
toCameraVector = (inverse(viewMatrix) * vec4(0.0,0.0,0.0,1.0)).xyz - worldPosition.xyz;
}
#version 150
in vec2 pass_textureCoordinates;
in vec3 surfaceNormal;
in vec3 toLightVector;
in vec3 toCameraVector;
out vec4 out_Color;
uniform sampler2D modelTexture;
uniform vec3 lightColour;
uniform float shineDamper;
uniform float reflectivity;
void main(void){
vec3 unitNormal = normalize(surfaceNormal);
vec3 unitLightVector = normalize(toLightVector);
float nDotl = dot(unitNormal, unitLightVector);
float brightness = max(nDotl, 0.7);
vec3 diffuse = brightness * lightColour;
vec3 unitVectorToCamera = normalize(toCameraVector);
vec3 lightDirection = -unitLightVector;
vec3 reflectedLightDirection = reflect(lightDirection, unitNormal);
float specularFactor = dot(reflectedLightDirection, unitVectorToCamera);
specularFactor = max(specularFactor, 0.0);
float dampedFactor = pow(specularFactor, shineDamper);
vec3 finalSpecular = dampedFactor * reflectivity * lightColour;
out_Color = vec4(diffuse, 1.0) * texture(modelTexture,pass_textureCoordinates) + vec4(finalSpecular, 1.0);
}
If a cube-shaped building I'm trying to render is indexed like so:
How can I create the array of normals programmatically for flat shading?
The "sort of gradient" is caused by the Specular highlights. If the normal vectors are normal to the faces of the cube, then a Lambertian diffuse light will give you a flat look.
All you have to do is to initialize the uniform variable reflectivity by 0.0. (actually that's the default value).
It is possible to compute the face normal vectors by partial derivatives (dFdx, dFdy) in the fragment shader or by the cross product in the geometry shader. But this options will cause a loss of quality and speed.
See
From within a fragment shader how do I get the surface - not vertex - normal
3D object is colored in a way that looks like a 2D object

Processing output nothing using gouraud shading

sorry, I am a new on opengl es and processing
below processing and shaders output only background
PShader Gouraud,Phong;
rocket = loadShape("rocket.obj");
rocket.setFill(color(800, 0, 0));
Gouraud= loadShader("gouraudfragment.glsl","gouraudvertex.glsl");
Phong= loadShader("phongfragment.glsl","phongvertex.glsl");
background(0);
pushMatrix();
shader(Gouraud);
translate(130,height/2.0);
rotateY(rc);
rotateX(0.4);
noStroke();
fill(#800080);
box(100);
rc+=(0.02+speedCube);
rc*=dirCube;
popMatrix();
pushMatrix();
shader(Gouraud);
translate(width/2, height/2 + 100, -200);
rotateZ(PI);
rotateY(rr);
shape(rocket,100,100);
rr +=( 0.02+speedRocket);
rr*=dirRocket;
popMatrix();
vertex shader
varying vec3 N;
varying vec3 v;
varying vec4 diffuse;
varying vec4 spec;
attribute vec4 position;
attribute vec3 normal;
uniform mat4 modelview;
uniform mat4 projectionMatrix;
uniform mat3 normalMatrix;
uniform vec4 lightPosition;
uniform vec3 lightAmbient;
uniform vec3 lightDiffuse;
uniform vec3 lightSpecular;
uniform float SpecularPower;
void main()
{
vec4 diffuse;
vec4 spec;
vec4 ambient;
v = vec3(modelview * position);
N = normalize(normalMatrix * normal);
gl_Position = projectionMatrix * position;
vec3 L = normalize(lightPosition.xyz - v);
vec3 E = normalize(-v);
vec3 R = normalize(reflect(-L,N));
ambient = vec4(lightAmbient,100.0);
diffuse = vec4(clamp( lightDiffuse * max(dot(N,L), 0.0) , 0.0, 1.0 ) ,100.0);
spec = vec4(clamp (lightSpecular * pow(max(dot(R,E),0.0),0.3*SpecularPower) , 0.0, 1.0 ),100.0);
color = ambient + diffuse + spec;
}
fragment shader
void main()
{
gl_FragColor = color;
}
please help!
before apply gouraud shading
after apply gouraud shading
The prcessing load the obj and the draw a cube and apply a gouraud shader, but after that only backgroud are shown, the obj loaded and cube is gone. nothing shown!
the shader doesn't even compile and link. The vertex shader has 1 varying output (color), so the framgent shader needs the input varying vec4 color;.
varying vec4 color;
When you set the clip space position, then the vertex coordinate has to be transformed by the model view and projection matrix:
gl_Position = projectionMatrix * modelview * position;
The types specifications of v and N are missing and the types of ambient, diffuse and spec are vec4 rather than vec3.
Vertex shader:
attribute vec4 position;
attribute vec3 normal;
varying vec4 color;
uniform mat4 modelview;
uniform mat4 projectionMatrix;
uniform mat3 normalMatrix;
uniform vec4 lightPosition;
uniform vec3 lightAmbient;
uniform vec3 lightDiffuse;
uniform vec3 lightSpecular;
uniform float SpecularPower;
void main()
{
vec3 v = vec3(modelview * position);
vec3 N = normalize(normalMatrix * normal);
gl_Position = projectionMatrix * modelview * position;
vec3 L = normalize(lightPosition.xyz - v);
vec3 E = normalize(-v);
vec3 R = normalize(reflect(-L,N));
vec4 ambient = vec4(lightAmbient,100.0);
vec4 diffuse = vec4(clamp( lightDiffuse * max(dot(N,L), 0.0) , 0.0, 1.0 ) ,100.0);
vec4 spec = vec4(clamp (lightSpecular * pow(max(dot(R,E),0.0),0.3*SpecularPower) , 0.0, 1.0 ),100.0);
color = ambient + diffuse + spec;
}
Fragment shader:
varying vec4 color;
void main()
{
gl_FragColor = color;
}
Of course you have to set at least an ambient light source ambientLight().
You can use a directionalLight(), pointLight() or spotLight(), too.
But note, your shader can handle 1 light source only. More the 1 light source would gain
OpenGL error 1282 at top endDraw(): invalid operation
If you want to use more than 1 light source then you would have to use uniform arrays int the vertex shader for lightPosition, lightAmbient, lightDiffuse, and lightSpecular. See Types of shaders in Processing(https://processing.org/tutorials/pshader/)

How do you incorporate per pixel lighting in shaders with LIBGDX?

So I've currently managed to write a shader using Xoppa tutorials and use an AssetManager, and I managed to bind a texture to the model, and it looks fine.
[[1
Now the next step I guess would be to create diffuse(not sure if thats the word? phong shading?) lighting(?) to give the bunny some form of shading. While I have a little bit of experience with GLSL shaders in LWJGL, I'm unsure how to process that same information so I can use it in libGDX and in the glsl shaders.
I understand that this all could be accomplished using the Environment class etc. But I want to achieve this through the shaders alone or by traditional means simply for the challenge.
In LWJGL, shaders would have uniforms:
in vec3 position;
in vec2 textureCoordinates;
in vec3 normal;
out vec2 pass_textureCoordinates;
out vec3 surfaceNormal;
out vec3 toLightVector;
uniform mat4 transformationMatrix;
uniform mat4 projectionMatrix;
uniform mat4 viewMatrix;
uniform vec3 lightPosition;
This would be reasonably easy for me to calculate in LWJGL
Vertex file:
attribute vec3 a_position;
attribute vec3 a_normal;
attribute vec2 a_texCoord0;
uniform mat4 u_worldTrans;
uniform mat4 u_projViewTrans;
varying vec2 v_texCoords;
void main() {
v_texCoords = a_texCoord0;
gl_Position = u_projViewTrans * u_worldTrans * vec4(a_position, 1.0);
}
I imagine that I could implement the uniforms similarly to the LWGL glsl example, but I dont know how I can apply these uniforms into libgdx and have it work. I am unsure what u_projViewTrans is, I'm assuming it is a combination of the projection, transformation and view matrix, and setting the uniform with the camera.combined?
If someone could help me understand the process or point to an example of how (per pixel lighting?), can be implemented with just the u_projViewTrans and u_worldTrans, I'd greatly appreciate your time and effort in helping me understand these concepts a bit better.
Heres my github upload of my work in progress.here
You can do the light calculations in world space. A simple lambertian diffuse light can be calculated like this:
vec3 toLightVector = normalize( lightPosition - vertexPosition );
float ligtIntensity = max( 0.0, dot( normal, toLightVector ));
A detailed explanation can be found in the answer to the Stackoverflow question How does this faking the light work on aerotwist?.
While Gouraud shading calculates the light in the the vertex shader, Phong shading calculates the light in the fragment shader.
(see further GLSL fixed function fragment program replacement)
A Gouraud shader may look like this:
Vertex Shader:
attribute vec3 a_position;
attribute vec3 a_normal;
attribute vec2 a_texCoord0;
uniform mat4 u_worldTrans;
uniform mat4 u_projViewTrans;
uniform vec3 lightPosition;
varying vec2 v_texCoords;
varying float v_lightIntensity;
void main()
{
vec4 vertPos = u_worldTrans * vec4(a_position, 1.0);
vec3 normal = normalize(mat3(u_worldTrans) * a_normal);
vec3 toLightVector = normalize(lightPosition - vertPos.xyz);
v_lightIntensity = max( 0.0, dot(normal, toLightVector));
v_texCoords = a_texCoord0;
gl_Position = u_projViewTrans * vertPos;
}
Fragment Shader:
varying vec2 v_texCoords;
varying float v_lightIntensity;
uniform sampler2D u_texture;
void main()
{
vec4 texCol = texture( u_texture, v_texCoords.st );
gl_FragColor = vec4( texCol.rgb * v_lightIntensity, 1.0 );
}
A Phong shading may look like this:
Vertex Shader:
attribute vec3 a_position;
attribute vec3 a_normal;
attribute vec2 a_texCoord0;
uniform mat4 u_worldTrans;
uniform mat4 u_projViewTrans;
varying vec2 v_texCoords;
varying vec3 v_vertPosWorld;
varying vec3 v_vertNVWorld;
void main()
{
vec4 vertPos = u_worldTrans * vec4(a_position, 1.0);
v_vertPosWorld = vertPos.xyz;
v_vertNVWorld = normalize(mat3(u_worldTrans) * a_normal);
v_texCoords = a_texCoord0;
gl_Position = u_projViewTrans * vertPos;
}
Fragment Shader:
varying vec2 v_texCoords;
varying vec3 v_vertPosWorld;
varying vec3 v_vertNVWorld;
uniform sampler2D u_texture;
struct PointLight
{
vec3 color;
vec3 position;
float intensity;
};
uniform PointLight u_pointLights[1];
void main()
{
vec3 toLightVector = normalize(u_pointLights[0].position - v_vertPosWorld.xyz);
float lightIntensity = max( 0.0, dot(v_vertNVWorld, toLightVector));
vec4 texCol = texture( u_texture, v_texCoords.st );
vec3 finalCol = texCol.rgb * lightIntensity * u_pointLights[0].color;
gl_FragColor = vec4( finalCol.rgb * lightIntensity, 1.0 );
}

How to make a retro/neon/glow effect using shaders?

Let's say the concept is to create a map consisting of cubes with a neon aesthetic, such as:
Currently I have this vertex shader:
// Uniforms
uniform mat4 u_projection;
uniform mat4 u_view;
uniform mat4 u_model;
// Vertex atributes
in vec3 a_position;
in vec3 a_normal;
in vec2 a_texture;
vec3 u_light_direction = vec3(1.0, 2.0, 3.0);
// Vertex shader outputs
out vec2 v_texture;
out float v_intensity;
void main()
{
vec3 normal = normalize((u_model * vec4(a_normal, 0.0)).xyz);
vec3 light_dir = normalize(u_light_direction);
v_intensity = max(0.0, dot(normal, light_dir));
v_texture = a_texture;
gl_Position = u_projection * u_view * u_model * vec4(a_position, 1.0);
}
And this pixel shader:
in float v_intensity;
in vec2 v_texture;
uniform sampler2D u_texture;
out vec4 fragColor;
void main()
{
fragColor = texture(u_texture, v_texture) * vec4(v_intensity, v_intensity, v_intensity, 1.0);
}
How would I use this to create a neon effect such as in the example for 3D cubes? The cubes are simply models with a mesh/material. The only change would be to set the material color to black and the outlines to a bright pink or blue (maybe with a glow).
Any help is appreciated. :)
You'd normally implement this as a post-processing effect. First render with bright, saturated colours into a texture, then apply a bloom effect, when drawing that texture to screen.