GLSL normals change with a camera rotation - opengl

I know that same questions were asked many times, but unfortunately I am unable to find the source of my problem.
With help of tutorials I've written a small GLSL shader. Right now it can work with ambient light and load normals from normal map. The issue is that directional light seems to be dependent on my viewing angle.
Here are my shaders:
//Vertex Shader
#version 120
attribute vec3 position;
attribute vec2 texCoord;
attribute vec3 normal;
attribute vec3 tangent;
varying vec2 texCoord0;
varying mat3 tbnMatrix;
uniform mat4 transform;
void main(){
gl_Position=transform * vec4(position, 1.0);
texCoord0 = texCoord;
vec3 n = normalize((transform*vec4(normal,0.0)).xyz);
vec3 t = normalize((transform*vec4(tangent,0.0)).xyz);
t=normalize(t-dot(t,n)*n);
vec3 btTangent=cross(t,n);
tbnMatrix=transpose(mat3(t,btTangent,n));
}
//Fragment Shader
#version 120
varying vec2 texCoord0;
varying mat3 tbnMatrix;
struct BaseLight{
vec3 color;
float intensity;
};
struct DirectionalLight{
BaseLight base;
vec3 direction;
};
uniform sampler2D diffuse;
uniform sampler2D normalMap;
uniform vec3 ambientLight;
uniform DirectionalLight directionalLight;
vec4 calcLight(BaseLight base, vec3 direction,vec3 normal){
float diffuseFactor=dot(normal,normalize(direction));
vec4 diffuseColor = vec4(0,0,0,0);
if(diffuseFactor>0){
diffuseColor=vec4(base.color,1.0)* base.intensity *diffuseFactor;
}
return diffuseColor;
}
vec4 calcDirectionalLight(DirectionalLight directionalLight ,vec3 normal){
return calcLight(directionalLight.base,directionalLight.direction,normal);
}
void main(){
vec3 normal =tbnMatrix*(255.0/128.0* texture2D(normalMap,texCoord0).xyz-255.0/256.0);
vec4 totalLight = vec4(ambientLight,0) +calcDirectionalLight(directionalLight, normal);
gl_FragColor=texture2D(diffuse,texCoord0)*totalLight;
}
"transform" matrix that I send to the shader is summarily computed this way:
viewProjection=m_perspective* glm::lookAt(CameraPosition,CameraPosition+m_forward,m_up);
glm::mat4 transform = vievProjection * object_matrix;
"object_matrix" is a matrix that I get directly from physics engine.(I think it's the matrix that defines position and rotation of the object in the world space, correct me if I'm wrong.)
And I guess that "transform" matrix is computed correctly, since all the objects are drawn in the right positions. It looks like the problem is related to normals because if I set gl_FragColor = vec4(normal, 0) the color is also changing with camera rotation.
I would greatly appreciate if anyone could point me to my mistake

Related

How do you incorporate per pixel lighting in shaders with LIBGDX?

So I've currently managed to write a shader using Xoppa tutorials and use an AssetManager, and I managed to bind a texture to the model, and it looks fine.
[[1
Now the next step I guess would be to create diffuse(not sure if thats the word? phong shading?) lighting(?) to give the bunny some form of shading. While I have a little bit of experience with GLSL shaders in LWJGL, I'm unsure how to process that same information so I can use it in libGDX and in the glsl shaders.
I understand that this all could be accomplished using the Environment class etc. But I want to achieve this through the shaders alone or by traditional means simply for the challenge.
In LWJGL, shaders would have uniforms:
in vec3 position;
in vec2 textureCoordinates;
in vec3 normal;
out vec2 pass_textureCoordinates;
out vec3 surfaceNormal;
out vec3 toLightVector;
uniform mat4 transformationMatrix;
uniform mat4 projectionMatrix;
uniform mat4 viewMatrix;
uniform vec3 lightPosition;
This would be reasonably easy for me to calculate in LWJGL
Vertex file:
attribute vec3 a_position;
attribute vec3 a_normal;
attribute vec2 a_texCoord0;
uniform mat4 u_worldTrans;
uniform mat4 u_projViewTrans;
varying vec2 v_texCoords;
void main() {
v_texCoords = a_texCoord0;
gl_Position = u_projViewTrans * u_worldTrans * vec4(a_position, 1.0);
}
I imagine that I could implement the uniforms similarly to the LWGL glsl example, but I dont know how I can apply these uniforms into libgdx and have it work. I am unsure what u_projViewTrans is, I'm assuming it is a combination of the projection, transformation and view matrix, and setting the uniform with the camera.combined?
If someone could help me understand the process or point to an example of how (per pixel lighting?), can be implemented with just the u_projViewTrans and u_worldTrans, I'd greatly appreciate your time and effort in helping me understand these concepts a bit better.
Heres my github upload of my work in progress.here
You can do the light calculations in world space. A simple lambertian diffuse light can be calculated like this:
vec3 toLightVector = normalize( lightPosition - vertexPosition );
float ligtIntensity = max( 0.0, dot( normal, toLightVector ));
A detailed explanation can be found in the answer to the Stackoverflow question How does this faking the light work on aerotwist?.
While Gouraud shading calculates the light in the the vertex shader, Phong shading calculates the light in the fragment shader.
(see further GLSL fixed function fragment program replacement)
A Gouraud shader may look like this:
Vertex Shader:
attribute vec3 a_position;
attribute vec3 a_normal;
attribute vec2 a_texCoord0;
uniform mat4 u_worldTrans;
uniform mat4 u_projViewTrans;
uniform vec3 lightPosition;
varying vec2 v_texCoords;
varying float v_lightIntensity;
void main()
{
vec4 vertPos = u_worldTrans * vec4(a_position, 1.0);
vec3 normal = normalize(mat3(u_worldTrans) * a_normal);
vec3 toLightVector = normalize(lightPosition - vertPos.xyz);
v_lightIntensity = max( 0.0, dot(normal, toLightVector));
v_texCoords = a_texCoord0;
gl_Position = u_projViewTrans * vertPos;
}
Fragment Shader:
varying vec2 v_texCoords;
varying float v_lightIntensity;
uniform sampler2D u_texture;
void main()
{
vec4 texCol = texture( u_texture, v_texCoords.st );
gl_FragColor = vec4( texCol.rgb * v_lightIntensity, 1.0 );
}
A Phong shading may look like this:
Vertex Shader:
attribute vec3 a_position;
attribute vec3 a_normal;
attribute vec2 a_texCoord0;
uniform mat4 u_worldTrans;
uniform mat4 u_projViewTrans;
varying vec2 v_texCoords;
varying vec3 v_vertPosWorld;
varying vec3 v_vertNVWorld;
void main()
{
vec4 vertPos = u_worldTrans * vec4(a_position, 1.0);
v_vertPosWorld = vertPos.xyz;
v_vertNVWorld = normalize(mat3(u_worldTrans) * a_normal);
v_texCoords = a_texCoord0;
gl_Position = u_projViewTrans * vertPos;
}
Fragment Shader:
varying vec2 v_texCoords;
varying vec3 v_vertPosWorld;
varying vec3 v_vertNVWorld;
uniform sampler2D u_texture;
struct PointLight
{
vec3 color;
vec3 position;
float intensity;
};
uniform PointLight u_pointLights[1];
void main()
{
vec3 toLightVector = normalize(u_pointLights[0].position - v_vertPosWorld.xyz);
float lightIntensity = max( 0.0, dot(v_vertNVWorld, toLightVector));
vec4 texCol = texture( u_texture, v_texCoords.st );
vec3 finalCol = texCol.rgb * lightIntensity * u_pointLights[0].color;
gl_FragColor = vec4( finalCol.rgb * lightIntensity, 1.0 );
}

Parallax Corrected Cubemaps in Opengl?

I have been trying to imploment parallax corrected local cubemaps in OpenGL for a little while now, but I've not really managed to get anywhere, does anybody know where to start?
Here is my current shader code:
Fragment:
#version 330
in vec2 TexCoord;
in vec3 Normal;
in vec3 Position;
in vec3 Color;
out vec4 color;
uniform samplerCube CubeMap;
uniform vec3 CameraPosition;
void main() {
vec4 OutColor = vec4(Color,1.0);
vec3 normal = normalize(Normal);
vec3 view = normalize(Position-CameraPosition);
vec3 ReflectionVector = reflect(view,normal);
vec4 ReflectionColor = texture(CubeMap,ReflectionVector);
OutColor = mix(OutColor,ReflectionColor,0.5);
color = OutColor;
}
Vertex:
#version 330
layout (location = 0) in vec3 position;
layout (location = 2) in vec3 normal;
layout (location = 1) in vec2 texCoord;
layout (location = 3) in vec3 color;
out vec2 TexCoord;
out vec3 Normal;
out vec3 Position;
out vec3 Color;
uniform mat4 ModelMatrix;
uniform mat4 ViewMatrix;
uniform mat4 ProjectionMatrix;
void main()
{
gl_Position = ProjectionMatrix * ViewMatrix * ModelMatrix * vec4(position, 1.0f);
TexCoord=texCoord;
Normal = normal;
Position = vec4(ModelMatrix * vec4(position,1.0)).xyz;
Color = color;
}
Position = vec4(ModelMatrix * vec4(position,1.0)).xyz;
This puts the Position output variable into world space. Assuming the CameraPosition uniform is also in world space, then:
vec3 view = normalize(Position-CameraPosition);
view will also be in world space.
However, Normal comes directly from the vertex attribute. So unless you are updating each vertex's Normal value every time you rotate the object, that value will probably be in model space. And therefore:
vec3 ReflectionVector = reflect(view,normal);
This statement is incoherent. view is in one space, while normal is in another. And you can't really perform reasonable operations on vectors that are in different coordinate systems.
You need to make sure that Normal, Position, and CameraPosition are all in the same space. If you want that space to be world space, then you need to transform Normal into world space. Which in the general case requires computing the inverse/transpose of your model-to-world matrix.

Multi textures and multi lights in OpenGL 3.3

I have a project of castle and i send one light and one material to shaders. I want to add one more light and texture, but i don't know how to do it in shaders.
This is my fragment shader:
#version 130
out vec4 pixelColor;
in vec4 l;
in vec4 n;
in vec4 v;
uniform sampler2D textureMap0, textureMap1;
in vec2 iTexCoord;
in vec2 iTexCoord2;
uniform vec4 dragi1;
uniform vec4 dragi2;
uniform vec4 dragi3;
uniform float polysk;
uniform vec4 Light0ambient;
uniform vec4 Light0diffuse;
uniform vec4 Light0specular;
uniform vec4 Light0position;
uniform vec4 Material0emission;
uniform vec4 Material0ambient;
uniform vec4 Material0diffuse;
uniform vec4 Material0specular;
uniform float Material0shininess;
void main(void) {
vec4 ml=normalize(l);
vec4 mn=normalize(n);
vec4 mv=normalize(v);
vec4 mr=reflect(-ml,mn);
float nl=max(dot(ml,mn),0);
float rv=pow(max(dot(mr,mv),0),Material0shininess);
pixelColor=Light0ambient*Material0ambient+Light0diffuse*Md*vec4(nl,nl,nl,1)+Light0specular*Material0specular*vec4(rv,rv,rv,0);
}
And this is my vertex shader:
#version 130
uniform mat4 P;
uniform mat4 V;
uniform mat4 M;
in vec4 vertex;
in vec4 normal;
in vec2 texCoord;
out vec4 l;
out vec4 n;
out vec4 v;
out vec2 iTexCoord;
out vec2 iTexCoord2;
uniform vec4 Light0position;
void main(void) {
gl_Position=P*V*M*vertex;
l=normalize(V*(Light0position-M*vertex));
n=normalize(V*M*normal);
v=normalize(vec4(0,0,0,1)-V*M*vertex);
iTexCoord = texCoord;
iTexCoord2 = (n.xy + 1)/2;
}
It is simple to do?
And one more question, how to create a spotlight? Can i do this in simple way?
When you want to use multiple lights in your shaders you generally store all your light and material data in arrays and process each of those in one large for-loop with or without GLSL functions.
For example, you could create a light and a material struct and create a uniform array of those structs like this:
struct Light {
vec4 position;
vec4 ambient;
vec4 diffuse;
vec4 specular;
};
struct Material {
vec4 ambient;
vec4 diffuse;
vec4 specular;
vec4 emission;
float shininess;
};
uniform Light lights[4]; // Use 4 lights
uniform Material materials[2]; // Use 2 materials
Then the structure of your fragment shader could look something like this:
out vec4 color;
void main()
{
// define an output color value
vec3 output;
// calculate lighting value per light source
for(int i = 0; i < 4; i++)
outout += someFunctionToCalculateLight(lights[i], materials[0]);
color = vec4(output, 1.0);
}
In this case a function was created to calculate the resulting lighting color per light source and added to the final output color. This is generally how you work with multiple lights. Note that this is a very basic example.
Also, if you want to know how to create a spotlight, you're better off following tutorials on that subject of which there are plenty to be found online instead of asking that around here.
A few tutorials:
http://www.mbsoftworks.sk/index.php?page=tutorials&series=1&tutorial=20
http://www.lighthouse3d.com/tutorials/glsl-core-tutorial/spotlights/
http://www.learnopengl.com/#!Lighting/Light-casters

Problems with flat and phong shading

(Edit): The original code I posted was for both gouraud and phong shading options. I've changed it so it is just phong shading and posted below. The mesh is too big to describe here, as it is generated from a Bezier Patch.
I'm having some problems with flat and phong shading in Open GL 3 Mesa 9. It seems no matter what I do I get flat shaded figures, with tiny facets (planes) and I cannot get Blinn-Phong shading to work.
Here are my shaders:
(Vertex Shader)
//material parameters
uniform vec4 AmbientProduct, DiffuseProduct, SpecularProduct;
uniform float Shininess;
attribute vec4 vPosition;
//attribute vec4 vColor;
attribute vec4 vNormal;
attribute vec4 vControlColor;
attribute vec2 texcoord;
uniform mat4 model_view;
uniform mat4 projection;
uniform int flag;
uniform int phong_flag;
uniform vec4 eye_position;
//lighting parameters
uniform vec4 light_1; //light 1 position
uniform vec4 light_2; //light 2 position
varying vec4 control_color;
varying vec4 color;
varying vec4 position;
varying vec4 normal;
varying vec2 st;
void
main()
{
control_color = vControlColor;
position = vPosition;
normal = vNormal;
tex_coords = texcoord;
st = texcoord;
gl_Position = projection*model_view*vPosition;
}
And my fragment shader:
//material parameters
uniform vec4 AmbientProduct, DiffuseProduct, SpecularProduct;
uniform float Shininess;
uniform vec4 eye_position;
uniform int phong_flag;
//lighting parameters
uniform vec4 light_1; //light 1 position
uniform vec4 light_2; //light 2 position
varying vec4 light_2_transformed; //light 2 transformed position
uniform int Control_Point_Flag;
uniform sampler2D texMap;
varying vec4 color;
varying vec4 position;
varying vec4 normal;
varying vec4 control_color;
varying vec2 st;
void
main()
{
vec4 N = normalize(normal);
vec4 E = normalize(eye_position - position);
vec4 L1 = normalize(light_1 - position);
vec4 L2 = normalize(light_2 - position);
vec4 H1 = normalize( L1 + E);
vec4 H2 = normalize( L2 + E);
//calculate ambient component
vec4 ambient = AmbientProduct;
//calculate diffuse componenent
float k_d_1 = max(dot(L1,N), 0.0);
float k_d_2 = max(dot(L2,N), 0.0);
vec4 diffuse1 = k_d_1*DiffuseProduct;
vec4 diffuse2 = k_d_2*DiffuseProduct;
//calculate specular componenent
float k_s_1 = pow(max(dot(N, H1), 0.0), Shininess);
float k_s_2 = pow(max(dot(N, H2), 0.0), Shininess);
vec4 specular1 = k_s_1*SpecularProduct;
vec4 specular2 = k_s_2*SpecularProduct;
//if specular color is behind the camera, discard it
if (dot(L1, N) < 0.0) {
specular1 = vec4(0.0, 0.0, 0.0, 1.0);
}
if (dot(L2, N) < 0.0) {
specular2 = vec4(0.0, 0.0, 0.0, 1.0);
}
vec4 final_color = ambient + diffuse1 + diffuse2 + specular1 + specular2;
final_color.a = 1.0;
/* gl_FragColor = final_color; */
gl_FragColor = final_color*texture2D(texMap, st);
}
Does everything look ok for my shaders?
Things worth noting:
You have variables for a ModelView in your vertex shader, but you never use it in calculating position. Your vertex "position"s are thus whatever is passed from your OpenGL application and are not affected by any transformations you may be trying to do, though they are physically placed correctly because you use the matrices for gl_Position.
You aren't passing a Normal matrix to your shader. The Normal matrix is calculated by taking the transpose inverse of the ModelView matrix. Calculate this outside of the shader and pass it in. If you don't multiply your normals by the Normal matrix, you'll still be able to transform your model, but the normals will all still be facing the same way, so your lighting will be incorrect.
However, your normal vectors on the OpenGL side are may likely be the culprit. See this question for a good explanation of a possible source of unwanted flat shading.
As a side note, both of your shaders seem more complicated than they should be. That is to say, they have too many variables that aren't used and too much stuff that you could condense into fewer lines. It's just housekeeping, but it will make keeping track of your code easier.

Why does my OpenGL Phong shader behave like a flat shader?

I've been learning OpenGL for the past couple of weeks and I've run into some trouble implementing a Phong shader. It appears to do no interpolation between vertexes despite my use of the smooth qualifier. Am I missing something here? To give credit where credit is due, the code for the vertex and fragment shaders cribs heavily from the OpenGL SuperBible Fifth Edition. I would highly recommend this book!
Vertex Shader:
#version 330
in vec4 vVertex;
in vec3 vNormal;
uniform mat4 mvpMatrix; // mvp = ModelViewProjection
uniform mat4 mvMatrix; // mv = ModelView
uniform mat3 normalMatrix;
uniform vec3 vLightPosition;
smooth out vec3 vVaryingNormal;
smooth out vec3 vVaryingLightDir;
void main(void) {
vVaryingNormal = normalMatrix * vNormal;
vec4 vPosition4 = mvMatrix * vVertex;
vec3 vPosition3 = vPosition4.xyz / vPosition4.w;
vVaryingLightDir = normalize(vLightPosition - vPosition3);
gl_Position = mvpMatrix * vVertex;
}
Fragment Shader:
#version 330
out vec4 vFragColor;
uniform vec4 ambientColor;
uniform vec4 diffuseColor;
uniform vec4 specularColor;
smooth in vec3 vVaryingNormal;
smooth in vec3 vVaryingLightDir;
void main(void) {
float diff = max(0.0, dot(normalize(vVaryingNormal), normalize(vVaryingLightDir)));
vFragColor = diff * diffuseColor;
vFragColor += ambientColor;
vec3 vReflection = normalize(reflect(-normalize(vVaryingLightDir),normalize(vVaryingNormal)));
float spec = max(0.0, dot(normalize(vVaryingNormal), vReflection));
if(diff != 0) {
float fSpec = pow(spec, 32.0);
vFragColor.rgb += vec3(fSpec, fSpec, fSpec);
}
}
This (public domain) image from Wikipedia shows exactly what sort of image I'm getting and what I'm aiming for -- I'm getting the "flat" image but I want the "Phong" image.
Any help would be greatly appreciated. Thank you!
edit: If it makes a difference, I'm using PyOpenGL 3.0.1 and Python 2.6.
edit2:
Solution
It turns out the problem was with my geometry; Kos was correct. For anyone else that's having this problem with Blender models, Kos pointed out that doing Edit->Faces->Set Smooth does the trick. I found that Wings 3D worked "out of the box."
As an addition to this answer, here is a simple geometry shader which will let you visualize your normals. Modify the accompanying vertex shader as needed based on your attribute locations and how you send your matrices.
But first, a picture of a giant bunny head from our friend the Stanford bunny as an example of the result !
Major warning: do note that I get away with transforming the normals with the modelview matrix instead of a proper normal matrix. This won't work correctly if your modelview contains non uniform scaling. Also, the length of your normals won't be correct but that matters little if you just want to check their direction.
Vertex shader:
#version 330
layout(location = 0) in vec4 position;
layout(location = 1) in vec4 normal;
layout(location = 2) in mat4 mv;
out Data
{
vec4 position;
vec4 normal;
vec4 color;
mat4 mvp;
} vdata;
uniform mat4 projection;
void main()
{
vdata.mvp = projection * mv;
vdata.position = position;
vdata.normal = normal;
}
Geometry shader:
#version 330
layout(triangles) in;
layout(line_strip, max_vertices = 6) out;
in Data
{
vec4 position;
vec4 normal;
vec4 color;
mat4 mvp;
} vdata[3];
out Data
{
vec4 color;
} gdata;
void main()
{
const vec4 green = vec4(0.0f, 1.0f, 0.0f, 1.0f);
const vec4 blue = vec4(0.0f, 0.0f, 1.0f, 1.0f);
for (int i = 0; i < 3; i++)
{
gl_Position = vdata[i].mvp * vdata[i].position;
gdata.color = green;
EmitVertex();
gl_Position = vdata[i].mvp * (vdata[i].position + vdata[i].normal);
gdata.color = blue;
EmitVertex();
EndPrimitive();
}
}
Fragment shader:
#version 330
in Data
{
vec4 color;
} gdata;
out vec4 outputColor;
void main()
{
outputColor = gdata.color;
}
Hmm... You're interpolating the normal as a varying variable, so the fragment shader should receive the correct per-pixel normal.
The only explanation (I can think of) of the fact that you're having the result as on your left image is that every fragment on a given face ultimately receives the same normal. You can confirm it with a fragment shader like:
void main() {
vFragColor = normalize(vVaryingNormal);
}
If it's the case, the question remains: Why? The vertex shader looks OK.
So maybe there's something wrong in your geometry? What is the data which you send to the shader? Are you sure you have correctly calculated per-vertex normals, not just per-face normals?
The orange lines are normals of the diagonal face, the red lines are normals of the horizontal face.
If your data looks like the above image, then even with a correct shader you'll get flat shading. Make sure that you have correct per-vertex normals like on the lower image. (They are really simple to calculate for a sphere.)