I use glNormalPointer() to load values of normals in my code. The problem is that I could not understand which normals should I load per vertex or per face? Interestingly, but I try both variants and could not see any differences.
Could You clarify which normals should I pass to shader?
I use normals in the following vertex shader:
#version 120
void main() {
vec3 normal, lightDir;
vec4 diffuse, ambient, globalAmbient;
float NdotL;
normal = gl_NormalMatrix * gl_Normal;
gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;
// Calculate color
lightDir = normalize(vec3(gl_LightSource[0].position));
NdotL = max(abs(dot(normal, lightDir)), 0.0);
diffuse = gl_Color * gl_LightSource[0].diffuse;
ambient = gl_Color * gl_LightSource[0].ambient;
globalAmbient = gl_LightModel.ambient * gl_Color;
gl_FrontColor = NdotL * diffuse + globalAmbient + ambient;
}
All vertex attribute arrays, whether built-in or user-defined, provide data for each vertex, with a 1:1 ratio between vertex shader invocations and elements fetched from the array. OpenGL has no concept of "face normals". You can use flat interpolation and knowledge of the provoking vertex to create the effect of face normals, but your input data will still have to have a normal for each vertex.
Related
I've been trying to make a basic static point light using shaders for an LWJGL game, but it appears as if the light is moving as the camera's position is being translated and rotated. These shaders are slightly modified from the OpenGL 4.3 guide, so I'm not sure why they aren't working as intended. Can anyone explain why these shaders aren't working as intended and what I can do to get them to work?
Vertex Shader:
varying vec3 color, normal;
varying vec4 vertexPos;
void main() {
color = vec3(0.4);
normal = normalize(gl_NormalMatrix * gl_Normal);
vertexPos = gl_ModelViewMatrix * gl_Vertex;
gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;
}
Fragment Shader:
varying vec3 color, normal;
varying vec4 vertexPos;
void main() {
vec3 lightPos = vec3(4.0);
vec3 lightColor = vec3(0.75);
vec3 lightDir = lightPos - vertexPos.xyz;
float lightDist = length(lightDir);
float attenuation = 1.0 / (3.0 + 0.007 * lightDist + 0.000008 * lightDist * lightDist);
float diffuse = max(0.0, dot(normal, lightDir));
vec3 ambient = vec3(0.4, 0.4, 0.4);
vec3 finalColor = color * (ambient + lightColor * diffuse * attenuation);
gl_FragColor = vec4(finalColor, 1.0);
}
If anyone's interested, I ended up finding the solution. Removing the calls to gl_NormalMatrix and gl_ModelViewMatrix solved the problem.
The critical value here, lightPos, was being set as a function of vertexPos, which you have expressed in screen space (this happened because its original world space form was multiplied by modelView). Screen space stays with the camera, not anything in the 3D world. So to have a non-moving light source with respect to some absolute point in world space (like [4.0, 4.0, 4.0]), you could just leave your object's points in that space as you found out.
But getting rid of modelview is not a good idea, since the whole point of the model matrix is to place your objects where they belong (so you can re-use your vertex arrays with changes only to the model matrix, instead of burdening them with specifying every single shape's vertex positions from scratch).
A better way is to perform the modelView multiplication on both vertexPos AND lightPos. This way you're treating lightPos as originally a quantity in world space, but then doing the comparison in screen space. The math to get light intensities from normals will work out to the same in either space and you'll get a correct looking light source.
I'm trying to implement phong shading in GLSL but am having some issues with the specular component.
The green light is the specular component. The light (a point light) travels in a circle above the plane. The specular highlight always points inward toward the Y axis about which the light rotates and fans out toward the diffuse reflection as seen in the image. It doesn't appear to be affected at all by the positioning of the camera and I'm not sure where I'm going wrong.
Vertex shader code:
#version 330 core
/*
* Phong Shading with with Point Light (Quadratic Attenutation)
*/
//Input vertex data
layout(location = 0) in vec3 vertexPosition_modelSpace;
layout(location = 1) in vec2 vertexUVs;
layout(location = 2) in vec3 vertexNormal_modelSpace;
//Output Data; will be interpolated for each fragment
out vec2 uvCoords;
out vec3 vertexPosition_cameraSpace;
out vec3 vertexNormal_cameraSpace;
//Uniforms
uniform mat4 mvMatrix;
uniform mat4 mvpMatrix;
uniform mat3 normalTransformMatrix;
void main()
{
vec3 normal = normalize(vertexNormal_modelSpace);
//Set vertices in clip space
gl_Position = mvpMatrix * vec4(vertexPosition_modelSpace, 1);
//Set output for UVs
uvCoords = vertexUVs;
//Convert vertex and normal into eye space
vertexPosition_cameraSpace = mat3(mvMatrix) * vertexPosition_modelSpace;
vertexNormal_cameraSpace = normalize(normalTransformMatrix * normal);
}
Fragment Shader Code:
#version 330 core
in vec2 uvCoords;
in vec3 vertexPosition_cameraSpace;
in vec3 vertexNormal_cameraSpace;
//out
out vec4 fragColor;
//uniforms
uniform sampler2D diffuseTex;
uniform vec3 lightPosition_cameraSpace;
void main()
{
const float materialAmbient = 0.025; //a touch of ambient
const float materialDiffuse = 0.65;
const float materialSpec = 0.35;
const float lightPower = 2.0;
const float specExponent = 2;
//--------------Set Colors and determine vectors needed for shading-----------------
//reflection colors- NOTE- diffuse and ambient reflections will use the texture color
const vec3 colorSpec = vec3(0,1,0); //Green spec color
vec3 diffuseColor = texture2D(diffuseTex, uvCoords).rgb; //Get color from the texture at fragment
const vec3 lightColor = vec3(1,1,1); //White light
//Re-normalize normal vectors : after interpolation they make not be unit length any longer
vec3 normVertexNormal_cameraSpace = normalize(vertexNormal_cameraSpace);
//Set camera vec
vec3 viewVec_cameraSpace = normalize(-vertexPosition_cameraSpace); //Since its view space, camera at origin
//Set light vec
vec3 lightVec_cameraSpace = normalize(lightPosition_cameraSpace - vertexPosition_cameraSpace);
//Set reflect vect
vec3 reflectVec_cameraSpace = normalize(reflect(-lightVec_cameraSpace, normVertexNormal_cameraSpace)); //reflect function requires incident vec; from light to vertex
//----------------Find intensity of each component---------------------
//Determine Light Intensity
float distance = abs(length(lightPosition_cameraSpace - vertexPosition_cameraSpace));
float lightAttenuation = 1.0/( (distance > 0) ? (distance * distance) : 1 ); //Quadratic
vec3 lightIntensity = lightPower * lightAttenuation * lightColor;
//Determine Ambient Component
vec3 ambientComp = materialAmbient * diffuseColor * lightIntensity;
//Determine Diffuse Component
float lightDotNormal = max( dot(lightVec_cameraSpace, normVertexNormal_cameraSpace), 0.0 );
vec3 diffuseComp = materialDiffuse * diffuseColor * lightDotNormal * lightIntensity;
vec3 specComp = vec3(0,0,0);
//Determine Spec Component
if(lightDotNormal > 0.0)
{
float reflectDotView = max( dot(reflectVec_cameraSpace, viewVec_cameraSpace), 0.0 );
specComp = materialSpec * colorSpec * pow(reflectDotView, specExponent) * lightIntensity;
}
//Add Ambient + Diffuse + Spec
vec3 phongFragRGB = ambientComp +
diffuseComp +
specComp;
//----------------------Putting it together-----------------------
//Out Frag color
fragColor = vec4( phongFragRGB, 1);
}
Just noting that the normalTransformMatrix seen in the Vertex shader is the inverse-transpose of the model-view matrix.
I am setting a vector from the vertex position to the light, to the camera, and the reflect vector, all in camera space. For the diffuse calculation I am taking the dot product of the light vector and the normal vector, and for the specular component I am taking the dot product of the reflection vector and the view vector. Perhaps there is some fundamental misunderstanding that I have with the algorithm?
I thought at first that the problem could be that I wasn't normalizing the normals entering the fragment shader after interpolation, but adding a line to normalize didn't affect the image. I'm not sure where to look.
I know that there a lot of phong shading questions on the site, but everyone seems to have a problem that is a bit different. If anyone can see where I am going wrong, please let me know. Any help is appreciated.
EDIT: Okay its working now! Just as jozxyqk suggested below, I needed to do a mat4*vec4 operation for my vertex position or lose the translation information. When I first made the change I was getting strange results until I realized that I was making the same mistake in my OpenGL code for the lightPosition_cameraSpace before I passed it to the shader (the mistake being that I was casting down the view matrix to a mat3 for the calculation instead of setting the light position vector as a vec4). Once I edited those lines the shader appears to be working properly! Thanks for the help, jozxqk!
I can see two parts which don't look right.
"vertexPosition_cameraSpace = mat3(mvMatrix) * vertexPosition_modelSpace" should be a mat4/vec4(x,y,z,1) multiply, otherwise it ignores the translation part of the modelview matrix.
2. distance uses the light position relative to the camera and not the vertex. Use lightVec_cameraSpace instead. (edit: missed the duplicated calculation)
I'm working on an implementation of normal mapping, calculating the tangent vectors via the ASSIMP library.
The normal mapping seems to work perfectly on objects that have a model matrix close to the identity matrix. As long as I start translating and scaling, my lighting seems off. As you can see in the picture the normal mapping works perfectly on the container cube, but the lighting fails on the large floor (direction of the specular light should be towards the player, not towards the container).
I get the feeling it somehow has something to do with the position of the light (currently traversing from x = -10 to x = 10 over time) not properly being included in the calculations as long as I start changing the model matrix (via translations/scaling). I'm posting all the relevant code and hope you guys can somehow see something I'm missing since I've been staring at my code for days.
Vertex shader
#version 330
layout(location = 0) in vec3 position;
layout(location = 1) in vec3 normal;
layout(location = 2) in vec3 tangent;
layout(location = 3) in vec3 color;
layout(location = 4) in vec2 texCoord;
// fragment pass through
out vec3 Position;
out vec3 Normal;
out vec3 Tangent;
out vec3 Color;
out vec2 TexCoord;
out vec3 TangentSurface2Light;
out vec3 TangentSurface2View;
uniform vec3 lightPos;
// vertex transformation
uniform mat4 model;
uniform mat4 view;
uniform mat4 projection;
void main()
{
mat3 normalMatrix = transpose(mat3(inverse(view * model)));
Position = vec3((view * model) * vec4(position, 1.0));
Normal = normalMatrix * normal;
Tangent = tangent;
Color = color;
TexCoord = texCoord;
gl_Position = projection * view * model * vec4(position, 1.0);
vec3 light = vec3(view * vec4(lightPos, 1.0));
vec3 n = normalize(normalMatrix * normal);
vec3 t = normalize(normalMatrix * tangent);
vec3 b = cross(n, t);
mat3 mat = mat3(t.x, b.x ,n.x, t.y, b.y ,n.y, t.z, b.z ,n.z);
vec3 vector = normalize(light - Position);
TangentSurface2Light = mat * vector;
vector = normalize(-Position);
TangentSurface2View = mat * vector;
}
Fragment shader
#version 330
in vec3 Position;
in vec3 Normal;
in vec3 Tangent;
in vec3 Color;
in vec2 TexCoord;
in vec3 TangentSurface2Light;
in vec3 TangentSurface2View;
out vec4 outColor;
uniform vec3 lightPos;
uniform mat4 view;
uniform sampler2D texture0;
uniform sampler2D texture_normal; // normal
uniform float repeatFactor = 1;
void main()
{
vec4 texColor = texture(texture0, TexCoord * repeatFactor);
vec3 light = vec3(view * vec4(lightPos, 1.0));
float dist = length(light - Position);
float att = 1.0 / (1.0 + 0.01 * dist + 0.001 * dist * dist);
// Ambient
vec4 ambient = vec4(0.2);
// Diffuse
vec3 surface2light = normalize(TangentSurface2Light);
vec3 norm = normalize(texture(texture_normal, TexCoord * repeatFactor).xyz * 2.0 - 1.0);
float contribution = max(dot(norm, surface2light), 0.0);
vec4 diffuse = contribution * vec4(0.8);
// Specular
vec3 surf2view = normalize(TangentSurface2View);
vec3 reflection = reflect(-surface2light, norm); // reflection vector
float specContribution = pow(max(dot(surf2view, reflection), 0.0), 32);
vec4 specular = vec4(0.6) * specContribution;
outColor = (ambient + (diffuse * att)+ (specular * pow(att, 3))) * texColor;
}
OpenGL Drawing Code
void Render()
{
...
glm::mat4 view, projection; // Model will be done via MatrixStack
view = glm::lookAt(position, position + direction, up); // cam pos, look at (eye pos), up vec
projection = glm::perspective(45.0f, (float)width/(float)height, 0.1f, 1000.0f);
glUniformMatrix4fv(glGetUniformLocation(basicShader.shaderProgram, "view"), 1, GL_FALSE, glm::value_ptr(view));
glUniformMatrix4fv(glGetUniformLocation(basicShader.shaderProgram, "projection"), 1, GL_FALSE, glm::value_ptr(projection));
// Lighting
lightPos.x = 0.0 + sin(time / 125) * 10;
glUniform3f(glGetUniformLocation(basicShader.shaderProgram, "lightPos"), lightPos.x, lightPos.y, lightPos.z);
// Objects (use bump mapping on this cube)
bumpShader.Use();
glUniformMatrix4fv(glGetUniformLocation(bumpShader.shaderProgram, "view"), 1, GL_FALSE, glm::value_ptr(view));
glUniformMatrix4fv(glGetUniformLocation(bumpShader.shaderProgram, "projection"), 1, GL_FALSE, glm::value_ptr(projection));
glUniform3f(glGetUniformLocation(bumpShader.shaderProgram, "lightPos"), lightPos.x, lightPos.y, lightPos.z);
MatrixStack::LoadIdentity();
MatrixStack::Scale(2);
MatrixStack::ToShader(glGetUniformLocation(bumpShader.shaderProgram, "model"));
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_2D, resources.GetTexture("container"));
glUniform1i(glGetUniformLocation(bumpShader.shaderProgram, "img"), 0);
glActiveTexture(GL_TEXTURE1); // Normal map
glBindTexture(GL_TEXTURE_2D, resources.GetTexture("container_normal"));
glUniform1i(glGetUniformLocation(bumpShader.shaderProgram, "normalMap"), 1);
glUniform1f(glGetUniformLocation(bumpShader.shaderProgram, "repeatFactor"), 1);
cubeNormal.Draw();
MatrixStack::LoadIdentity();
MatrixStack::Translate(glm::vec3(0.0f, -22.0f, 0.0f));
MatrixStack::Scale(glm::vec3(200.0f, 20.0f, 200.0f));
MatrixStack::ToShader(glGetUniformLocation(bumpShader.shaderProgram, "model"));
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_2D, resources.GetTexture("floor"));
glActiveTexture(GL_TEXTURE1); // Normal map
glBindTexture(GL_TEXTURE_2D, resources.GetTexture("floor_normal"));
glUniform1f(glGetUniformLocation(bumpShader.shaderProgram, "repeatFactor"), 100);
cubeNormal.Draw();
MatrixStack::LoadIdentity();
glActiveTexture(GL_TEXTURE0);
...
}
EDIT
I now loaded my objects using the ASSIMP library with the 'aiProcess_CalcTangentSpace' flag enabled and changed my shaders accordingly to adapt to the new changes. Since ASSIMP now automatically calculates the correct tangent vectors I should have valid tangent vectors and my problem should be solved (as noted by Nicol Bolas), but I still have the same issue with the specular lighting acting strange and the diffuse lighting not really showing up. I guess there is still something else that is not working correctly. I unmarked your answer as the correct answer Nicol Bolas (for now) and updated my code accordingly since there is still something I'm missing.
It probably has something to do with translating. As soon as I add a translate (-22.0f in y direction) to the Model matrix it reacts with strange lighting. As long as the floor (which is actually a cube) has no translation the lighting looks fine.
calculating the tangent vectors in the vertex shader
Well there's your problem. That's not possible for an arbitrary surface.
The tangent and bitangent are not arbitrary vectors that are perpendicular to one another. They are model-space direction vectors that point in the direction of the texture coordinates. The tangent points in the direction of the S texture coordinate, and the bitangent points in the direction of the T texture coordinate (or U and V for the tex coords, if you prefer).
This effectively computes the orientation of the texture relative to each vertex on the surface. You need this orientation, because the way the texture is mapped to the surface matters when you want to make sense of a tangent-space vector.
Remember: tangent-space is the space perpendicular to a surface. But you need to know how that surface is mapped to the object in order to know where "up" is, for example. Take a square surface. You could map a texture so that the +Y part of the square is oriented along the +T direction of the texture. Or it could be along the +X of the square. You could even map it so that the texture is distorted, or rotated at an arbitrary angle.
The tangent and bitangent vectors are intended to correct for this mapping. They point in the S and T directions in model space. So, combined with the normal, they form a transformation matrix to transform from tangent-space into whatever space the 3 vectors are in (you generally transform the NBT to camera space or whatever space you use for lighting before using them).
You cannot compute them by just taking the normal and crossing it with some arbitrary vector. That produces a perpendicular normal, but not the right one.
In order to correctly compute the tangent/bitangent, you need access to more than one vertex. You need to be able to see how the texture coordinates change over the surface of the mesh, which is how you compute the S and T directions relative to the mesh.
Vertex shaders cannot access more than one vertex. Geometry shaders can't (generally) access enough vertices to do this either. Compute the tangent/bitangent off-line on the CPU.
mat3 mat = mat3(t.x, b.x ,n.x, t.y, b.y ,n.y, t.z, b.z ,n.z);
Is wrong. In order to use the tbn matrix correctly, you must transpose it,like so:
mat3 mat = transpose(mat3(t.x, b.x ,n.x, t.y, b.y ,n.y, t.z, b.z ,n.z));
then use it to transform your light and view vectors into tangent space. Alternatively(and less efficiently), pass the untransposed tbn matrix to the fragment shader, and use it to transform the sampled normal into view space. It's an easy thing to miss, but very important. See http://www.opengl-tutorial.org/intermediate-tutorials/tutorial-13-normal-mapping/ for more info.
On a side note, a minor optimisation you can do for your vertex shader, is to calculate the normal matrix on the cpu, per mesh, as it will be the same for all vertices on the mesh, and so reduce unnecessary calculations.
Pixel based lighting is a common issue in many OpenGL applications, as the standard OpenGL lighting has very poor quality.
I want to use a GLSL program to have per-pixel based lighting in my OpenGL program instead of per-vertex. Just Diffuse lighting, but with fog, texture and texture-alpha at least.
I started with this shader:
texture.vert:
varying vec3 position;
varying vec3 normal;
void main(void)
{
gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;
gl_FrontColor = gl_Color;
gl_TexCoord[0] = gl_MultiTexCoord0;
normal = normalize(gl_NormalMatrix * gl_Normal);
position = vec3(gl_ModelViewMatrix * gl_Vertex);
}
texture.frag:
uniform sampler2D Texture0;
uniform int ActiveLights;
varying vec3 position;
varying vec3 normal;
void main(void)
{
vec3 lightDir;
float attenFactor;
vec3 eyeDir = normalize(-position); // camera is at (0,0,0) in ModelView space
vec4 lightAmbientDiffuse = vec4(0.0,0.0,0.0,0.0);
vec4 lightSpecular = vec4(0.0,0.0,0.0,0.0);
// iterate all lights
for (int i=0; i<ActiveLights; ++i)
{
// attenuation and light direction
if (gl_LightSource[i].position.w != 0.0)
{
// positional light source
float dist = distance(gl_LightSource[i].position.xyz, position);
attenFactor = 1.0/( gl_LightSource[i].constantAttenuation +
gl_LightSource[i].linearAttenuation * dist +
gl_LightSource[i].quadraticAttenuation * dist * dist );
lightDir = normalize(gl_LightSource[i].position.xyz - position);
}
else
{
// directional light source
attenFactor = 1.0;
lightDir = gl_LightSource[i].position.xyz;
}
// ambient + diffuse
lightAmbientDiffuse += gl_FrontLightProduct[i].ambient*attenFactor;
lightAmbientDiffuse += gl_FrontLightProduct[i].diffuse * max(dot(normal, lightDir), 0.0) * attenFactor;
// specular
vec3 r = normalize(reflect(-lightDir, normal));
lightSpecular += gl_FrontLightProduct[i].specular *
pow(max(dot(r, eyeDir), 0.0), gl_FrontMaterial.shininess) *
attenFactor;
}
// compute final color
vec4 texColor = gl_Color * texture2D(Texture0, gl_TexCoord[0].xy);
gl_FragColor = texColor * (gl_FrontLightModelProduct.sceneColor + lightAmbientDiffuse) + lightSpecular;
float fog = (gl_Fog.end - gl_FogFragCoord) * gl_Fog.scale; // Intensität berechnen
fog = clamp(fog, 0.0, 1.0); // Beschneiden
gl_FragColor = mix(gl_Fog.color, gl_FragColor, fog); // Nebelfarbe einmischen
}
Comments are german because it's a german site where this code was posted, sorry.
But all this shader does is make everything very dark. No lighting effects at all - yet the shader codes compile. If I only use GL_LIGHT0 in the fragment shader, then it seems to work, but only reasonable for camera facing polygons and my floor polygon is just extremely dark. Also quads with RGBA textures show no sign of transparency.
I use standard glRotate/Translate for the Modelview matrix, and glVertex/Normal for my polygons. OpenGL lighting works fine apart from the fact that it looks ugly on very large surfaces. I triple checked my normals, they are fine.
Is there something wrong in the above code?
OR
Tell me why there is no generic lighting Shader for this actual task (point based light with distance falloff: a candle if you will) - shouldn't there be just one correct way to do this? I don't want bump/normal/parallax/toon/blur/whatever effects. I just want my lighting to perform better with larger polygons.
All Tutorials I found are only useful for lighting a single object when the camera is at 0,0,0 facing orthogonal to the object. The above is the only one found that at least looks like the thing I want to do.
I would strongly suggest you to read this article to see how the standard ADS lightning is done within GLSL.That is GL 4.0 but not a problem to adjust to your version:
Also you operate in the view (camera) space so DON"T negate the eyes vector :
vec3 eyeDir = normalize(-position);
I had pretty similar issues to yours because I also negated the eye vector forgetting that it is transformed into the view space.Your diffuse and specular calculations seem to be wrong too in the current scenario.In your place I wouldn't use data from the fixed pipeline at all ,otherwise what is the point doing it in a shader?
Here is the method to calculate diffuse and specular in the per fragment ADS point lightning:
void ads( int lightIndex,out vec3 ambAndDiff, out vec3 spec )
{
vec3 s = vec3(lights[lightIndex].Position - posOut) ;
vec3 v = normalize( posOut.xyz );
vec3 n = normalize(normOut);
vec3 h = normalize(v+s) ;// half vector (read in the web on what it is )
vec3 diffuse = ((Ka+ lights[lightIndex].Ld) * Kd * max( 0.0,dot(n, v) )) ;
spec = Ks * pow( max(0.0, dot(n,h) ), Shininess ) ;
ambAndDiff = diffuse ;
/// Ka-material ambient factor
/// Kd-material diffuse factor
/// Ks-material specular factor.
/// lights[lightIndex].Ld-lights diffuse factor;you may also add La and Ls if you want to have even more control of the light shading.
}
Also I wouldn't suggest you using the attenuation equation you have here,it is hard to control.If you want to add light radius based attenuation
there is this nice blog post:
I have got problem with my shaders. I am trying to put textures and phong shading in my game using glsl but I can't get any good effect.
I've been searching google for a long time, and I can't find any info how connect ligh and texture together, so I've decided to wrote and ask here.
This is my game without texture:
and this is with texture:
What I want to achive is to make this pinkish texture make better visible with spot on a center - just like without texture, and also repair those per vertex shading on gutters - make it per pixel shading, I don't know what is wrong now.
I have checked about 10 shaders with phong shading and I have got also per vertex not per pixel shading.
This is my fragment vertex code mayby someone can see there something?
varying vec3 N;
varying vec3 v;
uniform sampler2D myTexture;
varying vec2 vTexCoord;
void main (void)
{
vec4 finalColor = vec4(0.0, 0.0, 0.0, 0.0);
vec3 L = normalize(gl_LightSource[0].position.xyz - v);
vec3 E = normalize(-v); // we are in Eye Coordinates, so EyePos is (0,0,0)
vec3 R = normalize(-reflect(L,N));
//calculate Ambient Term:
vec4 Iamb = gl_FrontLightProduct[0].ambient;
//calculate Diffuse Term:
vec4 Idiff = gl_FrontLightProduct[0].diffuse * max(dot(N,L), 0.0);
// calculate Specular Term:
vec4 Ispec = gl_FrontLightProduct[0].specular
* pow(max(dot(R,E),0.0),0.3*gl_FrontMaterial.shininess);
finalColor+=Iamb + Idiff + Ispec;
// write Total Color:
gl_FragColor = gl_FrontLightModelProduct.sceneColor + (texture2D(myTexture, vTexCoord)) + finalColor;
}
and my vertex shader
varying vec3 N;
varying vec3 v;
varying vec2 vTexCoord;
void main(void)
{
v = vec3(gl_ModelViewMatrix * gl_Vertex);
N = normalize(gl_NormalMatrix * gl_Normal);
vTexCoord = vec2(gl_MultiTexCoord0);
gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;
}
I'll be glad for help.
EDIT:
this is how I put my texture into shader. It works fine ( I think ) because I can edit texture values in shader - in some way.
int my_sampler_uniform_location = glGetUniformLocation(brickProg, "myTexture");
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_2D, texture);
glUniform1i(my_sampler_uniform_location,0);
CUTIL::drawBox();
glBindTexture(GL_TEXTURE_2D,0);
EDIT 2:
After Nicol Bolas suggestion about colors add I have edited my shader and change it like this:
gl_FragColor = (texture2D(myTexture, vTexCoord)) + finalColor;
Now it is only to bright, but now I have to chenge a little light on stage and would be great. But still I haven't got per pixel shading instead I have got per vertex shading. This is my current screen and I have marked on example what I mean talking about shading:
This equation:
gl_FragColor = gl_FrontLightModelProduct.sceneColor + (texture2D(myTexture, vTexCoord)) + finalColor;
Does not make any sense for most textures. You are taking the color produced from the lighting equation and adding it to the color sampled from the texture. This would only make sense if the values stored in the texture represented light emitting properties of the surface.
Generally, color values of a texture represent the diffuse reflectance of a surface. Which means you need to incorporate them into the lighting equation directly. The texture's color should either fully replace the diffuse color from the material or it should be combined with the material diffuse color in some way.