OpenGL, fragment shader : No such uniform named - opengl

I have an issue, I want my light to come from my object, I managed to pass its position as uniform but I can't pass its rotation matrix.
My fragment shader:
#version 120
varying vec3 coordonnee_3d;
varying vec3 coordonnee_3d_locale;
varying vec3 normale;
varying vec4 color;
uniform sampler2D texture;
// vec3 light=vec3(0.5,0.5,5.0);
uniform vec4 pos_lamp;
uniform mat4 rot_lamp;
vec3 light = vec3(pos_lamp.x, pos_lamp.y, pos_lamp.z);
void main (void)
{
vec3 n = normalize(normale);
vec3 d = normalize(light-coordonnee_3d_locale);
mat4 test = rot_lamp;
vec3 r = reflect(d,n);
vec3 o = normalize(-coordonnee_3d_locale);
float diffuse = 0.7*clamp(dot(n,d),0.0,1.0);
float specular = -0.2*pow(clamp(dot(r,o),0.0,1.0),128.0);
float ambiant = 0.2;
vec4 white = vec4(1.0,1.0,1.0,0.0);
vec2 tex_coord = gl_TexCoord[0].xy;
vec4 color_texture = texture2D(texture,tex_coord);
vec4 color_final = color*color_texture;
gl_FragColor = (ambiant+diffuse)*color_final+specular*white;
}
And in my function display_callback in my main.cpp
static void display_callback(){
glUniformMatrix4fv(get_uni_loc(shader_program_id, "rotation_view"), 1, false, pointeur(transformation_view.rotation));
PRINT_OPENGL_ERROR();
vec3 cv = transformation_view.rotation_center;
glUniform4f(get_uni_loc(shader_program_id, "rotation_center_view"), cv.x, cv.y, cv.z, 0.0f);
PRINT_OPENGL_ERROR();
vec3 tv = transformation_view.translation;
glUniform4f(get_uni_loc(shader_program_id, "translation_view"), tv.x, tv.y, tv.z, 0.0f);
PRINT_OPENGL_ERROR();
glUniform4f(get_uni_loc(shader_program_id, "pos_lamp"), transformation_model_1.translation.x, transformation_model_1.translation.y, transformation_model_1.translation.z, 0.0f);
PRINT_OPENGL_ERROR();
glUniformMatrix4fv(get_uni_loc(shader_program_id, "rot_lamp"), 1, false, pointeur(transformation_model_1.rotation));
PRINT_OPENGL_ERROR();
}
But I have this issue: No such uniform named "rot_lamp"

Following on from the comment, you have...
uniform sampler2D texture;
// ...
mat4 test = rot_lamp;
So rot_lamp is only ever use in the assignment to test and test is never used after this statement. Since setting rot_lamp cannot, therefore, affect the output of the fragment shader it will be removed from the shader program.

This is the only place where the rot_lamp uniform is used:
mat4 test = rot_lamp;
Since test is used nowhere else, thus doesn't affect the output of the shader, the GLSL compiler has elided rot_lamp uniform. It's expected, and is not an error (unless you intended the uniform to be actually used in the calculation of the output).

Related

Why is the light source not moving when I change its position?

I've been working on OpenGL for a while now, and I'm now working at diffuse lighting. I've set up a way to change the light source, but my object acts as if it were in the same place each time.
I've continued checking whether or not my fragment shader and vertex shader is correct, but no problems seem to arise.
This is the light position sent to the core shader:
vec3 lightPos0 = vec3(0.f, 0.f, 2.f);
glUniform3fv(glGetUniformLocation(coreShader.getID(), "lightPos0"), 1, value_ptr(lightPos0));
And here are the fragment and vertex shaders (with all usual variables implied to exist):
#version 450
layout (location = 0) in vec3 vertex_position;
layout (location = 1) in vec3 vertex_color;
layout (location = 2) in vec2 vertex_texcoord;
layout (location = 3) in vec3 vertex_normal;
out vec3 vs_position;
out vec3 vs_color;
out vec2 vs_texcoord;
out vec3 vs_normal;
uniform mat4 ModelMatrix;
uniform mat4 ViewMatrix;
uniform mat4 ProjectionMatrix;
void main()
{
vs_position = vec4(ModelMatrix * vec4(vertex_position, 1.f)).xyz;
vs_color = vertex_color;
vs_texcoord = vec2(vertex_texcoord.x, vertex_texcoord.y * -1.f);
vs_normal = mat3(ModelMatrix) * vertex_normal;
gl_Position = ProjectionMatrix * ViewMatrix * ModelMatrix * vec4(vertex_position, 1.f);
}
#version 450
in vec3 vs_position;
in vec3 vs_color;
in vec2 vs_texcoord;
in vec3 vs_normal;
out vec4 fs_color;
uniform sampler2D texture0;
uniform sampler2D texture1;
uniform vec3 lightPos0;
void main()
{
//Ambient light
vec3 ambientLight = vec3(0.1f, 0.1f, 0.1f);
//Diffuse light
vec3 posToLightDirVec = normalize(lightPos0 - vs_position);
vec3 diffuseColor = vec3(1.f, 1.f, 1.f);
float diffuse = max(dot(posToLightDirVec, vs_normal), 0.0);
vec3 diffuseFinal = diffuseColor * diffuse;
fs_color =
(texture(texture0, vs_texcoord)) * vec4(vs_color, 1.f)
* (vec4(ambientLight, 1.f) + vec4(diffuseFinal, 1.f));
}
If you actually managed to take all of this and get it to about what this program looks like, you would notice that it wouldn't quite matter where you put the light point. You would always have to move back slightly to actually see anything. Also, you would see that it isn't quite diffuse, but more like specular lighting.
If you need any more code, ask me in the comments.
Vou've to install the program as object as part of current rendering state, before you can set the value of the uniform variable lightPos0:
glUseProgram(coreShader.getID())
Note, glUniform3fv changes the value of a uniform in the default uniform block of the currently installed program.
Of course you can get the index of a active program resource (e.g. glGetUniformLocation) before the program is installed. For that it sufficient that the program is linked, but it is not necessary that it is the current program.
If you compare the function glGetUniformLocation and glUniform, then you can see, that the program object is a parameter to the former, but not to the later function. For the use of glUniform the program has to be current.
Since OpenGL 4.1 glProgramUniform is provided, which can specify the value of a uniform variable for a specified program object.
This is your problem:
vec3 posToLightDirVec = normalize(lightPos0 - vs_position);
vs_position is in local space, and so is unaffected by transformations. Transform it first....
vec3 posToLightDirVec = normalize(lightPos0 - (ModelMatrix * vs_position).xyz);

uniform sampler2D in Vertex Shader

I tried to realize height map with GLSL.
For it, i need to sent my picture to VertexShader and get grey component.
glActiveTexture(GL_TEXTURE0);
Texture.bind();
glUniform1i(mShader.getUniformLocation("heightmap"), 0);
mShader.getUniformLocation uses glGetUniformLocation and work good for other uniforms values, that used in Fragment, Vertex Shaders. But for heightmap return -1...
VertexShader code:
#version 330 core
layout (location = 0) in vec3 position;
layout (location = 1) in vec4 color;
layout (location = 2) in vec2 texCoords;
layout (location = 3) in vec3 normal;
out vec3 Normal;
out vec3 FragPos;
out vec2 TexCoords;
out vec4 ourColor;
uniform mat4 model;
uniform mat4 view;
uniform mat4 projection;
uniform sampler2D heightmap;
void main()
{
float bias = 0.25;
float h = 0.0;
float scale = 5.0;
h = scale * ((texture2D(heightmap, texCoords).r) - bias);
vec3 hnormal = vec3(normal.x*h, normal.y*h, normal.z*h);
vec3 position1 = position * hnormal;
gl_Position = projection * view * model * vec4(position1, 1.0f);
FragPos = vec3(model * vec4(position, 1.0f));
Normal = mat3(transpose(inverse(model))) * normal;
ourColor = color;
TexCoords = texCoords;
}
may be algorithm of getting height is bad, but error with getting uniformlocation stops my work..
What is wrong? Any ideas?
UPD: texCoords (not TexCoords) of course is using in
h = scale * ((texture2D(heightmap, texCoords).r) - bias);
my mistake, but it doesn't solve the problem. Having same error..
My bet is your variable has been optimized out by driver or the shader did not compile/link properly. After trying to compile your shader (on my nVidia) I got this in the logs:
0(9) : warning C7050: "TexCoords" might be used before being initialized
You should always check the GLSL compile/link logs ? see
How to debug GLSL Fragment shader
especially how the glGetShaderInfoLog is used.
In line
h = scale * ((texture2D(heightmap, TexCoords).r) - bias);
You are using TexCoords which is output variable and not yet set so the behavior is undefined and most likely your gfx driver throw that line away (and may be others) removing the TexCoords from shader completely but that is just my assumption.
What driver and gfx card you got?
What returns the logs on your setup?

Light with vertex/fragment shader. Using varying variables. (openGL)

I was looking for a lot of this problem. I found this question Passing data into different shaders but this problem not mine. I get "The fragment shader uses varying "normal", but previous shader does not write to it." error message.
My vertey shader code:
#version 430
in layout(location=0) vec3 position;
in layout(location=1) vec3 normal;
out vec3 norm;
uniform mat4 transformation;
void main()
{
gl_Position = transformation * vec4(position, 1.0);
norm = (transformation * vec4(normal, 0.0)).xyz;
}
And my fragment shader code:
#version 430
in vec3 normal;
out vec4 colour;
vec3 lightPos = vec3(0,50,0);
vec3 lightColor = vec3(0.5, 0, 0);
vec3 materialColor = vec3(0, 1.0, 0);
void main() {
float cosTheta = dot(-lightPos, normalize(normal));
vec3 temp = materialColor * lightColor * cosTheta;
colour = vec4(temp, 1.0);
}
What is the main problem? I don't understand this message my vertex shader using the normal vector and it passing into fragment shader. I don't see difference between the linked code and mine. Please tell me some idea :\
If you want to use different variable names for some reason you can specify a location to match in- and output variables.
For example, in your case:
.vert:
out layout(location = 7) vec3 norm;
.frag:
in layout(location = 7)vec3 normal;

PassThrough Geometry Shader Not Working

I am using FragmentShader and VertexShader at present, and works absolutely fine. I cannot get my geometry shader working. I am absolutely new to it, below is what I have tried.
I am using VBO, lighting and textures along with some geometry, but it works fine before using GeometryShader. the only thing I have changed is the variable names as I had to get the input in the geometry shader and give the output. So I have appended 1 at the end of those variable names those which will go out from geometry shader to the fragment shader.
Also I have added headers starting with # which were earlier not there. I am using GL_TRIANGLES to draw.
VertexShader
in vec4 position;
in vec4 color1;
in vec4 normal;
in vec2 texCoord;
uniform sampler2D Tex1;
uniform int use_texture;
out vec4 pcolor;
out vec3 N;
out vec3 L;
out vec3 R;
out vec3 V;
uniform mat4 local2clip;
uniform mat4 local2eye;
uniform mat4 normal_matrix;
uniform mat4 world2eye;
uniform vec4 light_ambient;
uniform vec4 light_diffuse;
uniform vec4 light_specular;
uniform vec4 light_pos;
#version 330 compatibility
uniform vec4 mat_ambient;
uniform vec4 mat_diffuse;
uniform vec4 mat_specular;
uniform float mat_shine;
//varying vec3 v_normal; // vertex normal
out vec4 v_color; // vertex color
out vec4 pos_in_eye; //vertex position in eye space
out vec2 FtexCoord;
void main(){
gl_Position = local2clip * position;
N = normalize(vec3(normal_matrix * normal)); //v_normal
vec4 Lpos = world2eye * light_pos; //light pos. in eye
vec4 Vpos = local2eye * position; //pos_in_eye
L = normalize(vec3(Lpos - Vpos)); //light_vector
R = normalize(reflect(-L, N));
V = normalize(vec3(-Vpos)); //eye vector
vec3 halfv = normalize(L+V);
FtexCoord = texCoord;
//pcolor = color1;
}
This is my FragemntShader
#version 330 compatibility
uniform int use_texture;
in vec4 pcolor;
in vec3 N1;
in vec3 L1;
in vec3 R1;
in vec3 V1;
uniform mat4 local2clip;
uniform mat4 local2eye;
uniform mat4 normal_matrix;
uniform mat4 world2eye;
uniform vec4 light_ambient;
uniform vec4 light_diffuse;
uniform vec4 light_specular;
uniform vec4 light_pos;
uniform vec4 mat_ambient;
uniform vec4 mat_diffuse;
uniform vec4 mat_specular;
uniform float mat_shine;
uniform sampler2D Tex1;
in vec2 FtexCoord1;
void main() {
vec4 ambient = light_ambient * mat_ambient;
float NdotL;
if (dot(N1,L1) <0.0) NdotL = 0.0;
else NdotL = dot(N1, L1);
vec4 diffuse = light_diffuse * mat_diffuse * NdotL;
float RdotV;
RdotV = dot(R1, V1);
if (NdotL == 0.0) RdotV = 0.0;
if (RdotV <0.0) RdotV = 0.0;
vec4 specular = light_specular * mat_specular * pow(RdotV,mat_shine);
vec4 texcolor;
if( use_texture == 1 ) {
texcolor = texture2D(Tex1, FtexCoord1);
gl_FragColor = texcolor;
}
else
gl_FragColor = (diffuse + ambient + specular);
}
This is my GeometryShader
#version 330
layout (triangles) in;
layout (triangles) out;
layout (max_vertices = 3) out;
out vec3 N1;
out vec3 L1;
out vec3 R1;
out vec3 V1;
in vec3 N;
in vec3 L;
in vec3 R;
in vec3 V;
uniform mat4 local2clip;
uniform mat4 local2eye;
uniform mat4 normal_matrix;
uniform mat4 world2eye;
uniform vec4 light_ambient;
uniform vec4 light_diffuse;
uniform vec4 light_specular;
uniform vec4 light_pos;
uniform vec4 mat_ambient;
uniform vec4 mat_diffuse;
uniform vec4 mat_specular;
uniform float mat_shine;
//varying vec3 v_normal; // vertex normal
out vec4 v_color1; // vertex color
out vec4 pos_in_eye1; //vertex position in eye space
out vec2 FtexCoord1;
in vec4 v_color; // vertex color
in vec4 pos_in_eye; //vertex position in eye space
in vec2 FtexCoord;
void main(void)
{
int i;
N1=N;
L1=L;
R1=R;
V1=R;
FtexCoord1=FtexCoord;
v_color1=v_color;
pos_in_eye1=pos_in_eye;
for (i = 0; i < gl_in.length(); i++)
{
gl_Position = gl_in[i].gl_Position;
EmitVertex();
}
EndPrimitive();
}
I just want that what ever was there earlier is passed from vertex shader to fragment shader via geometry shader, so that I can manipulate the shader later. Currently the screen is just black
The core of your problem is that you didn't bother to check for compilation errors when you built your Geometry Shader. I know that because I see several syntax errors for it. In particular:
in vec3 N;
in vec3 L;
in vec3 R;
in vec3 V;
in vec4 v_color; // vertex color
in vec4 pos_in_eye; //vertex position in eye space
in vec2 FtexCoord;
Geometry Shader inputs are always aggregated into arrays. Remember: a geometry shader operates on primitives, which are defined as a collection of one or more vertices. Each GS invocation therefore gets a set of per-vertex input values, one for each vertex in the primitive type defined by your layout in qualifier.
Notice how you loop over the number of vertices in a primitive and use gl_in[i] to get the input value for each vertex in the primitive. That's how you need to access all of your Geometry Shader inputs. And you need to write each one to its corresponding output variable, then call EmitVertex. All in that loop.

Problems with flat and phong shading

(Edit): The original code I posted was for both gouraud and phong shading options. I've changed it so it is just phong shading and posted below. The mesh is too big to describe here, as it is generated from a Bezier Patch.
I'm having some problems with flat and phong shading in Open GL 3 Mesa 9. It seems no matter what I do I get flat shaded figures, with tiny facets (planes) and I cannot get Blinn-Phong shading to work.
Here are my shaders:
(Vertex Shader)
//material parameters
uniform vec4 AmbientProduct, DiffuseProduct, SpecularProduct;
uniform float Shininess;
attribute vec4 vPosition;
//attribute vec4 vColor;
attribute vec4 vNormal;
attribute vec4 vControlColor;
attribute vec2 texcoord;
uniform mat4 model_view;
uniform mat4 projection;
uniform int flag;
uniform int phong_flag;
uniform vec4 eye_position;
//lighting parameters
uniform vec4 light_1; //light 1 position
uniform vec4 light_2; //light 2 position
varying vec4 control_color;
varying vec4 color;
varying vec4 position;
varying vec4 normal;
varying vec2 st;
void
main()
{
control_color = vControlColor;
position = vPosition;
normal = vNormal;
tex_coords = texcoord;
st = texcoord;
gl_Position = projection*model_view*vPosition;
}
And my fragment shader:
//material parameters
uniform vec4 AmbientProduct, DiffuseProduct, SpecularProduct;
uniform float Shininess;
uniform vec4 eye_position;
uniform int phong_flag;
//lighting parameters
uniform vec4 light_1; //light 1 position
uniform vec4 light_2; //light 2 position
varying vec4 light_2_transformed; //light 2 transformed position
uniform int Control_Point_Flag;
uniform sampler2D texMap;
varying vec4 color;
varying vec4 position;
varying vec4 normal;
varying vec4 control_color;
varying vec2 st;
void
main()
{
vec4 N = normalize(normal);
vec4 E = normalize(eye_position - position);
vec4 L1 = normalize(light_1 - position);
vec4 L2 = normalize(light_2 - position);
vec4 H1 = normalize( L1 + E);
vec4 H2 = normalize( L2 + E);
//calculate ambient component
vec4 ambient = AmbientProduct;
//calculate diffuse componenent
float k_d_1 = max(dot(L1,N), 0.0);
float k_d_2 = max(dot(L2,N), 0.0);
vec4 diffuse1 = k_d_1*DiffuseProduct;
vec4 diffuse2 = k_d_2*DiffuseProduct;
//calculate specular componenent
float k_s_1 = pow(max(dot(N, H1), 0.0), Shininess);
float k_s_2 = pow(max(dot(N, H2), 0.0), Shininess);
vec4 specular1 = k_s_1*SpecularProduct;
vec4 specular2 = k_s_2*SpecularProduct;
//if specular color is behind the camera, discard it
if (dot(L1, N) < 0.0) {
specular1 = vec4(0.0, 0.0, 0.0, 1.0);
}
if (dot(L2, N) < 0.0) {
specular2 = vec4(0.0, 0.0, 0.0, 1.0);
}
vec4 final_color = ambient + diffuse1 + diffuse2 + specular1 + specular2;
final_color.a = 1.0;
/* gl_FragColor = final_color; */
gl_FragColor = final_color*texture2D(texMap, st);
}
Does everything look ok for my shaders?
Things worth noting:
You have variables for a ModelView in your vertex shader, but you never use it in calculating position. Your vertex "position"s are thus whatever is passed from your OpenGL application and are not affected by any transformations you may be trying to do, though they are physically placed correctly because you use the matrices for gl_Position.
You aren't passing a Normal matrix to your shader. The Normal matrix is calculated by taking the transpose inverse of the ModelView matrix. Calculate this outside of the shader and pass it in. If you don't multiply your normals by the Normal matrix, you'll still be able to transform your model, but the normals will all still be facing the same way, so your lighting will be incorrect.
However, your normal vectors on the OpenGL side are may likely be the culprit. See this question for a good explanation of a possible source of unwanted flat shading.
As a side note, both of your shaders seem more complicated than they should be. That is to say, they have too many variables that aren't used and too much stuff that you could condense into fewer lines. It's just housekeeping, but it will make keeping track of your code easier.