uniform sampler2D in Vertex Shader - c++

I tried to realize height map with GLSL.
For it, i need to sent my picture to VertexShader and get grey component.
glActiveTexture(GL_TEXTURE0);
Texture.bind();
glUniform1i(mShader.getUniformLocation("heightmap"), 0);
mShader.getUniformLocation uses glGetUniformLocation and work good for other uniforms values, that used in Fragment, Vertex Shaders. But for heightmap return -1...
VertexShader code:
#version 330 core
layout (location = 0) in vec3 position;
layout (location = 1) in vec4 color;
layout (location = 2) in vec2 texCoords;
layout (location = 3) in vec3 normal;
out vec3 Normal;
out vec3 FragPos;
out vec2 TexCoords;
out vec4 ourColor;
uniform mat4 model;
uniform mat4 view;
uniform mat4 projection;
uniform sampler2D heightmap;
void main()
{
float bias = 0.25;
float h = 0.0;
float scale = 5.0;
h = scale * ((texture2D(heightmap, texCoords).r) - bias);
vec3 hnormal = vec3(normal.x*h, normal.y*h, normal.z*h);
vec3 position1 = position * hnormal;
gl_Position = projection * view * model * vec4(position1, 1.0f);
FragPos = vec3(model * vec4(position, 1.0f));
Normal = mat3(transpose(inverse(model))) * normal;
ourColor = color;
TexCoords = texCoords;
}
may be algorithm of getting height is bad, but error with getting uniformlocation stops my work..
What is wrong? Any ideas?
UPD: texCoords (not TexCoords) of course is using in
h = scale * ((texture2D(heightmap, texCoords).r) - bias);
my mistake, but it doesn't solve the problem. Having same error..

My bet is your variable has been optimized out by driver or the shader did not compile/link properly. After trying to compile your shader (on my nVidia) I got this in the logs:
0(9) : warning C7050: "TexCoords" might be used before being initialized
You should always check the GLSL compile/link logs ? see
How to debug GLSL Fragment shader
especially how the glGetShaderInfoLog is used.
In line
h = scale * ((texture2D(heightmap, TexCoords).r) - bias);
You are using TexCoords which is output variable and not yet set so the behavior is undefined and most likely your gfx driver throw that line away (and may be others) removing the TexCoords from shader completely but that is just my assumption.
What driver and gfx card you got?
What returns the logs on your setup?

Related

Why is the light source not moving when I change its position?

I've been working on OpenGL for a while now, and I'm now working at diffuse lighting. I've set up a way to change the light source, but my object acts as if it were in the same place each time.
I've continued checking whether or not my fragment shader and vertex shader is correct, but no problems seem to arise.
This is the light position sent to the core shader:
vec3 lightPos0 = vec3(0.f, 0.f, 2.f);
glUniform3fv(glGetUniformLocation(coreShader.getID(), "lightPos0"), 1, value_ptr(lightPos0));
And here are the fragment and vertex shaders (with all usual variables implied to exist):
#version 450
layout (location = 0) in vec3 vertex_position;
layout (location = 1) in vec3 vertex_color;
layout (location = 2) in vec2 vertex_texcoord;
layout (location = 3) in vec3 vertex_normal;
out vec3 vs_position;
out vec3 vs_color;
out vec2 vs_texcoord;
out vec3 vs_normal;
uniform mat4 ModelMatrix;
uniform mat4 ViewMatrix;
uniform mat4 ProjectionMatrix;
void main()
{
vs_position = vec4(ModelMatrix * vec4(vertex_position, 1.f)).xyz;
vs_color = vertex_color;
vs_texcoord = vec2(vertex_texcoord.x, vertex_texcoord.y * -1.f);
vs_normal = mat3(ModelMatrix) * vertex_normal;
gl_Position = ProjectionMatrix * ViewMatrix * ModelMatrix * vec4(vertex_position, 1.f);
}
#version 450
in vec3 vs_position;
in vec3 vs_color;
in vec2 vs_texcoord;
in vec3 vs_normal;
out vec4 fs_color;
uniform sampler2D texture0;
uniform sampler2D texture1;
uniform vec3 lightPos0;
void main()
{
//Ambient light
vec3 ambientLight = vec3(0.1f, 0.1f, 0.1f);
//Diffuse light
vec3 posToLightDirVec = normalize(lightPos0 - vs_position);
vec3 diffuseColor = vec3(1.f, 1.f, 1.f);
float diffuse = max(dot(posToLightDirVec, vs_normal), 0.0);
vec3 diffuseFinal = diffuseColor * diffuse;
fs_color =
(texture(texture0, vs_texcoord)) * vec4(vs_color, 1.f)
* (vec4(ambientLight, 1.f) + vec4(diffuseFinal, 1.f));
}
If you actually managed to take all of this and get it to about what this program looks like, you would notice that it wouldn't quite matter where you put the light point. You would always have to move back slightly to actually see anything. Also, you would see that it isn't quite diffuse, but more like specular lighting.
If you need any more code, ask me in the comments.
Vou've to install the program as object as part of current rendering state, before you can set the value of the uniform variable lightPos0:
glUseProgram(coreShader.getID())
Note, glUniform3fv changes the value of a uniform in the default uniform block of the currently installed program.
Of course you can get the index of a active program resource (e.g. glGetUniformLocation) before the program is installed. For that it sufficient that the program is linked, but it is not necessary that it is the current program.
If you compare the function glGetUniformLocation and glUniform, then you can see, that the program object is a parameter to the former, but not to the later function. For the use of glUniform the program has to be current.
Since OpenGL 4.1 glProgramUniform is provided, which can specify the value of a uniform variable for a specified program object.
This is your problem:
vec3 posToLightDirVec = normalize(lightPos0 - vs_position);
vs_position is in local space, and so is unaffected by transformations. Transform it first....
vec3 posToLightDirVec = normalize(lightPos0 - (ModelMatrix * vs_position).xyz);

Light with vertex/fragment shader. Using varying variables. (openGL)

I was looking for a lot of this problem. I found this question Passing data into different shaders but this problem not mine. I get "The fragment shader uses varying "normal", but previous shader does not write to it." error message.
My vertey shader code:
#version 430
in layout(location=0) vec3 position;
in layout(location=1) vec3 normal;
out vec3 norm;
uniform mat4 transformation;
void main()
{
gl_Position = transformation * vec4(position, 1.0);
norm = (transformation * vec4(normal, 0.0)).xyz;
}
And my fragment shader code:
#version 430
in vec3 normal;
out vec4 colour;
vec3 lightPos = vec3(0,50,0);
vec3 lightColor = vec3(0.5, 0, 0);
vec3 materialColor = vec3(0, 1.0, 0);
void main() {
float cosTheta = dot(-lightPos, normalize(normal));
vec3 temp = materialColor * lightColor * cosTheta;
colour = vec4(temp, 1.0);
}
What is the main problem? I don't understand this message my vertex shader using the normal vector and it passing into fragment shader. I don't see difference between the linked code and mine. Please tell me some idea :\
If you want to use different variable names for some reason you can specify a location to match in- and output variables.
For example, in your case:
.vert:
out layout(location = 7) vec3 norm;
.frag:
in layout(location = 7)vec3 normal;

Simple GLSL Shader (Light) causes flickering

I'm trying to implement some basic lighting and shading following the tutorial over here and here.
Everything is more or less working but I get some kind of strange flickering on object surfaces due to the shading.
I have two images attached to show you guys how this problem looks.
I think the problem is related to the fact that I'm passing vertex coordinates from vertex shader to fragment shader to compute some lighting variables as stated in the above linked tutorials.
Here is some source code (stripped out unrelated code).
Vertex Shader:
#version 150 core
in vec4 pos;
in vec4 in_col;
in vec2 in_uv;
in vec4 in_norm;
uniform mat4 model_view_projection;
out vec4 out_col;
out vec2 passed_uv;
out vec4 out_vert;
out vec4 out_norm;
void main(void) {
gl_Position = model_view_projection * pos;
out_col = in_col;
out_vert = pos;
out_norm = in_norm;
passed_uv = in_uv;
}
and Fragment Shader:
#version 150 core
uniform sampler2D tex;
uniform mat4 model_mat;
in vec4 in_col;
in vec2 passed_uv;
in vec4 vert_pos;
in vec4 in_norm;
out vec4 col;
void main(void) {
mat3 norm_mat = mat3(transpose(inverse(model_mat)));
vec3 norm = normalize(norm_mat * vec3(in_norm));
vec3 light_pos = vec3(0.0, 6.0, 0.0);
vec4 light_col = vec4(1.0, 0.8, 0.8, 1.0);
vec3 col_pos = vec3(model_mat * vert_pos);
vec3 s_to_f = light_pos - col_pos;
float brightness = dot(norm, normalize(s_to_f));
brightness = clamp(brightness, 0, 1);
gl_FragColor = out_col;
gl_FragColor = vec4(brightness * light_col.rgb * gl_FragColor.rgb, 1.0);
}
As I said earlier I guess the problem has to do with the way the vertex position is passed to the fragment shader. If I change the position values to something static no more flickering occurs.
I changed all other values to statics, too. It's the same result - no flickering if I am not using the vertex position data passed from vertex shader.
So, if there is someone out there with some GL-wisdom .. ;)
Any help would be appreciated.
Side note: running all this stuff on an Intel HD 4000 if that may provide further information.
Thanks in advance!
Ivan
The names of the out variables in the vertex shader and the in variables in the fragment shader need to match. You have this in the vertex shader:
out vec4 out_col;
out vec2 passed_uv;
out vec4 out_vert;
out vec4 out_norm;
and this in the fragment shader:
in vec4 in_col;
in vec2 passed_uv;
in vec4 vert_pos;
in vec4 in_norm;
These variables are associated by name, not by order. Except for passed_uv, the names do not match here. For example, you could use these declarations in the vertex shader:
out vec4 passed_col;
out vec2 passed_uv;
out vec4 passed_vert;
out vec4 passed_norm;
and these in the fragment shader:
in vec4 passed_col;
in vec2 passed_uv;
in vec4 passed_vert;
in vec4 passed_norm;
Based on the way I read the spec, your shader program should actually fail to link. At least in the GLSL 4.50 spec, in the table on page 43, it lists "Link-Time Error" for this situation. The rules seem somewhat ambiguous in earlier specs, though.

OpenGL texture and UV mapping issue

I am currently trying to make a little game in OpenGL as an attempt to learn how to use the API. I've come to a point where I can move a camera around a simple scene, and I can render models and shade them with a simple phong model shader.
I'm right now working on texturing the models in the scene, so I got a copy of Maya and made (with quite some struggle) a square with a texture with the UV mapping made in within Maya.
When I render the scene, the texture is applied, but far from correct. I read the models as .obj files with a parser I wrote myself, and the textures are read using a funtion I found online a while back.
I'm not sure how to describe the problem in sufficient detail, nor what to look for in the code, but here are some code fractions that I would suspect contained the problem.
Reading the texture
GLuint loadTexture(Image* image){
GLuint textureId;
glGenTextures(1, &textureId);
glBindTexture(GL_TEXTURE_2D, textureId);
glTexImage2D(GL_TEXTURE_2D,
0,
GL_RGB,
image->width, image->height,
0,
GL_RGB,
GL_UNSIGNED_BYTE,
image->pixels);
return textureId;
}
Setting the texture prior to rendering the mesh
// set texture
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_2D, this->body_texture);
current_shader->setUniformint(0, "Difuse_texture");
Vertex shader
#version 410
layout(location = 0) in vec3 VertexPosition;
layout(location = 1) in vec3 VertexNormal;
layout(location = 1) in vec2 TextureCoord;
out vec3 Position;
out vec3 Normal;
out vec2 TexCoord;
uniform mat4 ModelMatrix;
uniform mat4 VeiwMatrix;
uniform mat4 ProjectionMatrix;
uniform mat3 NormalMatrix;
void main(){
mat4 ModelVeiwMatrix = VeiwMatrix * ModelMatrix;
mat4 MVP = ProjectionMatrix * ModelVeiwMatrix;
TexCoord = TextureCoord;
Normal = normalize( NormalMatrix * VertexNormal );
Position = vec3(ModelVeiwMatrix * vec4(VertexPosition, 1.0));
gl_Position = MVP * vec4(VertexPosition, 1.0);
}
Fragment shader
#version 410
in vec3 Position;
in vec3 Normal;
in vec2 TexCoord;
uniform vec4 LightPosition;
uniform vec3 LightIntensity;
uniform vec3 Kd;
uniform vec3 Ka;
uniform vec3 Ks;
uniform float Shininess;
uniform sampler2D Difuse_texture;
layout(location = 0) out vec4 FragColor;
vec4 ads(){
vec3 n = normalize( Normal );
vec3 s = normalize( vec3(LightPosition) - Position );
vec3 v = normalize( vec3(-Position) );
vec3 r = reflect( -s, n );
vec3 specular_light = Ks * pow(max(dot(r, v), 0.0), Shininess);
vec3 ad_light = Ka + Kd * max(dot(s, n), 0.0);
vec4 TexColor = texture2D(Difuse_texture, TexCoord);
return TexColor; // (vec4(LightIntensity, 1.0) * (vec4(ad_light, 1.0) * TexColor + vec4(specular_light, 1.0)));
}
void main() {
FragColor = ads();
}
I know some things are written strangely, but at this point I'm starting to just try anything to get it working.
Does anyone have a suggestion on how to solve this strange UV mapping?
EDIT:
OBJ LOADING
I have made the obj loader print all vertex attributes and compared these with the indexing in the .obj file. It looks like the verecies, normals and UVs are showing in the correct order.
Screenshot
The scene looks like this using just simple reg to green gradient as trexture image.
(The square should by my understading show the gradient from the texture? not just a single color)
Alignment sounds like a possible flaw, how can I correct this?
a http://imageshack.com/a/img674/9927/y0bJ51.png
SOLUTION
I made a very simple and easy to overlook mistake. In the top of the vertex shader i wrote
layout(location = 0) in vec3 VertexPosition;
layout(location = 1) in vec3 VertexNormal;
layout(location = 1) in vec2 TextureCoord;
So I guess that when I sent the normal data to location 1, I set the Texture coordinates to normal data, so the UV coords never reached the fragment shader.
Changeing to the folowing resolved the problem without further change.
layout(location = 0) in vec3 VertexPosition;
layout(location = 1) in vec3 VertexNormal;
layout(location = 2) in vec2 TextureCoord;

Why does my OpenGL Phong shader behave like a flat shader?

I've been learning OpenGL for the past couple of weeks and I've run into some trouble implementing a Phong shader. It appears to do no interpolation between vertexes despite my use of the smooth qualifier. Am I missing something here? To give credit where credit is due, the code for the vertex and fragment shaders cribs heavily from the OpenGL SuperBible Fifth Edition. I would highly recommend this book!
Vertex Shader:
#version 330
in vec4 vVertex;
in vec3 vNormal;
uniform mat4 mvpMatrix; // mvp = ModelViewProjection
uniform mat4 mvMatrix; // mv = ModelView
uniform mat3 normalMatrix;
uniform vec3 vLightPosition;
smooth out vec3 vVaryingNormal;
smooth out vec3 vVaryingLightDir;
void main(void) {
vVaryingNormal = normalMatrix * vNormal;
vec4 vPosition4 = mvMatrix * vVertex;
vec3 vPosition3 = vPosition4.xyz / vPosition4.w;
vVaryingLightDir = normalize(vLightPosition - vPosition3);
gl_Position = mvpMatrix * vVertex;
}
Fragment Shader:
#version 330
out vec4 vFragColor;
uniform vec4 ambientColor;
uniform vec4 diffuseColor;
uniform vec4 specularColor;
smooth in vec3 vVaryingNormal;
smooth in vec3 vVaryingLightDir;
void main(void) {
float diff = max(0.0, dot(normalize(vVaryingNormal), normalize(vVaryingLightDir)));
vFragColor = diff * diffuseColor;
vFragColor += ambientColor;
vec3 vReflection = normalize(reflect(-normalize(vVaryingLightDir),normalize(vVaryingNormal)));
float spec = max(0.0, dot(normalize(vVaryingNormal), vReflection));
if(diff != 0) {
float fSpec = pow(spec, 32.0);
vFragColor.rgb += vec3(fSpec, fSpec, fSpec);
}
}
This (public domain) image from Wikipedia shows exactly what sort of image I'm getting and what I'm aiming for -- I'm getting the "flat" image but I want the "Phong" image.
Any help would be greatly appreciated. Thank you!
edit: If it makes a difference, I'm using PyOpenGL 3.0.1 and Python 2.6.
edit2:
Solution
It turns out the problem was with my geometry; Kos was correct. For anyone else that's having this problem with Blender models, Kos pointed out that doing Edit->Faces->Set Smooth does the trick. I found that Wings 3D worked "out of the box."
As an addition to this answer, here is a simple geometry shader which will let you visualize your normals. Modify the accompanying vertex shader as needed based on your attribute locations and how you send your matrices.
But first, a picture of a giant bunny head from our friend the Stanford bunny as an example of the result !
Major warning: do note that I get away with transforming the normals with the modelview matrix instead of a proper normal matrix. This won't work correctly if your modelview contains non uniform scaling. Also, the length of your normals won't be correct but that matters little if you just want to check their direction.
Vertex shader:
#version 330
layout(location = 0) in vec4 position;
layout(location = 1) in vec4 normal;
layout(location = 2) in mat4 mv;
out Data
{
vec4 position;
vec4 normal;
vec4 color;
mat4 mvp;
} vdata;
uniform mat4 projection;
void main()
{
vdata.mvp = projection * mv;
vdata.position = position;
vdata.normal = normal;
}
Geometry shader:
#version 330
layout(triangles) in;
layout(line_strip, max_vertices = 6) out;
in Data
{
vec4 position;
vec4 normal;
vec4 color;
mat4 mvp;
} vdata[3];
out Data
{
vec4 color;
} gdata;
void main()
{
const vec4 green = vec4(0.0f, 1.0f, 0.0f, 1.0f);
const vec4 blue = vec4(0.0f, 0.0f, 1.0f, 1.0f);
for (int i = 0; i < 3; i++)
{
gl_Position = vdata[i].mvp * vdata[i].position;
gdata.color = green;
EmitVertex();
gl_Position = vdata[i].mvp * (vdata[i].position + vdata[i].normal);
gdata.color = blue;
EmitVertex();
EndPrimitive();
}
}
Fragment shader:
#version 330
in Data
{
vec4 color;
} gdata;
out vec4 outputColor;
void main()
{
outputColor = gdata.color;
}
Hmm... You're interpolating the normal as a varying variable, so the fragment shader should receive the correct per-pixel normal.
The only explanation (I can think of) of the fact that you're having the result as on your left image is that every fragment on a given face ultimately receives the same normal. You can confirm it with a fragment shader like:
void main() {
vFragColor = normalize(vVaryingNormal);
}
If it's the case, the question remains: Why? The vertex shader looks OK.
So maybe there's something wrong in your geometry? What is the data which you send to the shader? Are you sure you have correctly calculated per-vertex normals, not just per-face normals?
The orange lines are normals of the diagonal face, the red lines are normals of the horizontal face.
If your data looks like the above image, then even with a correct shader you'll get flat shading. Make sure that you have correct per-vertex normals like on the lower image. (They are really simple to calculate for a sphere.)