I was looking for a lot of this problem. I found this question Passing data into different shaders but this problem not mine. I get "The fragment shader uses varying "normal", but previous shader does not write to it." error message.
My vertey shader code:
#version 430
in layout(location=0) vec3 position;
in layout(location=1) vec3 normal;
out vec3 norm;
uniform mat4 transformation;
void main()
{
gl_Position = transformation * vec4(position, 1.0);
norm = (transformation * vec4(normal, 0.0)).xyz;
}
And my fragment shader code:
#version 430
in vec3 normal;
out vec4 colour;
vec3 lightPos = vec3(0,50,0);
vec3 lightColor = vec3(0.5, 0, 0);
vec3 materialColor = vec3(0, 1.0, 0);
void main() {
float cosTheta = dot(-lightPos, normalize(normal));
vec3 temp = materialColor * lightColor * cosTheta;
colour = vec4(temp, 1.0);
}
What is the main problem? I don't understand this message my vertex shader using the normal vector and it passing into fragment shader. I don't see difference between the linked code and mine. Please tell me some idea :\
If you want to use different variable names for some reason you can specify a location to match in- and output variables.
For example, in your case:
.vert:
out layout(location = 7) vec3 norm;
.frag:
in layout(location = 7)vec3 normal;
Related
So I've currently managed to write a shader using Xoppa tutorials and use an AssetManager, and I managed to bind a texture to the model, and it looks fine.
[[1
Now the next step I guess would be to create diffuse(not sure if thats the word? phong shading?) lighting(?) to give the bunny some form of shading. While I have a little bit of experience with GLSL shaders in LWJGL, I'm unsure how to process that same information so I can use it in libGDX and in the glsl shaders.
I understand that this all could be accomplished using the Environment class etc. But I want to achieve this through the shaders alone or by traditional means simply for the challenge.
In LWJGL, shaders would have uniforms:
in vec3 position;
in vec2 textureCoordinates;
in vec3 normal;
out vec2 pass_textureCoordinates;
out vec3 surfaceNormal;
out vec3 toLightVector;
uniform mat4 transformationMatrix;
uniform mat4 projectionMatrix;
uniform mat4 viewMatrix;
uniform vec3 lightPosition;
This would be reasonably easy for me to calculate in LWJGL
Vertex file:
attribute vec3 a_position;
attribute vec3 a_normal;
attribute vec2 a_texCoord0;
uniform mat4 u_worldTrans;
uniform mat4 u_projViewTrans;
varying vec2 v_texCoords;
void main() {
v_texCoords = a_texCoord0;
gl_Position = u_projViewTrans * u_worldTrans * vec4(a_position, 1.0);
}
I imagine that I could implement the uniforms similarly to the LWGL glsl example, but I dont know how I can apply these uniforms into libgdx and have it work. I am unsure what u_projViewTrans is, I'm assuming it is a combination of the projection, transformation and view matrix, and setting the uniform with the camera.combined?
If someone could help me understand the process or point to an example of how (per pixel lighting?), can be implemented with just the u_projViewTrans and u_worldTrans, I'd greatly appreciate your time and effort in helping me understand these concepts a bit better.
Heres my github upload of my work in progress.here
You can do the light calculations in world space. A simple lambertian diffuse light can be calculated like this:
vec3 toLightVector = normalize( lightPosition - vertexPosition );
float ligtIntensity = max( 0.0, dot( normal, toLightVector ));
A detailed explanation can be found in the answer to the Stackoverflow question How does this faking the light work on aerotwist?.
While Gouraud shading calculates the light in the the vertex shader, Phong shading calculates the light in the fragment shader.
(see further GLSL fixed function fragment program replacement)
A Gouraud shader may look like this:
Vertex Shader:
attribute vec3 a_position;
attribute vec3 a_normal;
attribute vec2 a_texCoord0;
uniform mat4 u_worldTrans;
uniform mat4 u_projViewTrans;
uniform vec3 lightPosition;
varying vec2 v_texCoords;
varying float v_lightIntensity;
void main()
{
vec4 vertPos = u_worldTrans * vec4(a_position, 1.0);
vec3 normal = normalize(mat3(u_worldTrans) * a_normal);
vec3 toLightVector = normalize(lightPosition - vertPos.xyz);
v_lightIntensity = max( 0.0, dot(normal, toLightVector));
v_texCoords = a_texCoord0;
gl_Position = u_projViewTrans * vertPos;
}
Fragment Shader:
varying vec2 v_texCoords;
varying float v_lightIntensity;
uniform sampler2D u_texture;
void main()
{
vec4 texCol = texture( u_texture, v_texCoords.st );
gl_FragColor = vec4( texCol.rgb * v_lightIntensity, 1.0 );
}
A Phong shading may look like this:
Vertex Shader:
attribute vec3 a_position;
attribute vec3 a_normal;
attribute vec2 a_texCoord0;
uniform mat4 u_worldTrans;
uniform mat4 u_projViewTrans;
varying vec2 v_texCoords;
varying vec3 v_vertPosWorld;
varying vec3 v_vertNVWorld;
void main()
{
vec4 vertPos = u_worldTrans * vec4(a_position, 1.0);
v_vertPosWorld = vertPos.xyz;
v_vertNVWorld = normalize(mat3(u_worldTrans) * a_normal);
v_texCoords = a_texCoord0;
gl_Position = u_projViewTrans * vertPos;
}
Fragment Shader:
varying vec2 v_texCoords;
varying vec3 v_vertPosWorld;
varying vec3 v_vertNVWorld;
uniform sampler2D u_texture;
struct PointLight
{
vec3 color;
vec3 position;
float intensity;
};
uniform PointLight u_pointLights[1];
void main()
{
vec3 toLightVector = normalize(u_pointLights[0].position - v_vertPosWorld.xyz);
float lightIntensity = max( 0.0, dot(v_vertNVWorld, toLightVector));
vec4 texCol = texture( u_texture, v_texCoords.st );
vec3 finalCol = texCol.rgb * lightIntensity * u_pointLights[0].color;
gl_FragColor = vec4( finalCol.rgb * lightIntensity, 1.0 );
}
I tried to realize height map with GLSL.
For it, i need to sent my picture to VertexShader and get grey component.
glActiveTexture(GL_TEXTURE0);
Texture.bind();
glUniform1i(mShader.getUniformLocation("heightmap"), 0);
mShader.getUniformLocation uses glGetUniformLocation and work good for other uniforms values, that used in Fragment, Vertex Shaders. But for heightmap return -1...
VertexShader code:
#version 330 core
layout (location = 0) in vec3 position;
layout (location = 1) in vec4 color;
layout (location = 2) in vec2 texCoords;
layout (location = 3) in vec3 normal;
out vec3 Normal;
out vec3 FragPos;
out vec2 TexCoords;
out vec4 ourColor;
uniform mat4 model;
uniform mat4 view;
uniform mat4 projection;
uniform sampler2D heightmap;
void main()
{
float bias = 0.25;
float h = 0.0;
float scale = 5.0;
h = scale * ((texture2D(heightmap, texCoords).r) - bias);
vec3 hnormal = vec3(normal.x*h, normal.y*h, normal.z*h);
vec3 position1 = position * hnormal;
gl_Position = projection * view * model * vec4(position1, 1.0f);
FragPos = vec3(model * vec4(position, 1.0f));
Normal = mat3(transpose(inverse(model))) * normal;
ourColor = color;
TexCoords = texCoords;
}
may be algorithm of getting height is bad, but error with getting uniformlocation stops my work..
What is wrong? Any ideas?
UPD: texCoords (not TexCoords) of course is using in
h = scale * ((texture2D(heightmap, texCoords).r) - bias);
my mistake, but it doesn't solve the problem. Having same error..
My bet is your variable has been optimized out by driver or the shader did not compile/link properly. After trying to compile your shader (on my nVidia) I got this in the logs:
0(9) : warning C7050: "TexCoords" might be used before being initialized
You should always check the GLSL compile/link logs ? see
How to debug GLSL Fragment shader
especially how the glGetShaderInfoLog is used.
In line
h = scale * ((texture2D(heightmap, TexCoords).r) - bias);
You are using TexCoords which is output variable and not yet set so the behavior is undefined and most likely your gfx driver throw that line away (and may be others) removing the TexCoords from shader completely but that is just my assumption.
What driver and gfx card you got?
What returns the logs on your setup?
I'm trying to calculate per-face normals in geometry shader, using the following pipeline
//VERTEX_SHADER
#version 330
layout(location = 0) in vec4 vertex;
out vec3 vert;
uniform mat4 projMatrix;
uniform mat4 mvMatrix;
void main()
{
vert = vertex.xyz;
gl_Position = projMatrix * mvMatrix * vertex;
}
//GEOMETRY_SHADER
#version 330
layout ( triangles ) in;
layout ( triangle_strip, max_vertices = 3 ) out;
out vec3 normal_out;
uniform mat3 normalMatrix;
void main()
{
vec3 A = gl_in[2].gl_Position.xyz - gl_in[0].gl_Position.xyz;
vec3 B = gl_in[1].gl_Position.xyz - gl_in[0].gl_Position.xyz;
normal_out = normalMatrix * normalize(cross(A,B));
gl_Position = gl_in[0].gl_Position;
EmitVertex();
gl_Position = gl_in[1].gl_Position;
EmitVertex();
gl_Position = gl_in[2].gl_Position;
EmitVertex();
EndPrimitive();
}
//FRAG_SHADER
#version 330
in vec3 normal_out;
in vec3 vert;
out vec4 fColor;
uniform vec3 lightPos;
void main()
{
highp vec3 L = normalize(lightPos - vert);
highp float NL = max(dot(normal_out, L), 0.0);
highp vec3 color = vec3(1, 1, 0.0);
fColor = vec4(color*NL, 1.0);
}
However, I end up with very weird looking faces that keeps flickering(I included a snapshot below). It occurred to me that it might be because I'm using 8 vertices to represent 1 cell(cube) instead of 24 vertices, But I'm not quite sure if that is what is causing the problem.
Left: Using Light Weighting 'NL', Right:Without
After every call to EmitVertex, the contents of all output variables are made undefined. Therefore, if you want to output the same value to multiple vertices, you must copy it to the output every time.
Also, note that each shader stage's outputs provide inputs only to the next stage. So if you have a GS, and you want to pass a value from the VS to the FS, you must have the GS explicitly pass that value through.
I'm trying to implement some basic lighting and shading following the tutorial over here and here.
Everything is more or less working but I get some kind of strange flickering on object surfaces due to the shading.
I have two images attached to show you guys how this problem looks.
I think the problem is related to the fact that I'm passing vertex coordinates from vertex shader to fragment shader to compute some lighting variables as stated in the above linked tutorials.
Here is some source code (stripped out unrelated code).
Vertex Shader:
#version 150 core
in vec4 pos;
in vec4 in_col;
in vec2 in_uv;
in vec4 in_norm;
uniform mat4 model_view_projection;
out vec4 out_col;
out vec2 passed_uv;
out vec4 out_vert;
out vec4 out_norm;
void main(void) {
gl_Position = model_view_projection * pos;
out_col = in_col;
out_vert = pos;
out_norm = in_norm;
passed_uv = in_uv;
}
and Fragment Shader:
#version 150 core
uniform sampler2D tex;
uniform mat4 model_mat;
in vec4 in_col;
in vec2 passed_uv;
in vec4 vert_pos;
in vec4 in_norm;
out vec4 col;
void main(void) {
mat3 norm_mat = mat3(transpose(inverse(model_mat)));
vec3 norm = normalize(norm_mat * vec3(in_norm));
vec3 light_pos = vec3(0.0, 6.0, 0.0);
vec4 light_col = vec4(1.0, 0.8, 0.8, 1.0);
vec3 col_pos = vec3(model_mat * vert_pos);
vec3 s_to_f = light_pos - col_pos;
float brightness = dot(norm, normalize(s_to_f));
brightness = clamp(brightness, 0, 1);
gl_FragColor = out_col;
gl_FragColor = vec4(brightness * light_col.rgb * gl_FragColor.rgb, 1.0);
}
As I said earlier I guess the problem has to do with the way the vertex position is passed to the fragment shader. If I change the position values to something static no more flickering occurs.
I changed all other values to statics, too. It's the same result - no flickering if I am not using the vertex position data passed from vertex shader.
So, if there is someone out there with some GL-wisdom .. ;)
Any help would be appreciated.
Side note: running all this stuff on an Intel HD 4000 if that may provide further information.
Thanks in advance!
Ivan
The names of the out variables in the vertex shader and the in variables in the fragment shader need to match. You have this in the vertex shader:
out vec4 out_col;
out vec2 passed_uv;
out vec4 out_vert;
out vec4 out_norm;
and this in the fragment shader:
in vec4 in_col;
in vec2 passed_uv;
in vec4 vert_pos;
in vec4 in_norm;
These variables are associated by name, not by order. Except for passed_uv, the names do not match here. For example, you could use these declarations in the vertex shader:
out vec4 passed_col;
out vec2 passed_uv;
out vec4 passed_vert;
out vec4 passed_norm;
and these in the fragment shader:
in vec4 passed_col;
in vec2 passed_uv;
in vec4 passed_vert;
in vec4 passed_norm;
Based on the way I read the spec, your shader program should actually fail to link. At least in the GLSL 4.50 spec, in the table on page 43, it lists "Link-Time Error" for this situation. The rules seem somewhat ambiguous in earlier specs, though.
I am currently trying to make a little game in OpenGL as an attempt to learn how to use the API. I've come to a point where I can move a camera around a simple scene, and I can render models and shade them with a simple phong model shader.
I'm right now working on texturing the models in the scene, so I got a copy of Maya and made (with quite some struggle) a square with a texture with the UV mapping made in within Maya.
When I render the scene, the texture is applied, but far from correct. I read the models as .obj files with a parser I wrote myself, and the textures are read using a funtion I found online a while back.
I'm not sure how to describe the problem in sufficient detail, nor what to look for in the code, but here are some code fractions that I would suspect contained the problem.
Reading the texture
GLuint loadTexture(Image* image){
GLuint textureId;
glGenTextures(1, &textureId);
glBindTexture(GL_TEXTURE_2D, textureId);
glTexImage2D(GL_TEXTURE_2D,
0,
GL_RGB,
image->width, image->height,
0,
GL_RGB,
GL_UNSIGNED_BYTE,
image->pixels);
return textureId;
}
Setting the texture prior to rendering the mesh
// set texture
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_2D, this->body_texture);
current_shader->setUniformint(0, "Difuse_texture");
Vertex shader
#version 410
layout(location = 0) in vec3 VertexPosition;
layout(location = 1) in vec3 VertexNormal;
layout(location = 1) in vec2 TextureCoord;
out vec3 Position;
out vec3 Normal;
out vec2 TexCoord;
uniform mat4 ModelMatrix;
uniform mat4 VeiwMatrix;
uniform mat4 ProjectionMatrix;
uniform mat3 NormalMatrix;
void main(){
mat4 ModelVeiwMatrix = VeiwMatrix * ModelMatrix;
mat4 MVP = ProjectionMatrix * ModelVeiwMatrix;
TexCoord = TextureCoord;
Normal = normalize( NormalMatrix * VertexNormal );
Position = vec3(ModelVeiwMatrix * vec4(VertexPosition, 1.0));
gl_Position = MVP * vec4(VertexPosition, 1.0);
}
Fragment shader
#version 410
in vec3 Position;
in vec3 Normal;
in vec2 TexCoord;
uniform vec4 LightPosition;
uniform vec3 LightIntensity;
uniform vec3 Kd;
uniform vec3 Ka;
uniform vec3 Ks;
uniform float Shininess;
uniform sampler2D Difuse_texture;
layout(location = 0) out vec4 FragColor;
vec4 ads(){
vec3 n = normalize( Normal );
vec3 s = normalize( vec3(LightPosition) - Position );
vec3 v = normalize( vec3(-Position) );
vec3 r = reflect( -s, n );
vec3 specular_light = Ks * pow(max(dot(r, v), 0.0), Shininess);
vec3 ad_light = Ka + Kd * max(dot(s, n), 0.0);
vec4 TexColor = texture2D(Difuse_texture, TexCoord);
return TexColor; // (vec4(LightIntensity, 1.0) * (vec4(ad_light, 1.0) * TexColor + vec4(specular_light, 1.0)));
}
void main() {
FragColor = ads();
}
I know some things are written strangely, but at this point I'm starting to just try anything to get it working.
Does anyone have a suggestion on how to solve this strange UV mapping?
EDIT:
OBJ LOADING
I have made the obj loader print all vertex attributes and compared these with the indexing in the .obj file. It looks like the verecies, normals and UVs are showing in the correct order.
Screenshot
The scene looks like this using just simple reg to green gradient as trexture image.
(The square should by my understading show the gradient from the texture? not just a single color)
Alignment sounds like a possible flaw, how can I correct this?
a http://imageshack.com/a/img674/9927/y0bJ51.png
SOLUTION
I made a very simple and easy to overlook mistake. In the top of the vertex shader i wrote
layout(location = 0) in vec3 VertexPosition;
layout(location = 1) in vec3 VertexNormal;
layout(location = 1) in vec2 TextureCoord;
So I guess that when I sent the normal data to location 1, I set the Texture coordinates to normal data, so the UV coords never reached the fragment shader.
Changeing to the folowing resolved the problem without further change.
layout(location = 0) in vec3 VertexPosition;
layout(location = 1) in vec3 VertexNormal;
layout(location = 2) in vec2 TextureCoord;