Why is the light source not moving when I change its position? - c++

I've been working on OpenGL for a while now, and I'm now working at diffuse lighting. I've set up a way to change the light source, but my object acts as if it were in the same place each time.
I've continued checking whether or not my fragment shader and vertex shader is correct, but no problems seem to arise.
This is the light position sent to the core shader:
vec3 lightPos0 = vec3(0.f, 0.f, 2.f);
glUniform3fv(glGetUniformLocation(coreShader.getID(), "lightPos0"), 1, value_ptr(lightPos0));
And here are the fragment and vertex shaders (with all usual variables implied to exist):
#version 450
layout (location = 0) in vec3 vertex_position;
layout (location = 1) in vec3 vertex_color;
layout (location = 2) in vec2 vertex_texcoord;
layout (location = 3) in vec3 vertex_normal;
out vec3 vs_position;
out vec3 vs_color;
out vec2 vs_texcoord;
out vec3 vs_normal;
uniform mat4 ModelMatrix;
uniform mat4 ViewMatrix;
uniform mat4 ProjectionMatrix;
void main()
{
vs_position = vec4(ModelMatrix * vec4(vertex_position, 1.f)).xyz;
vs_color = vertex_color;
vs_texcoord = vec2(vertex_texcoord.x, vertex_texcoord.y * -1.f);
vs_normal = mat3(ModelMatrix) * vertex_normal;
gl_Position = ProjectionMatrix * ViewMatrix * ModelMatrix * vec4(vertex_position, 1.f);
}
#version 450
in vec3 vs_position;
in vec3 vs_color;
in vec2 vs_texcoord;
in vec3 vs_normal;
out vec4 fs_color;
uniform sampler2D texture0;
uniform sampler2D texture1;
uniform vec3 lightPos0;
void main()
{
//Ambient light
vec3 ambientLight = vec3(0.1f, 0.1f, 0.1f);
//Diffuse light
vec3 posToLightDirVec = normalize(lightPos0 - vs_position);
vec3 diffuseColor = vec3(1.f, 1.f, 1.f);
float diffuse = max(dot(posToLightDirVec, vs_normal), 0.0);
vec3 diffuseFinal = diffuseColor * diffuse;
fs_color =
(texture(texture0, vs_texcoord)) * vec4(vs_color, 1.f)
* (vec4(ambientLight, 1.f) + vec4(diffuseFinal, 1.f));
}
If you actually managed to take all of this and get it to about what this program looks like, you would notice that it wouldn't quite matter where you put the light point. You would always have to move back slightly to actually see anything. Also, you would see that it isn't quite diffuse, but more like specular lighting.
If you need any more code, ask me in the comments.

Vou've to install the program as object as part of current rendering state, before you can set the value of the uniform variable lightPos0:
glUseProgram(coreShader.getID())
Note, glUniform3fv changes the value of a uniform in the default uniform block of the currently installed program.
Of course you can get the index of a active program resource (e.g. glGetUniformLocation) before the program is installed. For that it sufficient that the program is linked, but it is not necessary that it is the current program.
If you compare the function glGetUniformLocation and glUniform, then you can see, that the program object is a parameter to the former, but not to the later function. For the use of glUniform the program has to be current.
Since OpenGL 4.1 glProgramUniform is provided, which can specify the value of a uniform variable for a specified program object.

This is your problem:
vec3 posToLightDirVec = normalize(lightPos0 - vs_position);
vs_position is in local space, and so is unaffected by transformations. Transform it first....
vec3 posToLightDirVec = normalize(lightPos0 - (ModelMatrix * vs_position).xyz);

Related

OpenGL - How could I color objects in pixelated fashion through shaders?

I'm trying to figure out a way to light up my object in a pixelated fashion through the use of shaders.
To ilustrate, my goal is to turn this:
Into this:
I've tried looking up ways to do this through the fragment shader, however, there is no way I can access the local position of a fragment to determine the "fake pixel" it would belong to. I also had the idea to use a geometry shader to create a vertex for each of those boxes, but I'm under suspicion there could be a better way to do this. Would it be possible?
EDIT: These are the shaders currently being used for the object illustrated by the first image:
vertex shader:
#version 330 core
layout (location = 0) in vec3 aPos;
layout (location = 1) in vec3 aColor;
layout (location = 2) in vec2 aTex;
uniform mat4 model;
uniform mat4 view;
uniform mat4 projection;
out vec3 oColor; //Output of a color
out vec2 oTex; //Output of a Texture
out vec3 oPos; //Output of Position in space for light calculation
out vec3 oNormal; //Output of Normal vector for light calculation.
void main(){
gl_Position = projection * view * model * vec4(aPos, 1.0);
oColor = aColor;
oTex = aTex;
oPos = vec3(model * vec4(aPos, 1.0));
oNormal = vec3(0, 0, -1); //Not being calculated at the moment.
}
fragment shader:
#version 330 core
in vec3 oColor;
in vec2 oTex;
in vec3 oPos;
in vec3 oNormal;
out vec4 FragColor;
uniform sampler2D tex;
uniform vec3 lightColor; //Color of the light on the scene, there's only one
uniform vec3 lightPos; //Position of the light on the scene
void main(){
//Ambient Light Calculation
float ambientStrength = 0.1;
//vec3 ambient = ambientStrength * lightColor * vec3(texture(tex, oTex));
vec3 ambient = ambientStrength * lightColor;
//Diffuse Light Calculation
float diffuseStrength = 1.0;
vec3 norm = normalize(oNormal);
vec3 lightDir = normalize(lightPos - oPos);
float diff = max(dot(norm, lightDir), 0.0);
//vec3 diffuse = diff * lightColor* vec3(texture(tex, oTex)) * diffuseStrength;
vec3 diffuse = diff * lightColor;
//Specular Light Calculation
float specularStrength = 0.25;
float shinnyness = 8;
vec3 viewPos = vec3(0, 0, -10);
vec3 viewDir = normalize(viewPos - oPos);
vec3 reflectDir = reflect(-lightDir, norm);
float spec = pow(max(dot(viewDir, reflectDir), 0.0), shinnyness);
vec3 specular = specularStrength * spec * lightColor;
//Result Light
vec3 result = (ambient+diffuse+specular) * oColor;
FragColor = vec4(result, 1.0f);
}
The lighting depends on oPos. You need to "cascade" the position. e.g:
vec3 pos = vec3(round(oPos.xy * 10.0) / 10.0, oPos.z);
In the following use pos instead of oPos.
Note that this only works if oPos is a position in the view space, respectively if the XY plane of the oPos` coordinate system is parallel to the XY plane of the view.
Alternatively you can compute the a position depending on gl_FragCoord.
Add a uniform variable with the resolution of the screen:
uniform vec2 resolution;
Compute pos depending on resolution and gl_FragCoord:
vec3 pos = vec3(round(20.0 * gl_FragCoord.xy/resolution.y) / 20.0, oPos.z);
If you want to align the inner squares with the object you need to introduce texture coordinates. Where the bottom left coordinate of the object is (0, 0) and the top right is (1, 1).

How do you incorporate per pixel lighting in shaders with LIBGDX?

So I've currently managed to write a shader using Xoppa tutorials and use an AssetManager, and I managed to bind a texture to the model, and it looks fine.
[[1
Now the next step I guess would be to create diffuse(not sure if thats the word? phong shading?) lighting(?) to give the bunny some form of shading. While I have a little bit of experience with GLSL shaders in LWJGL, I'm unsure how to process that same information so I can use it in libGDX and in the glsl shaders.
I understand that this all could be accomplished using the Environment class etc. But I want to achieve this through the shaders alone or by traditional means simply for the challenge.
In LWJGL, shaders would have uniforms:
in vec3 position;
in vec2 textureCoordinates;
in vec3 normal;
out vec2 pass_textureCoordinates;
out vec3 surfaceNormal;
out vec3 toLightVector;
uniform mat4 transformationMatrix;
uniform mat4 projectionMatrix;
uniform mat4 viewMatrix;
uniform vec3 lightPosition;
This would be reasonably easy for me to calculate in LWJGL
Vertex file:
attribute vec3 a_position;
attribute vec3 a_normal;
attribute vec2 a_texCoord0;
uniform mat4 u_worldTrans;
uniform mat4 u_projViewTrans;
varying vec2 v_texCoords;
void main() {
v_texCoords = a_texCoord0;
gl_Position = u_projViewTrans * u_worldTrans * vec4(a_position, 1.0);
}
I imagine that I could implement the uniforms similarly to the LWGL glsl example, but I dont know how I can apply these uniforms into libgdx and have it work. I am unsure what u_projViewTrans is, I'm assuming it is a combination of the projection, transformation and view matrix, and setting the uniform with the camera.combined?
If someone could help me understand the process or point to an example of how (per pixel lighting?), can be implemented with just the u_projViewTrans and u_worldTrans, I'd greatly appreciate your time and effort in helping me understand these concepts a bit better.
Heres my github upload of my work in progress.here
You can do the light calculations in world space. A simple lambertian diffuse light can be calculated like this:
vec3 toLightVector = normalize( lightPosition - vertexPosition );
float ligtIntensity = max( 0.0, dot( normal, toLightVector ));
A detailed explanation can be found in the answer to the Stackoverflow question How does this faking the light work on aerotwist?.
While Gouraud shading calculates the light in the the vertex shader, Phong shading calculates the light in the fragment shader.
(see further GLSL fixed function fragment program replacement)
A Gouraud shader may look like this:
Vertex Shader:
attribute vec3 a_position;
attribute vec3 a_normal;
attribute vec2 a_texCoord0;
uniform mat4 u_worldTrans;
uniform mat4 u_projViewTrans;
uniform vec3 lightPosition;
varying vec2 v_texCoords;
varying float v_lightIntensity;
void main()
{
vec4 vertPos = u_worldTrans * vec4(a_position, 1.0);
vec3 normal = normalize(mat3(u_worldTrans) * a_normal);
vec3 toLightVector = normalize(lightPosition - vertPos.xyz);
v_lightIntensity = max( 0.0, dot(normal, toLightVector));
v_texCoords = a_texCoord0;
gl_Position = u_projViewTrans * vertPos;
}
Fragment Shader:
varying vec2 v_texCoords;
varying float v_lightIntensity;
uniform sampler2D u_texture;
void main()
{
vec4 texCol = texture( u_texture, v_texCoords.st );
gl_FragColor = vec4( texCol.rgb * v_lightIntensity, 1.0 );
}
A Phong shading may look like this:
Vertex Shader:
attribute vec3 a_position;
attribute vec3 a_normal;
attribute vec2 a_texCoord0;
uniform mat4 u_worldTrans;
uniform mat4 u_projViewTrans;
varying vec2 v_texCoords;
varying vec3 v_vertPosWorld;
varying vec3 v_vertNVWorld;
void main()
{
vec4 vertPos = u_worldTrans * vec4(a_position, 1.0);
v_vertPosWorld = vertPos.xyz;
v_vertNVWorld = normalize(mat3(u_worldTrans) * a_normal);
v_texCoords = a_texCoord0;
gl_Position = u_projViewTrans * vertPos;
}
Fragment Shader:
varying vec2 v_texCoords;
varying vec3 v_vertPosWorld;
varying vec3 v_vertNVWorld;
uniform sampler2D u_texture;
struct PointLight
{
vec3 color;
vec3 position;
float intensity;
};
uniform PointLight u_pointLights[1];
void main()
{
vec3 toLightVector = normalize(u_pointLights[0].position - v_vertPosWorld.xyz);
float lightIntensity = max( 0.0, dot(v_vertNVWorld, toLightVector));
vec4 texCol = texture( u_texture, v_texCoords.st );
vec3 finalCol = texCol.rgb * lightIntensity * u_pointLights[0].color;
gl_FragColor = vec4( finalCol.rgb * lightIntensity, 1.0 );
}

uniform sampler2D in Vertex Shader

I tried to realize height map with GLSL.
For it, i need to sent my picture to VertexShader and get grey component.
glActiveTexture(GL_TEXTURE0);
Texture.bind();
glUniform1i(mShader.getUniformLocation("heightmap"), 0);
mShader.getUniformLocation uses glGetUniformLocation and work good for other uniforms values, that used in Fragment, Vertex Shaders. But for heightmap return -1...
VertexShader code:
#version 330 core
layout (location = 0) in vec3 position;
layout (location = 1) in vec4 color;
layout (location = 2) in vec2 texCoords;
layout (location = 3) in vec3 normal;
out vec3 Normal;
out vec3 FragPos;
out vec2 TexCoords;
out vec4 ourColor;
uniform mat4 model;
uniform mat4 view;
uniform mat4 projection;
uniform sampler2D heightmap;
void main()
{
float bias = 0.25;
float h = 0.0;
float scale = 5.0;
h = scale * ((texture2D(heightmap, texCoords).r) - bias);
vec3 hnormal = vec3(normal.x*h, normal.y*h, normal.z*h);
vec3 position1 = position * hnormal;
gl_Position = projection * view * model * vec4(position1, 1.0f);
FragPos = vec3(model * vec4(position, 1.0f));
Normal = mat3(transpose(inverse(model))) * normal;
ourColor = color;
TexCoords = texCoords;
}
may be algorithm of getting height is bad, but error with getting uniformlocation stops my work..
What is wrong? Any ideas?
UPD: texCoords (not TexCoords) of course is using in
h = scale * ((texture2D(heightmap, texCoords).r) - bias);
my mistake, but it doesn't solve the problem. Having same error..
My bet is your variable has been optimized out by driver or the shader did not compile/link properly. After trying to compile your shader (on my nVidia) I got this in the logs:
0(9) : warning C7050: "TexCoords" might be used before being initialized
You should always check the GLSL compile/link logs ? see
How to debug GLSL Fragment shader
especially how the glGetShaderInfoLog is used.
In line
h = scale * ((texture2D(heightmap, TexCoords).r) - bias);
You are using TexCoords which is output variable and not yet set so the behavior is undefined and most likely your gfx driver throw that line away (and may be others) removing the TexCoords from shader completely but that is just my assumption.
What driver and gfx card you got?
What returns the logs on your setup?

OpenGL returns -1 for an uniform that does exist and is used

I have this vertex shader:
#version 430 core
layout (location = 0) in vec3 position;
layout (location = 1) in vec2 textureCoordinate;
layout (location = 2) in vec3 normal;
layout (location = 4) in vec3 tangent;
uniform mat4 projectionMatrix;
uniform mat4 modelViewMatrix;
uniform vec3 lightPosition = vec3(0.0, 1.0, 0.0);
out vec2 texCoord;
out vec3 lightDirection;
out vec3 eyeDirection;
void main() {
vec4 P = modelViewMatrix * vec4(position, 1.0);
vec3 N = normalize(mat3(modelViewMatrix) * normal);
vec3 T = normalize(mat3(modelViewMatrix) * tangent);
vec3 B = cross(N, T);
vec3 L = lightPosition - P.xyz;
vec3 V = -P.xyz;
lightDirection = normalize(vec3(dot(V, T), dot(V, B), dot(V, N)));
eyeDirection = normalize(vec3(dot(V, T), dot(V, B), dot(V, N)));
texCoord = textureCoordinate;
gl_Position = projectionMatrix * P;
}
And this fragment shader:
#version 430 core
in vec2 texCoord;
in vec3 lightDirection;
in vec3 eyeDirection;
layout (binding = 0) uniform sampler2D tex;
layout (binding = 1) uniform sampler2D normalMap;
out vec4 color;
void main() {
vec3 ambient = vec3(0.1);
vec3 V = normalize(eyeDirection);
vec3 L = normalize(lightDirection);
vec3 N = normalize(texture(normalMap, texCoord)).rgb * 2.0 - vec3(1.0);
vec3 R = reflect(-L, N);
vec3 diffuseAlbedo = texture(tex, texCoord).rgb;
/*vec3 diffuseAlbedo = vec3(1.0);*/
vec3 diffuse = max(dot(N, L), 0.0) * diffuseAlbedo;
vec3 specular = vec3(0.0);
color = vec4(ambient + diffuse + specular, 1.0);
}
As you can see, the variable lightPosition is actually used in the calculations. But when I want to get the location of it via glGetUniformLocation, I get -1.
I looked through the other questions here on SO, like those ones:
glGetActiveUniform reports uniform exists, but glGetUniformLocation returns -1
glGetUniformLocation return -1 on nvidia cards
glGetUniformLocation() returning -1 even though used in vertex shader
Especially the last one - but in my case, lightPosition is used to calculate lightDirection and this is then used in the fragment shader. So it should not be removed.
Any ideas on what is going on here? Or how to debug this?
Also: I set a default value for the uniform. If I remove the default value, it still gives -1.
I am running Ubuntu 15.10 with Nvidia 340.96 drivers and a GF 710M card. I am running OpenGL 4.4 with core profile.
... As you can see, the variable lightPosition is actually used in the calculations. ...
You would think, right. Actually you're using it once in the Vertex Shader to assign a value to vec3 L, which is not actually used after that.
The compilation process for the shaders is very meticulous. You may have used it in a calculation in the Vertex Shader, but you didn't utilize the result L anywhere and it doesn't make any contribution to the final result calculated in the Fragment Shader, so it was "optimized away".
You must give some use to L.

OpenGL texture and UV mapping issue

I am currently trying to make a little game in OpenGL as an attempt to learn how to use the API. I've come to a point where I can move a camera around a simple scene, and I can render models and shade them with a simple phong model shader.
I'm right now working on texturing the models in the scene, so I got a copy of Maya and made (with quite some struggle) a square with a texture with the UV mapping made in within Maya.
When I render the scene, the texture is applied, but far from correct. I read the models as .obj files with a parser I wrote myself, and the textures are read using a funtion I found online a while back.
I'm not sure how to describe the problem in sufficient detail, nor what to look for in the code, but here are some code fractions that I would suspect contained the problem.
Reading the texture
GLuint loadTexture(Image* image){
GLuint textureId;
glGenTextures(1, &textureId);
glBindTexture(GL_TEXTURE_2D, textureId);
glTexImage2D(GL_TEXTURE_2D,
0,
GL_RGB,
image->width, image->height,
0,
GL_RGB,
GL_UNSIGNED_BYTE,
image->pixels);
return textureId;
}
Setting the texture prior to rendering the mesh
// set texture
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_2D, this->body_texture);
current_shader->setUniformint(0, "Difuse_texture");
Vertex shader
#version 410
layout(location = 0) in vec3 VertexPosition;
layout(location = 1) in vec3 VertexNormal;
layout(location = 1) in vec2 TextureCoord;
out vec3 Position;
out vec3 Normal;
out vec2 TexCoord;
uniform mat4 ModelMatrix;
uniform mat4 VeiwMatrix;
uniform mat4 ProjectionMatrix;
uniform mat3 NormalMatrix;
void main(){
mat4 ModelVeiwMatrix = VeiwMatrix * ModelMatrix;
mat4 MVP = ProjectionMatrix * ModelVeiwMatrix;
TexCoord = TextureCoord;
Normal = normalize( NormalMatrix * VertexNormal );
Position = vec3(ModelVeiwMatrix * vec4(VertexPosition, 1.0));
gl_Position = MVP * vec4(VertexPosition, 1.0);
}
Fragment shader
#version 410
in vec3 Position;
in vec3 Normal;
in vec2 TexCoord;
uniform vec4 LightPosition;
uniform vec3 LightIntensity;
uniform vec3 Kd;
uniform vec3 Ka;
uniform vec3 Ks;
uniform float Shininess;
uniform sampler2D Difuse_texture;
layout(location = 0) out vec4 FragColor;
vec4 ads(){
vec3 n = normalize( Normal );
vec3 s = normalize( vec3(LightPosition) - Position );
vec3 v = normalize( vec3(-Position) );
vec3 r = reflect( -s, n );
vec3 specular_light = Ks * pow(max(dot(r, v), 0.0), Shininess);
vec3 ad_light = Ka + Kd * max(dot(s, n), 0.0);
vec4 TexColor = texture2D(Difuse_texture, TexCoord);
return TexColor; // (vec4(LightIntensity, 1.0) * (vec4(ad_light, 1.0) * TexColor + vec4(specular_light, 1.0)));
}
void main() {
FragColor = ads();
}
I know some things are written strangely, but at this point I'm starting to just try anything to get it working.
Does anyone have a suggestion on how to solve this strange UV mapping?
EDIT:
OBJ LOADING
I have made the obj loader print all vertex attributes and compared these with the indexing in the .obj file. It looks like the verecies, normals and UVs are showing in the correct order.
Screenshot
The scene looks like this using just simple reg to green gradient as trexture image.
(The square should by my understading show the gradient from the texture? not just a single color)
Alignment sounds like a possible flaw, how can I correct this?
a http://imageshack.com/a/img674/9927/y0bJ51.png
SOLUTION
I made a very simple and easy to overlook mistake. In the top of the vertex shader i wrote
layout(location = 0) in vec3 VertexPosition;
layout(location = 1) in vec3 VertexNormal;
layout(location = 1) in vec2 TextureCoord;
So I guess that when I sent the normal data to location 1, I set the Texture coordinates to normal data, so the UV coords never reached the fragment shader.
Changeing to the folowing resolved the problem without further change.
layout(location = 0) in vec3 VertexPosition;
layout(location = 1) in vec3 VertexNormal;
layout(location = 2) in vec2 TextureCoord;