I am trying to understand how to manipulate my renderings with shaders. I haven't changed the projection matrix of the scene but I draw a triangle with vertices = {-0.5, -0.5} {0.5, -0.5} {0, 0.5}. I then pass in a vec2 position of a "light" to the uniform of my fragment shader that i want to essentially shine onto my triangle from the top right of the triangle (lightPos = (0.5,0.5))
Here is a very bad drawing of where everything is located.
and this is what I aim to have in my triangle (kind of.. it doesnt need to be white to blue it just needs to be brighter near the light and darker further away)
Here is the shader
#version 460 core
in vec3 oPos;
out vec4 fragColor;
uniform vec3 u_outputColor;
uniform vec2 u_lightPosition;
void main(){
float intensity = 1 / length(oPos.xy - u_lightPosition);
vec4 col = vec4(u_outputColor, 1.0f);
fragColor = col * intensity;
}
Here is the basic code to compiling the shader(most of it is abstracted away so it is fairly simple)
/* Test data for shader program. */
program.init("passthrough.vert", "passthrough.frag");
program.setUniformVec3("u_outputColor", 0.3f, 0.3f, 0.8f);
program.setUniformVec2("u_lightPosition", 0.5f, 0.5f);
GLfloat vertices[9] = {-0.5f, -0.5, 0, 0,0.5f,0, 0.5, -0.5, 0};
Here is vertex shader:
#version 460 core
layout (location = 0) in vec3 aPos;
out vec3 oPos;
void main(){
gl_Position = vec4(aPos.x, aPos.y, aPos.z, 1.0);
}
Every single test I have run to see why I can't get this to work seems to show me that if there is a slight color change it will change the entire triangle to a different shade. All tests show a triangle of ONE color across the entire thing; no gradient at all. I want the triangle to be a gradient that is brighter near the light and darker further from it. This is driving me crazy because I have been stuck on such a simple thing for 3 hours now and it just seems like any code I write modifies all 3 vertices at once as if they are in the exact same spot. I wrote the math out and I strongly feel as if this should work. Any help is very appreciated.
EDIT
The triangle after the solution fixed my issue:
Try this for your vertex shader:
#version 460 core
layout (location = 0) in vec3 aPos;
out vec3 oPos;
void main(){
oPos.xyz = aPos.xyz; // ADD THIS LINE
gl_Position = vec4(aPos.xyz, 1.0);
}
Your version never writes to oPos, so the fragment shader gets either a) a random value or, in your case b) vec3(0,0,0). Since your color calculation is based off of:
float intensity = 1 / length(oPos.xy - u_lightPosition);
This is basically the same as
float intensity = 1 / length(-1*u_lightPosition);
So the color only depends on the light position.
You can debug and verify this by setting your fragment color to oPos:
vec4 col = vec4(oPos.xy, oPos.z + 0.5, 1.0f);
If oPos was set correctly in the vertex shader, then this line in the fragment shader would show you an RGB ramp. If oPos is not set correctly, you'll see 50% blue.
Always check for errors and logs returned from OpenGL. It should have emitted a warning about this that would have sent you straight to the problem.
Also, I'm surprised that your entire triangle isn't being clipped since vertices have a z of 0.
Related
I'm doing some OpenGL stuff in Java (lwjgl) for a project, part of which includes importing 3d models in OBJ format. Everything looks ok, until I try to displace vertices, then the models break up, you can see right through them.
Here is Suzanne from blender, UV mapped with a completely black texture (for visibility's sake). In the frag shader I'm adding some white colour to the fragment depending on the fragments angle between its normal and the world's up vector:
So far so good. But when I apply a small Y component displacement to the same vertices, I expect to see the faces 'stretch' up. Instead this happens:
Vertex shader:
#version 150
in vec3 position;
in vec2 texCoords;
in vec3 normal;
void main()
{
vertPosModel = position;
cosTheta = dot(vec3(0.0, 1.0, 0.0), normal);
if(cosTheta > 0.0 && cosTheta < 1.0)
vertPosModel += vec3(0.0, 0.15, 0.0);
gl_Position = transform * vec4(vertPosModel, 1.0);
}
Fragment shader:
#version 150
uniform sampler2D objTexture;
in vec2 texcoordOut;
in float cosTheta;
out vec4 fragColor;
void main()
{
fragColor = vec4(texture(objTexture, texcoordOut.st).rgb, 1.0) + vec4(cosTheta);
}
So, your algorithm offsets a vertex's position, based on a property derived from the vertex's normal. This will only produce a connected mesh if your mesh is completely smooth. That is, where two triangles meet, the normals of the shared vertices must be the same on both sides of the triangle.
If the model has discontinuous normals over the surface, then breaks can appear anywhere that the normal stops being continuous. That is, the edges where there is a normal discontinuity may become disconnected.
I'm pretty sure that Blender3D can generate a smooth version of Suzanne. So... did you generate a smooth mesh? Or is it faceted?
UPDATE: So it turns out this was due to a bug in the C side of things, causing some of the matrix to become malformed. The shaders are all fine. So if adding uniforms causes weird things to happen, my advice would be to use a debugger to check the value of ALL uniforms and make sure that they are all being set correctly.
So I am trying to render depth to a cube map to use as a shadow map, but when I add and use a uniform in the fragment shader everything becomes white as if the shader isn't being used. No warnings or errors are generated when compiling/linking the shader.
The shader program I am using to render the depth map (setting the depth simply to the fragment z position as a test) is as follows:
//vertex shader
#version 430
in layout(location=0) vec4 vertexPositionModel;
uniform mat4 modelToWorldMatrix;
void main() {
gl_Position = modelToWorldMatrix * vertexPositionModel;
}
//geometry shader
#version 430
layout (triangles) in;
layout (triangle_strip, max_vertices=18) out;
out vec4 fragPositionWorld;
uniform mat4 projectionMatrices[6];
void main() {
for (int face = 0; face < 6; face++) {
gl_Layer = face;
for (int i = 0; i < 3; i++) {
fragPositionWorld = gl_in[i].gl_Position;
gl_Position = projectionMatrices[face] * fragPositionWorld;
EmitVertex();
}
EndPrimitive();
}
}
//Fragment shader
#version 430
in vec4 fragPositionWorld;
void main() {
gl_FragDepth = abs(fragPositionWorld.z);
}
The main shader samples from the cubemap and simply renders the depth as greyscale colour:
vec3 lightDirection = fragPositionWorld - pointLight.position;
float closestDepth = texture(shadowMap, lightDirection).r;
finalColour = vec4(vec3(closestDepth), 1.0);
The scene is a small cube in a larger cubic room, and renders as expected, dark near z = 0 and the cube projected back onto the wall (The depth map is being rendered from the centre of the room):
Good:
[2
I can move the small cube around and the projection projects correctly onto all the sides of the cubemap. All good so far.
The problem is when I add a uniform to the fragment shader, i.e:
#version 430
in vec4 fragPositionWorld;
uniform vec3 lightPos;
void main() {
gl_FragDepth = min(lightPos.y, 0.5);
}
Everything renders as white, same as if the render failed to compile:
Bad:
gDEBugger reports that the uniform is set correctly (0,4,0) but regardless of what that lightPos is, gl_FragDepth should be set to a value less than 0.5 and appear a shade of grey (which is what happens if I set gl_FragDepth = 0.5 directly), so I can only conclude that the fragment shader is not being used for some reason and the default one is being use instead. Unfortunately I have no idea why.
I have very basic OpenGL knowledge, but I'm trying to replicate the shading effect that MeshLab's visualizer has.
If you load up a mesh in MeshLab, you'll realize that if a face is facing the camera, it is completely lit and as you rotate the model, the lighting changes as the face that faces the camera changes. I loaded a simple unit cube with 12 faces in MeshLab and captured these screenshots to make my point clear:
Model loaded up (notice how the face is completely gray):
Model slightly rotated (notice how the faces are a bit darker):
More rotation (notice how all faces are now darker):
Off the top of my head, I think the way it works is that it is somehow assigning colors per face in the shader. If the angle between the face normal and camera is zero, then the face is fully lit (according to the color of the face), otherwise it is lit proportional to the dot product between the normal vector and the camera vector.
I already have the code to draw meshes with shaders/VBO's. I can even assign per-vertex colors. However, I don't know how I can achieve a similar effect. As far as I know, fragment shaders work on vertices. A quick search revealed questions like this. But I got confused when the answers talked about duplicate vertices.
If it makes any difference, in my application I load *.ply files which contain vertex position, triangle indices and per-vertex colors.
Results after the answer by #DietrichEpp
I created the duplicate vertices array and used the following shaders to achieve the desired lighting effect. As can be seen in the posted screenshot, the similarity is uncanny :)
The vertex shader:
#version 330 core
uniform mat4 projection_matrix;
uniform mat4 model_matrix;
uniform mat4 view_matrix;
in vec3 in_position; // The vertex position
in vec3 in_normal; // The computed vertex normal
in vec4 in_color; // The vertex color
out vec4 color; // The vertex color (pass-through)
void main(void)
{
gl_Position = projection_matrix * view_matrix * model_matrix * vec4(in_position, 1);
// Compute the vertex's normal in camera space
vec3 normal_cameraspace = normalize(( view_matrix * model_matrix * vec4(in_normal,0)).xyz);
// Vector from the vertex (in camera space) to the camera (which is at the origin)
vec3 cameraVector = normalize(vec3(0, 0, 0) - (view_matrix * model_matrix * vec4(in_position, 1)).xyz);
// Compute the angle between the two vectors
float cosTheta = clamp( dot( normal_cameraspace, cameraVector ), 0,1 );
// The coefficient will create a nice looking shining effect.
// Also, we shouldn't modify the alpha channel value.
color = vec4(0.3 * in_color.rgb + cosTheta * in_color.rgb, in_color.a);
}
The fragment shader:
#version 330 core
in vec4 color;
out vec4 out_frag_color;
void main(void)
{
out_frag_color = color;
}
The uncanny results with the unit cube:
It looks like the effect is a simple lighting effect with per-face normals. There are a few different ways you can achieve per-face normals:
You can create a VBO with a normal attribute, and then duplicate vertex position data for faces which don't have the same normal. For example, a cube would have 24 vertexes instead of 8, because the "duplicates" would have different normals.
You can use a geometry shader which calculates a per-face normal.
You can use dFdx() and dFdy() in the fragment shader to approximate the normal.
I recommend the first approach, because it is simple. You can simply calculate the normals ahead of time in your program, and then use them to calculate the face colors in your vertex shader.
This is simple flat shading, instead of using per vertex normals you can evaluate per face normal with this GLSL snippet:
vec3 x = dFdx(FragPos);
vec3 y = dFdy(FragPos);
vec3 normal = cross(x, y);
vec3 norm = normalize(normal);
then apply some diffuse lighting using norm:
// diffuse light 1
vec3 lightDir1 = normalize(lightPos1 - FragPos);
float diff1 = max(dot(norm, lightDir1), 0.0);
vec3 diffuse = diff1 * diffColor1;
I'm using C++ and when I implemented a diffuse shader, it causes every other triangle to disappear.
I can post my render code if need be, but I believe the issue is with my normal matrix (which I wrote it to be the transpose inverse of the model view matrix). Here is the shader code which is sort of similar to that of a tutorial on lighthouse tutorials.
VERTEX SHADER
#version 330
layout(location=0) in vec3 position;
layout(location=1) in vec3 normal;
uniform mat4 transform_matrix;
uniform mat4 view_model_matrix;
uniform mat4 normal_matrix;
uniform vec3 light_pos;
out vec3 light_intensity;
void main()
{
vec3 tnorm = normalize(normal_matrix * vec4(normal, 1.0)).xyz;
vec4 eye_coords = transform_matrix * vec4(position, 1.0);
vec3 s = normalize(vec3(light_pos - eye_coords.xyz)).xyz;
vec3 light_set_intensity = vec3(1.0, 1.0, 1.0);
vec3 diffuse_color = vec3(0.5, 0.5, 0.5);
light_intensity = light_set_intensity * diffuse_color * max(dot(s, tnorm), 0.0);
gl_Position = transform_matrix * vec4(position, 1.0);
}
My fragment shader just outputs the "light_intensity" in the form of a color. My model is straight from Blender and I have tried different exporting options like keeping vertex order, but nothing has worked.
This is not related to you shader.
It appears to be depth test related. Here, the order of triangles in the depth relative to you viewport is messed up, because you do not make sure that only the nearest pixel to your camera gets drawn.
Enable depth testing and make sure you have a z buffer bound to your render target.
Read more about this here: http://www.opengl.org/wiki/Depth_Test
Only the triangles highlighted in red should be visible to the viewer. Due to the lack of a valid depth test, there is no chance to guarantee, what triangle is painted top most. Thus blue triangles of the faces that should not be visible will cover parts of previously drawn red triangles.
The depth test would omit this, by comparing the depth in the z buffer with the depth of the pixel to be drawn at the current moment. Only the color information of a pixel that is closer to the viewer, i.e. has a smaller z value than the z value in the buffer, shall be written to the framebuffer in order to achieve a correct result.
(Backface culling, would be nice too and, if the model is correctly exported, also allow to show it correctly. But it would only hide the main problem, not solve it.)
I have been reading a pdf file on OpenGL lighting.
It says for the Gouraud Shading:
• Gouraud shading
– Set vertex normals
– Calculate colors at vertices
– Interpolate colors across polygon
• Must calculate vertex normals!
• Must normalize vertex normals to unit length!
So that's what I did.
Here is my Vertex and Fragment Shader file
V_Shader:
#version 330
layout(location = 0) in vec3 in_Position; //declare position
layout(location = 1) in vec3 in_Color;
// mvpmatrix is the result of multiplying the model, view, and projection matrices */
uniform mat4 MVP_matrix;
vec3 ambient;
out vec3 ex_Color;
void main(void) {
// Multiply the MVP_ matrix by the vertex to obtain our final vertex position (mvp was created in *.cpp)
gl_Position = MVP_matrix * vec4(in_Position, 1.0);
ambient = vec3(0.0f,0.0f,1.0f);
ex_Color = ambient * normalize(in_Position) ; //anti ex_Color=in_Color;
}
F_shader:
#version 330
in vec3 ex_Color;
out vec4 gl_FragColor;
void main(void) {
gl_FragColor = vec4(ex_Color,1.0);
}
The interpolation is taken care by the fragment shader right?
so here is my sphere (it is low polygon btw):
Is this the standard way of implementing Gouraud Shading?
(my sphere has a center of (0,0,0))
Thanks for your patience
ex_Color = ambient * normalize(in_Position) ; //anti ex_Color=in_Color;
Allow me to quote myself, "It certainly doesn't qualify as 'lighting'." That didn't stop being true between the first time you asked this question and now.
This is not lighting. This is just normalizing the model-space position and multiplying it by the ambient color. Even if we assume that the model-space position is centered at zero and represents a point on the sphere, multiplying a light by a normal is meaningless. It is not lighting.
If you want to learn how lighting works, read this. Or this.