How is Phong Shading implemented in GLSL? - c++

I am implementing a Phong Shader in GLSL for an assignment, but I don't get the same specular reflection as I should. The way I understood it, it's the same as Gouraud Shading, but instead of doing all the calculations in the vertex shader, you do them in the fragment shader, so that you interpolate the normals and then apply the phong model at each pixel. As part of the assignment I had to develop also the Gouraud shader and that works as supposed and I thought you just needed to put the vertex shader code into the fragment shader, but that doesn't seem to be the case.
What I do in the vertex shader is that I simply transform the vertex position into view coordinates and I apply the transpose inverse of the model view matrix to the vertex normal. Then, in the fragment shader I apply just the view transform to the light position and use these coordinates to calculate the vectors needed in the Phong model The lighting is almost correct, but some specular light is missing. All the parameters have been tested, so I would assume it's the light's position that is wrong, however I have no idea why my code wouldn't work when it does so in the other shader. Also, I know this isn't the most efficient way of doing it (pre-computing the normal matrix would be faster), I'm just trying to first get it to work properly.

The problem is the way how the light source position is calculated. The following code
vec3 l = normalize(mat3(VMatrix)*light);
treats light as a direction (by normalizing it and because the translation part of the view matrix is ignored), but it actually is a position. The correct code should be something like
vec3 l = (VMatrix * vec4(light, 1.0)).xyz

Related

Calculating surface normals of dynamic mesh (without geometry shader)

I have a mesh whose vertex positions are generated dynamically by the vertex shader. I've been using https://www.khronos.org/opengl/wiki/Calculating_a_Surface_Normal to calculate the surface normal for each primitive in the geometry shader, which seems to work fine.
Unfortunately, I'm planning on switching to an environment where using a geometry shader is not possible. I'm looking for alternative ways to calculate surface normals. I've considered:
Using compute shaders in two passes. One to generate the vertex positions, another (using the generated vertex positions) to calculate the surface normals, and then passing that data into the shader pipeline.
Using ARB_shader_image_load_store (or related) to write the vertex positions to a texture (in the vertex shader), which can then be read from the fragment shader. The fragment shader should be able to safely access the vertex positions (since it will only ever access the vertices used to invoke the fragment), and can then calculate the surface normal per fragment.
I believe both of these methods should work, but I'm wondering if there is a less complicated way of doing this, especially considering that this seems like a fairly common task. I'm also wondering if there are any problems with either of the ideas I've proposed, as I've had little experience with both compute shaders and image_load_store.
See Diffuse light with OpenGL GLSL. If you just want the face normals, you can use the partial derivative dFdx, dFdy. Basic fragment shader that calculates the normal vector (N) in the same space as the position:
in vec3 position;
void main()
{
vec3 dx = dFdx(position);
vec3 dy = dFdy(position);
vec3 N = normalize(cross(dx, dy));
// [...]
}

Why normals rotate with camera?

I wrote shaders for diffuse lightning.
Normals calculating in vertex shader: normal = gl_NormalMatrix * gl_Normal;
But, when I rotate the camera, the normals are also starting to rotate with the camera. How to fix it?
You must be generating your normal matrix incorrectly.
NormalMatrix = transpose(inverse(ModelMatrix * ViewMatrix))
Also, unless you're forced to use gl_NormalMatrix and gl_Normal, you should use shader uniforms and in variables and calculate the matrices yourself rather than using the older model.
If you don't know how to do this, you should find a tutorial on OpenGL 4 to learn the programmable shader pipeline. OGLDev is pretty good.

Lighting Primitives From Positions That Aren't Processed By Shaders

I have a shader which calculates diffuse lighting values. It consists of a vertex and a fragment shader that calculate the lighting intensity on a per-vertex basis. However, as expected, if I have a large GL_TRIANGLE with a light position, say, just above the center of the triangle, the light that should illuminate it does not appear because the light values are smoothly interpolated across the surface of the triangle based on the vertex calculations.
So my question is this- how can a primitive be lit by a light source at a position other than at one of its vertices?
When you do lighting at vertices, you're doing Phong lighting, which is computing a color at the vertices, and then merely interpolating the colors computed at the vertices (i.e., Gouraud shading) across all the pixels in the primitive. If you were to compute the lighting color at each pixel, you'd get Phong shading. Given your scenario, if you move the lighting computations from the vertex shader to the fragment shader, and interpolate the normal across the primitive you should get much better results.
If you make the vertex normal a varying variable and normalize it in the fragment shader, and then do your lighting computations, you should get much better results.

Can't get 2D transformation to work

me and my budy is trying to create a 2D engine, but we can't get transformation to work, this is how it looks right now. We have created matrices for each element like scaling, translation etc. But we can't get the quad to move or do anything, have we done any math errors? This is what we get now, we just render the vbo and not using the shaders. With shaders I mean that we are GLSL.
http://imgur.com/T0jDSTO
Transform
http://pastebin.com/dKRW244e
Matrix3f
http://pastebin.com/GY0872k6
Matrix4f
http://pastebin.com/f1YNuM09
VBO
http://pastebin.com/5zVgWYtK
BasicShader
http://pastebin.com/RyeSxibQ
Shader
http://pastebin.com/68tJTswq
VertexShader
http://pastebin.com/ffDvsL2Y
FragmentShader
http://pastebin.com/SWT5EKAi
From where we render and update
http://pastebin.com/dTG6HHDX
I think that I have messed up to bind the shaders to the vbo, is that right? And if so, how do I fix it...
Also, I just want answer in modern opengl, so no glBegin() and glEnd() Thank you!
In your vertex shader, you wrote gl_Position = in_Position * transform;
This should return a matrix, rather than a vector. Given that your rectangle is plain white and you have made provisions for colour, I would hazard that your shader failed to compile at all.
gl_Position = transform * in_Position; is the correct order for transforming a vector with a matrix.

Calculate normals for plane inside fragment shader

I have a situation where I need to do light shading. I don't have a vertex shader so I can't interpolate normals into my fragment shader. Also I have no ability to pass in a normal map. Can I generate normals completely in the fragment shader based,for example on fragment coordinates? The geometry is always planar in my case.
And to extend on what I am trying to do:
I am using the NV_path_rendering extension which allows rendering pure vector graphics on GPU. The problem is that only the fragment stage is accessible via shader which basically means - I can't use a vertex shader with NV_Path objects.
Since your shapes are flat and NV_PATH require compat profile you can pass normal through on of built-in varyings gl_Color or gl_SecondaryColor
Extension description says that there is some kind of interpolation:
Interpolation of per-vertex data (section 3.6.1). Path primitives have neither conventional vertices nor per-vertex data. Instead fragments generate interpolated per-fragment colors, texture coordinate sets, and fog coordinates as a linear function of object-space or eye-space path coordinate's or using the current color, texture coordinate set, or fog coordinate state directly.
http://developer.download.nvidia.com/assets/gamedev/files/GL_NV_path_rendering.txt
Here's a method which "sets the normal as the face normal", without knowing anything about vertex normals (as I understand it).
https://stackoverflow.com/a/17532576/738675
I have a three.js demo working here:
http://meetar.github.io/three.js-normal-map-0/index6.html
My implementation is getting vertex position data from the vertex shader, but it sounds like you're able to get that through other means.