Why normals rotate with camera? - opengl

I wrote shaders for diffuse lightning.
Normals calculating in vertex shader: normal = gl_NormalMatrix * gl_Normal;
But, when I rotate the camera, the normals are also starting to rotate with the camera. How to fix it?

You must be generating your normal matrix incorrectly.
NormalMatrix = transpose(inverse(ModelMatrix * ViewMatrix))
Also, unless you're forced to use gl_NormalMatrix and gl_Normal, you should use shader uniforms and in variables and calculate the matrices yourself rather than using the older model.
If you don't know how to do this, you should find a tutorial on OpenGL 4 to learn the programmable shader pipeline. OGLDev is pretty good.

Related

How is Phong Shading implemented in GLSL?

I am implementing a Phong Shader in GLSL for an assignment, but I don't get the same specular reflection as I should. The way I understood it, it's the same as Gouraud Shading, but instead of doing all the calculations in the vertex shader, you do them in the fragment shader, so that you interpolate the normals and then apply the phong model at each pixel. As part of the assignment I had to develop also the Gouraud shader and that works as supposed and I thought you just needed to put the vertex shader code into the fragment shader, but that doesn't seem to be the case.
What I do in the vertex shader is that I simply transform the vertex position into view coordinates and I apply the transpose inverse of the model view matrix to the vertex normal. Then, in the fragment shader I apply just the view transform to the light position and use these coordinates to calculate the vectors needed in the Phong model The lighting is almost correct, but some specular light is missing. All the parameters have been tested, so I would assume it's the light's position that is wrong, however I have no idea why my code wouldn't work when it does so in the other shader. Also, I know this isn't the most efficient way of doing it (pre-computing the normal matrix would be faster), I'm just trying to first get it to work properly.
The problem is the way how the light source position is calculated. The following code
vec3 l = normalize(mat3(VMatrix)*light);
treats light as a direction (by normalizing it and because the translation part of the view matrix is ignored), but it actually is a position. The correct code should be something like
vec3 l = (VMatrix * vec4(light, 1.0)).xyz

Calculating surface normals of dynamic mesh (without geometry shader)

I have a mesh whose vertex positions are generated dynamically by the vertex shader. I've been using https://www.khronos.org/opengl/wiki/Calculating_a_Surface_Normal to calculate the surface normal for each primitive in the geometry shader, which seems to work fine.
Unfortunately, I'm planning on switching to an environment where using a geometry shader is not possible. I'm looking for alternative ways to calculate surface normals. I've considered:
Using compute shaders in two passes. One to generate the vertex positions, another (using the generated vertex positions) to calculate the surface normals, and then passing that data into the shader pipeline.
Using ARB_shader_image_load_store (or related) to write the vertex positions to a texture (in the vertex shader), which can then be read from the fragment shader. The fragment shader should be able to safely access the vertex positions (since it will only ever access the vertices used to invoke the fragment), and can then calculate the surface normal per fragment.
I believe both of these methods should work, but I'm wondering if there is a less complicated way of doing this, especially considering that this seems like a fairly common task. I'm also wondering if there are any problems with either of the ideas I've proposed, as I've had little experience with both compute shaders and image_load_store.
See Diffuse light with OpenGL GLSL. If you just want the face normals, you can use the partial derivative dFdx, dFdy. Basic fragment shader that calculates the normal vector (N) in the same space as the position:
in vec3 position;
void main()
{
vec3 dx = dFdx(position);
vec3 dy = dFdy(position);
vec3 N = normalize(cross(dx, dy));
// [...]
}

Can't get 2D transformation to work

me and my budy is trying to create a 2D engine, but we can't get transformation to work, this is how it looks right now. We have created matrices for each element like scaling, translation etc. But we can't get the quad to move or do anything, have we done any math errors? This is what we get now, we just render the vbo and not using the shaders. With shaders I mean that we are GLSL.
http://imgur.com/T0jDSTO
Transform
http://pastebin.com/dKRW244e
Matrix3f
http://pastebin.com/GY0872k6
Matrix4f
http://pastebin.com/f1YNuM09
VBO
http://pastebin.com/5zVgWYtK
BasicShader
http://pastebin.com/RyeSxibQ
Shader
http://pastebin.com/68tJTswq
VertexShader
http://pastebin.com/ffDvsL2Y
FragmentShader
http://pastebin.com/SWT5EKAi
From where we render and update
http://pastebin.com/dTG6HHDX
I think that I have messed up to bind the shaders to the vbo, is that right? And if so, how do I fix it...
Also, I just want answer in modern opengl, so no glBegin() and glEnd() Thank you!
In your vertex shader, you wrote gl_Position = in_Position * transform;
This should return a matrix, rather than a vector. Given that your rectangle is plain white and you have made provisions for colour, I would hazard that your shader failed to compile at all.
gl_Position = transform * in_Position; is the correct order for transforming a vector with a matrix.

Using GLSL shaders + lighting / normals

I've got this not-so-small-anymore tile-based game, which is my first real OpenGL project. I want to render every tile as a 3D object. So at first I created some objects, like a cube and a sphere, provided them with vertex normals and rendered them in immediate mode with flat shading. But since I've got like 10.000 objects per level, it was a bit slow. So I put my vertices and normals into VBOs.
That's where I encountered the first problem: Before using VBOs I just push()ed and pop()ed matrices for every object and used glTranslate / glRotate to place them in my scene. But when I did the same with VBOs, the lighting started to behave strangely. Instead of a fixed lighting position behind the camera, the light seemed to rotate with my objects. When moving around them 180 degrees I could see only a shadow.
So i did some research. I could not find any answer to my specific problem, but I read, that instead of using glTranslate/glRotate one should implement shaders and provide them with uniform matrices.
I thought "perhaps that could fix my problem too" and implemented a first small vertex shader program which only stretched my objects a bit, just to see if I could get a shader to work before focusing on the details.
void main(void)
{
vec4 v = gl_Vertex;
v.x = v.x * 0.5;
v.y = v.y * 0.5;
gl_Position = gl_ModelViewProjectionMatrix * v;
}
Well, my objects get stretched - but now OpenGLs flat shading is broken. I just get white shades. And I can't find any helpful information. So I got a few questions:
Can I only use one shader at a time, and when using my own shader, OpenGLs flat shading is turned off? So do I have to implement flat shading myself?
What about my vector normals? I read somewhere, that there is something like a normal-matrix. Perhaps I have to apply operations to my normals as well when modifying vertices?
That your lighting gets messed up with matrix operations changes means, that your calls to glLightfv(..., GL_POSITION, ...) happen in the wrong context (not the OpenGL context, but state of matrices, etc.).
Well, my objects get stretched - but now OpenGLs flat shading is broken. I just get white shades
I think you mean Gourad shading (flat shading means something different). The thing is: If you're using a vertex shader you must do everthing the fixed function pipeline did. That includes the lighting calculation. Lighthouse3D has a nice tutorial http://www.lighthouse3d.com/tutorials/glsl-tutorial/lighting/ as does Nicol Bolas' http://arcsynthesis.org/gltut/Illumination/Illumination.html

per-fragment lighting coordinate system

I'm developing an OpenGL 2.1 application using shaders and I'm having a problem with my per-fragment lighting. The lighting is correct when my scene initial loads, but as I navigate around the scene, the lighting moves around with the "camera", rather than staying in a static location. For example, if I place my light way off to the right, the right side of objects will be illuminated. If I then move the camera to the other side of the object and point in the opposite direction, the lighting is still on the right side of the object (rather than being on the left, like it should now). I assume that I am calculating the lighting in the wrong coordinate system, and this is causing the wrong behavior. I'm calculating lighting the following way...
In vertex shader...
ECPosition = gl_ModelViewMatrix * gl_Vertex;
WCNormal = gl_NormalMatrix * vertexNormal;
where vertexNormal is the normal in object/model space.
In the fragment shader...
float lightIntensity = 0.2; //ambient
lightIntensity += max(dot(normalize(LightPosition - vec3(ECPosition)), WCNormal), 0.0) * 1.5; //diffuse
where LightPosition is, for example, (100.0, 10.0, 0.0), which would put the light on the right side of the world as described above. The part that I'm unsure of, is the gl_NormalMatrix part. I'm not exactly sure what this matrix is and what coordinate space it puts my normal into (I assume world space). If the normal is put into world space, then I figured the problem was that ECPosition is in eye space while LightPosition and WCNormal are in world space. Something about this doesn't seem right but I can't figure it out. I also tried putting ECPosition into world space my multiplying it by my own modelMatrix that only contains the transformations I do to get the coordinate into world space, but this didn't work. Let me know if I need to provide other information about my shaders or code.
gl_NormalMatrix transforms your normal into eye-space (see this tutorial for more details).
I think in order for your light to be in a static world position you ought to be transforming your LightPosition into eye-space as well by multiplying it by your current view matrix.