What is OpenGL Default Light Position? - opengl

i want to learn about default light positions in opengl.
i have a system setted up like; if there is no light in the scene, system adds a environment light in the scene :
Gl.glEnable(Gl.GL_LIGHT0);
Gl.glLightf(Gl.GL_LIGHT0, Gl.GL_CONSTANT_ATTENUATION, 99999);
if i dont touch the light's Position parameter, light is like an environment light, it doesnt seem like it has a direction, so my objects get light from every angles.. thats kind of cool.. but if i do that:
Gl.glEnable(Gl.GL_LIGHT0);
Gl.glLightf(Gl.GL_LIGHT0, Gl.GL_CONSTANT_ATTENUATION, 99999);
Gl.glLightfv(Gl.GL_LIGHT0, Gl.GL_POSITION, new Vector4(0f, 0f, 1f, 0f).ToArray());
(which is the original value already, because i tested it with Gl.glGetLightfv(Gl.GL_LIGHT0, Gl.GL_POSITION, vec); and it is the same). after i mess with position, it realy becomes like a directional sun light, but i want it to work it like enviroment light. waht am i missing here?

Related

HLSL fixed lighting location

Hi im trying to create a shader for my 3D models with materials and fog. Everything works fine but the light direction. I'm not sure what to set it to so I used a fixed value, but when I rotate my 3D model (which is a simple textured sphere) the light rotates with it. I want to change my code so that my light stays in one place according to the camera and not the object itself. I have tried multiplying the view matrix by the input normals but the same result occurs.
Also, should I be setting the light direction according to the camera instead?
EDIT: removed pastebin link since that is against the rules...
Use camera depended values just for transforming vertex pos to view and projected position (needed in shaders for clipping and rasterizer stage). The video cards needs to know, where to draw your pixel.
For lighting you normally pass additional to the camera transformed value the world position of the vertex and the normal in world position to the needed shader stages (i.e pixel shader stage for phong lighting).
So you can set your light position, or better light direction in world space coordinate system as global variable to your shaders. With that the lighting is independent of the camera view position.
If you want to have a effect like using a flashlight. You can set the lightposition to camera position, and light direction to your look direction. So the bright parts are always in the center of your viewing frustum.
good luck

Point Light not rendering

I'm trying to render a couple point lights in my scene, but having trouble getting the actual lights to illuminate. The only light I got to work is a directional light which lights up the scene initially:
Where the two rocks are displayed are the same positions as the two point lights. I've toyed around with the diffuse, color and attenuation values, but got the same results. But when changed the ambiance of either light, the color changed:
My calculations within GLSL are correct and my uniforms are read correctly as well. But somehow I lost my diffuse intensity. m_pointLight[0].DiffuseIntensity doesn't registers a change, but if the commented AmbientIntensity is used, the scene is tinted red.
m_scale += 0.0057f;
// changing the value makes scene red
m_pointLight[0].AmbientIntensity = 0.5f;
// changing the value does nothing
m_pointLight[0].DiffuseIntensity = 1.5f;
// light color is red
m_pointLight[0].Color = glm::vec3(1.0f, 0.0f, 0.0f);
// position of light (same as rock)
m_pointLight[0].Position = glm::vec3(3.0f, 1.0f, FieldDepth * (cosf(m_scale) + 1.0f) / 2.0f);
// Light decay
m_pointLight[0].Attenuation.Linear = 0.01f;
Not a real answer but I figured out part of what my problem was:
The problem was the angle I was looking from. Something is quite wrong with my code. I set my camera position to (0, 100, 75) originally. After moving the a small block of code to where I I set my perspective and other uniforms, a bit of light started to show from the point light. The problem then was when the two rocks moved, closer to the center of the floor, the point light of the first rock started to glow red. Same goes for the second rock, but it was set to blue. But as the rocks moved closer to the camera, the colors faded gradually until it appeared off. Then when it moved back to the center, the colors gradually got brighter to full capacity.
So then I was thinking this may have to do with the angle that I'm at. So from the camera position (0, 100, 75), I decided to look from the complete opposite end of the floor (-75 for the z-axis). And this is what I got:
The only problem remaining is that depending on where the camera is, the point light closest to the camera doesn't show (other than on the underside of the closest rock). The bottom right image is the only one displaying all lights correctly:
If some could explain what is going on, it would be greatly appreciated.

Per-model local rotation breaks openGL Lighting

I'm having trouble with OpenGL lighting. My issue is this: When the object has 0 rotation, the lighting is fine- otherwise the lighting works, but rotates with the object, instead of staying fixed in regards to the scene.
Sounds simple, right? The OpenGL FAQ has some simple advice on this: coordinates passed to glLightfv(GL_LIGHT0, GL_POSITION...) are multiplied by the current MODELVIEW matrix. So I must be calling this at the wrong place... except I'm not. I've copied the MODELVIEW matrix into a variable to debug, and it stays the same regardless of how my object is rotated. So it has to be something else, but I'm at a loss as to what.
I draw the model using glDrawArrays, and position my model within the world using glMatrixMult on a matrix built from a rotation quaternion and a translation. All of this takes place within glPushMatrix/glPopMatrix, so shouldn't have any side effect on the light.
A cut down version of my rendering process looks like this:
//Setup our camera
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
cameraMatrix = translate(Vector3D(Pos.mX,Pos.mY,Pos.mZ)) * camRot.QuatToMatrix();
glMultMatrixf((GLfloat*)&cameraMatrix);
//Position the light now
glEnable(GL_LIGHTING);
glEnable(GL_LIGHT0);
GLfloat lp[4] = {lightPos.mX, lightPos.mY, lightPos.mZ, 1.0f};
glLightfv(GL_LIGHT0, GL_POSITION,(GLfloat*) lp);
//Loop, doing this for each model: (mRot, mPos, and mi are model member variables)
matrix = translate(Vector3D(mPos.mX,mPos.mY,mPos.mZ)) * mRot.QuatToMatrix();
glPushMatrix();
glMultMatrixf((GLfloat*)&matrix);
glBindBuffer(GL_ARRAY_BUFFER, mi->mVertexBufHandle); //Bind the model VBO.
glDrawArrays(GL_TRIANGLES, 0, mi->verts); //Draw the object
glPopMatrix();
I thought the normals might be messed up, but when I render them out they look fine. Is there anything else that might effect openGL lighting? The FAQ mentions:
If your light source is part of a
light fixture, you also may need to
specify a modeling transform, so the
light position is in the same location
as the surrounding fixture geometry.
I took this to mean that you'd need to translate the light into the scene, kind of a no-brainer... but does it mean something else?
It might be minor, but in this line:
glLightfv(GL_LIGHT0, GL_POSITION,(GLfloat*) &lp);
remove the & (address operator). lp will already give you the array-address.
This was awhile back, but I did eventually figure out the problem. The issue I thought I was having was that the light's position got translated wrong. Picture this: the light was located at 0,0,0, but then I translated and rotated my mesh. If this had been the case, I'd have to do as suggested in the other answers and make certain I was placing my glLightfv calls in the right place.
The actual problem turned out to be much simpler, yet much more insidious. It turns out I wasn't setting the glNormalPointer correctly, and so it was being fed garbage data. While debugging, I'd render the normals to check that they were correct, but when doing so I'd manually draw them based on the positions I'd calculated. A recommendation to future debuggers: when drawing your debug info normal rays, make sure you feed the debug function /the same data/ as openGL gets. In my case, this would mean pointing my normal ray draw function's glVertexPointer to the same place as the model's glNormalPointer.
Basically an OpenGL light behaves like a vertex. So in your code it's transformed by cameraMatrix, while your meshes are transformed by cameraMatrix * matrix. Now, it looks like both cameraMatrix and matrix contain mrot.QuatToMatrix(), that is: there is a single rotation matrix there, and the light gets rotated once, while the objects get rotated twice. It doesn't look right to me, unless your actual code is different; the mRot matrix you use for each mesh should be its own, e.g. mRot[meshIndex].

Open GL Lighting Problem

I've been working on a game engine for a month now, and I've finished the basic OpenGL stuff. However, the one thing that I can't get to work like I expect it to is the lighting.
(Note: This is the first time I seriously worked with OpenGL)
What I want is a close to realistic lighting simulation - where the surfaces facing the light are lit up more than those farther, etc.. The basic light should have a position and a color. This is how I thought it could be implemented:
float lightv[4]; // populated with (0.6, 0.6, 0.6, 1)
float positionv[4]; // populated with (0, 10, 0, 1)
int lightID = GL_LIGHT0;
int attenuationType = GL_LINEAR_ATTENUATION;
float attenuationValue = 1;
glLightf(lightID, attenuationType, attenuationValue);
glLightfv(lightID, GL_DIFFUSE, lightv);
glLightfv(lightID, GL_POSITION, positionv);
glEnable(lightID);
Instead of doing what I expect it to do, it gives me lighting as if there was a light where the camera is! Every surface has the same lighting!
What am I doing wrong?
Thank you, I appreciate it.
The first thing to do is make sure you have glEnable(GL_LIGHTING); called at some point. Past that, the first thing to check is that your normals are correct. For lighting to work properly you will need to have a normal set for every vertex you draw. If you have already set your normals, you should make sure that they are all of unit length. If they do not have a length of one, lighting can act oddly. If that is all as it should be, you should keep in mind that when you set the position of a light, it is modified by the current Modelview matrix as if it were a vertex. If none of those things are relevant, I'll see if I can think of something further.
Set your light position after you set up your GL_MODELVIEW transform, since it's affected by the current transform matrix. Otherwise you get the "headlamp" effect you have discovered.

OpenGL Spotlight shining through from rear-face

I have a Spotlight source in OpenGL, pointing towards a texture mapped sphere.
I rotate the lightsource with the sphere, such that if I rotate the sphere to the 'non-light' side, that side should be dark.
The odd part is, the spotlight seems to be shining through my sphere (it's a solid, no gaps between triangles. The light seems to be 'leaking' through to the other side.
Any thoughts on why this is happening?
Screenshots:
Front view, low light to emphasize the problem
Back view, notice the round area that is 'shining through'
Its really hard to tell from the images, but:
Check if GL_LIGHT_MODEL_TWO_SIDE is being set (two sided lighting), but more importantly have a look at the normals of the sphere you are rendering.
Edit: Also - change the background colour to something lighter. Oh and make sure you aren't rendering with alpha blending turned on (maybe its a polygon sorting issue).
OK, I'm a nob - I was specifying my normals, but not calling glEnableClientState(GL_NORMAL_ARRAY). Hence all normals were facing one direction (I think that's the default, no?)
Anyway - a lesson learned - always go back over the basics.