i want to code a little minecraft clone. Now i tried to insert some simple lighting but my results are very bad. I read much about it and i tried different solutions without any result.
Thats what i got.
Initializing:
GL11.glViewport(0, 0, Config.GAME_WIDTH, Config.GAME_HEIGHT);
GL11.glMatrixMode(GL11.GL_PROJECTION); // Select The Projection Matrix
GL11.glLoadIdentity(); // Reset The Projection Matrix
GL11.glMatrixMode(GL11.GL_MODELVIEW); // Select The Modelview Matrix
GL11.glLoadIdentity(); // Reset The Modelview Matrix
GL11.glEnable(GL11.GL_DEPTH_TEST);
GL11.glEnable(GL11.GL_CULL_FACE);
GL11.glFrontFace(GL11.GL_CCW);
GL11.glLightModeli(GL11.GL_LIGHT_MODEL_LOCAL_VIEWER, GL11.GL_TRUE);
GL11.glEnable(GL11.GL_LIGHTING);
GL11.glEnable(GL11.GL_LIGHT0);
FloatBuffer qaAmbientLight = floatBuffer(0.0f, 0.0f, 0.0f, 1.0f);
FloatBuffer qaDiffuseLight = floatBuffer(1.0f, 1.0f, 1.0f, 1.0f);
FloatBuffer qaSpecularLight = floatBuffer(1.0f, 1.0f, 1.0f, 1.0f);
GL11.glLight(GL11.GL_LIGHT0, GL11.GL_AMBIENT, qaAmbientLight);
GL11.glLight(GL11.GL_LIGHT0, GL11.GL_DIFFUSE, qaDiffuseLight);
GL11.glLight(GL11.GL_LIGHT0, GL11.GL_SPECULAR, qaSpecularLight);
FloatBuffer qaLightPosition = floatBuffer(lightPosition.x, lightPosition.y, lightPosition.z, 1.0f);
GL11.glLight(GL11.GL_LIGHT0, GL11.GL_POSITION, qaLightPosition);
So now before each render i tried this:
GL11.glClear(GL11.GL_COLOR_BUFFER_BIT | GL11.GL_DEPTH_BUFFER_BIT | GL11.GL_STENCIL_BUFFER_BIT);
GL11.glEnable(GL11.GL_DEPTH_TEST);
GL11.glClearColor(0.0f, 100.0f, 100.0f, 1.0f);
GL11.glShadeModel(GL11.GL_FLAT);
GL11.glLoadIdentity();
FloatBuffer qaLightPosition = floatBuffer(lightPosition.x, lightPosition.y, lightPosition.z, 1.0f);
GL11.glLight(GL11.GL_LIGHT0, GL11.GL_POSITION, qaLightPosition);
FloatBuffer ambientMaterial = floatBuffer(0.2f, 0.2f, 0.2f, 1.0f);
FloatBuffer diffuseMaterial = floatBuffer(0.8f, 0.8f, 0.8f, 1.0f);
FloatBuffer specularMaterial = floatBuffer(0.0f, 0.0f, 0.0f, 1.0f);
GL11.glMaterial(GL11.GL_FRONT, GL11.GL_AMBIENT, ambientMaterial);
GL11.glMaterial(GL11.GL_FRONT, GL11.GL_DIFFUSE, diffuseMaterial);
GL11.glMaterial(GL11.GL_FRONT, GL11.GL_SPECULAR, specularMaterial);
GL11.glMaterialf(GL11.GL_FRONT, GL11.GL_SHININESS, 50.0f);
Of course this is not much code but this is all about lighting. Did i make a mistake? I read that OpenGL is not as good as DirectX for lighting and shadowing.
That's what it looks like:
http://img199.imageshack.us/img199/7014/testrender.png
Can someone give me tips to get it a better look?
I found one post with an awesome block landscape.
http://i.imgur.com/zIocp.jpg
That's how it should look like :)
Can someone give me tips to get it a better look?
Neither OpenGL nor DirectX have nothing to do with lighting and shadowing, if you use programmable pipeline. The normals become just another vertex attribute, which can be used for lighting computation. Fixed functionality is old and deprecated, and thus not recommended, if you aren't forced to use it.
Changing to shaders isn't really that hard, and you won't be limited by the fixed pipeline anymore; you have then complete control over how the lighting is computed, you can easily output more debug information (such as coloring surfaces based on their normals).
That's how it should look like :)
The screen you posted has also visible ambient occlusion. Achieving this effect without shaders would be extremely hard and simply not worth the effort.
I happen to be doing a similar project myself; I wouldn't mention it, if it wasn't OpenSource and publicly available. Here's the sample result:
You can find the lighting shader code here.
I'll post an excerpt to prevent links from rotting:
float CalcDirectionalLightFactor(vec3 lightDirection, vec3 normal) {
float DiffuseFactor = dot(normalize(normal), -lightDirection);
if (DiffuseFactor > 0) {
return DiffuseFactor;
}
else {
return 0.0;
}
}
vec3 DiffuseColor = Light0.Color * Light0.DiffuseIntensity * CalcDirectionalLightFactor(Light0.Direction, normal);
Bartek's answer is a good one. You will want to go down the path of writing your own shaders, understanding what OpenGl provides for shadowing and lighting and not relying on older, deprecated lighting models. It is a lot more complex the glEnable(LIGHTING_AND_SHADOWING).
But, if you just want to play with your code to see the colors change from binary black/white, one potential idea is turning off the qaSpecularLight (which creates "glossy" all-white highlights that don't help you get to a "matte" look) and adjusting the glShadeModel setting for SMOOTH shading.
That should help somewhat, but will not get you all the way to your goal. Follow Bartek's suggested path (or google for similar ideas).
Related
For practice I am setting up a 2d/orthographic rendering pipeline in openGL to be used for a simple game, but I am having issues related to the coordinate system.
In short, rotations distort 2d shapes, and I cannot seem to figure why. I am also not entirely sure that my coordinate system is sound.
First I looked for previous answers, but the following (the most relevant 2D opengl rotation causes sprite distortion) indicates that the problem was an incorrect ordering of transformations, but for now I am using just a view matrix and projection matrix, multiplied in the correct order in the vertex shader:
gl_Position = projection * view * model vec4(1.0); //(The model is just the identity matrix.)
To summarize my setup so far:
- I am successfully uploading a quad that should stretch across the whole screen:
GLfloat vertices[] = {
-wf, hf, 0.0f, 0.0, 0.0, 1.0, 1.0, // top left
-wf, -hf, 0.0f, 0.0, 0.0, 1.0, 1.0, // bottom left
wf, -hf, 0.0f, 0.0, 0.0, 1.0, 1.0, // bottom right
wf, hf, 0.0f, 0.0, 0.0, 1.0, 1.0, // top right
};
GLuint indices[] = {
0, 1, 2, // first Triangle
2, 3, 0, // second Triangle
};
wf and hf are 1, and I am trying to use a -1 to 1 coordinate system so I don't need to scale by the resolution in shaders (though I am not sure that this is correct to do.)
My viewport and orthographic matrix:
glViewport(0, 0, SCREEN_WIDTH, SCREEN_HEIGHT);
...
glm::mat4 mat_ident(1.0f);
glm::mat4 mat_projection = glm::ortho(-1.0f, 1.0f, -1.0f, 1.0f, -1.0f, 1.0f);
... though this clearly does not factor in the screen width and height. I have seen others use width and height instead of 1s, but this seems to break the system or display nothing.
I rotate with a static method that modifies a struct containing a glm::quaternion (time / 1000) to get seconds:
main_cam.rotate((GLfloat)curr_time / TIME_UNIT_TO_SECONDS, 0.0f, 0.0f, 1.0f);
// which does: glm::angleAxis(angle, glm::vec3(x, y, z) * orientation)
Lastly, I pass the matrix as a uniform:
glUniformMatrix4fv(MAT_LOC, 1, GL_FALSE, glm::value_ptr(mat_projection * FreeCamera_calc_view_matrix(&main_cam) * mat_ident));
...and multiply in the vertex shader
gl_Position = u_matrix * vec4(a_position, 1.0);
v_position = a_position.xyz;
The full-screen quad rotates on its center (0, 0 as I wanted), but its length and width distort, which means that I didn't set something correctly.
My best guess is that I haven't created the right ortho matrix, but admittedly I have had trouble finding anything else on stack overflow or elsewhere that might help debug. Most answers suggest that the matrix multiplication order is wrong, but that is not the case here.
A secondary question is--should I not set my coordinates to 1/-1 in the context of a 2d game? I did so in order to make writing shaders easier. I am also concerned about character/object movement once I add model matrices.
What might be causing the issue? If I need to multiply the arguments to gl::ortho by width and height, then how do I transform coordinates so v_position (my "in"/"varying" interpolated version of the position attribute) works in -1 to 1 as it should in a shader? What are the implications of choosing a particular coordinates system when it comes to ease of placing entities? The game will use sprites and textures, so I was considering a pixel coordinate system, but that quickly became very challenging to reason about on the shader side. I would much rather have THIS working.
Thank you for your help.
EDIT: Is it possible that my varying/interpolated v_position should be set to the calculated gl_Position value instead of the attribute position?
Try accounting for the aspect ratio of the window you are displaying on in the first two parameters of glm::ortho to reflect the aspect ratio of your display.
GLfloat aspectRatio = SCREEN_WIDTH / SCREEN_HEIGHT;
glm::mat4 mat_projection = glm::ortho(-aspectRatio, aspectRatio, -1.0f, 1.0f, -1.0f, 1.0f);
So I have begun learning OpenGL, reading from the book "OpenGL Super Bible 5 ed.". It's explains things really well, and I have been able to create my first gl program myself! Just something simple, a rotating 3d pyramid.
Now for some reason one of the faces are not rendering. I checked the vertecies (plotted it on paper first) and it seemed to be right. Found out if I changed the shader to draw a line loop, it would render. However it would not render a triangle. Can anyone explain why?
void setupRC()
{
glClearColor(0.0f, 0.0f, 0.0f, 1.0f);
shaderManager.InitializeStockShaders();
M3DVector3f vVerts1[] = {-0.5f,0.0f,-0.5f,0.0f,0.5f,0.0f,0.5f,0.0f,-0.5f};
M3DVector3f vVerts2[] = {-0.5f,0.0f,-0.5f,0.0f,0.5f,0.0f,-0.5f,0.0f,0.5f};
M3DVector3f vVerts3[] = {-0.5f,0.0f,0.5f,0.0f,0.5f,0.0f,0.5f,0.0f,0.5f};
M3DVector3f vVerts4[] = {0.5f,0.0f,0.5f,0.0f,0.5f,0.0f,0.5f,0.0f,-0.5f};
triangleBatch1.Begin(GL_LINE_LOOP, 3);
triangleBatch1.CopyVertexData3f(vVerts1);
triangleBatch1.End();
triangleBatch2.Begin(GL_TRIANGLES, 3);
triangleBatch2.CopyVertexData3f(vVerts2);
triangleBatch2.End();
triangleBatch3.Begin(GL_TRIANGLES, 3);
triangleBatch3.CopyVertexData3f(vVerts3);
triangleBatch3.End();
triangleBatch4.Begin(GL_TRIANGLES, 3);
triangleBatch4.CopyVertexData3f(vVerts4);
triangleBatch4.End();
glEnable(GL_CULL_FACE);
}
float rot = 1;
void renderScene()
{
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT | GL_STENCIL_BUFFER_BIT);
GLfloat vRed[] = {1.0f, 0.0f, 0.0f, 0.5f};
GLfloat vBlue[] = {0.0f, 1.0f, 0.0f, 0.5f};
GLfloat vGreen[] = {0.0f, 0.0f, 1.0f, 0.5f};
GLfloat vWhite[] = {1.0f, 1.0f, 1.0f, 0.5f};
M3DMatrix44f transformMatrix;
if (rot >= 360)
rot = 0;
else
rot = rot + 1;
m3dRotationMatrix44(transformMatrix,m3dDegToRad(rot),0.0f,1.0f,0.0f);
shaderManager.UseStockShader(GLT_SHADER_FLAT, transformMatrix, vRed);
triangleBatch1.Draw();
shaderManager.UseStockShader(GLT_SHADER_FLAT, transformMatrix, vBlue);
triangleBatch2.Draw();
shaderManager.UseStockShader(GLT_SHADER_FLAT, transformMatrix, vGreen);
triangleBatch3.Draw();
shaderManager.UseStockShader(GLT_SHADER_FLAT, transformMatrix, vWhite);
triangleBatch4.Draw();
glutSwapBuffers();
glutPostRedisplay();
Sleep(10);
}
You've most likely defined the vertices in clockwise order for the triangle that isn't showing, and in counterclockwise order (normally the default) for those that are. Clockwise winding essentially creates an inward facing normal and thus OpenGL won't bother to render it when culling is enabled.
The easiest way to check this is to set glCullFace(GL_FRONT)--that should toggle it so you see the missing triangle and no longer see the other three.
The only thing I see that affects polygons here is glEnable(GL_CULL_FACE);.
You shouldn't have that, because if you plot your vertices backwards, the polygon won't render.
Remove it or actually call glDisable(GL_CULL_FACE); to be sure.
In your case, it's not likely that you want to draw a polygon that you can see from one side only.
Im using glColor4f(1.0f, 1.0f, 1.0f, alpha_); to set transparency to primitives I'm drawing.
However I'd like to be able to read the current opengl alpha value. Is that possible?
e.g.
float current_alpha = glGetAlpha(); //???
glColor4f(1.0f, 1.0f, 1.0f, alpha_*current_alpha);
Either you store the last alpha value you sent using glColor4f, either you use:
float currentColor[4];
glGetFloatv(GL_CURRENT_COLOR,currentColor);
Do you mean the alpha value of the fragment you're drawing on (which would explain why you want alpha_ * current_alpha)? If so, remember that reading a fragment back from the pipeline is expensive.
If you're rendering back to front, consider using the GL_SRC_ALPHA + GL_ONE_MINUS_SRC_ALPHA trick.
This is a simple issue that I'm somewhat ashamed to ask for help on.
I'm making a simple call to gluSphere to render a sphere, however, it does not light properly even though I'm pretty sure I added the normals and lighting correctly. If, however, I add a texture, the model lights normally, except it seems to be always SMOOTH, and I cannot change it to flat.
This is the lighting code in my init() function:
gl.glLightfv( GL.GL_LIGHT0, GL.GL_AMBIENT , AMBIENT_LIGHT, 0 );
gl.glLightfv( GL.GL_LIGHT0, GL.GL_DIFFUSE , DIFFUSE_LIGHT, 0 );
gl.glLightfv( GL.GL_LIGHT0, GL.GL_POSITION, light_pos , 0 );
gl.glEnable ( GL.GL_LIGHT0 );
gl.glEnable ( GL.GL_LIGHTING );
this is my sphere code in my display() function:
gl.glColor3d(1.0, 1.0, 1.0);
glu.gluQuadricDrawStyle (quad, GLU.GLU_FILL);
glu.gluQuadricNormals (quad, GLU.GLU_FLAT);
glu.gluQuadricOrientation(quad, GLU.GLU_OUTSIDE);
glu.gluSphere(quad, 1.0, lat, lon);
Please advise.
EDIT:
light values:
public final static float[] DIFFUSE_LIGHT = { 1.0f, 1.0f, 1.0f, 1.0f };
public final static float[] AMBIENT_LIGHT = { 0.3f, 0.3f, 0.3f, 1.0f };
public float[] light_pos = { -2.0f, 2.0f, 10.0f, 0.0f };
added materials, no change:
gl.glMaterialfv(GL.GL_FRONT, GL.GL_AMBIENT , new float[]{0.5f, 0.5f, 0.5f, 1.0f}, 0);
gl.glMaterialfv(GL.GL_FRONT, GL.GL_DIFFUSE , new float[]{1.0f, 1.0f, 1.0f, 1.0f}, 0);
gl.glMaterialfv(GL.GL_FRONT, GL.GL_SPECULAR, new float[]{0.7f, 0.7f, 0.7f, 1.0f}, 0);
gl.glMaterialf (GL.GL_FRONT, GL.GL_SHININESS, 0.5f);
gl.glMaterialfv(GL.GL_FRONT, GL.GL_EMISSION, new float[]{0.3f, 0.3f, 0.3f, 0.0f}, 0);
EDIT2:
Blah, I figured i had a:
gl.glEnable(GL.GL_TEXTURE_2D);
active somewhere and it was causing my model not to have shading if there was no texture associated with it. -_- carry on good people, carry on.
I encountered lighting problem too and figured out that i need to add glEnable(GL_NORMALIZE).
What kind of lighting are you expecting? The third parameter to glLightfv is supposed to be the values of the light you are setting.
Are there any translations/rotations you do that may affect the position of the light?
Is GL_COLOR_MATERIAL enabled or disabled? It may overwrite your material settings if it is.
Even if your code is not enabling GL_TEXTURE_2D, you should try disabling it manually right before the line is reached, just in case.
If you wish to ignore gluSphere completely, there's some code in this thread that you can use.
For my project i needed to rotate a rectangle. I thought, that would be easy but i'm getting an unpredictable behavior when running it..
Here is the code:
glPushMatrix();
glRotatef(30.0f, 0.0f, 0.0f, 1.0f);
glTranslatef(vec_vehicle_position_.x, vec_vehicle_position_.y, 0);
glEnable(GL_TEXTURE_2D);
glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
glBegin(GL_QUADS);
glTexCoord2f(0.0f, 0.0f);
glVertex2f(0, 0);
glTexCoord2f(1.0f, 0.0f);
glVertex2f(width_sprite_, 0);
glTexCoord2f(1.0f, 1.0f);
glVertex2f(width_sprite_, height_sprite_);
glTexCoord2f(0.0f, 1.0f);
glVertex2f(0, height_sprite_);
glEnd();
glDisable(GL_BLEND);
glDisable(GL_TEXTURE_2D);
glPopMatrix();
The problem with that, is that my rectangle is making a translation somewhere in the window while rotating. In other words, the rectangle doesn't keep the position : vec_vehicle_position_.x and vec_vehicle_position_.y.
What's the problem ?
Thanks
You need to flip the order of your transformations:
glRotatef(30.0f, 0.0f, 0.0f, 1.0f);
glTranslatef(vec_vehicle_position_.x, vec_vehicle_position_.y, 0);
becomes
glTranslatef(vec_vehicle_position_.x, vec_vehicle_position_.y, 0);
glRotatef(30.0f, 0.0f, 0.0f, 1.0f);
To elaborate on the previous answers.
Transformations in OpenGL are performed via matrix multiplication. In your example you have:
M_r_ - the rotation transform
M_t_ - the translation transform
v - a vertex
and you had applied them in the order:
M_r_ * M_t_ * v
Using parentheses to clarify:
( M_r_ * ( M_t_ * v ) )
We see that the vertex is transformed by the closer matrix first, which is in this case the translation. It can be a bit counter intuitive because this requires you to specify the transformations in the order opposite of that which you want them applied in. But if you think of how the transforms are placed on the matrix stack it should hopefully make a bit more sense (the stack is pre-multiplied together).
Hence why in order to get your desired result you needed to specify the transforms in the opposite order.
Inertiatic provided a very good response. From a code perspective, your transformations will happen in the reverse order they appear. In other words, transforms closer to the actual drawing code will be applied first.
For example:
glRotate();
glTranslate();
glScale();
drawMyThing();
Will first scale your thing, then translate it, then rotate it. You effectively need to "read your code backwards" to figure out which transforms are being applied in which order. Also keep in mind what the state of these transforms is when you push and pop the model-view stack.
Make sure the rotation is applied before the translation.