Related
I am trying to render 3D prisms in LWJGL OpenGL with flat shading. For example, I have a cube indexed as following:
I only have 8 vertices in the vertex buffer, which I have indexed as above. Is there any way to implement flat normal shading on the cube such as below? I don't want to rewrite my vertex and index buffers to include duplicate vertices if possible.
If you don't need any other attributes (e.g. texture coordinates), then there is an option to create a cube mesh with face normal vectors, by 8 vertices only. Use the flat Interpolation qualifier for the normal vector.
Vertex shader:
flat out vec3 surfaceNormal;
Fragment sahder:
flat out vec3 surfaceNormal;
When the flat qualifier is used, then the output of the vertex shader will not be interpolated. The value given to the fragment shader is one of the attributes associated to one vertex of the primitive, the Provoking vertex.
For a GL_TRINANGLE primitive this is either the last or the first vertex. That can be chosen by glProvokingVertex.
Choose the first vertex:
glProvokingVertex(GL_FIRST_VERTEX_CONVENTION);
For the order of the points of your cube mesh (image in the question)
front back
1 3 7 5
+---+ +---+
| | | |
+---+ +---+
0 2 6 4
you have to setup the following vertex coordinates and normal vectors:
// x y z nx, ny, nz
-1, -1, -1, 0, -1, 0, // 0, nv front
-1, -1, 1, 0, 0, 1, // 1, nv top
1, -1, -1, 0, 0, 0, // 2
1, -1, 1, 1, 0, 0, // 3, nv right
1, 1, -1, 0, 1, 0, // 4, nv back
1, 1, 1, 0, 0, 0, // 5
-1, 1, -1, 0, 0, -1, // 6, nv bottom
-1, 1, 1, -1, 0, 0, // 7, nv left
Define the indices in that way, that the vertices 7, 3, 0, 4, 6, 1 are the first vertex for both triangles of the left, right, front, back, bottom and top of the cube:
0, 2, 3, 0, 3, 1, // front
4, 6, 7, 4, 7, 5, // back
3, 2, 4, 3, 4, 5, // right
7, 6, 0, 7, 0, 1, // left
6, 4, 2, 6, 2, 0, // bottom
1, 3, 5, 1, 5, 7 // top
Draw 12 triangle primitives. e.g:
glDrawElements(GL_TRIANGLES, 36, GL_UNSIGNED_INT, 0);
For flat shading, it is better to use a geometry shader to compute the normals for each of the primitives. Although you can use the provoking-vertex method when rendering a cube, you cannot use it for certain geometric objects where the number of faces is more than that of the vertices: e.g. consider the polyhedron obtained by gluing two tetrahedra at their base triangle. Such an object will have 6 triangles but only 5 vertices (note that Euler's formula still holds: v-e+f = 5-9+6 = 2), so there are not enough vertices to send the face-normals via the vertices. Even if you can do that, another reason not to use provokig-vertex method is that it is not convenient to do so, because you would have to find a way to enumare the vertices in a way such that each vertex uniquely 'represents' a single face, so that you can associate the face-normal with it.
In a nutshell, just use a geometry shader, it is much simpler and more importantly much more robust. Not to mention that the normal calculations are done on the fly inside the GPU, rather than you having to set them up on CPU, creating & binding the necessary buffers and defining attributes which increases both the set-up costs and eats up the memory bandwith between the CPU and the GPU.
i created bezier surface in OpenGL this way:
GLfloat punktyWSP[5][5][3] = {
{ {0,0,4}, {1,0,4},{2,0,4},{3,0,4},{4,1,4}},
{ {0,0,3}, {1,1,3},{2,1,3},{3,1,3},{4,1,3} },
{ {0,1,2}, {1,2,2},{2,6,2},{3,2,2},{4,1,2} },
{ {0,0,1}, {1,1,1},{2,1,1},{3,1,1},{4,1,1} },
{ {0,0,0}, {1,0,0},{2,0,0},{3,0,0},{4,1,0} }
};
glMap2f(GL_MAP2_VERTEX_3, 0, 1, 3, 5, 0, 1, 15, 5, &punktyWSP[0][0][0]);
glEnable(GL_MAP2_VERTEX_3);
glMapGrid2f(u, 0, 1, v, 0, 1);
glShadeModel(GL_FLAT);
glEnable (GL_AUTO_NORMAL);
glEvalMesh2(GL_FILL, 0, u, 0, v);
Now i want to texture it.
Is there any way to add auto texture coordinates to my surface as it it with norms and glenable(gl_auto_normal)?
If there is no such a function, do you have any idea how to add coordinates to my surface? Maybe glEnable(GL_MAP2_TEXTURE_COORD_2) ?
I'm developing a 2D game called Spaceland and I've ran into a problem with clearing the screen. Whenever I call glClear(GL_COLOR_BUFFER_BIT) every frame, it keeps my screen black until i stop calling it. I have tested this by assigning glClear() to a key, and when I hold it down the screen turns black, when not pressed, the quad that is spreading across the screen just grows until I clear again.
I am using glClearColor(0, 0, 0, 1) when I create a window. I have tried turning off and on glfwSwapInterval().
create() function in my Window class:
public void create(boolean vsync) {
GLFWErrorCallback.createPrint(System.err).set();
GLFWVidMode vid = glfwGetVideoMode(glfwGetPrimaryMonitor());
keys = new boolean[GLFW_KEY_LAST];
for (int i = 0; i < GLFW_KEY_LAST; i ++) {
keys[i] = false;
}
glfwWindowHint(GLFW_RESIZABLE, GLFW_FALSE);
ID = glfwCreateWindow(vid.width(), vid.height(), TITLE, glfwGetPrimaryMonitor(), 0);
if (ID == 0)
throw new IllegalStateException("Error whilst creating window: '" + TITLE + "'");
glfwMakeContextCurrent(ID);
createCapabilities();
glClearColor(0, 0, 0, 1);
camera = new Camera(getWidth(), getHeight());
glfwSwapInterval(vsync ? 1 : 0);
}
Sprite Class:
public class Sprite {
private VertexArray vao;
private VertexBuffer
pVbo,
iVbo;
private int vertexCount;
private float scale;
private Vector3f position;
private Vector3f rotation;
private Matrix4f tMatrix;
public Sprite(float[] pos, int[] indices) {
vertexCount = indices.length;
position = new Vector3f(0, 0, 0);
rotation = new Vector3f(0, 0, 0);
scale = 0.1f;
tMatrix = MatrixHelper.createTransformationMatrix(position, rotation, scale);
vao = new VertexArray();
pVbo = new VertexBuffer(false);
iVbo = new VertexBuffer(true);
vao.bind();
pVbo.bind();
pVbo.add(pos);
vao.add();
pVbo.unbind();
iVbo.bind();
iVbo.add(indices);
iVbo.unbind();
vao.unbind();
}
public void setPosition(float x, float y, float z) {
position.x = x;
position.y = y;
position.z = z;
}
public void setRotation(Vector3f rot) {
rotation = rot;
}
public void render(int renderType) {
MatrixHelper.setTMatrixPosition(tMatrix, position);
setPosition(getPosition().x + 0.0001f, 0, 0);
System.out.println(tMatrix);
Spaceland.shader.bind();
Spaceland.shader.editValue("transformation", tMatrix);
vao.bind();
glEnableVertexAttribArray(0);
iVbo.bind();
glDrawElements(renderType, vertexCount, GL_UNSIGNED_INT, 0);
iVbo.unbind();
glDisableVertexAttribArray(0);
vao.unbind();
Spaceland.shader.unbind();
}
public Vector3f getPosition() {
return position;
}
}
I don't think you need to see my Camera class or MatrixHelper class as the problem has occured before implementing this.
Main class (ignore rose[] and roseI[] it's just a cool pattern I made as a test):
public class Spaceland {
public static Window window;
public static Sprite sprite;
public static Shader shader;
public static float[] rose = {
-0.45f, 0f,
0.45f, 0f,
0f, 0.45f,
0f, -0.45f,
-0.4f, -0.2f,
-0.4f, 0.2f,
0.4f, -0.2f,
0.4f, 0.2f,
-0.2f, -0.4f,
-0.2f, 0.4f,
0.2f, -0.4f,
0.2f, 0.4f
};
public static int[] roseI = {
0, 1, 0, 2, 0, 3, 0, 4, 0, 5, 0, 6, 0, 7, 0, 8, 0, 9, 0, 10, 0, 11,
1, 2, 1, 3, 1, 4, 1, 5, 1, 6, 1, 7, 1, 8, 1, 9, 1, 10, 1, 11,
2, 3, 2, 4, 2, 5, 2, 6, 2, 7, 2, 8, 2, 9, 2, 10, 2, 11,
3, 4, 3, 5, 3, 6, 3, 7, 3, 8, 3, 9, 3, 10, 3, 11,
4, 5, 4, 6, 4, 7, 4, 8, 4, 9, 4, 10, 4, 11,
5, 6, 5, 7, 5, 8, 5, 9, 5, 10, 5, 11,
6, 7, 6, 8, 6, 9, 6, 10, 6, 11,
7, 8, 7, 9, 7, 10, 7, 11,
8, 9, 8, 10, 8, 11,
9, 10, 9, 11,
10, 11,
};
public static float[] quad = {
0.5f, 0.5f,
0.5f, -0.5f,
-0.5f, 0.5f,
-0.5f, -0.5f
};
public static int[] quadI = {
2, 0, 3,
0, 1, 3
};
public static void main(String[] args) {
init();
}
public static void loop() {
while (!window.isCloseRequested()) {
update();
render();
}
destroy(0);
}
public static void init() {
if (!glfwInit())
throw new IllegalStateException("Error whilst initialising GLFW");
window = new Window("Spaceland");
window.create(true);
shader = new Shader("src/main/java/com/spaceland/graphics/fragment.fs", "src/main/java/com/spaceland/graphics/vertex.vs");
sprite = new Sprite(quad, quadI);
loop();
}
public static void render() {
window.render();
sprite.render(GL11.GL_TRIANGLES);
}
public static void update() {
window.update();
if (window.isDown(GLFW_KEY_SPACE)) {
glClear(GL_COLOR_BUFFER_BIT);
}
}
public static void destroy(int error) {
window.destroy();
glfwTerminate();
glfwSetErrorCallback(null).free();
shader.destroy();
VertexBuffer.deleteAll();
VertexArray.destroyAll();
System.exit(error);
}
}
Please tell me if you need to see the Shader class, shader vs and fs files, or anything else.
Thanks!
glClear affects the output buffers. So it is part of rendering. If you want to clear as part of your rendering, put glClear inside your render function.
You have it inside update. I suspect that whomever is calling render and update (LWJGL, presumably?) doesn't guarantee any particular ordering to them. So each time you're asked to update you're stomping on top of the last thing you rendered.
Updates:
adjust internal state, usually partly as a function of time.
Renders:
capture current state visually.
It is not very clear in my question, but the answer is that I cleared the screen, swapped buffers, rendered, etc. Which doesn't work.
glClear(...);
glfwSwapBuffers(...);
...render...
This is how it is currently, and this doesn't work.
glClear(...);
...render...
glfwSwapBuffers(...);
This is how I do it now, and it works fine.
i have a shape with the following vertexes and faces:
static Vec3f cubeVerts[24] = {
{ -0.5, 0.5, -0.5 }, /* backside */
{ -0.5, -0.5, -0.5 },
{ -0.3, 4.0, -0.5 },
{ -0.3, 3.0, -0.5 },
{ -0.1, 5.5, -0.5 },
{ -0.1, 4.5, -0.5 },
{ 0.1, 5.5, -0.5 },
{ 0.1, 4.5, -0.5 },
{ 0.3, 4.0, -0.5 },
{ 0.3, 3.0, -0.5 },
{ 0.5, 0.5, -0.5 },
{ 0.5, -0.5, -0.5 },
{ -0.5, 0.5, 0.5 }, /* frontside */
{ -0.5, -0.5, 0.5 },
{ -0.3, 4.0, 0.5 },
{ -0.3, 3.0, 0.5 },
{ -0.1, 5.5, 0.5 },
{ -0.1, 4.5, 0.5 },
{ 0.1, 5.5, 0.5 },
{ 0.1, 4.5, 0.5 },
{ 0.3, 4.0, 0.5 },
{ 0.3, 3.0, 0.5 },
{ 0.5, 0.5, 0.5 },
{ 0.5, -0.5, 0.5 }
};
static GLuint cubeFaces[] = {
0, 1, 3, 2, /*backfaces*/
2, 3, 5, 4,
4, 5, 7, 6,
6, 7, 9, 8,
8, 9, 11, 10,
12, 13, 15, 14, /*frontfaces*/
14, 15, 17, 16,
16, 17, 19, 18,
18, 19, 21, 20,
20, 21, 23, 22,
0, 2, 14, 12, /*topfaces*/
2, 4, 16, 14,
4, 6, 18, 16,
6, 8, 20, 18,
8, 10, 22, 20,
1, 3, 15, 13, /*bottomfaces*/
3, 5, 17, 15,
5, 7, 19, 17,
7, 9, 21, 19,
9, 11, 23, 21,
0, 1, 13, 12, /*sidefaces*/
10, 11, 23, 22
};
and i want to get its normal like this:
static Vec3f cubeNorms[] = {
{ 0, 1, 0 },
{ 0, 1, 0 },
{ 0, 1, 0 },
{ 0, 1, 0 }
};
Can someone tell me how to calculate its normal and putting it inside an array so i can use all these together like this, i know something is wrong with my normal, because lighting on my shape is not right and i am also not sure if its the right way of setting up the normal, just one example is fine, ive been reading heaps of normal calculations and still can't figure out how to do it.
static void drawCube()
{
//vertexes
glEnableClientState(GL_VERTEX_ARRAY);
glVertexPointer(3, GL_FLOAT, 0, cubeVerts);
//norms
glEnableClientState(GL_NORMAL_ARRAY);
glNormalPointer(GL_FLOAT, 0, cubeNorms);
//faces
glDrawElements(GL_QUADS, 22 * 4, GL_UNSIGNED_INT, cubeFaces);
}
I'm going to assume your faces are counter-clockwise front-facing - I don't know if that's the case - and the quads are, of course, convex and planar.
For a face, take vertices {0, 1, 2}. I don't know the Vec3f specification (or if it's a class or C struct), but we can find the normal for all vertices in the quad with:
Vec3f va = v0 - v1; // quad vertex 1 -> 0
Vec3f vb = v2 - v1; // quad vertex 1 -> 2
Vec3f norm = cross(vb, va); // cross product.
float norm_len = sqrt(dot(norm, norm));
norm /= norm_len; // divide each component of norm by norm_len.
That gives you a unit normal for that face. If vertices are shared, and you want to give the model the perception of curvature using lighting, you'll have to decide what value of the normal should be 'agreed' upon. Perhaps the best starting point is to simply take an average of the face normals at that vertex - and rescale the result to unit length as required.
I have to make a sphere out of smaller uniformely distributed balls. I think the optimal way is to build a triangle-based geodesic sphere and use the vertices as the middle points of my balls. But I fail to write an algorithm generating the vertices.
Answer in C++ or pseudo-code will be better.
Example of a geodesic sphere: http://i.stack.imgur.com/iNQfP.png
Using the link #Muckle_ewe gave me, I was able to code the following algorithm:
Outside the main()
class Vector3d { // this is a pretty standard vector class
public:
double x, y, z;
...
}
void subdivide(const Vector3d &v1, const Vector3d &v2, const Vector3d &v3, vector<Vector3d> &sphere_points, const unsigned int depth) {
if(depth == 0) {
sphere_points.push_back(v1);
sphere_points.push_back(v2);
sphere_points.push_back(v3);
return;
}
const Vector3d v12 = (v1 + v2).norm();
const Vector3d v23 = (v2 + v3).norm();
const Vector3d v31 = (v3 + v1).norm();
subdivide(v1, v12, v31, sphere_points, depth - 1);
subdivide(v2, v23, v12, sphere_points, depth - 1);
subdivide(v3, v31, v23, sphere_points, depth - 1);
subdivide(v12, v23, v31, sphere_points, depth - 1);
}
void initialize_sphere(vector<Vector3d> &sphere_points, const unsigned int depth) {
const double X = 0.525731112119133606;
const double Z = 0.850650808352039932;
const Vector3d vdata[12] = {
{-X, 0.0, Z}, { X, 0.0, Z }, { -X, 0.0, -Z }, { X, 0.0, -Z },
{ 0.0, Z, X }, { 0.0, Z, -X }, { 0.0, -Z, X }, { 0.0, -Z, -X },
{ Z, X, 0.0 }, { -Z, X, 0.0 }, { Z, -X, 0.0 }, { -Z, -X, 0.0 }
};
int tindices[20][3] = {
{0, 4, 1}, { 0, 9, 4 }, { 9, 5, 4 }, { 4, 5, 8 }, { 4, 8, 1 },
{ 8, 10, 1 }, { 8, 3, 10 }, { 5, 3, 8 }, { 5, 2, 3 }, { 2, 7, 3 },
{ 7, 10, 3 }, { 7, 6, 10 }, { 7, 11, 6 }, { 11, 0, 6 }, { 0, 1, 6 },
{ 6, 1, 10 }, { 9, 0, 11 }, { 9, 11, 2 }, { 9, 2, 5 }, { 7, 2, 11 }
};
for(int i = 0; i < 20; i++)
subdivide(vdata[tindices[i][0]], vdata[tindices[i][1]], vdata[tindices[i][2]], sphere_points, depth);
}
Then in the main():
vector<Vector3d> sphere_points;
initialize_sphere(sphere_points, DEPTH); // where DEPTH should be the subdivision depth
for(const Vector3d &point : sphere_points)
const Vector3d point_tmp = point * RADIUS + CENTER; // Then for the sphere I want to draw, I iterate over all the precomputed sphere points and with a linear function translate the sphere to its CENTER and chose the proper RADIUS
You actually only need to use initialize_sphere() once and use the result for every sphere you want to draw.
I've done this before for a graphics project, the algorithm I used is detailed on this website
http://www.opengl.org.ru/docs/pg/0208.html
just ignore any openGL drawing calls and only code up the parts that deal with creating the actual vertices
There are well known algorithms to triangulate surfaces. You should be able to use the GNU Triangulated Surface Library to generate a suitable mesh if you don't want to code one of them up yourself.
It depends on the number of triangles you want the sphere to have. You can potentially have infinite resolution.
First focus on creating a dome, you can double it later by taking the negative coordinates of your upper dome. You will generate the sphere by interlocking rows of triangles.
Your triangles are equilateral, so decide on a length.
Divide 2(pi)r by the number of triangles you want to be on the bottom row of the dome.
This will be the length of each side of each triangle.
Next you need to create a concentric circle that intersects the surface of the sphere.
Between this circle and the base of the dome will be your first row.
You will need to find the angle that each triangle is tilted. (I will post later when I figure that out)
Repeat process for each concentric circle (generating row) until the height of the row * the number of rows approximately equals the 2(pi)r that u started with.
I will try to program it later if I get a chance. You could also try posting in the Math forum.