Algorithm for a geodesic sphere - c++

I have to make a sphere out of smaller uniformely distributed balls. I think the optimal way is to build a triangle-based geodesic sphere and use the vertices as the middle points of my balls. But I fail to write an algorithm generating the vertices.
Answer in C++ or pseudo-code will be better.
Example of a geodesic sphere: http://i.stack.imgur.com/iNQfP.png

Using the link #Muckle_ewe gave me, I was able to code the following algorithm:
Outside the main()
class Vector3d { // this is a pretty standard vector class
public:
double x, y, z;
...
}
void subdivide(const Vector3d &v1, const Vector3d &v2, const Vector3d &v3, vector<Vector3d> &sphere_points, const unsigned int depth) {
if(depth == 0) {
sphere_points.push_back(v1);
sphere_points.push_back(v2);
sphere_points.push_back(v3);
return;
}
const Vector3d v12 = (v1 + v2).norm();
const Vector3d v23 = (v2 + v3).norm();
const Vector3d v31 = (v3 + v1).norm();
subdivide(v1, v12, v31, sphere_points, depth - 1);
subdivide(v2, v23, v12, sphere_points, depth - 1);
subdivide(v3, v31, v23, sphere_points, depth - 1);
subdivide(v12, v23, v31, sphere_points, depth - 1);
}
void initialize_sphere(vector<Vector3d> &sphere_points, const unsigned int depth) {
const double X = 0.525731112119133606;
const double Z = 0.850650808352039932;
const Vector3d vdata[12] = {
{-X, 0.0, Z}, { X, 0.0, Z }, { -X, 0.0, -Z }, { X, 0.0, -Z },
{ 0.0, Z, X }, { 0.0, Z, -X }, { 0.0, -Z, X }, { 0.0, -Z, -X },
{ Z, X, 0.0 }, { -Z, X, 0.0 }, { Z, -X, 0.0 }, { -Z, -X, 0.0 }
};
int tindices[20][3] = {
{0, 4, 1}, { 0, 9, 4 }, { 9, 5, 4 }, { 4, 5, 8 }, { 4, 8, 1 },
{ 8, 10, 1 }, { 8, 3, 10 }, { 5, 3, 8 }, { 5, 2, 3 }, { 2, 7, 3 },
{ 7, 10, 3 }, { 7, 6, 10 }, { 7, 11, 6 }, { 11, 0, 6 }, { 0, 1, 6 },
{ 6, 1, 10 }, { 9, 0, 11 }, { 9, 11, 2 }, { 9, 2, 5 }, { 7, 2, 11 }
};
for(int i = 0; i < 20; i++)
subdivide(vdata[tindices[i][0]], vdata[tindices[i][1]], vdata[tindices[i][2]], sphere_points, depth);
}
Then in the main():
vector<Vector3d> sphere_points;
initialize_sphere(sphere_points, DEPTH); // where DEPTH should be the subdivision depth
for(const Vector3d &point : sphere_points)
const Vector3d point_tmp = point * RADIUS + CENTER; // Then for the sphere I want to draw, I iterate over all the precomputed sphere points and with a linear function translate the sphere to its CENTER and chose the proper RADIUS
You actually only need to use initialize_sphere() once and use the result for every sphere you want to draw.

I've done this before for a graphics project, the algorithm I used is detailed on this website
http://www.opengl.org.ru/docs/pg/0208.html
just ignore any openGL drawing calls and only code up the parts that deal with creating the actual vertices

There are well known algorithms to triangulate surfaces. You should be able to use the GNU Triangulated Surface Library to generate a suitable mesh if you don't want to code one of them up yourself.

It depends on the number of triangles you want the sphere to have. You can potentially have infinite resolution.
First focus on creating a dome, you can double it later by taking the negative coordinates of your upper dome. You will generate the sphere by interlocking rows of triangles.
Your triangles are equilateral, so decide on a length.
Divide 2(pi)r by the number of triangles you want to be on the bottom row of the dome.
This will be the length of each side of each triangle.
Next you need to create a concentric circle that intersects the surface of the sphere.
Between this circle and the base of the dome will be your first row.
You will need to find the angle that each triangle is tilted. (I will post later when I figure that out)
Repeat process for each concentric circle (generating row) until the height of the row * the number of rows approximately equals the 2(pi)r that u started with.
I will try to program it later if I get a chance. You could also try posting in the Math forum.

Related

VTK face keeps rendering backwards

I am trying to render the faces of a cube using VTK 9.2.
The cube's vertices are ordered like so:
// 7-------6
// /| /|
// 4-+-----5 |
// | | | | y
// | 3-----+-2 | z
// |/ |/ |/
// 0-------1 +--x
(Yes I know this is somewhat atypical ordering, but we're just rendering faces in VTK, so it shouldn't matter as long as we're consistent with the usage)
While one of the faces works perfectly, the other consistently renders backwards regardless of how I define it.
I am using the VTK_PIXEL ordering for each face.
Here is the code that does the rendering:
vtkGenericOpenGLRenderWindow* renderWindow = vtkGenericOpenGLRenderWindow::New();
vtkNew<vtkUnstructuredGrid> ugrid;
// Create and insert vertices
std::vector<std::vector<double>> vertices(8);
double halfWidth = 20.0;
vertices[0] = { - halfWidth, - halfWidth, - halfWidth }; // 0
vertices[1] = { + halfWidth, - halfWidth, - halfWidth }; // 1
vertices[2] = { + halfWidth, - halfWidth, + halfWidth }; // 2
vertices[3] = { - halfWidth, - halfWidth, + halfWidth }; // 3
vertices[4] = { - halfWidth, + halfWidth, - halfWidth }; // 4
vertices[5] = { + halfWidth, + halfWidth, - halfWidth }; // 5
vertices[6] = { + halfWidth, + halfWidth, + halfWidth }; // 6
vertices[7] = { - halfWidth, + halfWidth, + halfWidth }; // 7
vtkNew<vtkPoints> points;
for (auto i = 0; i < 8; ++i)
{
points->InsertNextPoint(vertices.at(i).at(0), vertices.at(i).at(1), vertices.at(i).at(2));
}
// Create faces
std::vector<std::array<vtkIdType, 4>> faces;
faces.push_back({ 3, 2, 7, 6 }); // +Z, works perfectly!
// -Z:
faces.push_back({ 1, 0, 5, 4 }); // backwards
//faces.push_back({ 0, 1, 4, 5 }); // backwards
//faces.push_back({ 0, 4, 1, 5 }); // backwards
//faces.push_back({ 5, 1, 4, 0 }); // backwards
//faces.push_back({ 4, 5, 0, 1 }); // backwards
//faces.push_back({ 1, 5, 0, 4 }); // backwards
//faces.push_back({ 4, 0, 5, 1 }); // backwards
//faces.push_back({ 5, 4, 1, 0 }); // also backwards
// Insert faces
for(int i = 0; i < faces.size(); i++)
{
ugrid->InsertNextCell(VTK_PIXEL, 4, faces.at(i).data());
}
ugrid->SetPoints(points);
// Create new data mapper for this snapshot
vtkNew<vtkDataSetMapper> mapper;
mapper->SetInputData(ugrid);
// Create new actor for this data snapshot
vtkNew<vtkActor> actor;
actor->SetMapper(mapper);
addActorToScene(0, 0.0, actor);
renderWindow->Render();
The +Z face works fantastic and looks correct. However, the other face is always backwards no matter what node order I try.
This is what I see in my window:
As seen there, the +Z face (3, 2, 7, 6) works great. It appears white on the outside and black on the inside.
But the -Z face does not work - it appears white on the inside of the cube, and black on the outside.
It was a lighting issue, mi aculpa. I was doing the following outside of this code:
vtkNew<vtkLight> light;
renderWindow->AddLight(light);
Once I removed that and the default lighting took over, both the inside and outside of each face appear white, so which side appears white does not indicate the direction of the face.

Normals to a cube seem to be pointing inwards

I have written the following function to draw a cube:
void drawCube() {
Point vertices[8] = { Point(-1.0, -1.0, -1.0), Point(-1.0, -1.0, 1.0), Point(1.0, -1.0, 1.0), Point(1.0, -1.0, -1.0),
Point(-1.0, 1.0, -1.0), Point(-1.0, 1.0, 1.0), Point(1.0, 1.0, 1.0), Point(1.0, 1.0, -1.0) };
int faces[6][4] = {{0, 1, 2, 3}, {0, 3, 7, 4}, {0, 1, 5, 4}, {1, 2, 6, 5}, {3, 2, 6, 7}, {4, 5, 6, 7}};
glBegin(GL_QUADS);
for (unsigned int face = 0; face < 6; face++) {
Vector v = vertices[faces[face][1]] - vertices[faces[face][0]];
Vector w = vertices[faces[face][2]] - vertices[faces[face][0]];
Vector normal = v.cross(w).normalised();
glNormal3f(normal.dx, normal.dy, normal.dz);
for (unsigned int vertex = 0; vertex < 4; vertex++) {
switch (vertex) {
case 0: glTexCoord2f(0, 0); break;
case 1: glTexCoord2f(1, 0); break;
case 2: glTexCoord2f(1, 1); break;
case 3: glTexCoord2f(0, 1); break;
}
glVertex3f(vertices[faces[face][vertex]].x, vertices[faces[face][vertex]].y, vertices[faces[face][vertex]].z);
}
}
glEnd();
}
When the cube is rendered with a light shining on to it, it appears that as I rotate the cube, the correct shading transitions are only happening for around half the faces. The rest just remain a very dark shade, as if I had removed the normal calculations.
I then decided to remove a couple of faces to see inside the cube. The faces that are not reflecting the light correctly on the outside, are doing so correctly on the inside. How can I ensure that the normal to each face is pointing out from that face, rather than in towards the centre of the cube?
To reverse the direction of the normal, swap the order you use for the cross product:
Vector normal = w.cross(v).normalised();
Maybe there is a more efficient way, but a imho quite easy to understand way is the following....
Calculate the vector that points from the center of the side to the center of the cube. Call it
m = center cube - center side
Then calculate the scalar product of that vector with your normal:
x = < m , n >
The scalar product is positive if the two vectors point in the same direction with respect to the side (the angle between them is less than 90 degree). It is negative, if they point in opposite directions (angle is bigger than 90 degree). Then correct your normal via
if ( x > 0 ) n = -n;
to make sure it always points outwards.

How i can add auto texture coordinates in OpenGL?

i created bezier surface in OpenGL this way:
GLfloat punktyWSP[5][5][3] = {
{ {0,0,4}, {1,0,4},{2,0,4},{3,0,4},{4,1,4}},
{ {0,0,3}, {1,1,3},{2,1,3},{3,1,3},{4,1,3} },
{ {0,1,2}, {1,2,2},{2,6,2},{3,2,2},{4,1,2} },
{ {0,0,1}, {1,1,1},{2,1,1},{3,1,1},{4,1,1} },
{ {0,0,0}, {1,0,0},{2,0,0},{3,0,0},{4,1,0} }
};
glMap2f(GL_MAP2_VERTEX_3, 0, 1, 3, 5, 0, 1, 15, 5, &punktyWSP[0][0][0]);
glEnable(GL_MAP2_VERTEX_3);
glMapGrid2f(u, 0, 1, v, 0, 1);
glShadeModel(GL_FLAT);
glEnable (GL_AUTO_NORMAL);
glEvalMesh2(GL_FILL, 0, u, 0, v);
Now i want to texture it.
Is there any way to add auto texture coordinates to my surface as it it with norms and glenable(gl_auto_normal)?
If there is no such a function, do you have any idea how to add coordinates to my surface? Maybe glEnable(GL_MAP2_TEXTURE_COORD_2) ?

How to model ellipsoid in opengl using polygons

I want to model ellipsoid using triangles and subdivision.
below code, referenced from OpenGL Programming guide, models sphere but I don't know how I can modify this to model ellipsoid
#define X .525731112119133606
#define Z .850650808352039932
static GLfloat vdata[12][3] = {
  { -X, 0.0, Z }, { X, 0.0, Z }, { -X, 0.0, -Z }, { X, 0.0, -Z },
  { 0.0, Z, X }, { 0.0, Z, -X }, { 0.0, -Z, X }, { 0.0, -Z, -X },
  { Z, X, 0.0 }, { -Z, X, 0.0 }, { Z, -X, 0.0 }, { -Z, -X, 0.0 }
};
static GLuint tindices[20][3] = {
{ 1, 4, 0 }, { 4, 9, 0 }, { 4, 5, 9 }, { 8, 5, 4 }, { 1, 8, 4 },
 { 1, 10, 8 }, { 10, 3, 8 }, { 8, 3, 5 }, { 3, 2, 5 }, { 3, 7, 2 },
 { 3, 10, 7 }, { 10, 6, 7 }, { 6, 11, 7 }, { 6, 0, 11 }, { 6, 1, 0 },
 { 10, 1, 6 }, { 11, 0, 9 }, { 2, 11, 9 }, { 5, 2, 9 }, { 11, 2, 7 },
};
//draws triangle at the specified coordinate
void drawtriangle(float *v1, float *v2, float *v3){
printf("v1 = %f, v3 = %f,v3 = %f\n", *v1, *v2, *v3);
glBegin(GL_TRIANGLES);
glNormal3fv(v1);
glVertex3fv(v1);
glNormal3fv(v2);
glVertex3fv(v2);
glNormal3fv(v3);
glVertex3fv(v3);
glEnd();
}
void normalize(float v[3]){
GLfloat d = sqrt(v[0] * v[0] + v[1] * v[1] + v[2] * v[2]);
if (d == 0.0){
printf("zero length vector\n");
return;
}
v[0] /= d;
v[1] /= d;
v[2] /= d;
}
void subdivide(float *v1, float *v2, float *v3, long depth){
GLfloat v12[3], v23[3], v31[3];
GLint i;
//end recursion
if (depth == 0){
drawtriangle(v1, v2, v3);
return;
}
for (i = 0; i < 3; i++){
v12[i] = (v1[i] + v2[i]) / 2.0;
v23[i] = (v2[i] + v3[i]) / 2.0;
v31[i] = (v3[i] + v1[i]) / 2.0;
}
normalize(v12);
normalize(v23);
normalize(v31);
subdivide(v1, v12, v31, depth - 1);
subdivide(v2, v23, v12, depth - 1);
subdivide(v3, v31, v23, depth - 1);
subdivide(v12, v23, v31, depth - 1);
}
void display(void){
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glShadeModel(GL_FLAT);
glRotatef(300.0, 0.5, 1.0, 0.5);
for (int i = 0; i < 20; i++){
subdivide(&vdata[tindices[i][0]][0], &vdata[tindices[i][1]][0], &vdata[tindices[i][2]][0], 1);
}
glFlush();
}
As long as the ellipsoid is axis-aligned, it's not much more difficult than the sphere. The code you have calculates vertices on a unit sphere. For a sphere with radius r, you multiply those unit sphere points (vx, vy, vz) with the radius:
sx = r * vx
sy = r * vy
sz = r * vz
The ellipsoid is a generalization, where the radii in the 3 coordinate directions can be different. With the 3 radii rx, ry, and rz, the points are then calculated as:
sx = rx * vx
sy = ry * vy
sz = rz * vz
It gets slightly more interesting with the normals. Spheres have the convenient property that the position and normal vectors are identical. This does not apply to ellipsoids. For the normals, you have to divide by the corresponding radius (see normal matrix for non uniform scaling for the mathematical background). So the normals for the ellipsoid are calculated as:
nx = vx / rx
ny = vy / ry
nz = vz / rz
To fit this into your code with minimal changes, you can change the drawtriangle() function to:
glBegin(GL_TRIANGLES);
glNormal3f(v1[0] / rx, v1[1] / ry, v1[2] / rz);
glVertex3f(v1[0] * rx, v1[1] * ry, v1[2] * rz);
glNormal3f(v2[0] / rx, v2[1] / ry, v2[2] / rz);
glVertex3f(v2[0] * rx, v2[1] * ry, v2[2] * rz);
glNormal3f(v3[0] / rx, v3[1] / ry, v3[2] / rz);
glVertex3f(v3[0] * rx, v3[1] * ry, v3[2] * rz);
glEnd();
With these calculations, the normal vectors will generally not be normalized anymore. You can ask OpenGL to normalize them for you by adding this call to your initialization code:
glEnable(GL_NORMALIZE);
If you care about performance at all, calculating the points each time you want to render a sphere will be highly inefficient. You will want to calculate them once, and store them away for rendering. And while you're at it, you can store them in a vertex buffer, and get rid of the immediate mode rendering.

Incorrectly rendering Isocahedron in OpenGL

I'm trying to draw an Isocahedron in OpenGL with c++. I keep getting close but having some missing faces. I have found 3 different sets of vertex/index data on multiple sites, most often the data listed below
float X = 0.525731112119133606f;
float Z = 0.850650808352039932f;
float temppts[12][3] = { { -X, 0.0f, Z }, { X, 0.0f, Z }, { -X, 0.0f, -Z }, { X, 0.0f, -Z },
{ 0.0f, Z, X }, { 0.0f, Z, -X }, { 0.0f, -Z, X }, { 0.0f, -Z, -X },
{ Z, X, 0.0f }, { -Z, X, 0.0f }, { Z, -X, 0.0f }, { -Z, -X, 0.0f } };
GLushort tempindicies[60] =
{ 1, 4, 0, 4, 9, 0, 4, 5, 9, 8, 5, 4, 1, 8, 4,
1, 10, 8, 10, 3, 8, 8, 3, 5, 3, 2, 5, 3, 7, 2,
3, 10, 7, 10, 6, 7, 6, 11, 7, 6, 0, 11, 6, 1, 0,
10, 1, 6, 11, 0, 9, 2, 11, 9, 5, 2, 9, 11, 2, 7};
This code is adapted from a book and multiple sites display it working, though they are drawing immediate and I'm using vbo/ibo. Can anyone point me to some working vertex/index data or tell me what is going wrong transferring this to buffer objects? The three different data all have differently incorrect icosahedrons, each with different faces missing.
I have checked over my bufferData calls many times and tried several drawing modes ( TRIANGLES, TRIANGLE_STRIP ... ) and am convinced the index data is wrong somehow
I used the mesh coordinates (vertices) and the triangle connectivity from Platonic Solids (scroll down to icosahedron). I've pasted a screen shot from that file below. When calling glDrawElements I used GL_TRIANGLES.
Icosahedron
Another thing to watch out for is back face culling. Initially switch off backface culling.
glDisable(GL_CULL_FACE);