Rendering 3D Delaunay using C++ and DirectX - c++

I tried the code below, but I have incorrect indices. I'm trying to render lines using CGAL and 3D Delaunay, but I get correct indices. It was hard to get the indices of the 3D Delaunay. I don't know which part is wrong, I try with 4 vertices, it looks correct, but with more than 5 indices I get wrong triangles...
std::vector<int> triangles;
std::map<Delaunay3::Vertex_handle, int> index_of_vertex;
int j = 0;
for (Delaunay3::Finite_vertices_iterator it = dt.finite_vertices_begin();
it != dt.finite_vertices_end(); ++it, ++j) {
index_of_vertex[it.base()] = j;
}
for (Delaunay3::Finite_facets_iterator itFacet = dt.finite_facets_begin();
itFacet != dt.finite_facets_end(); itFacet++) {
triangles.push_back(index_of_vertex[itFacet->first->vertex(0)]);
triangles.push_back(index_of_vertex[itFacet->first->vertex(1)]);
triangles.push_back(index_of_vertex[itFacet->first->vertex(2)]);
triangles.push_back(index_of_vertex[itFacet->first->vertex(0)]);
triangles.push_back(index_of_vertex[itFacet->first->vertex(2)]);
triangles.push_back(index_of_vertex[itFacet->first->vertex(3)]);
triangles.push_back(index_of_vertex[itFacet->first->vertex(1)]);
triangles.push_back(index_of_vertex[itFacet->first->vertex(2)]);
triangles.push_back(index_of_vertex[itFacet->first->vertex(3)]);
}
std::vector<WORD> lineIndex;
lineIndex.resize(triangles.size() * sizeof(int) * 4);
int l, t;
for (l = 0, t = 0; t < triangles.size(); t += 4) {
// Each triangle has 3 lines, so D3D_PRIMITIVE_TOPOLOGY_LINELIST needs 6 vertices
// Each vertex has to be listed twice
lineIndex[l] = triangles[t];
l++;
lineIndex[l] = triangles[t + 1];
l++;
lineIndex[l] = triangles[t + 1];
l++;
lineIndex[l] = triangles[t + 2];
l++;
lineIndex[l] = triangles[t + 2];
l++;
lineIndex[l] = triangles[t];
l++;
}
// Fill in a buffer description for drawing only the triangle outlines
ZeroMemory(&bd, sizeof(bd));
bd.Usage = D3D11_USAGE_DEFAULT;
bd.ByteWidth = numTriangleVertices * 3 * sizeof(int); // <---- from the function
bd.BindFlags = D3D11_BIND_INDEX_BUFFER;
bd.CPUAccessFlags = 0;
ZeroMemory(&InitData, sizeof(InitData));
InitData.pSysMem = static_cast<void*>(lineIndex.data());
pd3dDevice->CreateBuffer(&bd, &InitData, &g_pTriOutlineIndexBuffer);

Let me explain it with a triangulation with only 4 non-coplanar vertices.
You then have 4 finite facets. Each of them is represented as a std::pair<Delaunay3::Cell_handle, int> pair;, where pair. first is the cell incident to the facet and pair.second is the index of the vertex opposite to the facet in the cell.
If pair.second == 0, the indices of the vertices of the facet are 1,2,3;
If pair.second == 1, the indices of the vertices of the facet are 2,3,0; etc.
That is in general, f pair.second == i, the indices of the vertices of the facet are (i+1)%4, (i+2)%4, (i+3)%4
There is also a helper class, which is the base class of the triangulations
https://doc.cgal.org/latest/TDS_3/classCGAL_1_1Triangulation__utils__3.html
It allows you to write Delaunay_3::vertex_triple_index(i,j).
I agree that this is not well documented at all, and we are going to improve that.

Related

Calculating Vertex normals weird results

I know this has been asked quiet a few times but my Problem is not about how to do it. I know how this works (or at least I think so ^^) but something seems to be wrong with my implementation and I can't get behind it.
I have a procedurally generated Terrain mesh and I'm trying to calculate the normals for each vertex by averaging the normals of all the triangles this vertex is connected to. When setting the normal xyz to the rgb vertex colors it seems as if it's randomly either black (0, 0, 0) or blue (0, 0, 1).
void CalculateVertexNormal(int index){ //index of the vertex in the mesh's vertex array
std::vector<int> indices; //indices of triangles the vertex is a part of
Vector normals = Vector(0.0f, 0.0f, 0.0f, 0.0f); //sum of all the face normals
for(int i = 0; i < triangles.size(); i += 3){ //iterate over the triangle array in order
if(triangles[i] == index) //to find the triangle indices
indices.push_back(triangles[i]);
else if(triangles[i + 1] == index)
indices.push_back(triangles[i]);
else if(triangles[i + 2] == index)
indices.push_back(triangles[i]);
}
for(int i = 0; i < indices.size(); i++){ //iterate over the indices to calculate the normal for each tri
int vertex = indices[i];
Vector v1 = vertices[vertex + 1].GetLocation() - vertices[vertex].GetLocation(); //p1->p2
Vector v2 = vertices[vertex + 2].GetLocation() - vertices[vertex].GetLocation(); //p1->p3
normals += v1.Cross(v2); //cross product with two edges to receive face normal
}
vertices[index].SetNormals(normals.Normalize()); //normalize the sum of face normals and set to vertex
}
Maybe somebody could have a look and tell me what I'm doing wrong.
Thank you.
Edit:
Thanks to molbdnilo's comment I finally understood what was wrong. It was a problem with indexing the arrays and my two loops were kind of confusing as well, maybe I should get some rest ;)
I eventually came up with this, reduced to one loop:
for(int i = 0; i < triangles.size(); i += 3){
if(triangles[i] == index || triangles[i + 1] == index || triangles[i + 2] == index){
Vector v1 = vertices[triangles[i + 1]].GetLocation() - vertices[index].GetLocation();
Vector v2 = vertices[triangles[i + 2]].GetLocation() - vertices[index].GetLocation();
faceNormals += v1.Cross(v2);
}
}
vertices[index].SetNormals(faceNormals.Normalize());

What is the correct way to create a vertex and index buffer from a physx cloth object

I'm trying to actually RENDER the cloth I created to the screen in DirectX11.
I used the PhysX API to create a cloth object and tried to create the vertex and index buffer accordingly. As far as I know the cloth object should be okay.
Here's my code. Please note that this is in a custom engine (from school) so some things might look weird (like the gameContext object for example) but you should be able to comprehend the code.
I used the Introduction to 3D Game Programming with DirectX10 book from Frank D Luna as a reference for the buffers.
// create regular mesh
PxU32 resolution = 20;
PxU32 numParticles = resolution*resolution;
PxU32 numTriangles = 2*(resolution-1)*(resolution-1);
// create cloth particles
PxClothParticle* particles = new PxClothParticle[numParticles];
PxVec3 center(0.5f, 0.3f, 0.0f);
PxVec3 delta = 1.0f/(resolution-1) * PxVec3(15.0f, 15.0f, 15.0f);
PxClothParticle* pIt = particles;
for(PxU32 i=0; i<resolution; ++i)
{
for(PxU32 j=0; j<resolution; ++j, ++pIt)
{
pIt->invWeight = j+1<resolution ? 1.0f : 0.0f;
pIt->pos = delta.multiply(PxVec3(PxReal(i),
PxReal(j), -PxReal(j))) - center;
}
}
// create triangles
PxU32* triangles = new PxU32[3*numTriangles];
PxU32* iIt = triangles;
for(PxU32 i=0; i<resolution-1; ++i)
{
for(PxU32 j=0; j<resolution-1; ++j)
{
PxU32 odd = j&1u, even = 1-odd;
*iIt++ = i*resolution + (j+odd);
*iIt++ = (i+odd)*resolution + (j+1);
*iIt++ = (i+1)*resolution + (j+even);
*iIt++ = (i+1)*resolution + (j+even);
*iIt++ = (i+even)*resolution + j;
*iIt++ = i*resolution + (j+odd);
}
}
// create fabric from mesh
PxClothMeshDesc meshDesc;
meshDesc.points.count = numParticles;
meshDesc.points.stride = sizeof(PxClothParticle);
meshDesc.points.data = particles;
meshDesc.invMasses.count = numParticles;
meshDesc.invMasses.stride = sizeof(PxClothParticle);
meshDesc.invMasses.data = &particles->invWeight;
meshDesc.triangles.count = numTriangles;
meshDesc.triangles.stride = 3*sizeof(PxU32);
meshDesc.triangles.data = triangles;
// cook fabric
PxClothFabric* fabric = PxClothFabricCreate(*PhysxManager::GetInstance()->GetPhysics(), meshDesc, PxVec3(0, 1, 0));
//delete[] triangles;
// create cloth
PxTransform gPose = PxTransform(PxVec3(0,1,0));
gCloth = PhysxManager::GetInstance()->GetPhysics()->createCloth(gPose, *fabric, particles, PxClothFlags(0));
fabric->release();
//delete[] particles;
// 240 iterations per/second (4 per-60hz frame)
gCloth->setSolverFrequency(240.0f);
GetPhysxProxy()->GetPhysxScene()->addActor(*gCloth);
// CREATE VERTEX BUFFER
D3D11_BUFFER_DESC bufferDescriptor = {};
bufferDescriptor.Usage = D3D11_USAGE_DEFAULT;
bufferDescriptor.ByteWidth = sizeof( PxClothParticle* ) * gCloth->getNbParticles();
bufferDescriptor.BindFlags = D3D11_BIND_VERTEX_BUFFER;
bufferDescriptor.CPUAccessFlags = 0;
bufferDescriptor.MiscFlags = 0;
D3D11_SUBRESOURCE_DATA initData = {};
initData.pSysMem = particles;
gameContext.pDevice->CreateBuffer(&bufferDescriptor, &initData, &m_pVertexBuffer);
// BUILD INDEX BUFFER
D3D11_BUFFER_DESC bd = {};
bd.Usage = D3D11_USAGE_IMMUTABLE;
bd.ByteWidth = sizeof(PxU32) * sizeof(triangles);
bd.BindFlags = D3D11_BIND_INDEX_BUFFER;
bd.CPUAccessFlags = 0;
bd.MiscFlags = 0;
D3D11_SUBRESOURCE_DATA initData2 = {};
initData2.pSysMem = triangles;
gameContext.pDevice->CreateBuffer(&bd, &initData2, &m_pIndexBuffer);
When this is done I run this code in the "draw" part of the engine:
// Set vertex buffer(s)
UINT offset = 0;
UINT vertexBufferStride = sizeof(PxClothParticle*);
gameContext.pDeviceContext->IASetVertexBuffers( 0, 1, &m_pVertexBuffer, &vertexBufferStride, &offset );
// Set index buffer
gameContext.pDeviceContext->IASetIndexBuffer(m_pIndexBuffer,DXGI_FORMAT_R32_UINT,0);
// Set primitive topology
gameContext.pDeviceContext->IASetPrimitiveTopology( D3D10_PRIMITIVE_TOPOLOGY_TRIANGLELIST );
auto mat = new DiffuseMaterial();
mat->Initialize(gameContext);
mat->SetDiffuseTexture(L"./Resources/Textures/Chair_Dark.dds");
gameContext.pMaterialManager->AddMaterial(mat, 3);
ID3DX11EffectTechnique* pTechnique = mat->GetDefaultTechnique();
D3DX11_TECHNIQUE_DESC techDesc;
pTechnique->GetDesc( &techDesc );
for( UINT p = 0; p < techDesc.Passes; ++p )
{
pTechnique->GetPassByIndex(p)->Apply(0, gameContext.pDeviceContext);
gameContext.pDeviceContext->DrawIndexed(gCloth->getNbParticles(), 0, 0 );
}
I think there's something obviously wrong that I'm just totally missing. (DirectX isn't my strongest part in programming). Every comment or answer is much appreciated.

Terrain only rendered underside

I'm trying to render a map, but unfortunately, only the underside is rendered.
I guess I'm doing something wrong while setting up the vertex and index buffers.
This is the part I initialize the vertex and index buffers:
// Initialize vertices and indices
SimpleVertex* vertices = new SimpleVertex[(dimension + 1) * (dimension + 1)];
WORD* indices = new WORD[dimension * dimension * 6];
for (WORD i = 0; i < dimension + 1; ++i)
{
for (WORD j = 0; j < dimension + 1; ++j)
{
vertices[i * (dimension + 1) + j].Pos = XMFLOAT3(i, rand() % 2, j);
vertices[i * (dimension + 1) + j].Color = XMFLOAT4(rand() % 2, rand() % 2, rand() % 2, 1.0f);
}
}
for (WORD i = 0; i < dimension; i++)
{
for (WORD j = 0; j < dimension; j++)
{
indices[(i * dimension + j) * 6] = (WORD)(i * (dimension + 1) + j);
indices[(i * dimension + j) * 6 + 2] = (WORD)(i * (dimension + 1) + j + 1);
indices[(i * dimension + j) * 6 + 1] = (WORD)((i + 1) * (dimension + 1) + j + 1);
indices[(i * dimension + j) * 6 + 3] = (WORD)(i * (dimension + 1) + j);
indices[(i * dimension + j) * 6 + 5] = (WORD)((i + 1) * (dimension + 1) + j + 1);
indices[(i * dimension + j) * 6 + 4] = (WORD)((i + 1) * (dimension + 1) + j);
}
}
// Create vertex buffer
D3D11_BUFFER_DESC bd;
ZeroMemory(&bd, sizeof(bd));
bd.Usage = D3D11_USAGE_DEFAULT;
bd.ByteWidth = sizeof(SimpleVertex)* (dimension + 1) * (dimension + 1);
bd.BindFlags = D3D11_BIND_VERTEX_BUFFER;
bd.CPUAccessFlags = 0;
D3D11_SUBRESOURCE_DATA InitData;
ZeroMemory(&InitData, sizeof(InitData));
InitData.pSysMem = vertices;
hr = g_pd3dDevice->CreateBuffer(&bd, &InitData, &g_pVertexBuffer);
delete vertices;
if (FAILED(hr))
return hr;
// Set vertex buffer
UINT stride = sizeof(SimpleVertex);
UINT offset = 0;
g_pImmediateContext->IASetVertexBuffers(0, 1, &g_pVertexBuffer, &stride, &offset);
// Create indices buffer
bd.Usage = D3D11_USAGE_DEFAULT;
bd.ByteWidth = sizeof(WORD)* dimension * dimension * 6;
bd.BindFlags = D3D11_BIND_INDEX_BUFFER;
bd.CPUAccessFlags = 0;
InitData.pSysMem = indices;
hr = g_pd3dDevice->CreateBuffer(&bd, &InitData, &g_pIndexBuffer);
delete indices;
if (FAILED(hr))
return hr;
Excuses for my bad English :(. Thank you for reading!
The first thing that occurred to me is you may be declaring your vertices in the wrong order. If your Direct3D context is expecting vertices to be counterclockwise, and yours are defined in clockwise order, "backface culling" will cause your polygons to be invisible unless viewed from the other side.
Specifically, D3D11_RASTERIZER_DESC::FrontCounterClockwise sets the direction. (see http://msdn.microsoft.com/en-us/library/windows/desktop/ff476198%28v=vs.85%29.aspx)
In the code where you set up your rasterizer description, try setting CullMode=D3D11_CULL_NONE and if the terrain appears, then this was your problem.
Most likely, the face culling wasn't set up properly.
In theory (thanks Google for providing links ;) ):
Face culling
Winding order
In practice:
You decide in which order to put your vertices within triangles (in reality, you manipulating with indices, as your buffers are indexed) - clockwise or counterclockwise.
Having decision #1 you now decide which faces must be considered as "front":
D3D11_RASTERIZER_DESC rd = {};
rd.FrontCounterClockwise = true; // counterclockwise are front
and you decide which faces rasterizer must cull: back ones, front ones, or none:
rd.CullMode = D3D11_CULL_BACK; // back faced primitives will be stripped out
// during rasterization
// (clockwise ones in our example)
So, you can either change your geometry winding and/or DirectX winding option and/or DirectX culling option.
Note: By-default, DirectX 11 uses false and D3D11_CULL_BACK for parameters above. So it considers clockwise primitives as front faced, and culls counterclockwise ones, considered back faced.
Note: To better understand culling, draw a triangle on both sides of piece of paper as if it would be same triangle viewed from different sides. Put indices near each vertex (same on both sides of paper). Draw an circular arrow showing winding order. Compare it with your mesh. Then it will be obvious which winding order and culling you must use.
Sources:
MSDN DirectX Reference pages:
D3D11_RASTERIZER_DESC
D3D11_CULL_MODE
ID3D11Device::CreateRasterizerState()
ID3D11DeviceContext::RSSetState()

A method for indexing triangles from a loaded heightmap?

I am currently making a method to load in a noisy heightmap, but lack the triangles to do so. I want to make an algorithm that will take an image, its width and height and construct a terrain node out of it.
Here's what I have so far, in somewhat pseudo
Vertex* vertices = new Vertices[image.width * image.height];
Index* indices; // How do I judge how many indices I will have?
float scaleX = 1 / image.width;
float scaleY = 1 / image.height;
float currentYScale = 0;
for(int y = 0; y < image.height; ++y) {
float currentXScale = 0;
for (int x = 0; x < image.width; ++x) {
Vertex* v = vertices[x * y];
v.x = currentXScale;
v.y = currentYScale;
v.z = image[x,y];
currentXScale += scaleX;
}
currentYScale += scaleY;
}
This works well enough to my needs, my only problem is this: How would I calculate the # of indices and their positions for drawing the triangles? I have somewhat familiarity with indices, but not how to programmatically calculate them, I can only do that statically.
As far as your code above goes, using vertices[x * y] isn't right - if you use that, then e.g. vert(2,3) == vert(3,2). What you want is something like vertices[y * image.width + x], but you can do it more efficiently by incrementing a counter (see below).
Here's the equivalent code I use. It's in C# unfortunately, but hopefully it should illustrate the point:
/// <summary>
/// Constructs the vertex and index buffers for the terrain (for use when rendering the terrain).
/// </summary>
private void ConstructBuffers()
{
int heightmapHeight = Heightmap.GetLength(0);
int heightmapWidth = Heightmap.GetLength(1);
int gridHeight = heightmapHeight - 1;
int gridWidth = heightmapWidth - 1;
// Construct the individual vertices for the terrain.
var vertices = new VertexPositionTexture[heightmapHeight * heightmapWidth];
int vertIndex = 0;
for(int y = 0; y < heightmapHeight; ++y)
{
for(int x = 0; x < heightmapWidth; ++x)
{
var position = new Vector3(x, y, Heightmap[y,x]);
var texCoords = new Vector2(x * 2f / heightmapWidth, y * 2f / heightmapHeight);
vertices[vertIndex++] = new VertexPositionTexture(position, texCoords);
}
}
// Create the vertex buffer and fill it with the constructed vertices.
this.VertexBuffer = new VertexBuffer(Renderer.GraphicsDevice, typeof(VertexPositionTexture), vertices.Length, BufferUsage.WriteOnly);
this.VertexBuffer.SetData(vertices);
// Construct the index array.
var indices = new short[gridHeight * gridWidth * 6]; // 2 triangles per grid square x 3 vertices per triangle
int indicesIndex = 0;
for(int y = 0; y < gridHeight; ++y)
{
for(int x = 0; x < gridWidth; ++x)
{
int start = y * heightmapWidth + x;
indices[indicesIndex++] = (short)start;
indices[indicesIndex++] = (short)(start + 1);
indices[indicesIndex++] = (short)(start + heightmapWidth);
indices[indicesIndex++] = (short)(start + 1);
indices[indicesIndex++] = (short)(start + 1 + heightmapWidth);
indices[indicesIndex++] = (short)(start + heightmapWidth);
}
}
// Create the index buffer.
this.IndexBuffer = new IndexBuffer(Renderer.GraphicsDevice, typeof(short), indices.Length, BufferUsage.WriteOnly);
this.IndexBuffer.SetData(indices);
}
I guess the key point is that given a heightmap of size heightmapHeight * heightmapWidth, you need (heightmapHeight - 1) * (heightmapWidth - 1) * 6 indices, since you're drawing:
2 triangles per grid square
3 vertices per triangle
(heightmapHeight - 1) * (heightmapWidth - 1) grid squares in your terrain.

OpenGL Calculating Normals (Quads)

My issue is regarding OpenGL, and Normals, I understand the math behind them, and I am having some success.
The function I've attached below accepts an interleaved Vertex Array, and calculates the normals for every 4 vertices. These represent QUADS that having the same directions. By my understanding these 4 vertices should share the same Normal. So long as they face the same way.
The problem I am having is that my QUADS are rendering with a diagonal gradient, much like this: Light Effect - Except that the shadow is in the middle, with the light in the corners.
I draw my QUADS in a consistent fashion. TopLeft, TopRight, BottomRight, BottomLeft, and the vertices I use to calculate my normals are TopRight - TopLeft, and BottomRight - TopLeft.
Hopefully someone can see something I've made a blunder on, but I have been at this for hours to no prevail.
For the record I render a Cube, and a Teapot next to my objects to check my lighting is functioning, so I'm fairly sure there is no issue regarding Light position.
void CalculateNormals(point8 toCalc[], int toCalcLength)
{
GLfloat N[3], U[3], V[3];//N will be our final calculated normal, U and V will be the subjects of cross-product
float length;
for (int i = 0; i < toCalcLength; i+=4) //Starting with every first corner QUAD vertice
{
U[0] = toCalc[i+1][5] - toCalc[i][5]; U[1] = toCalc[i+1][6] - toCalc[i][6]; U[2] = toCalc[i+1][7] - toCalc[i][7]; //Calculate Ux Uy Uz
V[0] = toCalc[i+3][5] - toCalc[i][5]; V[1] = toCalc[i+3][6] - toCalc[i][6]; V[2] = toCalc[i+3][7] - toCalc[i][7]; //Calculate Vx Vy Vz
N[0] = (U[1]*V[2]) - (U[2] * V[1]);
N[1] = (U[2]*V[0]) - (U[0] * V[2]);
N[2] = (U[0]*V[1]) - (U[1] * V[0]);
//Calculate length for normalising
length = (float)sqrt((pow(N[0],2)) + (pow(N[1],2)) + (pow(N[2],2)));
for (int a = 0; a < 3; a++)
{
N[a]/=length;
}
for (int j = 0; i < 4; i++)
{
//Apply normals to QUAD vertices (3,4,5 index position of normals in interleaved array)
toCalc[i+j][3] = N[0]; toCalc[i+j][4] = N[1]; toCalc[i+j][5] = N[2];
}
}
}
It seems like you are taking the vertex position values for use in calculations from indices 5, 6, and 7, and then writing out the normals at indices 3, 4, and 5. Note how index 5 is used on both. I suppose one of them is not correct.
It looks like your for-loops are biting you.
for (int i = 0; i < toCalcLength; i+=4) //Starting with every first corner QUAD vertice
{
...
for (int j = 0; i < 4; i++)
{ // ^ ^
// Should you be using 'j' instead of 'i' here?
// j will never increment
// This loop won't be called at all after the first time through the outer loop
...
}
}
You use indexes 3, 4, and 5 for storing normal:
toCalc[i+j][3] = N[0]; toCalc[i+j][4] = N[1]; toCalc[i+j][5] = N[2];
AND you use indexes 5, 6 and 7 to get point coordinates:
U[0] = toCalc[i+1][5] - toCalc[i][5]; U[1] = toCalc[i+1][6] - toCalc[i][6]; U[2] = toCalc[i+1][7] - toCalc[i][7];
Those indexes overlap (normal.x shares same index as position.z), which shouldn't be happening.
Recommendations:
Put everything into structures.
Either:
Use math library.
OR put vector arithmetics into separate appropriately named subroutines.
Use named variables instead of indexes.
By doing so you'll reduce number of bugs in your code. a.position.x is easier to read than quad[0][5], and it is easier to fix a typo in vector operation when the code hasn't been copy-pasted.
You can use unions to access vector components by both index and name:
struct Vector3{
union{
struct{
float x, y, z;
};
float v[3];
};
};
For calcualting normal in quad ABCD
A--B
| |
C--D
Use formula:
normal = normalize((B.position - A.position) X (C.position - A.position)).
OR
normal = normalize((D.position - A.position) X (C.position - B.position)).
Where "X" means "cross-product".
Either way will work fine.