Having issues drawing 3D cube's edges with OpenGL 3 - c++

I'm trying to draw 3D cube's vertices (edges only) using OpenGL (4.3 core profile), I know the glPolygonMode function but I'd like not to draw the intermediate diagonal lines. I declare my vertices and my indices like so :
struct Vertex {
glm::vec3 pos;
glm::vec3 color;
glm::vec3 normal;
glm::vec2 uv;
};
Vertex vertices[8];
// Front vertices
vertices[0].pos = glm::vec3(-0.5f, -0.5f, +0.5f);
vertices[1].pos = glm::vec3(+0.5f, -0.5f, +0.5f);
vertices[2].pos = glm::vec3(+0.5f, +0.5f, +0.5f);
vertices[3].pos = glm::vec3(-0.5f, +0.5f, +0.5f);
// Back vertices
vertices[4].pos = glm::vec3(-0.5f, -0.5f, -0.5f);
vertices[5].pos = glm::vec3(+0.5f, -0.5f, -0.5f);
vertices[6].pos = glm::vec3(+0.5f, +0.5f, -0.5f);
vertices[7].pos = glm::vec3(-0.5f, +0.5f, -0.5f);
GLuint indices[36] = {
0, 1, 2, 2, 3, 0, // Front
1, 5, 6, 6, 2, 1, // Right
7, 6, 5, 5, 4, 7, // Back
4, 0, 3, 3, 7, 4, // Left
4, 5, 1, 1, 0, 4, // Bottom
3, 2, 6, 6, 7, 3 // Top
};
My buffer is updated accordingly :
// Bind Vertex Array
glBindVertexArray(_VAO);
// Bind VBO to GL_ARRAY_BUFFER type so that all calls to GL_ARRAY_BUFFER use VBO
glBindBuffer(GL_ARRAY_BUFFER, _VBO);
// Upload vertices to VBO
glBufferData(GL_ARRAY_BUFFER, verticesNb * sizeof(Vertex), vertices, GL_STATIC_DRAW);
// Bind EBO to GL_ARRAY_BUFFER type so that all calls to GL_ARRAY_BUFFER use EBO
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, _EBO);
// Updload indices to EBO
glBufferData(GL_ELEMENT_ARRAY_BUFFER, indicesNb * sizeof(GLuint), indices, GL_STATIC_DRAW);
I'm using glDrawElements(GL_LINES, 36, GL_UNSIGNED_INT, 0); to draw my cube's edges, but for some reason it doesn't draw some edges and I don't understand why.
If I use GL_TRIANGLES, it works pretty well though when I want to render my 3D cube in fill-mode. Does anyone know what I'm missing here ? Is it an issue with the indices ?
(The "cube" below has a custom size of (1.0f, 2.0f, 3.0f))

Your indices form GL_TRIANGLES primitives rather than GL_LINES primitives. See GL_LINES:
Vertices 0 and 1 are considered a line. Vertices 2 and 3 are considered a line. And so on.
The indices form the primitives. Change the indices:
GLuint indices[] = {
0, 1, 1, 2, 2, 3, 3, 0, // Front
4, 5, 5, 6, 6, 7, 7, 4, // Back
0, 4, 1, 5, 2, 6, 3, 7
};
glDrawElements(GL_LINES, 24, GL_UNSIGNED_INT, 0);

Related

How to add vec2 for UV for texture mapping when using indices

I am trying to apply texture mapping to my cubes but are unsure on how to proceed. Current I am using indices to avoid having to repeat vec3s to make a cube and a vertex array of the points and their normals like so:
// Cube data as our basic building block
unsigned int indices[] = {
10, 8, 0, 2, 10, 0, 12, 10, 2, 4, 12, 2,
14, 12, 4, 6, 14, 4, 8, 14, 6, 0, 8, 6,
12, 14, 8, 10, 12, 8, 2, 0, 6, 4, 2, 6
};
vec3 vertexArray[] = {
vec3(-0.5f, -0.5f, -0.5f), vec3(-0.408248, -0.816497, -0.408248),
vec3(0.5f, -0.5f, -0.5f), vec3(0.666667, -0.333333, -0.666667),
vec3(0.5f, 0.5f, -0.5f), vec3(0.408248, 0.816497, -0.408248),
vec3(-0.5f, 0.5f, -0.5f), vec3(-0.666667, 0.333333, -0.666667),
vec3(-0.5f, -0.5f, 0.5f), vec3(-0.666667, -0.333333, 0.666667),
vec3(0.5f, -0.5f, 0.5f), vec3(0.666667, -0.666667, 0.333333),
vec3(0.5f, 0.5f, 0.5f), vec3(0.408248, 0.408248, 0.816497),
vec3(-0.5f, 0.5f, 0.5f), vec3(-0.408248, 0.816497, 0.408248),
};
// convert arrays to vectors
std::vector<vec3> vertexArrayVector;
vertexArrayVector.insert(vertexArrayVector.begin(), std::begin(vertexArray), std::end(vertexArray));
std::vector<unsigned int> indicesVector;
indicesVector.insert(indicesVector.begin(), std::begin(indices), std::end(indices));
I want to now apply textures to the cube but I am not sure how to add the use of a vec2 for UV when using indices. My creating of VBOs and VAOs like this if it helps:
GLuint vertexBufferObject;
GLuint indexBufferObject;
GLuint vertexArrayObject;
glGenVertexArrays(1, &vertexArrayObject);
glGenBuffers(1, &indexBufferObject);
glGenBuffers(1, &vertexBufferObject);
glBindVertexArray(vertexArrayObject);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, indexBufferObject);
glBufferData(GL_ELEMENT_ARRAY_BUFFER, sizeof(vertexIndicesArray[0]) * vertexIndicesArray.size(), &vertexIndicesArray[0], GL_STATIC_DRAW);
// Upload Vertex Buffer to the GPU, keep a reference to it (vertexBufferObject)
glBindBuffer(GL_ARRAY_BUFFER, vertexBufferObject);
glBufferData(GL_ARRAY_BUFFER, sizeof(vertexPointsArray[0]) * vertexPointsArray.size(), &vertexPointsArray[0], GL_STATIC_DRAW);
// Teach GPU how to read position data from vertexBufferObject
glVertexAttribPointer(0, // attribute 0 matches aPos in Vertex Shader
3, // size
GL_FLOAT, // type
GL_FALSE, // normalized?
0, // 0 stride
(void*)0 // array buffer offset
);
glEnableVertexAttribArray(0);
// Teach GPU how to read normals data from vertexBufferObject
glVertexAttribPointer(1, // attribute 1 matches normals in Vertex Shader
3,
GL_FLOAT,
GL_FALSE,
0,
(void*)sizeof(glm::vec3) // normal is offseted a vec3 (comes after position)
);
glEnableVertexAttribArray(1);
The vertex coordinate an the texture coordinates for a tuple with 5 components (x, y, z, u, v). If you have a vertex coordinate that is shared by the face but is associated to different texture coordinates, you need to duplicate a vertex coordinate. You must specify 1 attribute tuple for each vertex coordinate and texture coordinate combination required in your mesh.
It is not possible to specify different indices for the vertex coordinates and texture coordinates. See Rendering meshes with multiple indices and Why does OpenGL not support multiple index buffering?.

Trouble to access attributes by using glVertexArrayAttribFormat / glVertexArrayVertexBuffer [closed]

Closed. This question is not reproducible or was caused by typos. It is not currently accepting answers.
This question was caused by a typo or a problem that can no longer be reproduced. While similar questions may be on-topic here, this one was resolved in a way less likely to help future readers.
Closed 2 years ago.
Improve this question
I have this code :
Upp::Vector<float> verticesTriangle{
1.0f, 0.0f, 0.0f, 0.0f, 0.0f, 0.0f, 0.0f, 0.0f, -0.5f, -0.5f, 0.0f,
0.0f, 1.0f, 0.0f, 0.0f, 0.0f, 0.0f, 0.0f, 0.0f, 0.5f, -0.5f, 0.0f,
0.0f, 0.0f, 1.0f, 0.0f, 0.0f, 0.0f, 0.0f, 0.0f, 0.0f, 0.5f, 0.0f,
};
glGenVertexArrays(1, &VAO);
glBindVertexArray(VAO);
//Setting up the VAO Attribute format
glVertexArrayAttribFormat(VAO, 0, 3, GL_FLOAT, GL_FALSE, 0); //Will be colors (R G B in float)
glVertexArrayAttribFormat(VAO, 1, 2, GL_FLOAT, GL_FALSE, 3); //Will be texture coordinates
glVertexArrayAttribFormat(VAO, 2, 3, GL_FLOAT, GL_FALSE, 5); //Normals
glVertexArrayAttribFormat(VAO, 3, 3, GL_FLOAT, GL_FALSE, 8); //Will be my position
glEnableVertexArrayAttrib(VAO, 0);
glEnableVertexArrayAttrib(VAO, 1);
glEnableVertexArrayAttrib(VAO, 2);
glEnableVertexArrayAttrib(VAO, 3);
//Generating a VBO
glGenBuffers(1, &VBOCarre);
glBindBuffer(GL_ARRAY_BUFFER, VBOCarre);
glBufferStorage(GL_ARRAY_BUFFER, sizeof(float) * verticesTriangle.GetCount(), verticesTriangle, GL_MAP_READ_BIT | GL_MAP_WRITE_BIT);
//Binding the VBO to be read by VAO
glVertexArrayVertexBuffer(VAO, 0, VBOCarre, 0 * sizeof(float), 11 * sizeof(float));
glVertexArrayVertexBuffer(VAO, 1, VBOCarre, 3 * sizeof(float), 11 * sizeof(float));
glVertexArrayVertexBuffer(VAO, 2, VBOCarre, 5 * sizeof(float), 11 * sizeof(float));
glVertexArrayVertexBuffer(VAO, 3, VBOCarre, 8 * sizeof(float), 11 * sizeof(float));
//Bind VAO
glBindVertexArray(VAO);
I have no problem retrieving the first attribute in my shader however, when I trying to retrieve others, It dont work. To test it, I have setup an float array and a simple shader program and I try to retrieve my position to draw a triangle.
Here is how my datas are ordered :
Here is my vertex shader :
#version 400
layout (location = 0) in vec3 colors;
layout (location = 1) in vec2 textCoords;
layout (location = 2) in vec3 normals;
layout (location = 3) in vec3 positions;
out vec3 fs_colors;
void main()
{
gl_Position = vec4(positions.x, positions.y, positions.z, 1.0);
// gl_Position = vec4(colors.x, colors.y, colors.z, 1.0); //This line work, proofing my
// first attribute is sended well to my shader
fs_colors = colors;
}
The problem is, except the first attribute, all others seems to not be sent to the shader. What am I missing ?!
You're putting stuff in the wrong place.
glVertexArrayAttribFormat(VAO, 1, 2, GL_FLOAT, GL_FALSE, 3); //Will be texture coordinates
The "3" here is being passed as a byte offset from the start of a vertex in the array to the particular data for that vertex in the attribute. Obviously, your texture coordinate is not 3 bytes from the start of your vertex; it's 3 * sizeof(float) bytes from the start of the vertex.
Similarly:
glVertexArrayVertexBuffer(VAO, 1, VBOCarre, 3 * sizeof(float), 11 * sizeof(float));
This makes no sense either. You're only using a single buffer, and all four attributes read from the same binding. So you should only bind a single buffer.
The offset ought to be 0, because that's where a vertex in the buffer starts. And the stride should be what you wrote.
You also never directly set the association between the attributes and the binding index with glVertexArrayAttribBinding. You probably got things to work by relying on the default, but you shouldn't be using the default here.
The correct code would be:
//Generating a VBO
glCreateBuffers(1, &VBOCarre);
//No need to call glBindBuffer(GL_ARRAY_BUFFER, VBOCarre);, since we're doing DSA.
glNamedBufferStorage(VBOCarre, sizeof(float) * verticesTriangle.GetCount(), verticesTriangle, GL_MAP_READ_BIT | GL_MAP_WRITE_BIT);
glCreateVertexArrays(1, &VAO);
//No need to glBindVertexArray(VAO);, since we're using DSA.
//Setting up the VAO Attribute format
glEnableVertexArrayAttrib(VAO, 0);
glVertexArrayAttribFormat(VAO, 0, 3, GL_FLOAT, GL_FALSE, 0); //Will be colors (R G B in float)
glEnableVertexArrayAttrib(VAO, 1);
glVertexArrayAttribFormat(VAO, 1, 2, GL_FLOAT, GL_FALSE, 3 * sizeof(float)); //Will be texture coordinates
glEnableVertexArrayAttrib(VAO, 2);
glVertexArrayAttribFormat(VAO, 2, 3, GL_FLOAT, GL_FALSE, 5 * sizeof(float)); //Normals
glEnableVertexArrayAttrib(VAO, 3);
glVertexArrayAttribFormat(VAO, 3, 3, GL_FLOAT, GL_FALSE, 8 * sizeof(float)); //Will be my position
//One buffer, one binding.
glVertexArrayVertexBuffer(VAO, 0, VBOCarre, 0, 11 * sizeof(float));
//Make all attributes read from the same buffer.
glVertexArrayAttribBinding(VAO, 0, 0);
glVertexArrayAttribBinding(VAO, 1, 0);
glVertexArrayAttribBinding(VAO, 2, 0);
glVertexArrayAttribBinding(VAO, 3, 0);
//We can glBindVertexArray(VAO); when we're about to use it, not just because we finished setting it up.

Drawing a cube using Indexed Draw in modern OpenGL

I'm trying to draw a cube using indexed draw in OpenGL 3.3. But it does not show up right...
Here is what I tried doing.
GLfloat vertices01[] = {
-1.0f,1.0f,0.0f,
-1.0f,-1.0f,0.0f,
1.0f,1.0f,0.0f,
1.0f,-1.0f,0.0f,
-1.0f,1.0f,-1.0f,
-1.0f,-1.0f,-1.0f,
1.0f,1.0f,-1.0f,
1.0f,-1.0f,-1.0f
};
unsigned int indices01[] = {
0, 2, 3, 1,
2, 6, 7, 3,
6, 4, 5, 7,
4, 0, 1, 5,
0, 4, 6, 2,
1, 5, 7, 3
};
Mesh* obj3 = new Mesh();
obj3->CreateMesh(vertices01, indices01, 24, 24);
meshList.push_back(obj3);
meshList[0]->RenderMesh();
//in mesh class
indexCount = numOfIndices;
glGenVertexArrays(1, &VAO);
glBindVertexArray(VAO);
glGenBuffers(1, &IBO);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, IBO);
glBufferData(GL_ELEMENT_ARRAY_BUFFER, sizeof(indices[0])*numOfIndices, indices, GL_STATIC_DRAW);
glGenBuffers(1, &VBO);
glBindBuffer(GL_ARRAY_BUFFER, VBO);
glBufferData(GL_ARRAY_BUFFER, sizeof(vertices[0])*numOfVertices, vertices, GL_STATIC_DRAW);
glVertexAttribPointer(0, 4, GL_FLOAT, GL_FALSE, 0, 0);
glEnableVertexAttribArray(0);
glBindBuffer(GL_ARRAY_BUFFER, 0);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, 0);
glBindVertexArray(0);
glBindVertexArray(VAO);
//bind ibo
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, IBO);
glDrawElements(GL_TRIANGLES,indexCount, GL_UNSIGNED_INT, 0);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, 0);
glBindVertexArray(0);
the output shows a partial cube whose each side had one triangle each and also a triangle going through it's diagonal
If you use a compatibility profile context, then you can keep your indices an use GL_QUADS instead of GL_TRIANGLES. But that's deprecated (Legacy OpenGL ).
Since the primitive type is GL_TRIANGLES, each side of the cube has to be formed by 2 triangles. See Triangle primitives.
Change the index buffer to solve the issue:
unsigned int indices01[] = {
0, 2, 3, 0, 3, 1,
2, 6, 7, 2, 7, 3,
6, 4, 5, 6, 5, 7,
4, 0, 1, 4, 1, 5,
0, 4, 6, 0, 6, 2,
1, 5, 7, 1, 7, 3,
};
An alternative solution would be to use the primitive type GL_TRIANGLE_STRIP and a Primitive Restart index.
Enable primitive restart and define a restart index:
e.g.
glEnable( GL_PRIMITIVE_RESTART );
glPrimitiveRestartIndex( 99 );
Define 2 triangle strips, which are separated by the restart index:
unsigned int indices01[] = {
0, 1, 2, 3, 6, 7, 4, 5,
99, // 99 is the restart index
7, 3, 5, 1, 4, 0, 6, 2
};
int indexCOunt = 17;
And draw the elements:
glDrawElements(GL_TRIANGLE_STRIP, indexCount, GL_UNSIGNED_INT, 0);
Note, the Index buffer binding is stored in the Vertex Array Object.
So it is sufficient to bind the index buffer once, when the VAO is setup:
glGenVertexArrays(1, &VAO);
glBindVertexArray(VAO);
// [...]
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, IBO);
// [...]
// glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, 0); <---- delete this
glBindVertexArray(0);
Then it is superfluous to bind the index buffer again, before the draw call:
glBindVertexArray(VAO);
// glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, IBO); <---- now this is superfluous
glDrawElements(GL_TRIANGLES,indexCount, GL_UNSIGNED_INT, 0);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, 0);
glBindVertexArray(0);

Rendering two separate objects, one object does not appear on screen

I'm trying to create random terrain, surrounded by a skybox, which I will then apply a texture to. I have 2 shader programs running interchangeably but I can't seem to make the skybox appear on screen, whereas the terrain is rendered normally.
The terrain is essentially a grid on the x-z plane and every vertex has a y value generated by a noise algorithm. The grid is square and has a side of SIDE. I have centered the terrain by using an offset on the x and z values, equal to SIDE/2.
The skybox is a cube which also has a side SIDE, so it should fit perfectly with the grid inside it, and should be visible since the y values of the terrain's vertices are small compared to SIDE or even SIDE/2.
This is the code:
GLfloat skyboxVertices[24] = {
-HALF_SIDE, HALF_SIDE, -HALF_SIDE, //A = 1
HALF_SIDE, HALF_SIDE, -HALF_SIDE, //B = 2
HALF_SIDE, -HALF_SIDE, -HALF_SIDE, //C = 3
-HALF_SIDE, -HALF_SIDE, -HALF_SIDE, //D = 4
-HALF_SIDE, HALF_SIDE, HALF_SIDE, //E = 5
HALF_SIDE, HALF_SIDE, HALF_SIDE, //F = 6
HALF_SIDE, -HALF_SIDE, HALF_SIDE, //G = 7
-HALF_SIDE, -HALF_SIDE, HALF_SIDE //H = 8
};
//generating & binding skybox VAO
glGenVertexArrays(1, &skyboxVAO);
glBindVertexArray(skyboxVAO);
//creating & binding VBO for skybox vertices
glGenBuffers(1, &skyboxVBO);
glBindBuffer(GL_ARRAY_BUFFER, skyboxVBO);
glBufferData(GL_ARRAY_BUFFER, sizeof(skyboxVertices),
skyboxVertices, GL_STATIC_DRAW);
glVertexAttribPointer(0, 3, GL_FLOAT, GL_FALSE, 0, NULL);
glEnableVertexAttribArray(0);
//3 indices per triangle, 2 triangles per side, 6 sides for a cube
//for a total of 36
totalSkyboxElements = 36;
GLuint skyboxElements[36]{
0, 1, 3, 3, 2, 0, //front
4, 5, 1, 1, 0, 4, //up
4, 5, 6, 6, 7, 4, //back
7, 6, 3, 3, 2, 7, //bottom
1, 5, 6, 6, 3, 1, //right
0, 4, 7, 7, 2, 0 //left
};
//creating & binding VBO for skybox vertices
glGenBuffers(1, &skyboxElementsVBO);
glBindBuffer(GL_ARRAY_BUFFER, skyboxElementsVBO);
glBufferData(GL_ARRAY_BUFFER, sizeof(skyboxElements),
skyboxElements, GL_STATIC_DRAW);
The code of the vertex and fragment shaders are as follows, and are compiled and linked without errors:
//vertex shader
#version 330 core
layout(location = 0) in vec3 position;
void main()
{
gl_Position = vec4(position, 1.0);
}
//fragment shader
#version 330 core
out vec4 color;
void main()
{
color = vec4(1.0, 1.0, 1.0, 1.0);
}
Finally, in the mainloop the code is as follows. shaderProgramSB refers to the skybox and shaderProgram refers to the terrain
glClear(GL_COLOR_BUFFER_BIT| GL_DEPTH_BUFFER_BIT);
glUseProgram(shaderProgramSB);
glBindVertexArray(skyboxVAO);
glDrawElements(GL_TRIANGLES, totalSkyboxElements, GL_UNSIGNED_INT, 0);
glUseProgram(shaderProgram);
glBindVertexArray(terrainVAO);
camera->update();
projectionMatrix = camera->projectionMatrix;
viewMatrix = camera->viewMatrix;
modelMatrix = glm::mat4(1.0);
MVP = projectionMatrix * viewMatrix * modelMatrix;
glUniformMatrix4fv(MVPLocation, 1, GL_FALSE, &MVP[0][0]);
glDrawElements(GL_TRIANGLES, totalElements, GL_UNSIGNED_INT, 0);
I have tried rendering the terrain first and the skybox after, but the result is always the same. Terrain rendered normally, skybox not rendered at all. I've been searching and searching but can't seem to locate the problem. Any help is greatly appreciated.

opengl vertices not rendering

int main() {
using namespace game1;
using namespace graphics;
using namespace maths;
using namespace util;
using namespace std;
Window window("Game 1", 960, 540);
glClearColor(0.2f, 0.3f, 0.8f, 1.0f);
GLfloat vertices[] = {
4, 3, 5,
12, 3, 5,
4, 6, 5,
4, 6, 5,
12, 6, 5,
4, 3, 5
};
Matrix4f ortho = Matrix4f::genOrtho(0.0f, 16.0f, 0.0f, 9.0f, -1, 1);
GLuint vbo;
glGenBuffers(1, &vbo);
glBindBuffer(GL_ARRAY_BUFFER, vbo);
glBufferData(GL_ARRAY_BUFFER, sizeof(vertices), vertices, GL_STATIC_DRAW);
glVertexAttribPointer(0, 3, GL_FLOAT, GL_FALSE, 0, 0);
glEnableVertexAttribArray(0);
ShaderProgram s;
s.attachShader(File("res/testVert.vert"), GL_VERTEX_SHADER);
s.attachShader(File("res/testFrag.frag"), GL_FRAGMENT_SHADER);
s.link();
s.enable();
s.setUniformMat4f("model", Matrix4f::identity());
s.setUniformMat4f("world", Matrix4f::identity());
s.setUniformMat4f("proj", ortho);
while (!window.closed())
{
window.clear();
glDrawArrays(GL_TRIANGLES, 0, 3);
window.update();
}
return 0;
}
I am quite new to opengl, and i am not sure what i am doing wrong. I suspect it has something to do with the vbo or vertexAttribArrayPointer and i've tested lots of things but i cant fins what is wrong. It works with a normal 3 vertex triangle and i'm sure my error is somewhere in main
The ortho-matrix that gets defined in the code has a visible range along the z-axes from -1 to 1. The triangle on the other hand is at z=5 which means it is outside of the near-plane/far-plane range.