Drawing many circles using GL_LINE_LOOP - c++

I have problem rendering many circles on the screen using these code.
float degree = 0;
unsigned int ctr = 0;
for(int xi = -3200; xi < 3200; xi+= 2*r)
{
for(int yi = 4800; yi > -4800; yi-= 2*r)
{
for(int i = 0; i < 360; ++i)
{
vertices.push_back(xi + r * cos(float(degree)));
vertices.push_back(yi + r * sin(float(degree)));
vertices.push_back(-8);
indices.push_back(i+ctr);
++degree;
}
ctr += 360;
degree = 0;
}
}
unsigned int i = 0;
for(i = 0; i < indices.size()/360; ++i)
{
glEnableClientState(GL_VERTEX_ARRAY);
glVertexPointer(3, GL_FLOAT, 0, &vertices[i*360]);
glLineWidth(1);
glDrawElements(GL_LINE_LOOP, 360, GL_UNSIGNED_INT, &indices[i*360]);
glDisableClientState(GL_VERTEX_ARRAY);
}
Here is the result
In addition, the program crashes when I change xi value to [-6400, 6400]

Leaving aside the questionable nature of this technique, you look to be accessing the indices incorrectly.
glVertexPointer(3, GL_FLOAT, 0, &vertices[i*360]);
glDrawElements(GL_LINE_LOOP, 360, GL_UNSIGNED_INT, &indices[i*360]);
The indices of glDrawElements specify an offset from the vertices at glVertexPointer. You've defined the indices as relative to the start of the vertex buffer:
indices.push_back(i+ctr);
But you're moving the buffer offset for each circle you draw. So in your indices buffer, the second circle starts at index 360. But when you draw the second circle, you also move the vertex pointer such that index 360 is the 0th element of the pointer.
Then when you try to access index 360, you're actually accessing element 720 (360 + start of buffer #360).

Related

OpenGL line drawing issue when using glDrawArrays

When I am trying to draw a line using legacy openGL , lines are drawing fine.
glEnable(GL_LINE_STIPPLE);
glBegin(GL_LINE_LOOP);
for (size_t idx = 0; idx < m_spline_cvs.size(); idx++) {
glVertex2f(m_cv_positions[0][idx].x,m_cv_positions[0][idx].y);
}
glEnd();
glDisable(GL_LINE_STIPPLE);
Correct Line Loop in Stipple mode
But when I am trying to use **glDrawArrays**. Something is going wrong?
std::vector <float> cv_tracker_target_line;
for (size_t idx = 0; idx < m_spline_cvs.size(); idx++) {
cv_tracker_target_line.push_back(m_cv_positions[0][idx].x);
cv_tracker_target_line.push_back(m_cv_positions[0][idx].y);
}
glEnable(GL_LINE_STIPPLE);
glEnableClientState(GL_VERTEX_ARRAY);
glVertexPointer(2, GL_FLOAT, 0, &cv_tracker_target_line[0]);
glDrawArrays(GL_LINE_LOOP, 0, cv_tracker_target_line.size());
glDisable(GL_LINE_STIPPLE);
glDisableClientState(GL_VERTEX_ARRAY);
Wrong Lines drawn not in loop
What am I doing wrong here?
The 3rd argument in glDrawArrays is the number of vertices, but not the number of elements (floats) in the array. Each vertex coordinate consists of 2 components (x, y):
glDrawArrays(GL_LINE_LOOP, 0, cv_tracker_target_line.size());
glDrawArrays(GL_LINE_LOOP, 0, cv_tracker_target_line.size() / 2);

c++ OpenGL: Mesh only appears on bottom half of the window

I have just begun learning OpenGL, and I think there is a problem with my index array formula.
I'm trying to render a square terrain using IBO. When I draw with glDrawElements, the result would only appear on the bottom half of the screen, all tightly packed in a rectangular shape, while when I use glDrawArrays it works out perfectly with a square shaped and centered mesh.
I load my vertex height values from a grayscale, here is how I load vertices and create indices:
For vertices: right to left, bottom to top
int numVertices = image.width() * image.height() * 3;
float rowResize = image.width() / 2;
float colResize = image.height() / 2;
GLfloat* vertexData;
vertexData = new GLfloat[numVertices];
int counter = 0;
for (float j = 0; j < col; j++){
for (float i = 0; i < row; i++){
vertexData[counter++] = (i - rowResize) / rowResize;
vertexData[counter++] = (j - colResize) / colResize;
vertexData[counter++] = image.getColor(i,j) / 255.0f;
}
}
For indices: Trying to follow the order of {0, 1, 2, 1, 3, 2...}
2 3
-------
|\ |
| \ |
| \ |
| \ |
| \|
-------
0 1
int numIndices = (row - 1) * (col - 1) * 2 * 3;
unsigned short* indexData = new unsigned short[numIndices];
counter = 0;
for (short y = 0; y < col - 1; y++){
for (short x = 0; x < row - 1; x++){
// lower triangle
short L_first = y*row + x;
short L_second = L_first + 1;
short L_third = L_first + row;
//upper triangle
short U_first = L_first + 1;
short U_second = U_first + row;
short U_third = U_second - 1;
indexData[counter++] = L_first;
indexData[counter++] = L_second;
indexData[counter++] = L_third;
indexData[counter++] = U_first;
indexData[counter++] = U_second;
indexData[counter++] = U_third;
}
}
I initialized VAO, VBO and IBO, and then gen, bind, link data for each buffer object, and then unbind all.
In the game loop I have:
glBindVertexArray(VAO);
glEnableVertexAttribArray(0);
glVertexAttribPointer(0, 3, GL_FLOAT, GL_FALSE, sizeof(GLfloat) * 3, 0);
//glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, ibo);
glDrawArrays(GL_POINTS, 0, numVertices);
//glDrawElements(GL_TRIANGLE_STRIP, numIndices, GL_UNSIGNED_SHORT, 0);
glBindVertexArray(0);
glfwSwapBuffers(window);
Since drawing from vertices works and drawing from indices doesn't, what could be wrong with my indices generation?
Thank you for your help!
(Weird thing: I just tried with another grayscale image, and it worked well with both drawing from verticesGL_POINTS and indicesGL_TRIANGLE_STRIP...welp)
Pictures
Using glDrawArrays
Using glDrawElements

OpenGL Vertices being clipped from the side

I'm having my vertices clipped on the edged as shown on this album:
http://imgur.com/a/VkCrJ
When my terrain size if 400 x 400 i get clipping, yet at 40x40 or anything less, i don't get any clipping.
This is my code to fill the position and indices:
void Terrain::fillPosition()
{
//start from the top right and work your way down to 1,1
double x = -1, y = 1, z = 1;
float rowValue = static_cast<float>((1.0f / _rows) * 2.0); // .05 if 40
float colValue = static_cast<float>((1.0f / _columns) * 2.0); // .05 if 40
for (y; y > -1; y -= colValue)
{
for (x; x < 1; x += rowValue)
{
_vertexPosition.emplace_back(glm::vec3(x, y, z));
}
x = -1;
}
}
This properly sets my position, I've tested it with GL_POINTS. It works fine at 400x400 and 40x40 and other values in between.
Index code:
void Terrain::fillIndices()
{
glm::ivec3 triangle1, triangle2;
for (int y = 0; y < _columns - 1; y++)
{
for (int x = 0; x < _rows - 1; x++)
{
// Triangle 1
triangle1.x = x + y * _rows;
triangle1.y = x + (y + 1) * _rows;
triangle1.z =(x + 1) + y * _rows;
// Triangle 2
triangle2.x = triangle1.y;
triangle2.y = (x + 1) + (y + 1) * _rows;
triangle2.z = triangle1.z;
// add our data to the vector
_indices.emplace_back(triangle1.x);
_indices.emplace_back(triangle1.y);
_indices.emplace_back(triangle1.z);
_indices.emplace_back(triangle2.x);
_indices.emplace_back(triangle2.y);
_indices.emplace_back(triangle2.z);
}
}
}
_indices is std::vector.I'm not sure what's causing this, But I'm pretty sure it's the way I'm filling the indices for the mesh. I've re-written my algorhithm and it ends up with the same result, small values work perfectly fine, and large values over ~144 get clipped. I fill my buffers like this:
void Terrain::loadBuffers()
{
// generate the buffers and vertex arrays
glGenVertexArrays(1, &_vao);
glGenBuffers(1, &_vbo);
glGenBuffers(1, &_ebo);
// bind the vertex array
glBindVertexArray(_vao);
// bind the buffer to the vao
glBindBuffer(GL_ARRAY_BUFFER, _vbo);
glBufferData(GL_ARRAY_BUFFER, _vertexPosition.size() * sizeof(_vertexPosition[0]), _vertexPosition.data(), GL_STATIC_DRAW);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, _ebo);
glBufferData(GL_ELEMENT_ARRAY_BUFFER, _indices.size() * sizeof(_indices[0]), _indices.data(), GL_STATIC_DRAW);
// enable the shader locations
glEnableVertexAttribArray(0);
glVertexAttribPointer(0, 3, GL_FLOAT, GL_FALSE, 0, nullptr);
// unbind our data
glBindVertexArray(0);
}
and my draw call:
void Terrain::renderTerrain(ResourceManager& manager, ResourceIdTextures id)
{
// set the active texture
glActiveTexture(GL_TEXTURE0);
// bind our texture
glBindTexture(GL_TEXTURE_2D, manager.getTexture(id).getTexture());
_shaders.use();
// send data the our uniforms
glUniformMatrix4fv(_modelLoc, 1, GL_FALSE, glm::value_ptr(_model));
glUniformMatrix4fv(_viewLoc, 1, GL_FALSE, glm::value_ptr(_view));
glUniformMatrix4fv(_projectionLoc, 1, GL_FALSE, glm::value_ptr(_projection));
glUniform1i(_textureLoc, 0);
glBindVertexArray(_vao);
// Draw our terrain;
glPolygonMode(GL_FRONT_AND_BACK, GL_LINE);
glDrawElements(GL_TRIANGLES, _indices.size(), GL_UNSIGNED_INT, 0);
glPolygonMode(GL_FRONT_AND_BACK, GL_FILL);
glBindVertexArray(0);
_shaders.unuse();
}
I thought it was because of my transformations to the model, so i removed all transformations and it's the same result. I tried debugging by casting the glm::vec3 to_string but the data looks fine, My projectionMatrix is:
glm::perspective(glm::radians(_fov), _aspRatio, 0.1f, 1000.0f);
So i doubt it's my perspective doing the clipping. _aspRatio is 16/9.
It's really strange that it works fine with small rowsxcolumns and not large ones, I'm really not sure what the problem is.
I would check the length of _vertexPosition; I suspect the problem is that you are (depending on the number of _rows) generating an extra point at the end of your inner loop (and your outer loop too, depending on _columns).
The reason is that the termination condition of your vertex loops depends on the exact behavior of your floating point math. Specifically, you divide up the range [-1,1] into _rows segments, then add them together and use them as a termination test. It is unclear whether you expect a final point (yielding _rows+1 points per inner loop) or not (yielding a rectangle which doesn't cover the entire [-1,1] range). Unfortunately, floating point is not exact, so this is a recipe for unreliable behavior: depending on the direction of your floating point error, you might get one or the other.
For a larger number of _rows, you are adding more (and significantly smaller) numbers to the same initial value; this will aggravate your floating point error.
At any rate, in order to get reliable behavior, you should use integer loop variables to determine loop termination. Accumulate your floating point coordinates separately, so that exact accuracy is not required.

Rotating Vertex Array Object not working

I am using vertex arrays to store circle vertices and colors.
Here is the setup function:
void setup1(void)
{
glClearColor(1.0, 1.0, 1.0, 0.0);
// Enable two vertex arrays: co-ordinates and color.
glEnableClientState(GL_VERTEX_ARRAY);
glEnableClientState(GL_COLOR_ARRAY);
// Specify locations for the co-ordinates and color arrays.
glVertexPointer(3, GL_FLOAT, 0, Vertices1);
glColorPointer(3, GL_FLOAT, 0, Colors1);
}
The global declaration of the arrays is here:
static float Vertices1[500] = { 0 };
static float Colors1[500] = { 0 };
The arrays are all set up here (R is the radius, X and Y are the (X,Y) center, and t is the angle parameter of the circle)
void doGlobals1()
{
for (int i = 0; i < numVertices1 * 3; i += 3)
{
Vertices1[i] = X + R * cos(t);
Vertices1[i + 1] = Y + R * sin(t);
Vertices1[i + 2] = 0.0;
t += 2 * PI / numVertices1;
}
for (int j = 0; j < numVertices1 * 3; j += 3)
{
Colors1[j] = (float)rand() / (float)RAND_MAX;
Colors1[j + 1] = (float)rand() / (float)RAND_MAX;
Colors1[j + 2] = (float)rand() / (float)RAND_MAX;
}
}
Finally, this is where the shape is drawn.
// Window 1 drawing routine.
void drawScene1(void)
{
glutSetWindow(win1);
glLoadIdentity();
doGlobals1();
glClear(GL_COLOR_BUFFER_BIT);
glRotatef(15, 1, 0, 0);
glDrawArrays(GL_TRIANGLE_FAN, 0, numVertices1);
glFlush();
}
Without the Rotation, the circle draws just fine. The circle also draws fine with any Scale/Translate function. I suspect there is some special protocol necessary to rotate an object drawn with vertex arrays.
Can anyone tell me where I have gone wrong, what I will need to do in order to rotate the object, or offer any advice?
glRotatef(15, 1, 0, 0);
^ why the X axis?
The default ortho projection matrix has pretty tight near/far clipping planes: -1 to 1.
Rotating your circle of X/Y coordinates outside of the X/Y plane will tend to make those points get clipped.
Rotate around the Z axis instead:
glRotatef(15, 0, 0, 1);

Hard time understanding indices with glDrawElements

I'm trying to draw a terrain with GL_TRIANGLE_STRIP and glDrawElements but I'm having a really hard time understanding the indices thing behind glDrawElements...
Here's what I have so far:
void Terrain::GenerateVertexBufferObjects(float ox, float oy, float oz) {
float startWidth, startLength, *vArray;
int vCount, vIndex = -1;
// width = length = 256
startWidth = (width / 2.0f) - width;
startLength = (length / 2.0f) - length;
vCount = 3 * width * length;
vArray = new float[vCount];
for(int z = 0; z < length; z++) {
// vIndex == vIndex + width * 3 || width * 3 = 256 * 3 = 768
for(int x = 0; x < width; x++) {
vArray[++vIndex] = ox + startWidth + (x * stepWidth);
vArray[++vIndex] = oy + heights[z][x];
vArray[++vIndex] = oz + startLength + (z * stepLength);
}
}
glGenBuffers(1, &vertexBuffer);
glBindBuffer(GL_ARRAY_BUFFER, vertexBuffer);
glBufferData(GL_ARRAY_BUFFER, sizeof(float) * vCount, vArray, GL_STATIC_DRAW);
glBindBuffer(GL_ARRAY_BUFFER, 0);
}
void Terrain::DrawVBO(unsigned int texID, float ox, float oy, float oz) {
float terrainLight[] = { 1.0f, 1.0f, 1.0f, 1.0f };
if(!generatedVBOs) {
GenerateVertexBufferObjects(ox, oy, oz);
generatedVBOs = true;
}
unsigned int indices[] = { 0, 768, 3, 771 };
glGenBuffers(1, &indexBuffer);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, indexBuffer);
glBufferData(GL_ELEMENT_ARRAY_BUFFER, sizeof(unsigned int) * 4, indices, GL_STATIC_DRAW);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, 0);
glEnableClientState(GL_VERTEX_ARRAY);
glBindBuffer(GL_ARRAY_BUFFER, vertexBuffer);
glVertexPointer(3, GL_FLOAT, 0, 0);
glPolygonMode(GL_FRONT_AND_BACK, GL_LINE);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, indexBuffer);
glMaterialfv(GL_FRONT_AND_BACK, GL_AMBIENT_AND_DIFFUSE, terrainLight);
glDrawElements(GL_TRIANGLE_STRIP, 4, GL_UNSIGNED_INT, 0);
glPolygonMode(GL_FRONT_AND_BACK, GL_FILL);
glDisableClientState(GL_VERTEX_ARRAY);
glBindBuffer(GL_ARRAY_BUFFER, 0);
}
I believe my vArray is correct, I use the same values when drawing with glBegin(GL_TRIANGLE_STRIP)/glEnd which works just fine.
My guess was to use just the index of the x coordinate for each vertex. But I have no idea if that's the right way to use indices with glDrawElements.
0: Index of the x coordinate from the first vertex of the triangle. Location: (-128, -128).
768: Index of the x coordinate from the second vertex of the triangle. Location: (-128, -127)
3: Index of the x coordinate from the third vertex of the triangle. Location: (-127, -128)
771: Index of the x coordinate from the fourth vertex, which will draw a second triangle. Location: (-127, -127).
I think everything is making sense so far?
What's not working is that the location values above (which I doubled checked on vArray and they are correct) are not the same which glDrawElements is using. Two triangles are drawn but they are a lot bigger than what they should be. It starts correctly at (-128, -128) but it goes to something like (-125, -125) instead of (-127, -127).
I can't understand what I'm doing wrong here...
Using something like the following solves my problem:
unsigned int indices[] = { 0, 256, 1, 257 };
I think it's safe to assume that the index is the x coordinate and that OpenGL is expecting that to be followed by y and z but we shouldn't increase by 3 ourselves, the server does it for us.
And now that I think about it, glDrawElements has the word element on it, which in this case is a vertex with 3 coordinates as specified in glVertexPointer and we need to pass the indices to the element, not the vertex.
I feel so dumb now...