OpenGL Vertex Buffer doesn't draw anything in golang - opengl

I tried to use this tutorial with Golang: http://www.opengl-tutorial.org/beginners-tutorials/tutorial-2-the-first-triangle/
The go-version opens the window and makes the background blue, but it doesn't show the triangle. The c-version does show it.
This is the code in Go:
err := glfw.Init()
if err != nil {
log.Fatal("Failed to init GLFW: " + err.Error())
}
err = glfw.OpenWindow(1024, 768, 0,0,0,0, 32,0, glfw.Windowed)
if err != nil {
log.Fatal("Failed to open GLFW window: " + err.Error())
}
if gl.Init() != 0 {
log.Fatal("Failed to init GL")
}
gl.ClearColor(0.0, 0.0, 0.3, 0.0)
// create vertexbuffer
gVertexBufferData := []float32{-1.0,-1.0,0.0, 1.0,-1.0,0.0, 0.0,1.0,0.0}
vertexBuffer := gl.GenBuffer()
vertexBuffer.Bind(gl.ARRAY_BUFFER)
gl.BufferData(gl.ARRAY_BUFFER, len(gVertexBufferData), gVertexBufferData, gl.STATIC_DRAW)
for {
// clear screen
gl.Clear(gl.COLOR_BUFFER_BIT)
// first attribute buffer: vertices
var vertexAttrib gl.AttribLocation = 0
vertexAttrib.EnableArray()
vertexBuffer.Bind(gl.ARRAY_BUFFER)
var f float32 = 0.0
vertexAttrib.AttribPointer(
3, // size
false, // normalized?
0, // stride
&f) // array buffer offset
// draw the triangle
gl.DrawArrays(gl.TRIANGLES, 0, 3)
vertexAttrib.DisableArray()
glfw.SwapBuffers()
}
And this is the code in c which works:
if(!glfwInit())
return -1;
if(!glfwOpenWindow( 1024, 768, 0,0,0,0, 32,0, GLFW_WINDOW ))
return -1;
if(glewInit() != GLEW_OK)
return -1;
glClearColor(0.0f, 0.0f, 0.3f, 0.0f);
GLuint VertexArrayID;
glGenVertexArrays(1, &VertexArrayID);
glBindVertexArray(VertexArrayID);
static const GLfloat g_vertex_buffer_data[] = {
-1.0f, -1.0f, 0.0f,
1.0f, -1.0f, 0.0f,
0.0f, 1.0f, 0.0f,
};
GLuint vertexbuffer;
glGenBuffers(1, &vertexbuffer);
glBindBuffer(GL_ARRAY_BUFFER, vertexbuffer);
glBufferData(GL_ARRAY_BUFFER, sizeof(g_vertex_buffer_data), g_vertex_buffer_data, GL_STATIC_DRAW);
while(1) {
glClear( GL_COLOR_BUFFER_BIT );
// 1rst attribute buffer : vertices
glEnableVertexAttribArray(0);
glBindBuffer(GL_ARRAY_BUFFER, vertexbuffer);
glVertexAttribPointer(
0,
3, // size
GL_FLOAT, // type
GL_FALSE, // normalized?
0, // stride
(void*)0 // array buffer offset
);
// Draw the triangle !
glDrawArrays(GL_TRIANGLES, 0, 3); // From index 0 to 3 -> 1 triangle
glDisableVertexAttribArray(0);
// Swap buffers
glfwSwapBuffers();
}
Maybe I give vertexAttrib.AttribPointer() the wrong arguments, because I'm not sure what to give it instead of (void*)0. I tried nil, but that caused the application to crash. &gVertexBufferData[0] doesn't work either.
I'm using github.com/banthar/gl as glew-wrapper, go 1.0.2 and ubuntu 12.04 amd64.
EDIT update:
glGetError doesn't give any errors

I had the same problem and I managed to fix it after looking at your post, so first of all thanks a lot.
I managed to display a triangle by using the work branch of banthar bindings with this call to AttribPointer:
vertexAttrib.AttribPointer(
3, // size
gl.FLOAT, //type
false, // normalized?
0, // stride
nil) // array buffer offset
and by passing the size in bytes to BufferData.
[...]
data := []float32{0, 1, 0, -1, -1, 0, 1, -1, 0}
[...]
gl.BufferData(gl.ARRAY_BUFFER, len(data)*4, data, gl.STATIC_DRAW)
[...]
There is probably a better way to pass the right length.

I recently came into a similar issue with the Golang OpenGL bindings, and this question was one of the only references to it I could find.
However, none of the existing answers solved my problem, as the bindings appear to be slightly different now in 2015 than they looked in 2012.
The solution to my issue which hasn't already been covered by the existing answers involved the gl.BufferData() function called when creating a VBO.
A problem-producing example of the code in question would look like this:
[...]
vertices := []float32{0, 1, 0, -1, -1, 0, 1, -1, 0}
[...]
gl.BufferData(
gl.ARRAY_BUFFER,
len(vertices)*4,
unsafe.Pointer(&vertices),
gl.STATIC_DRAW)
[...]
One solution already provided recommended to change this code to something like this:
[...]
vertices := []float32{0, 1, 0, -1, -1, 0, 1, -1, 0}
[...]
gl.BufferData(
gl.ARRAY_BUFFER,
len(vertices)*4,
vertices,
gl.STATIC_DRAW)
[...]
However the bindings I used had a different function signature to those used here, and errored with:
cannot use vertices (type []float32) as type unsafe.Pointer in argument to gl.BufferData
The solution I ended up finding, and wanted to put here so nobody else should have to go through the headache it took trying to figure out the issue, looks like this:
[...]
vertices := []float32{0, 1, 0, -1, -1, 0, 1, -1, 0}
[...]
gl.BufferData(
gl.ARRAY_BUFFER,
len(vertices)*4, //len(vertices)*int(reflect.TypeOf(vertices).Elem().Size()),
gl.Ptr(vertices),
gl.STATIC_DRAW)
[...]
I also included a commented out option to replace len(vertices)*4 with, which produces the exact same result, but finds the '4' based on the type of slice (float32 in this case)
Footnotes
The bindings I used:
github.com/go-gl/gl/all-core/gl
github.com/go-gl/glfw/v3.1/glfw
My OpenGL context was created with these hints:
primaryMonitor := glfw.GetPrimaryMonitor()
vidMode := primaryMonitor.GetVideoMode()
glfw.WindowHint(glfw.ContextVersionMajor, 3)
glfw.WindowHint(glfw.ContextVersionMinor, 3)
glfw.WindowHint(glfw.OpenGLProfile, glfw.OpenGLCoreProfile)
glfw.WindowHint(glfw.OpenGLForwardCompatible, glfw.True)
glfw.WindowHint(glfw.RedBits, vidMode.RedBits)
glfw.WindowHint(glfw.GreenBits, vidMode.GreenBits)
glfw.WindowHint(glfw.BlueBits, vidMode.BlueBits)
glfw.WindowHint(glfw.RefreshRate, vidMode.RefreshRate)
glfw.WindowHint(glfw.Visible, glfw.False)

I was having the same problem, ended up being that for some reason calling glfw.OpenWindowHint was screwing it up. It would request the correct context, my opengl version would match, I would get no errors at all, but it wouldn't work. If I leave out the hint, I get a 4.3 context and everything seems to work.
Even if I request 4.3 in the hint, it doesn't work. If I request something else, my opengl string matches, but once again it doesn't work.
Hope this helps

I don't know how the OpenGL bindings to Go look exactly, but I can tell you at least this much:
The last parameter to glVertexAttribPointer should be the byte offset from the start of the buffer object, so (in your case) 0.
Note: The C type of that parameter generally should be int, as it's a byte offset. Instead, it's void* for legacy reasons - it used to have a different meaning before VBOs.
Instead of &f try passing either a literal 0 or, if this doesn't work, a pointer with value equal to 0. How to do that in Go? This is for you to figure out, since I don't grok Go. I told you what OpenGL expects and I hope this much helps.
Also: For debugging, please check glGetError() often.

Related

Issue using glTexCoordPointer()

I'm fairly new to OpenGL (and GLSL) and I have an issue using glTexCoordPointer().
I have the texture loaded in and it is rendering on the object correctly (a single quad) but I also get another quad appearing which is a single colour not a part of the loaded texture.
The arrays are defined as follows:
static const GLfloat obj_vert_buf[] = {
-1, 0, -1,
-1, 0, 1,
1, 0, 1,
1, 0, -1
};
static const GLfloat obj_tex_buf[] = {
0, 0,
0, 1,
1, 1,
1, 0
};
And the relevant excerpt from the draw function:
glEnableClientState(GL_VERTEX_ARRAY);
glEnableClientState(GL_TEXTURE_COORD_ARRAY_EXT);
glGenBuffers(1, &obj_id);
glTexCoordPointer(2, GL_FLOAT, 0, obj_tex_buf);
glVertexPointer(3, GL_FLOAT, 0, obj_vert_buf);
glDrawArrays(GL_QUADS, 0, sizeof(obj_vert_buf) / sizeof(GLfloat));
glDisableClientState(GL_VERTEX_ARRAY);
glDisableClientState(GL_TEXTURE_COORD_ARRAY_EXT);
To my understanding glTexCoordPointer()'s first argument specifies the number of elements per vertex which would be two as in:
glTexCoord2f(0.0, 0.0);
The second argument is the type, GLfloat.
The third argument is the offset between each set of elements pertaining to each vertex, so zero after the two stated before (I have also tried it with 2 * sizeof(GLfloat) to no change).
And the fourth argument is a pointer to the start of the data, i.e. obj_tex_buf.
The quad renders correctly and the texture is drawn on it correctly, but I get another random shape coming off from its centre and textured incorrectly, any thoughts would be great. The additional quad isn't visible without the glTexCoordPointer() line.
From the docs:
count
Specifies the number of indices to be rendered.
Thus you have to call glDrawArrays(GL_QUADS, 0, 4);
Please note that GL_QUADS isn't officially supported anymore as of OpenGL 3.1.

Unpack in a SSB

I use part of a SSB as a matrix 3D of linked lists. Each voxel of the matric is a uint that gives the location of the first element of the list.
Before each rendering, I need to re-init this matrix, but not the whole SSB. So I associated the part corresponding to the matrix with a texture 1D to be able to unpack a buffer inside it.
//Storage Shader buffer
glGenBuffers(1, &m_buffer);
glBindBufferBase(GL_SHADER_STORAGE_BUFFER, 0, m_buffer);
glBufferData(GL_SHADER_STORAGE_BUFFER,
headerMatrixSizeInByte + linkedListSizeInByte,
NULL,
GL_DYNAMIC_DRAW);
glBindBufferBase(GL_SHADER_STORAGE_BUFFER, 0, 0);
//Texture
glGenTextures(1, &m_texture);
glBindTexture(GL_TEXTURE_1D, m_texture);
glTexBufferRange(
GL_TEXTURE_BUFFER,
GL_R32UI,
m_buffer,
0,
headerMatrixSizeInByte);
glBindTexture(GL_TEXTURE_1D, 0);
//Unpack buffer
GLuint* clearData = new uchar[m_headerMatrixSizeInByte];
memset(clearData, 0xff, headerMatrixSizeInByte);
glGenBuffers(1, &m_clearBuffer);
glBindBuffer(GL_PIXEL_UNPACK_BUFFER, m_clearBuffer);
glBufferData(
GL_PIXEL_UNPACK_BUFFER,
headerMatrixSizeInByte,
clearData,
GL_STATIC_COPY);
glBindBuffer(GL_PIXEL_UNPACK_BUFFER, 0);
delete[] clearData;
So this is the initialization, now here is the clear attempt :
GLuint err;
glBindBuffer(GL_PIXEL_UNPACK_BUFFER, m_clearBuffer);
glBindTexture(GL_TEXTURE_1D, m_texture);
err = m_pFunctions->glGetError(); //no error
glTexSubImage1D(
GL_TEXTURE_1D,
0,
0,
m_textureSize,
GL_RED_INTEGER,
GL_UNSIGNED_INT,
NULL);
err = m_pFunctions->glGetError(); //err GL_INVALID_VALUE
glBindBuffer(GL_PIXEL_UNPACK_BUFFER, 0);
glBindTexture(GL_TEXTURE_1D, 0);
My questions are :
Is it possible to do what I'm attempting to ?
If yes, where did I screw up ?
Thanks to Andon again who got half the answer. There is two problem in the code above :
m_textureSize = 32770 which exceeds the limit in one dimension for many hardware. The easy workaround is to use a texture 2D. Since I don't care about the content after the linked list in the buffer, I can write whatever I want in it. In the next rendering call, it will be overwritten in the shaders.
When creating the texture, one function call was missing : glTexStorage2D(GL_TEXTURE_2D, 1, width, height);

Transparency or depth-test error in a really simpel two-pass effect

I want to setup a really simple two-pass effect. The first pass draws a texture object to a texture. The second pass creates a full screen quad in the geometry shader and textures it with the texture written in pass one.
The texture and framebuffer is set up in the following way:
gl.glGenFramebuffers(1, frameBufferHandle, 0);
gl.glBindFramebuffer(GL3.GL_FRAMEBUFFER, frameBufferHandle[0]);
texture = new Texture(gl, new TextureData(gl.getGLProfile(), GL3.GL_RGB, viewportWidth, viewportHeight,
0, GL3.GL_RGB, GL3.GL_UNSIGNED_BYTE, false, false, false, null, null));
texture.setTexParameteri(gl, GL3.GL_TEXTURE_MAG_FILTER, GL3.GL_LINEAR);
texture.setTexParameteri(gl, GL3.GL_TEXTURE_MIN_FILTER, GL3.GL_LINEAR);
gl.glFramebufferTexture(GL3.GL_FRAMEBUFFER, GL3.GL_COLOR_ATTACHMENT0, texture.getTextureObject(), 0);
int drawBuffers[] = {GL3.GL_COLOR_ATTACHMENT0};
gl.glDrawBuffers(1, drawBuffers, 0);
if (gl.glCheckFramebufferStatus(GL3.GL_FRAMEBUFFER) != GL3.GL_FRAMEBUFFER_COMPLETE)
throw new Exception("error while creating framebuffer");
The render function looks like:
// 1st pass
gl.glBindFramebuffer(GL3.GL_FRAMEBUFFER, frameBufferHandle[0]);
gl.glClearColor(0.2f, 0.2f, 0.2f, 1.0f);
gl.glClear(GL3.GL_STENCIL_BUFFER_BIT | GL3.GL_COLOR_BUFFER_BIT | GL3.GL_DEPTH_BUFFER_BIT);
texturePass.apply();
texturePass.updatePerObject(world);
texturePass.updateTexture(object.getDiffuseMap());
object.draw(gl);
// 2nd pass
gl.glBindFramebuffer(GL3.GL_FRAMEBUFFER, 0);
gl.glClearColor(0.2f, 0.2f, 0.2f, 1.0f);
gl.glClear(GL3.GL_STENCIL_BUFFER_BIT | GL3.GL_COLOR_BUFFER_BIT | GL3.GL_DEPTH_BUFFER_BIT);
fullscreenQuadPass.apply();
fullscreenQuadPass.updateTexture(texture)
;
gl.glDrawArrays(GL3.GL_POINTS, 0, 1);
The picture below shows the result of applying this effect:
As you hopefully can see, one can see through the golem and see his right hand. It seems like there is some kind of depth-test or transparency error.
Everything looks fine if I comment the 2nd pass out and replace
gl.glBindFramebuffer(GL3.GL_FRAMEBUFFER, frameBufferHandle[0]);
by
gl.glBindFramebuffer(GL3.GL_FRAMEBUFFER, 0);
Does anyone have an idea, what goes on here?
EDIT: In fact, I'm actually missing a depth buffer for the 2nd pass. Thus, I've updated my initialization sequence to
// Create framebuffer
gl.glGenFramebuffers(1, frameBufferHandle, 0);
gl.glBindFramebuffer(GL4.GL_FRAMEBUFFER, frameBufferHandle[0]);
// Set up color texture
colorTexture = new Texture(gl, new TextureData(gl.getGLProfile(),
GL4.GL_RGBA, width, height, 0, GL4.GL_RGBA, GL4.GL_UNSIGNED_BYTE,
false, false, false, null, null));
gl.glFramebufferTexture(GL4.GL_FRAMEBUFFER, GL4.GL_COLOR_ATTACHMENT0,
colorTexture.getTextureObject(), 0);
// Create and set up depth renderbuffer
gl.glGenRenderbuffers(GL4.GL_RENDERBUFFER, depthRenderBufferHandle, 0);
gl.glBindRenderbuffer(GL4.GL_RENDERBUFFER, depthRenderBufferHandle[0]);
gl.glRenderbufferStorage(GL4.GL_RENDERBUFFER, GL4.GL_DEPTH_COMPONENT,
width, height);
gl.glFramebufferRenderbuffer(GL4.GL_FRAMEBUFFER, GL4.GL_DEPTH_ATTACHMENT,
GL4.GL_RENDERBUFFER, depthRenderBufferHandle[0]);
int drawBuffers[] = {GL4.GL_COLOR_ATTACHMENT0};
gl.glDrawBuffers(1, drawBuffers, 0);
However, now my system crashes with a "fatal error" by the Java Runtime Environment. If I comment the newly added lines out, everything "works fine". What's the point now?
EDIT2: I've no idea why I've written
gl.glGenRenderbuffers(GL4.GL_RENDERBUFFER, depthRenderBufferHandle, 0);
Of course, it should be
gl.glGenRenderbuffers(1, depthRenderBufferHandle, 0);
That solved my problem.
Your Framebuffer Object currently lacks a depth attachment.
Here is some C pseudo-code that will address your problem:
GLuint depth_rbo;
glGenRenderbuffers (1, &depth_rbo);
glBindRenderbuffer (GL_RENDERBUFFER, depth_rbo);
glRenderbufferStorage (GL_RENDERBUFFER, GL_DEPTH_COMPONENT, width, height);
glFramebufferRenderbuffer (GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, GL_RENDERBUFFER, depth_rbo);
In fact, it also lacks a stencil attachment, so I am not sure why you are clearing the stencil buffer?
If you have stencil operations to perform you will need to allocate storage for it as well. Moreover, if you need both depth and stencil in an FBO, you must use a packed Depth-Stencil format (GL_DEPTH_STENCIL_ATTACHMENT).

Something wrong with my VBO

I'm trying to emulate exactly how a game sets up a VBO and draws it to the screen. I've never set one up before and the tutorials all show how to do it with glDrawArrays but I want to use glDrawElements.
I came up with the following:
glViewport(0, 0, 765, 553);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(0.0, 765, 553, 0.0, -1.0, 1.0);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
xCast(ptr_glActiveTextureARB, ptr_wglGetProcAddress("glActiveTextureARB"));
xCast(ptr_glMultiTexCoord2fARB, ptr_wglGetProcAddress("glMultiTexCoord2fARB"));
xCast(ptr_glGenBuffersARB, ptr_wglGetProcAddress("glGenBuffersARB"));
xCast(ptr_glBindBufferARB, ptr_wglGetProcAddress("glBindBufferARB"));
xCast(ptr_glBufferDataARB, ptr_wglGetProcAddress("glBufferDataARB"));
struct PointInfo
{
float Pos[3];
float Colour[3];
};
const int NumVerts = 3, NumInds = 3;
std::vector<PointInfo> Vertices;
Vertices.push_back({{0.0f, 1.0f, 0.0f}, {1, 1, 1}}); ///top left;
Vertices.push_back({{0.5f, 0.0f, 0.0f}, {1, 1, 1}}); ///bottom middle;
Vertices.push_back({{1.0f, 1.0f, 0.0f}, {1, 1, 1}}); ///top right;
std::vector<std::uint32_t> Indices = {0, 1, 2};
std::uint32_t VBO = 0, IBO = 0;
ptr_glGenBuffersARB(1, &VBO);
ptr_glGenBuffersARB(1, &IBO);
///Put Vertices In.
ptr_glBindBufferARB(GL_ARRAY_BUFFER, VBO);
ptr_glBufferDataARB(GL_ARRAY_BUFFER, sizeof(PointInfo) * NumVerts, &Vertices[0], GL_STATIC_DRAW);
Log(glGetError());
///Put Indices In.
ptr_glBindBufferARB(GL_ELEMENT_ARRAY_BUFFER, IBO);
ptr_glBufferDataARB(GL_ELEMENT_ARRAY_BUFFER, sizeof(int) * NumInds, &Indices[0], GL_STATIC_DRAW);
Log(glGetError());
I run the above only once at the start of my program. Then in my while loop, I run:
glPushMatrix();
glClearColor(0.0f, 1.0f, 1.0f, 1.0f);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
Log(glGetError());
ptr_glBindBufferARB(GL_ARRAY_BUFFER, VBO);
Log(glGetError());
glVertexPointer(3, GL_FLOAT, sizeof(PointInfo), (void*) offsetof(PointInfo, Pos));
Log(glGetError());
glColorPointer(3, GL_FLOAT, sizeof(PointInfo), (void*) offsetof(PointInfo, Colour));
Log(glGetError());
ptr_glBindBufferARB(GL_ELEMENT_ARRAY_BUFFER, IBO);
Log(glGetError());
glDrawElements(GL_TRIANGLES, NumInds, GL_UNSIGNED_INT, 0);
Log(glGetError());
glPopMatrix();
SwapBuffers(DC);
Sleep(1);
But the only thing that happens is my screen clearing. I never see my triangle at all :S I think it might be my view setup via the glOrtho but I'm not sure. Is there anything wrong with what I did? The glGetError just prints 0.. No errors :S
The triangle coordinates you specified are very small. The triangle occupies only half of a pixel at the top left corner of the screen. Try scaling it by 100.
Also I think you're missing calls to glEnableClientState with GL_VERTEX_ARRAY and GL_COLOR_ARRAY.
As a general approach I would suggest to take things one step at a time. Start with immediate mode glVertex to make sure you got the coordinates and camera setup right. Then add shaders. Then convert to a position-only VBO with DrawArrays. Then add vertex colors. Then convert to DrawElements. That way you have a better sense of where problems might lie.
You might also be interested in the glload library here to get rid of these ptr_ prefixes.
You should use glVertexAttribPointer. The functions you are using are deprecated. Perhaps you could get this code to work, but if you aren't forced to use such an ancient OpenGL, chances are you'd save yourself a lot of trouble.
Oh also manually loading function pointers is extremely cumbersome. I suggest you looked at libraries such as GLload.
A specialized debugger such as CodeXL or gDebugger can be very helpful in solving issues like that.
As for the problems in this code, your triangle is simply too small.

Messed Up OpenGL Depth Buffer?

I have something rather strange going on at the moment with my code. I am running this on a BlackBerry Playbook and it is OpenGL ES 1.1
EDIT 4: I deleted everything I have posted to simplify my question.
I took the code and simplified it to drawing two overlapping triangles. Here is the array containing the coordinates as well as an array containing colours:
GLfloat vertices[] =
{
// front
175.0f, 200.0f, -24.0f,
225.0f, 200.0f, -24.0f,
225.0f, 250.0f, -24.0f,
// back
200.0f, 200.0f, -25.0f,
250.0f, 200.0f, -25.0f,
250.0f, 250.0f, -25.0f
};
static const GLfloat colors[] =
{
/* front */ 1.0f,0.0f,0.0f,1.0f,1.0f,0.0f,0.0f,1.0f,1.0f,0.0f,0.0f,1.0f, //Red
/* back */ 0.0f,1.0f,0.0f,1.0f,0.0f,1.0f,0.0f,1.0f,0.0f,1.0f,0.0f,1.0f //Green
};
Please note that my coordinates are 0 to 1024 in the x direction and 0 to 600 in the y direction as well as 0 to -10000 in the z direction.
Here is my setup code which reflects this:
glClearDepthf(1.0f);
glClearColor(1.0f,1.0f,1.0f,1.0f);
glEnable(GL_DEPTH_TEST);
glDepthMask(GL_TRUE);
glShadeModel(GL_SMOOTH);
glViewport(0, 0, surface_width, surface_height);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrthof(0, surface_width, 0, surface_height, 0, 10000);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
glEnable(GL_DEPTH_TEST);
glDepthMask(GL_TRUE);
I have depth enabling in two places as I was trying to rule out the possibility that it was supposed to be used while a certain matrix mode was chosen.
Lastly here is my render code:
void render()
{
//Typical render pass
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glEnableClientState(GL_VERTEX_ARRAY);
glVertexPointer(3, GL_FLOAT, 0, vertices);
glEnableClientState(GL_COLOR_ARRAY);
glColorPointer(4, GL_FLOAT, 0, colors);
glDrawArrays(GL_TRIANGLES, 0 , 6);
glDisableClientState(GL_VERTEX_ARRAY);
glDisableClientState(GL_COLOR_ARRAY);
//Use utility code to update the screen
bbutil_swap();
}
The issue is that no matter what I do the green triangle is always overlayed over the red one. Changing z values either way has no effect on the finished image. I cannot figure this out.
By default, depth testing is disabled. You have to enable it with glEnable(GL_DEPTH_TEST). The reason why it is working when you enable culling is because the back facing triangles are not drawn, and since a cube is a convex polyhedron, no front-facing quad will ever overlap another front-facing quad. If you try to render a second cube, however, you will see depth problems as well, unless you enable depth testing.
I finally got it to work. The issue was with EGL setup code that I used that was provided. In bbutil.c (in my case .cpp) there is some code:
if(!eglChooseConfig(egl_disp, attrib_list, &egl_conf, 1, &num_configs)) {
bbutil_terminate();
return EXIT_FAILURE;
}
(that is not all the code in the file but its the important bit)
This basically freaks if the given attribute list is nor supported. Up higher in the file attrib_list is set as follows:
EGLint attrib_list[]= { EGL_RED_SIZE, 8,
EGL_GREEN_SIZE, 8,
EGL_BLUE_SIZE, 8,
EGL_SURFACE_TYPE, EGL_WINDOW_BIT,
EGL_RENDERABLE_TYPE, 0,
EGL_NONE};
There is no depth buffer specified. Now if you look in the EGL spec it says no depth is the default. BINGO, that's the problem. So I just modified it to look like this:
EGLint attrib_list[]= { EGL_RED_SIZE, 8,
EGL_GREEN_SIZE, 8,
EGL_BLUE_SIZE, 8,
EGL_SURFACE_TYPE, EGL_WINDOW_BIT,
EGL_RENDERABLE_TYPE, 0,
EGL_DEPTH_SIZE, 24,
EGL_NONE};
Note the EGL_DEPTH_SIZE and the 24. This sets the depth buffer to 24 bits. On the PlayBook 32 throws a segmentation fault although usually 32 is not supported anyways. Perhaps this will help someone out there trying to figure out why the provided include is causing this funny result I described as my problem.