Lost OpenGL output - c++

A week ago I upgraded my OS to the latest version, so GLEW, nVidia drivers, g++, Qt, etc. all got upgraded too. That was the day my QGLWidgets stopped showing raw OpenGL 3.3 content, only the 2D QPainter based stuff. Since no other OpenGL applications (including my DE) on my machine were affected, I must have written some dodgy code in my application, that perhaps had been allowed by older versions of these libraries - which have now been amended.
I have a lot of heavily abstracted OpenGL code, so many potential places for a failure; and after a few days of trying to get any sort of output from it (glGetError() was not returning errors before I started messing with it), I decided the best thing to do was write the simplest OpenGL application possible and then slowly build it up until it broke.
But I can't even get a triangle to appear!
void Viewer::initializeGL()
{
// Initialise GLEW, compile/link fragment and vertex shaders, error check, etc.
...
// Create default VAO.
glGenVertexArrays( 1, &vao_ );
glBindVertexArray( vao_ );
glUseProgram( shader_ );
vVertex_ = glGetAttribLocation( shader_, "vVertex" );
// Create the test triangle VBO.
glGenBuffers( 1, &vbo_ );
glBindBuffer( GL_ARRAY_BUFFER, vbo_ );
glEnableVertexAttribArray( vVertex_ );
glVertexAttribPointer( vVertex_, 3, GL_FLOAT, false, 0,
reinterpret_cast< GLvoid* >( 0 ) );
// Upload the data to the GPU.
static const float verts[9] = { 0.0f, 0.0f, -0.5f,
1.0f, 0.0f, -0.5f,
0.5f, 1.0f, -0.5f };
glBufferData( GL_ARRAY_BUFFER, sizeof( verts ),
static_cast< const void* >( verts ), GL_STATIC_DRAW );
glBindBuffer( GL_ARRAY_BUFFER, GL_NONE );
glDisableVertexAttribArray( vVertex_ );
Sy_GL::checkError();
}
void Viewer::paintGL()
{
// Clear the buffers.
qglClearColor( QColor( Qt::black ) );
glClear( GL_COLOR_BUFFER_BIT );
glPolygonMode( GL_FRONT_AND_BACK, GL_LINE );
glBindBuffer( GL_ARRAY_BUFFER, vbo_ );
glEnableVertexAttribArray( vVertex_ );
glVertexAttribPointer( vVertex_, 3, GL_FLOAT, false, 0,
reinterpret_cast< GLvoid* >( 0 ) );
glDrawArrays( GL_TRIANGLES, 0, 3 );
glBindBuffer( GL_ARRAY_BUFFER, GL_NONE );
glDisableVertexAttribArray( vVertex_ );
Sy_GL::checkError();
}
I'm not using my VAO for what it is for because VAOs cannot be shared across contexts, which is the scenario in my 'real' application, so I'm replicating that situation here. Sy_GL::checkError() just calls glGetError() and throws an exception if there's a problem. My two shaders could not be simpler:
// Vertex shader.
#version 330
in vec3 vVertex;
void main( void )
{
gl_Position = vec4( vVertex, 1.0 );
}
// Fragment shader (in different file).
#version 330
out vec4 fragColour;
void main( void )
{
fragColour = vec4( 1.0, 0.0, 0.0, 1.0 );
}
This should display a red line rendered triangle against a black background, but I just get the black background - no console output or exceptions. My system really does support OpenGL 3.3 and higher, here is top of my glxinfo output:
name of display: :0
display: :0 screen: 0
direct rendering: Yes
server glx vendor string: NVIDIA Corporation
server glx version string: 1.4
And my glewinfo output:
GLEW version 1.9.0
Reporting capabilities of display :0, visual 0x2b
Running on a GeForce GTX 560 Ti/PCIe/SSE2 from NVIDIA Corporation
OpenGL version 4.3.0 NVIDIA 310.32 is supported
So my question is: Is my code wrong? Or is my system damaged very subtly?
Edit
It appears that QGLFormat is reporting that I only have OpenGL v1.0 - what mechanism is Qt using to get that value?
In my 'real' application I perform a OpenGL version check using QGLFormat::openGLVersionFlags() & QGLFormat::OpenGL_Version_3_3, and this passes; but using myQGLWidget->format().majorVersion() and myQGLWidget->format().minorVersion() return 1 and 0 respectively.
Edit 2
Interestingly, if I set a default QGLFormat of v3.3 in my main.cpp:
QGLFormat format( QGL::DoubleBuffer |
QGL::DepthBuffer |
QGL::AlphaChannel );
format.setVersion( 3, 3 );
QGLFormat::setDefaultFormat( format );
It segfaults on the first of my OpenGL calls, specifically glGenVertexArrays(1, &vao_), but myQGLWidget->format().majorVersion() and myQGLWidget->format().minorVersion() return 3 and 3 respectively.

We recently had this at work. The cause was the latest NVIDIA drivers, which broke the major and minor version queries.
Edit: I think it may have been related to calling these functions before setting up a valid GL context.
End edit
So, you could try with a slightly older driver. However, there is also an issue with some Qt versions and glversion. Check this link out:
http://qt-project.org/forums/viewthread/20424

Related

How to specify vertex attributes in client-side vertex array?

I'm porting OpenGL 3.X code to OpenGL 2.1, where no VAO available. The codes below make my program crash inside graphics driver, immediately at glDrawElements call.
Part of the code is shared between 3.X and 2.1 mode, and it works properly in OpenGL 3.X above.
struct StrokeVertex
{
glm::vec2 position;
glm::vec2 tangent;
float center_dist;
};
if (use_gl3)
bind_vao();
bind_vbo();
bind_ibo();
send_data();
bind_program();
set_uniforms();
// setup client-side vertex array here
if (!use_gl3)
{
glEnableClientState( GL_VERTEX_ARRAY );
GLHELPER_CHECK;
glEnableVertexAttribArray( 0 );
GLHELPER_CHECK;
glVertexAttribPointer( get_attribute_location( "tangent" ) /* got 1 here */, 2, GL_FLOAT, GL_FALSE, sizeof( StrokeVertex ), (void*) offsetof( StrokeVertex, tangent ) );
GLHELPER_CHECK;
glEnableVertexAttribArray( 1 );
GLHELPER_CHECK;
glVertexAttribPointer( get_attribute_location( "center_dist" ) /* got 2 here */, 1, GL_FLOAT, GL_FALSE, sizeof( StrokeVertex ), (void*) offsetof( StrokeVertex, center_dist ) );
GLHELPER_CHECK;
// I also tried call this before setting the two attributes
glVertexPointer( 2, GL_FLOAT, sizeof( StrokeVertex ), (void*) offsetof( StrokeVertex, position ) );
GLHELPER_CHECK;
}
// immediately crash here
glDrawElements( GL_TRIANGLES, GLsizei( n_elem ), GL_UNSIGNED_INT, nullptr );
GLHELPER_CHECK;
if (!use_gl3)
{
glDisableClientState( GL_VERTEX_ARRAY );
}
else
{
unbind_vao();
}
And the part of vertex shader of OpenGL 3.X and 2.X. Most part are same expect attribute declaration:
in vec2 logic_pos;
in vec2 tangent;
in float center_dist;
OpenGL 2.X:
// builtin gl_Vertex is used, so we omit logic_pos
attribute vec2 tangent;
attribute float center_dist;
There seems to be a mismatch regarding which vertex attributes get enabled and to which data gets bound:
The following code tells OpenGL that it should enable vertex attribute 0, but the data gets bound to attribute 1 (at least according to the comment).
glEnableVertexAttribArray( 0 );
glVertexAttribPointer( get_attribute_location( "tangent" ) /* got 1 here */, 2, GL_FLOAT, GL_FALSE, sizeof( StrokeVertex ), (void*) offsetof( StrokeVertex, tangent ) );
In total, it looks as if your code enables attributes 0 and 1, but binds data to 1 and 2.
If you have attributes enabled but do not bind any data to them, this might lead to the crash you describe.

Drawing letters using triangles in OpenGL?

I'm required to draw my name using triangles. I understand how to handle shaders. I am just confused on how to actual draw the objects and connect them to make a letter.
I've been given some code to work with:
#include "Angel.h"
const int NumPoints = 50000;
/*This function initializes an array of 3d vectors
and sends it to the graphics card along with shaders
properly connected to them.*/
void
init( void )
{
vec3 points[NumPoints];
// Specifiy the vertices for a triangle
vec3 vertices[] = {
vec3( -1.0, -1.0, 0.0 ),
vec3( 0.0, 1.0, 0.0 ),
vec3( 1.0, -1.0, 0.0 )
};
// Select an arbitrary initial point inside of the triangle
points[0] = vec3( 0.0, 1.0, 0.0 );
// compute and store NumPoints - 1 new points
for ( int i = 1; i < NumPoints; ++i ) {
int j = rand() % 3; // pick a vertex from the triangle at random
// Compute the point halfway between the selected vertex
// and the previous point
points[i] = ( points[i - 1] + vertices[j] ) / 2.0;
}
// Create a vertex array object
GLuint vao; //just an integer recognized by graphics card
glGenVertexArrays( 1, &vao ); //generate 1 buffer
glBindVertexArray( vao ); //become array buffer
// Create and initialize a buffer object //sends it to graphics card
GLuint buffer;
glGenBuffers( 1, &buffer );
glBindBuffer( GL_ARRAY_BUFFER, buffer );
glBufferData( GL_ARRAY_BUFFER, sizeof(points), points, GL_STATIC_DRAW ); //size of array glstatic draw means this point isnt gonna change its static
// Load shaders and use the resulting shader program
GLuint program = InitShader("simpleShader - Copy.vert", "simpleShader - Copy.frag");
// make these shaders the current shaders
glUseProgram( program );
// Initialize the vertex position attribute from the vertex shader
GLuint loc = glGetAttribLocation( program, "vPosition" ); //find the location in the code
glEnableVertexAttribArray( loc );
glVertexAttribPointer( loc, 3, GL_FLOAT, GL_FALSE, 0, BUFFER_OFFSET(0) );
glClearColor( 0.5, 0.5, 0.5, 1.0 ); // gray background
}
//----------------------------------------------------------------------------
/* This function handles the display and it is automatically called by GLUT
once it is declared as the display function. The application should not
call it directly.
*/
void
display( void )
{
glClear( GL_COLOR_BUFFER_BIT ); // clear the window
glDrawArrays( GL_POINTS, 0, NumPoints ); // draw the points
glFlush(); // flush the buffer
}
//----------------------------------------------------------------------------
/* This function handles the keyboard and it is called by GLUT once it is
declared as the keyboard function. The application should not call it
directly.
*/
void
keyboard( unsigned char key, int x, int y )
{
switch ( key ) {
case 033: // escape key
exit( EXIT_SUCCESS ); // terminates the program
break;
}
}
//----------------------------------------------------------------------------
/* This is the main function that calls all the functions to initialize
and setup the OpenGL environment through GLUT and GLEW.
*/
int
main( int argc, char **argv )
{
// Initialize GLUT
glutInit( &argc, argv );
// Initialize the display mode to a buffer with Red, Green, Blue and Alpha channels
glutInitDisplayMode( GLUT_RGBA );
// Set the window size
glutInitWindowSize( 512, 512 );
// Here you set the OpenGL version
glutInitContextVersion( 3, 2 );
//Use only one of the next two lines
//glutInitContextProfile( GLUT_CORE_PROFILE );
glutInitContextProfile( GLUT_COMPATIBILITY_PROFILE );
glutCreateWindow( "Simple GLSL example" );
// Uncomment if you are using GLEW
glewInit();
// initialize the array and send it to the graphics card
init();
// provide the function that handles the display
glutDisplayFunc( display );
// provide the functions that handles the keyboard
glutKeyboardFunc( keyboard );
glutMainLoop();
return 0;
}
It looks like your problem is intended to familiarize you with setting up and using a vertex buffer. If so, it doesn't matter how you come up with the triangles -- the point is to understand how the setup works.
So, the first thing you need to do is to read your textbook on this topic. If you don't have a text, you will need to look up the graphics calls in an OpenGL reference.
If you run the program as-is, it should draw a bunch of randomly chosen, disconnected points (in the form of a fractal, but still...). This is because the actual draw call, glDrawArrays(), is called with the enumerated value GL_POINTS, which tells it to draw points, as its first argument.
If you read the documentation for glDrawArrays(), it should list other values for this argument, some of which draw triangles in various ways. The most straightforward of these is GL_TRIANGLES, but I recommend you look up all of them to give you an idea what your options are.
Generating the triangles is up to you. If your name is short, generating them by hand should be fairly easy. Note that you should entirely replace the random-point-generating code; options include:
with inline data
with some code to load the coordinates from a file
with something more clever so you don't have to hand-generate them

OpenGL drawing meshes incorrectly

I'm attempting to make an OpenGL Engine in C++, but cannot render meshes correctly. Meshes, when rendered, create faces that connect two random points on the mesh, or a random point on the mesh with 0,0,0.
The problem can be seen here:
(I made it a wireframe to see the problem more clearly)
Code:
// Render all meshes (Graphics.cpp)
for( int curMesh = 0; curMesh < numMesh; curMesh++ ) {
// Save pointer of buffer
meshes[curMesh]->updatebuf();
Buffer buffer = meshes[curMesh]->buffer;
// Update model matrix
glm::mat4 mvp = Proj*View*(meshes[curMesh]->model);
// Initialize vertex array
glBindBuffer( GL_ARRAY_BUFFER, vertbuffer );
glBufferData( GL_ARRAY_BUFFER, sizeof(GLfloat)*buffer.numcoords*3, meshes[curMesh]->verts, GL_STATIC_DRAW );
// Pass information to shader
GLuint posID = glGetAttribLocation( shader, "s_vPosition" );
glVertexAttribPointer( posID, 3, GL_FLOAT, GL_FALSE, 0, (void*)0 );
glEnableVertexAttribArray( posID );
// Check if texture applicable
if( meshes[curMesh]->texID != NULL && meshes[curMesh]->uvs != NULL ) {
// Initialize uv array
glBindBuffer( GL_ARRAY_BUFFER, uvbuffer );
glBufferData( GL_ARRAY_BUFFER, sizeof(GLfloat)*buffer.numcoords*2, meshes[curMesh]->uvs, GL_STATIC_DRAW );
// Pass information to shader
GLuint uvID = glGetAttribLocation( shader, "s_vUV" );
glVertexAttribPointer( uvID, 2, GL_FLOAT, GL_FALSE, 0, (void*)(0) );
glEnableVertexAttribArray( uvID );
// Set mesh texture
glActiveTexture( GL_TEXTURE0 );
glBindTexture( GL_TEXTURE_2D, meshes[curMesh]->texID );
GLuint texID = glGetUniformLocation( shader, "Sampler" );
glUniform1i( texID, 0 );
}
// Actiavte shader
glUseProgram( shader );
// Set MVP matrix
GLuint mvpID = glGetUniformLocation( shader, "MVP" );
glUniformMatrix4fv( mvpID, 1, GL_FALSE, &mvp[0][0] );
// Draw verticies on screen
bool wireframe = true;
if( wireframe )
for(int i = 0; i < buffer.numcoords; i += 3)
glDrawArrays(GL_LINE_LOOP, i, 3);
else
glDrawArrays( GL_TRIANGLES, 0, buffer.numcoords );
}
// Mesh Class (Graphics.h)
class mesh {
public:
mesh();
void updatebuf();
Buffer buffer;
GLuint texID;
bool updated;
GLfloat* verts;
GLfloat* uvs;
glm::mat4 model;
};
My Obj loading code is here: https://www.dropbox.com/s/tdcpg4vok11lf9d/ObjReader.txt (It's pretty crude and isn't organized, but should still work)
This looks like a primitive restart issue to me. Hard to tell what exactly is the problem without seeing some code. It would help a lot to see the about 20 lines above and below and including the drawing calls render the teapot. I.e. the 20 lines before the corresponding glDrawArrays, glDrawElements or glBegin call and the 20 lines after.
subtract 1 from the indices for your use, since these are 1-based indices, and you will almost certainly need 0-based indices.
This is because your triangles are not connected for the wireframe to look perfect.
In case triangles is not connected you should construct index buffer.

Weird GL_INVALID_OPERATION error in OpenGL

glClearColor( 1.0f, 1.0f, 1.0f, 1.0f );
AttachVertexShader( shader, "szescian_vs.glsl" );
AttachFragmentShader( shader, "szescian_fs.glsl" );
LinkProgram( shader );
glBindVertexArray( vertexVAO );
glGenBuffers( 1, &positionBuffer );
glGenBuffers( 1, &positionBuffer );
glBindBuffer( GL_ARRAY_BUFFER, positionBuffer );
glBufferData( GL_ARRAY_BUFFER, sizeof( position ), position, GL_STATIC_DRAW );
positionLoc = glGetAttribLocation( shader, "inPosition" );
glEnableVertexAttribArray ( positionLoc );
glVertexAttribPointer ( positionLoc, 3, GL_FLOAT, GL_FALSE, 0, ( void* ) 0 ); //here gDEBugger GL breaks on OpenGL Error
It's part of my init function, and I really don't know why gDEBugger breaks on it, can anybody explain it for me?
Break Reason OpenGL Error Breaked-on glVertexAttribPointer(0 , 3
, GL_FLOAT , FALSE , 0 , 0x00000000) Error-Code
GL_INVALID_OPERATION Error-Description The specified operation is
not allowed in the current state. The offending function is ignored,
having no side effect other than to set the error flag.
* Stopped before function execution
This is break information.
The possible GL_INVALID_OPERATION errors generated by glVertexAttribPointer():
GL_INVALID_OPERATION is generated if size is GL_BGRA and type is not
GL_UNSIGNED_BYTE, GL_INT_2_10_10_10_REV or GL_UNSIGNED_INT_2_10_10_10_REV.
GL_INVALID_OPERATION is generated if type is GL_INT_2_10_10_10_REV
or GL_UNSIGNED_INT_2_10_10_10_REV and size is not 4 or GL_BGRA.
GL_INVALID_OPERATION is generated if type is
GL_UNSIGNED_INT_10F_11F_11F_REV and size is not 3.
GL_INVALID_OPERATION is generated by glVertexAttribPointer if size
is GL_BGRA and noramlized is GL_FALSE.
GL_INVALID_OPERATION is generated if zero is bound to the
GL_ARRAY_BUFFER buffer object binding point and the pointer argument
is not NULL.
http://www.opengl.org/sdk/docs/man/xhtml/glVertexAttribPointer.xml

combining glVertexPointer and glVertexAttribPointer gives problems

I'm having an issue when first rendering a vertexbuffer with a program,
and then rendering a different vertexbuffer without program.
for the first buffer, when a program is enabled, i use code similar to:
glBindBuffer( GL_ARRAY_BUFFER, m_id );
GLint location = glGetAttribLocation( pID, "position" );
glEnableVertexAttribArray( location );
glVertexAttribPointer( location, 3, GL_FLOAT, GL_FALSE, 3 * sizeof( GLfloat ), 0 );
glDrawArrays( m_mode, 0, m_numVertices );
for the second, without program:
glBindBuffer( GL_ARRAY_BUFFER, m_id );
glEnableClientState( GL_VERTEX_ARRAY );
glVertexPointer( 3, GL_FLOAT, 3 * sizeof( GLfloat ), 0 );
glDrawArrays( m_mode, 0, m_numVertices );
both codepaths work fine individually, but when done in the order
"with program"->"without program", the second seems to use the buffer of the first,
and in the order "without program"->"with program", the first is not drawn (in the second iteration).
now this suggests to me that I'm missing some state change done by the glEnableVertexAttribArray block, but I don't understand what state change is causing the problems.
ps the reason I'm rendering with and without program is that in the scenegraph lib im using you can turn programs on or off per node.
Try adding
glDisableVertexAttribArray( location ); // location of "position"
before switching to fixed function rendering.