When I run my program with OpenGL 3.1 it works fine but when I use OpenGL 3.2, glGenFramebuffers gives a segfault. I tried using glewExperimental = GL_TRUE and this allows the program to run without giving an error but the screen is completely black. I should also mention I'm using the CG Toolkit version 3.1. Any ideas what could be going wrong?
By definition, glGenX functions should not be able to crash - all they do is give you a free handle you can use afterwards.
Thus, this sounds like a problem with GLEW. Have you stepped into it to see that you're not calling a null pointer function?
Related
Whenever I call a function to swap buffers I get tons of errors from glDebugMessageCallback saying:
glVertex2f has been removed from OpenGL Core context (GL_INVALID_OPERATION)
I've tried using both with GLFW and freeglut, and neither work appropriately.
I haven't used glVertex2f, of course. I even went as far as to delete all my rendering code to see if I can find what's causing it, but the error is still there, right after glutSwapBuffers/glfwSwapBuffers.
Using single-buffering causes no errors either.
I've initialized the context to 4.3, core profile, and flagged forward-compatibility.
As explained in comments, the problem here is actually third-party software and not any code you yourself wrote.
When software such as the Steam overlay or FRAPS need to draw something overtop OpenGL they usually go about this by hooking/injecting some code into your application's SwapBuffers implementation at run-time.
You are dealing with a piece of software (RivaTuner) that still uses immediate mode to draw its overlay and that is the source of the unexplained deprecated API calls on every buffer swap.
Do you have code you can share? Either the driver is buggy or something tries to call glVertex in your process. You could try to use glloadgen to build a loader library that covers OpenGL-4.3 symbols only (and only that symbols), so that when linking your program uses of symbols outside the 4.3 specs causes linkage errors.
I'm trying to tesselate a simple triangle using the Golang OpenGL bindings
The library doesn't claim support for the tesselation shaders, but I looked through the source code, and adding the correct bindings didn't seem terribly tricky. So I branched it and tried adding the correct constants in gl_defs.go.
The bindings still compile just fine and so does my program, it's when I actually try to use the new bindings that things go strange. The program goes from displaying a nicely circling triangle to a black screen whenever I actually try to include the tesselation shaders.
I'm following along with the OpenGL Superbible (6th edition) and using their shaders for this project, so I don't image I'm using broken shaders (they don't spit out an error log, anyway). But in case the shaders themselves could be at fault, they can be found in the setupProgram() function here.
I'm pretty sure my graphics card supports tesselation because printing the openGL version returns 4.4.0 NVIDIA 331.38
.
So my questions:
Is there any reason adding go bindings for tesselation wouldn't work? The bindings seem quite straightforward.
Am I adding the new bindings incorrectly?
If it should work, why is it not working for me?
What am I doing wrong here?
Steps that might be worth taking:
Your driver and video card may support tessellation shaders, but the GL context that your binding is returning for you might be for an earlier version of OpenGL. Try glGetString​(GL_VERSION​) and see what you get.
Are you calling glGetError basically everywhere and actually checking its values? Does this binding provide error return values? If so, are you checking those?
So I have just realized that the code I was working on for 3d textures was for OpenGL 1.1 or something and is no longer supported in OpenGL 3.3. Is there another way to do this without glTexture3D? Perhaps through a library or another function in OpenGL 3.3 that I do not know about?
EDIT:
I am not sure where I read that 3d texturing was taken out of OpenGL in newer versions (been googling a lot today), but consider this:
I have been following the tutorial/guide here. The program compiles without a hitch. Now read the following quote from the article:
The potential exists that the environment the program is being run on does not support 3D texturing, which would cause us to get a NULL address back, and attempting to use a NULL pointer is A Bad Thing so make sure to check for it and respond appropriately (the provided example exits with an error).
That quote is referring to the following function:
glTexImage3D = (PFNGLTEXIMAGE3DPROC) wglGetProcAddress("glTexImage3D");
When running my program on my computer (which has OpenGL 3.3) that same function returns null for me. When my friend runs it on his computer (which has OpenGL 1.2) it does not return null.
The way one uploads 3D textures has not changes since OpenGL-1.2. The functions for this are still named
glTexImage3D
glTexSubImage3D
glCopyTexSubImage3D
Recently my game-engine-in-progress has started throwing OpenGL errors in places that they shouldn't be possible. After rendering a few frames, suddenly I start getting errors from glColor:
print(gl.GetError()) --> nil
gl.Color(1, 1, 1, 1)
print(gl.GetError()) --> INVALID_OPERATION
If I don't call glColor here, I later get an invalid operation error from glMatrixMode.
According to the GL manual, glColor should never raise an error, and glMatrixMode only if it's between glBegin and glEnd, which I've checked is not the case. Are there any other reasons these functions can raise an error, that I'm not aware of? Maybe related to render-to-texture/renderbuffer extensions? I've been debugging like mad and can't find anything that should cause such failures. The whole program is a bit too large and complex to post here. It's using luagl, which is just a thin wrapper around the OpenGL API, and SDL. The reported version is: 2.1 Mesa 7.10.2
glColor will result in an error if there is no active OpenGL context. If you are using multiple contexts or glBindFramebuffer check that you always switch ones that are valid. Also remember that using OpenGL calls from multiple threads require special attention.
https://bugs.freedesktop.org/show_bug.cgi?id=48535
Looks like this was actually a driver bug. >.>
i have a program that does some GPU computing with Optional OpenGL rendering.
The use dynamic is as follow:
init function (init GLEW being the most relevant).
load mesh from file to GPU (use glGenBuffers are related functions to make VBO).
process this mesh in parallel (GPU Computing API).
save mesh into file.
my problem is that when mesh is loading i use opengl calls and wihout context created i just
get segmentation fault.
Edit: Evolution of the problem:
I was missing GL/glx.h I thought that GL/glxew.h included it, thanks to the answers that got fixed.
I was missing glXMakeCurrent; and therefore it was having zero contexts.
After this fixes, it works :).
also thanks for the tools suggestions, i would gladly use them it is just that i needed the low level code for this particular case.
i tried making a context with this code ( i am using glew, so i change the header to GL/glxew.h but the rest of this code remains the same)
Don'd do it. glxew is used for loading glx functions. You probably don't need it.
If you want to use GLEW, replace GL/gl.h with GL/glew.h leave GL/glx.h as it is.
X11 and GLX are quite complex, consider using sdl of glfw instead.
Just wildly guessing here, but could it be that GLEW redefined glXChooseFBConfig with something custom? Something in the call of glXChooseFBConfig dereferences an invalid pointer. So either glXChooseFBConfig itself is invalid, or fbcount to so small, or visual_attribs not properly terminated.
GLEW has nothing to do with context creation. It is an OpenGL loading library; it loads OpenGL functions. It needs you to have an OpenGL context in order for it to function.
Since you're not really using this context for drawing stuff, I would suggest using an off-the-shelf tool for context creation. GLFW or FreeGLUT would be the most light-weight alternatives. Just use them to create a context, do what you need to do, then destroy the windows they create.