GL_INVALID_OPERATION in glGetIntegerv() with GLAD - c++

I use GLAD (config) to load OpenGL functions and GLFW 3.3.8 to create context. Each time I start my program it pops a ERROR 1282 in glGetIntegerv from GLAD debug post-callback function (as far as I know it is invoked after each gl- function and prints an error if any occurred). I figured that this happens after returning from main().
Here's the code (it loads OpenGL 3.3 and shows red window until it is closed, pretty simple I think):
#include <iostream>
#include <glad/glad.h>
#include <GLFW/glfw3.h>
int main()
{
if(glfwInit() != GLFW_TRUE)
throw std::runtime_error{"Unable to initialize GLFW."};
glfwWindowHint(GLFW_CONTEXT_VERSION_MAJOR, 3);
glfwWindowHint(GLFW_CONTEXT_VERSION_MINOR, 3);
glfwWindowHint(GLFW_OPENGL_PROFILE, GLFW_OPENGL_CORE_PROFILE);
GLFWwindow * w{glfwCreateWindow(100, 100, "title", nullptr, nullptr)};
if(w == nullptr)
throw std::runtime_error{"Unable to create window."};
glfwMakeContextCurrent(w);
if(not gladLoadGLLoader(GLADloadproc(glfwGetProcAddress)))
throw std::runtime_error{"Unable to load OpenGL functions."};
glViewport(0, 0, 100, 100);
while(not glfwWindowShouldClose(w))
{
glfwPollEvents();
glClearColor(1.f, 0.f, 0.f, 1.f);
glClear(GL_COLOR_BUFFER_BIT);
glfwSwapBuffers(w);
}
glfwMakeContextCurrent(nullptr);
glfwDestroyWindow(w);
glfwTerminate();
std::cout << "Hey!" << std::endl;
return 0;
}
The output is:
Hey!
ERROR 1282 in glGetIntegerv
From this callstack:
#0 0x00416f91 in _post_call_callback_default_gl (name=0x446d40 <_glfwDataFormat+10036> "glGetIntegerv", funcptr=0x41c1ec <glad_debug_impl_glGetIntegerv#8>, len_args=2) at <glad.c>:45
#1 0x0041c265 in glad_debug_impl_glGetIntegerv#8 (arg0=33309, arg1=0x4526cc <num_exts_i>) at <glad.c>:1385
#2 0x00417168 in get_exts () at <glad.c>:220
#3 0x0042691f in find_extensionsGL () at <glad.c>:3742
#4 0x00426d12 in gladLoadGLLoader (load=0x402a2e <glfwGetProcAddress>) at <glad.c>:3821
#5 0x004016f8 in main () at <main.cpp>:33
Error 1282 is GL_INVALID_OPERATION, but it pops up after the program ended (or at least after the main() ended). Even if I separate the whole code in another function (i. e. create and destroy everything in separate function), and then invoke it in main(), the error still appears after the return 0; from main().
This did not happen when I used GLEW to load OpenGL functions, but maybe it was silenced. I didn't find anything similar to my problem on the internet. What am I doing wrong? Do I have to unload OpenGL or something like that?
UPD: Error message actually pops in gladLoadGLLoader(), not after the end of main().

Related

gdb Cannot find bounds of current function

I am developing a OpenGL program using Mingw32 on Windows 10(64 bit)
The program runs without problem
But when I debug my program using gdb, it shows:
(gdb) n
0x6a7706f8 in ?? ()
from C:\Windows\System32\DriverStore\FileRepository\c0310483.inf_amd64_ab6d2afa5c543409\atioglxx.dll
(gdb) n
Cannot find bounds of current function
(gdb)
Here is the code I want to debug
int main() {
GLFWwindow * window = initGLContext();
initImGui(window);
int points[8] = { 0 };
GLuint VAO, VBO;
glGenVertexArrays(1, &VAO); // I set breakpoint here
glGenBuffers(1, &VBO);
GLShader curveShader("", "", "");
while (!glfwWindowShouldClose(window)) {
glfwPollEvents();
useGUI(points);
render();
glfwSwapBuffers(window);
}
ImGui_ImplGlfwGL3_Shutdown();
ImGui::DestroyContext();
glfwTerminate();
return 0;
}
please let me know if more info is needed
Thanks in advance
Edited:
It turns out that my program lack the debug information for glGenVertexArrays(),which is offered by atioglxx.dll, so I decide to use printf() instead
But when I debug my program using gdb, it shows:
(gdb) n
0x6a7706f8 in ?? () from C:\Windows\System32\DriverStore...
This is happening because you are stopped inside atioglxx.dll, which has no debugging info (or even symbol table).
When debugging, you need to be aware of your current context (e.g. which function am I stopped in).
When you are in your own code, and assuming you compiled it with debug info, you can do next, step, info locals, etc. But when you are in somebody else's code (e.g. in system-provided DLL), these commands will not work (are not expected to work).

xCode 8.1 GLFWWindow "first responder" Issue

I have recently been working with OpenGL and have decided to use C++ for my latest project with OpenGL. I am using xCode 8.1 with my library paths and header paths linked correctly. Everything compiles fine but i get this error at runtime:
2016-11-03 15:17:24.649264 Modulo[25303:14858638] [General] ERROR: Setting <GLFWContentView: 0x100343da0> as the first responder for window <GLFWWindow: 0x100222540>, but it is in a different window ((null))! This would eventually crash when the view is freed. The first responder will be set to nil.(
0 AppKit 0x00007fff85c069a4 -[NSWindow _validateFirstResponder:] + 566
1 AppKit 0x00007fff853f79eb -[NSWindow _setFirstResponder:] + 31
2 AppKit 0x00007fff8549f66a -[NSWindow _realMakeFirstResponder:] + 406
3 AppKit 0x00007fff8549f480 -[NSWindow makeFirstResponder:] + 123
4 libglfw3.3.dylib 0x000000010011194a _glfwPlatformCreateWindow + 610
5 libglfw3.3.dylib 0x000000010010d533 glfwCreateWindow + 428
6 Modulo 0x00000001000010a8 main + 296
7 libdyld.dylib 0x00007fff9c828255 start + 1
8 ??? 0x0000000000000001 0x0 + 1)
The code I run to generate this error is as follows:
#include <iostream>
#define GLEW_STATIC
#include <GL/glew.h>
#include <GLFW/glfw3.h>
int main(int argc, const char * argv[]) {
//Engine Startup.
std::cout << "<----- Engine Start-Up ----->" << std::endl;
//Initialize GLFW.
if(!glfwInit()) {
std::cout << "- GLFW Failed to Initialize!" << std::endl;
return -1;
}
std::cout << "+ GLFW Initialized!" << std::endl;
//Create GLFWWindow
GLFWwindow* window = glfwCreateWindow(640, 480, "Engine", nullptr, nullptr);
if(!window) {
std::cout << "- GLFWWindow Failed to Create!" << std::endl;
glfwTerminate();
return -1;
}
std::cout << "+ GLFWWindow Created!" << std::endl;
return 0;
}
The program performs as it should but this error could become an issue later and also makes the console hard to debug so I would like to try and sort it out early!
Thank you in advance and if any more information is needed please let me know! :)
I'm a beginner and I also faced this issue.
I got an error but succeed to create window. How about adding:
glfwWindowHint(GLFW_CONTEXT_VERSION_MAJOR, 3);
glfwWindowHint(GLFW_CONTEXT_VERSION_MINOR, 2);
glfwWindowHint(GLFW_OPENGL_FORWARD_COMPAT, GL_TRUE);
glfwWindowHint(GLFW_OPENGL_PROFILE, GLFW_OPENGL_CORE_PROFILE);
before glfwCreateWindow?
Note discussion here GLFW first responder error which indicates it is a known bug in macOS Sierra, which has been addressed in the git repo for GLFW, but not yet released.

Why is OpenGL version 0.0?

I was troubleshooting an OpenGL application on a new computer when I discovered that GLFW could not create a window with the specified version of OpenGL. I created a minimal version of the application to test the version of OpenGL created, and no matter what version I hint, the version I get is 0.0. Do I simply not have OpenGL? This seems impossible, since glxgears runs and glxinfo suggests that I have version 2.1.
#include <iostream>
#include <GLFW/glfw3.h>
int main(int argc, const char *argv[]) {
if(!glfwInit()) {
return 1;
}
glfwWindowHint(GLFW_CONTEXT_VERSION_MAJOR, 2);
glfwWindowHint(GLFW_CONTEXT_VERSION_MINOR, 1);
auto win = glfwCreateWindow(640, 480, "", NULL, NULL);
if(!win) {
return 1;
}
int major = 0, minor = 0;
glfwMakeContextCurrent(win);
glGetIntegerv(GL_MAJOR_VERSION, &major);
glGetIntegerv(GL_MINOR_VERSION, &minor);
std::cout << "Initialized with OpenGL "
<< major << "." << minor << std::endl;
glfwDestroyWindow(win);
glfwTerminate();
}
The output of the application is "Initialized with OpenGL 0.0". A window briefly opens and closes and the application terminates without errors.
The GL_MAJOR_VERSION and GL_MINOR_VERSION queries were introduced in GL 3.0. Prior to that, this will just generate an GL_INVALID_ENUM error during the glGetIntegerv call, and leave your variables untouched.
You have to use glGetString(GL_VERSION) to reliably get the verison number if you can't make sure that you are on a >= 3.0 context. If you need those as numbers, you'll have to manually parse the string.

Derelict3 SDL2 and OpenGL weird SIGSEGV on DerelictGL.reload()

Trying to get set with SDL and OpenGL on D. Specifically, SDL2 and OpenGL 3.3 core/forward compatible. (although I left the last two out in the example, because it breaks at the same point whether or not they're there). The equivalent to the following in GLFW works fine, so apparently I'm screwing something up on the SDL end, or SDL does some magic things that break Derelict - which seems hard to believe considering that Derelict-gl doesn't do all that much other than loading a few function pointers, but something goes wrong somewhere and I wouldn't exclude a bug in Derelict or SDL, though it's more likely my code.
I don't see it though, and here it is:
import std.stdio;
import std.c.stdlib;
import derelict.sdl2.sdl;
import derelict.opengl3.gl;
void fatal_error_if(Cond,Args...)(Cond cond, string format, Args args) {
if(!!cond) {
stderr.writefln(format,args);
exit(1);
}
}
void main()
{
//set up D bindings to SDL and OpenGL 1.1
DerelictGL.load();
DerelictSDL2.load();
fatal_error_if(SDL_Init(SDL_INIT_VIDEO),"Failed to initialize sdl!");
// we want OpenGL 3.3
SDL_GL_SetAttribute(SDL_GL_CONTEXT_MAJOR_VERSION,3);
SDL_GL_SetAttribute(SDL_GL_CONTEXT_MINOR_VERSION,3);
auto window = SDL_CreateWindow(
"An SDL2 window",
SDL_WINDOWPOS_UNDEFINED,
SDL_WINDOWPOS_UNDEFINED,
800,
600,
SDL_WINDOW_OPENGL); // we want this window to support OpenGL
fatal_error_if(window is null,"Failed to create SDL window!");
auto glprof = SDL_GL_CreateContext(window); // Create the actual context and make it current
fatal_error_if(glprof is null,"Failed to create GL context!");
DerelictGL.reload(); //<-- BOOM SIGSEGV
// just some stuff so we actually see something if nothing exploded
glClearColor(1,0,0,0);
glClear(GL_COLOR_BUFFER_BIT);
SDL_GL_SwapWindow(window);
SDL_Delay(5000);
SDL_DestroyWindow(window);
SDL_Quit();
writeln("If we got to this point everything went alright...");
}
Like the question title says, it breaks on DerelictGL.reload() (which is supposed to load OpenGL functions similar to GLEW). Here's the stacktrace...
#0 0x00007ffff71a398d in __strstr_sse2_unaligned () from /usr/lib/libc.so.6
#1 0x000000000048b8d5 in derelict.opengl3.internal.findEXT() (extname=..., extstr=0x0)
at ../../../../.dub/packages/derelict-gl3-master/source/derelict/opengl3/internal.d:74
#2 0x000000000048b8b0 in derelict.opengl3.internal.isExtSupported() (name=..., glversion=<incomplete type>)
at ../../../../.dub/packages/derelict-gl3-master/source/derelict/opengl3/internal.d:67
#3 0x0000000000487778 in derelict.opengl3.gl.DerelictGLLoader.reload() (this=0x7ffff7ec5e80)
at ../../../../.dub/packages/derelict-gl3-master/source/derelict/opengl3/gl.d:48
#4 0x0000000000473bba in D main () at source/app.d:36
#5 0x00000000004980c8 in rt.dmain2._d_run_main() ()
#6 0x0000000000498022 in rt.dmain2._d_run_main() ()
#7 0x0000000000498088 in rt.dmain2._d_run_main() ()
#8 0x0000000000498022 in rt.dmain2._d_run_main() ()
#9 0x0000000000497fa3 in _d_run_main ()
#10 0x00000000004809e5 in main ()
The error here seems to occur because glGetString(GL_EXTENSIONS) returns null. Why I don't quite understand. If I remove the call to DerelictGL.reload the rest of the program runs, but that'd mean that post OpenGL1.1 functions don't get loaded.
To phrase this as an actual question - am I doing something wrong? If so, what?
Additional
I confirmed that an OpenGL 3.3 context was created - glGet returns 3 on GL_MAJOR_VERSION and GL_MINOR_VERSION respectively.
This seems to be a bug in Derelict-gl3 - if I change this line in gl.d
if( maxVer >= GLVersion.GL12 && isExtSupported( GLVersion.GL12, "GL_ARB_imaging" ) ) {
to
if( maxVer >= GLVersion.GL12 && isExtSupported( maxVer, "GL_ARB_imaging" ) ) {
it works fine. I'll submit an issue on the github repo, see if this is actually the case (I'm not that familiar with how Derelict works, but this appears fairly obvious to me).

Segmentation fault in OpenGL under Linux

I am following some tutorials and came up with the following code:
// rendering.cpp
#include "rendering.h"
#include <GL/gl.h>
#include <GL/freeglut.h>
void DrawGLScene()
{
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glLoadIdentity();
}
int InitGL(int argc, char** argv)
{
/*glShadeModel(GL_SMOOTH);
glClearColor(0.0f, 0.0f, 0.0f, 0.0f);
glClearDepth(1.0f);
glEnable(GL_DEPTH_TEST);
glDepthFunc(GL_LEQUAL);
glHint(GL_PERSPECTIVE_CORRECTION_HINT, GL_NICEST);*/
glutInit(&argc, argv);
glutInitWindowSize(500, 500);
glutInitWindowPosition(100, 100);
glutDisplayFunc(DrawGLScene);
glutCreateWindow("Swimming Simulation");
glutMainLoop(); // Enter GLUT's main loop
return true;
}
My main function is very simple and only calls that function:
#include "rendering.h"
int main(int argc, char** argv)
{
InitGL(argc, argv);
return 0;
}
I am compiling with this command:
g++ -Wall -g swim.cpp rendering.cpp -lglut -lGLU -o swim
Running swim creates a window as expected. However, if I uncomment the lines in InitGL, then I get a segmentation fault when running the program:
(gdb) r
Starting program: <dir>
[Thread debugging using libthread_db enabled]
Using host libthread_db library "/lib64/libthread_db.so.1".
Program received signal SIGSEGV, Segmentation fault.
0x000000335ca52ca7 in glShadeModel () from /usr/lib64/libGL.so.1
Missing separate debuginfos, use: debuginfo-install freeglut-2.6.0-6.fc15.x86_64 glibc-2.14.90-24.fc16.6.x86_64 libX11-1.4.3-1.fc16.x86_64 libXau-1.0.6-2.fc15.x86_64 libXdamage-1.1.3-2.fc15.x86_64 libXext-1.3.0-1.fc16.x86_64 libXfixes-5.0-1.fc16.x86_64 libXi-1.4.5-1.fc16.x86_64 libXxf86vm-1.1.1-2.fc15.x86_64 libdrm-2.4.33-1.fc16.x86_64 libgcc-4.6.3-2.fc16.x86_64 libstdc++-4.6.3-2.fc16.x86_64 libxcb-1.7-3.fc16.x86_64 mesa-libGL-7.11.2-3.fc16.x86_64 mesa-libGLU-7.11.2-3.fc16.x86_64
(gdb) backtrace
#0 0x000000335ca52ca7 in glShadeModel () from /usr/lib64/libGL.so.1
#1 0x0000000000401d67 in InitGL (argc=1, argv=0x7fffffffe198)
at rendering.cpp:25
#2 0x0000000000401c8c in main (argc=1, argv=0x7fffffffe198) at swim.cpp:37
What should I be doing here to get my program to run without crashing?
You fell into a tricky pitfall of GLUT. GLUT is sort of a state machine like OpenGL (it's not part of OpenGL). And the callback functions must be set after creating or selecting a window. In your case move the call of glutDisplayFunc (and any other callback setters) after the call of glutCreateWindow.
Get rid of glut and use something better like GLFW
and also a lot of those functions are deprecated so use a modern tutorial like
http://www.opengl-tutorial.org/
or
http://ogldev.atspace.co.uk/
OpenGL functions can be called only when there is OpenGL context - after glutCreateWindow function call if you use GLUT.
But they shouldn't crash the application though...