Segmentation fault in OpenGL under Linux - c++

I am following some tutorials and came up with the following code:
// rendering.cpp
#include "rendering.h"
#include <GL/gl.h>
#include <GL/freeglut.h>
void DrawGLScene()
{
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glLoadIdentity();
}
int InitGL(int argc, char** argv)
{
/*glShadeModel(GL_SMOOTH);
glClearColor(0.0f, 0.0f, 0.0f, 0.0f);
glClearDepth(1.0f);
glEnable(GL_DEPTH_TEST);
glDepthFunc(GL_LEQUAL);
glHint(GL_PERSPECTIVE_CORRECTION_HINT, GL_NICEST);*/
glutInit(&argc, argv);
glutInitWindowSize(500, 500);
glutInitWindowPosition(100, 100);
glutDisplayFunc(DrawGLScene);
glutCreateWindow("Swimming Simulation");
glutMainLoop(); // Enter GLUT's main loop
return true;
}
My main function is very simple and only calls that function:
#include "rendering.h"
int main(int argc, char** argv)
{
InitGL(argc, argv);
return 0;
}
I am compiling with this command:
g++ -Wall -g swim.cpp rendering.cpp -lglut -lGLU -o swim
Running swim creates a window as expected. However, if I uncomment the lines in InitGL, then I get a segmentation fault when running the program:
(gdb) r
Starting program: <dir>
[Thread debugging using libthread_db enabled]
Using host libthread_db library "/lib64/libthread_db.so.1".
Program received signal SIGSEGV, Segmentation fault.
0x000000335ca52ca7 in glShadeModel () from /usr/lib64/libGL.so.1
Missing separate debuginfos, use: debuginfo-install freeglut-2.6.0-6.fc15.x86_64 glibc-2.14.90-24.fc16.6.x86_64 libX11-1.4.3-1.fc16.x86_64 libXau-1.0.6-2.fc15.x86_64 libXdamage-1.1.3-2.fc15.x86_64 libXext-1.3.0-1.fc16.x86_64 libXfixes-5.0-1.fc16.x86_64 libXi-1.4.5-1.fc16.x86_64 libXxf86vm-1.1.1-2.fc15.x86_64 libdrm-2.4.33-1.fc16.x86_64 libgcc-4.6.3-2.fc16.x86_64 libstdc++-4.6.3-2.fc16.x86_64 libxcb-1.7-3.fc16.x86_64 mesa-libGL-7.11.2-3.fc16.x86_64 mesa-libGLU-7.11.2-3.fc16.x86_64
(gdb) backtrace
#0 0x000000335ca52ca7 in glShadeModel () from /usr/lib64/libGL.so.1
#1 0x0000000000401d67 in InitGL (argc=1, argv=0x7fffffffe198)
at rendering.cpp:25
#2 0x0000000000401c8c in main (argc=1, argv=0x7fffffffe198) at swim.cpp:37
What should I be doing here to get my program to run without crashing?

You fell into a tricky pitfall of GLUT. GLUT is sort of a state machine like OpenGL (it's not part of OpenGL). And the callback functions must be set after creating or selecting a window. In your case move the call of glutDisplayFunc (and any other callback setters) after the call of glutCreateWindow.

Get rid of glut and use something better like GLFW
and also a lot of those functions are deprecated so use a modern tutorial like
http://www.opengl-tutorial.org/
or
http://ogldev.atspace.co.uk/

OpenGL functions can be called only when there is OpenGL context - after glutCreateWindow function call if you use GLUT.
But they shouldn't crash the application though...

Related

GL_INVALID_OPERATION in glGetIntegerv() with GLAD

I use GLAD (config) to load OpenGL functions and GLFW 3.3.8 to create context. Each time I start my program it pops a ERROR 1282 in glGetIntegerv from GLAD debug post-callback function (as far as I know it is invoked after each gl- function and prints an error if any occurred). I figured that this happens after returning from main().
Here's the code (it loads OpenGL 3.3 and shows red window until it is closed, pretty simple I think):
#include <iostream>
#include <glad/glad.h>
#include <GLFW/glfw3.h>
int main()
{
if(glfwInit() != GLFW_TRUE)
throw std::runtime_error{"Unable to initialize GLFW."};
glfwWindowHint(GLFW_CONTEXT_VERSION_MAJOR, 3);
glfwWindowHint(GLFW_CONTEXT_VERSION_MINOR, 3);
glfwWindowHint(GLFW_OPENGL_PROFILE, GLFW_OPENGL_CORE_PROFILE);
GLFWwindow * w{glfwCreateWindow(100, 100, "title", nullptr, nullptr)};
if(w == nullptr)
throw std::runtime_error{"Unable to create window."};
glfwMakeContextCurrent(w);
if(not gladLoadGLLoader(GLADloadproc(glfwGetProcAddress)))
throw std::runtime_error{"Unable to load OpenGL functions."};
glViewport(0, 0, 100, 100);
while(not glfwWindowShouldClose(w))
{
glfwPollEvents();
glClearColor(1.f, 0.f, 0.f, 1.f);
glClear(GL_COLOR_BUFFER_BIT);
glfwSwapBuffers(w);
}
glfwMakeContextCurrent(nullptr);
glfwDestroyWindow(w);
glfwTerminate();
std::cout << "Hey!" << std::endl;
return 0;
}
The output is:
Hey!
ERROR 1282 in glGetIntegerv
From this callstack:
#0 0x00416f91 in _post_call_callback_default_gl (name=0x446d40 <_glfwDataFormat+10036> "glGetIntegerv", funcptr=0x41c1ec <glad_debug_impl_glGetIntegerv#8>, len_args=2) at <glad.c>:45
#1 0x0041c265 in glad_debug_impl_glGetIntegerv#8 (arg0=33309, arg1=0x4526cc <num_exts_i>) at <glad.c>:1385
#2 0x00417168 in get_exts () at <glad.c>:220
#3 0x0042691f in find_extensionsGL () at <glad.c>:3742
#4 0x00426d12 in gladLoadGLLoader (load=0x402a2e <glfwGetProcAddress>) at <glad.c>:3821
#5 0x004016f8 in main () at <main.cpp>:33
Error 1282 is GL_INVALID_OPERATION, but it pops up after the program ended (or at least after the main() ended). Even if I separate the whole code in another function (i. e. create and destroy everything in separate function), and then invoke it in main(), the error still appears after the return 0; from main().
This did not happen when I used GLEW to load OpenGL functions, but maybe it was silenced. I didn't find anything similar to my problem on the internet. What am I doing wrong? Do I have to unload OpenGL or something like that?
UPD: Error message actually pops in gladLoadGLLoader(), not after the end of main().

gdb Cannot find bounds of current function

I am developing a OpenGL program using Mingw32 on Windows 10(64 bit)
The program runs without problem
But when I debug my program using gdb, it shows:
(gdb) n
0x6a7706f8 in ?? ()
from C:\Windows\System32\DriverStore\FileRepository\c0310483.inf_amd64_ab6d2afa5c543409\atioglxx.dll
(gdb) n
Cannot find bounds of current function
(gdb)
Here is the code I want to debug
int main() {
GLFWwindow * window = initGLContext();
initImGui(window);
int points[8] = { 0 };
GLuint VAO, VBO;
glGenVertexArrays(1, &VAO); // I set breakpoint here
glGenBuffers(1, &VBO);
GLShader curveShader("", "", "");
while (!glfwWindowShouldClose(window)) {
glfwPollEvents();
useGUI(points);
render();
glfwSwapBuffers(window);
}
ImGui_ImplGlfwGL3_Shutdown();
ImGui::DestroyContext();
glfwTerminate();
return 0;
}
please let me know if more info is needed
Thanks in advance
Edited:
It turns out that my program lack the debug information for glGenVertexArrays(),which is offered by atioglxx.dll, so I decide to use printf() instead
But when I debug my program using gdb, it shows:
(gdb) n
0x6a7706f8 in ?? () from C:\Windows\System32\DriverStore...
This is happening because you are stopped inside atioglxx.dll, which has no debugging info (or even symbol table).
When debugging, you need to be aware of your current context (e.g. which function am I stopped in).
When you are in your own code, and assuming you compiled it with debug info, you can do next, step, info locals, etc. But when you are in somebody else's code (e.g. in system-provided DLL), these commands will not work (are not expected to work).

Any way to fix "command timed out" GDB error for an SFML-using program in CLion?

The problem is that GDB fails to get debug info from some places of a program using SFML. CLion specs:
CLion 2016.1.3
MinGW-w64 3.4
GDB 7.8.1
I found an answer suggesting to stop anti-virus software for a while, but it did not work.
Code sample:
int main(int argc, char* argv[]) {
sf::RenderWindow window(sf::VideoMode(800, 600), "myproject");
Interface interface (window);
/* Setting up 'interface'*/
while (window.isOpen()) {
sf::Event event;
while (window.pollEvent(event)) {
/* Capture events */
}
window.clear ();
interface.draw (); // Breakpoint here, information captured instantly
}
}
Going inside interface.draw():
void draw () {
for (FramePtr &ptr : activeFrameStack) // Debugger fails to get info here
ptr->draw (window);
}
Get the recently released 2016.2 version with 'command timeout' problem fixed and bundled GDB updated to 7.11 (https://blog.jetbrains.com/clion/2016/07/clion-2016-2-released/).

Derelict3 SDL2 and OpenGL weird SIGSEGV on DerelictGL.reload()

Trying to get set with SDL and OpenGL on D. Specifically, SDL2 and OpenGL 3.3 core/forward compatible. (although I left the last two out in the example, because it breaks at the same point whether or not they're there). The equivalent to the following in GLFW works fine, so apparently I'm screwing something up on the SDL end, or SDL does some magic things that break Derelict - which seems hard to believe considering that Derelict-gl doesn't do all that much other than loading a few function pointers, but something goes wrong somewhere and I wouldn't exclude a bug in Derelict or SDL, though it's more likely my code.
I don't see it though, and here it is:
import std.stdio;
import std.c.stdlib;
import derelict.sdl2.sdl;
import derelict.opengl3.gl;
void fatal_error_if(Cond,Args...)(Cond cond, string format, Args args) {
if(!!cond) {
stderr.writefln(format,args);
exit(1);
}
}
void main()
{
//set up D bindings to SDL and OpenGL 1.1
DerelictGL.load();
DerelictSDL2.load();
fatal_error_if(SDL_Init(SDL_INIT_VIDEO),"Failed to initialize sdl!");
// we want OpenGL 3.3
SDL_GL_SetAttribute(SDL_GL_CONTEXT_MAJOR_VERSION,3);
SDL_GL_SetAttribute(SDL_GL_CONTEXT_MINOR_VERSION,3);
auto window = SDL_CreateWindow(
"An SDL2 window",
SDL_WINDOWPOS_UNDEFINED,
SDL_WINDOWPOS_UNDEFINED,
800,
600,
SDL_WINDOW_OPENGL); // we want this window to support OpenGL
fatal_error_if(window is null,"Failed to create SDL window!");
auto glprof = SDL_GL_CreateContext(window); // Create the actual context and make it current
fatal_error_if(glprof is null,"Failed to create GL context!");
DerelictGL.reload(); //<-- BOOM SIGSEGV
// just some stuff so we actually see something if nothing exploded
glClearColor(1,0,0,0);
glClear(GL_COLOR_BUFFER_BIT);
SDL_GL_SwapWindow(window);
SDL_Delay(5000);
SDL_DestroyWindow(window);
SDL_Quit();
writeln("If we got to this point everything went alright...");
}
Like the question title says, it breaks on DerelictGL.reload() (which is supposed to load OpenGL functions similar to GLEW). Here's the stacktrace...
#0 0x00007ffff71a398d in __strstr_sse2_unaligned () from /usr/lib/libc.so.6
#1 0x000000000048b8d5 in derelict.opengl3.internal.findEXT() (extname=..., extstr=0x0)
at ../../../../.dub/packages/derelict-gl3-master/source/derelict/opengl3/internal.d:74
#2 0x000000000048b8b0 in derelict.opengl3.internal.isExtSupported() (name=..., glversion=<incomplete type>)
at ../../../../.dub/packages/derelict-gl3-master/source/derelict/opengl3/internal.d:67
#3 0x0000000000487778 in derelict.opengl3.gl.DerelictGLLoader.reload() (this=0x7ffff7ec5e80)
at ../../../../.dub/packages/derelict-gl3-master/source/derelict/opengl3/gl.d:48
#4 0x0000000000473bba in D main () at source/app.d:36
#5 0x00000000004980c8 in rt.dmain2._d_run_main() ()
#6 0x0000000000498022 in rt.dmain2._d_run_main() ()
#7 0x0000000000498088 in rt.dmain2._d_run_main() ()
#8 0x0000000000498022 in rt.dmain2._d_run_main() ()
#9 0x0000000000497fa3 in _d_run_main ()
#10 0x00000000004809e5 in main ()
The error here seems to occur because glGetString(GL_EXTENSIONS) returns null. Why I don't quite understand. If I remove the call to DerelictGL.reload the rest of the program runs, but that'd mean that post OpenGL1.1 functions don't get loaded.
To phrase this as an actual question - am I doing something wrong? If so, what?
Additional
I confirmed that an OpenGL 3.3 context was created - glGet returns 3 on GL_MAJOR_VERSION and GL_MINOR_VERSION respectively.
This seems to be a bug in Derelict-gl3 - if I change this line in gl.d
if( maxVer >= GLVersion.GL12 && isExtSupported( GLVersion.GL12, "GL_ARB_imaging" ) ) {
to
if( maxVer >= GLVersion.GL12 && isExtSupported( maxVer, "GL_ARB_imaging" ) ) {
it works fine. I'll submit an issue on the github repo, see if this is actually the case (I'm not that familiar with how Derelict works, but this appears fairly obvious to me).

OpenGL program doesn't start in Qt Creator

I'm using Qt Creator on linux mint and I try to run an opengl program.
The building doesn't give any errors, yet when I try to run the program inside Qt Creator, a terminal window appears, and nothing else happens. When I run the program directly in terminal I get the following output:
OpenGL version supported by this platform (3.3.0 NVIDIA 295.40):
OpenGL 3.3.0 NVIDIA 295.40, GLSL 3.30 NVIDIA via Cg compiler
Ready for OpenGL 2.0
Segmentation fault (core dumped)
My code:
#include <stdio.h>
#include <GL/glew.h>
#include <GL/gl.h>
#include <GL/glu.h>
#include <GL/glut.h>
#include <GL/glext.h>
int main(int argc, char **argv) {
glutInit(&argc, argv);
glutInitWindowSize(600, 600);
glutInitWindowPosition(100, 100);
glutInitDisplayMode(GLUT_RGBA | GLUT_DOUBLE | GLUT_DEPTH);
glutCreateWindow("Aquarium");
glutDisplayFunc(onDisplay);
glutMouseFunc(onMouse);
glutIdleFunc(onIdle);
glutKeyboardFunc(onKeyboard);
glutReshapeFunc(onReshape);
printf("OpenGL version supported by this platform (%s): \n", glGetString(GL_VERSION));
printf("OpenGL %s, GLSL %s\n",glGetString(GL_VERSION),glGetString(GL_SHADING_LANGUAGE_VERSION));
glewInit();
if (glewIsSupported("GL_VERSION_2_0"))
printf("Ready for OpenGL 2.0\n");
else {
printf("OpenGL 2.0 not supported\n");
exit(1);
}
onInitialization();
glutMainLoop();
return 0;
}
I have defined the event handlers, and I have the onInitialization() method too.
If I try to printf something in the beginning of the onInitialization() method, the program doesn't write anything else apart from the rows I wrote earlier. So it doesn't step inside
onInitialization() I think. I can't even debug this program in Qt Creator. What can cause this? And what can cause that I can't start the program inside Qt Creator? I have a .pro file:
QT -= gui core
TARGET = main
CONFIG += console
TEMPLATE = app
LIBS += /usr/lib/libglut.so /usr/lib/compiz/libopengl.so /usr/lib/i386-linux-gnu/libGLU.so /usr/lib/i386-linux-gnu/libGLEW.so
QMAKE_CXXFLAGS += -W -Wall -Wextra -pedantic
SOURCES += main.cpp
With the same setting I've been able to run the program under Windows (of course the LIBS were different there).
The onInitialization():
void onInitialization( ) {
printf("onInitialization");
soft.controlpoints.push_back(Point(5., 5., 5.));
soft.speeds.push_back(Point(0., 0., 0.));
soft.controlpoints.push_back(Point(5, -5., 5.));
soft.speeds.push_back(Point(0., 0., 0.));
soft.controlpoints.push_back(Point(5., -5., -5.));
soft.speeds.push_back(Point(0., 0., 0.));
soft2.controlpoints.push_back(Point(5., 5., 5.));
soft2.speeds.push_back(Point(0., 0., 0.));
soft2.controlpoints.push_back(Point(5, -5., 5.));
soft2.speeds.push_back(Point(0., 0., 0.));
soft2.controlpoints.push_back(Point(5., -5., -5.));
soft2.speeds.push_back(Point(0., 0., 0.));
soft2.controlpoints.push_back(Point(-5., 5., -5.));
soft2.speeds.push_back(Point(0., 0., 0.));
soft.set_r();
soft2.set_r();
aquarium.objects.push_back(&water);
aquarium.objects.push_back(&fish);
aquarium.objects.push_back(&field);
aquarium.objects.push_back(&soft2);
aquarium.createMaterials();
aquarium.createVoxelArray();
glEnable(GL_DEPTH_TEST);
lastMovingTime = 0.;
setShadowMapShaders();
setAquariumShaders();
}
It doesn't print even the "onInitialization" string. The objects in the method are global variables, and all of the methods that are called here is implemented. What can cause that it doesn't print the "onInitialization" string? soft.controlpoints, soft.speeds, and aquarium.object are public fields. Even if I comment everything else, the string doesn't appear, but the window is created. And it's still not running from inside the Qt Creator.
It turns out that the problem was with file reader method (that reads in the GLSL shaders codes) and this was because Qt Creator's working directory wasn't set on the directory which contained the shaders source files.