Derelict3 SDL2 and OpenGL weird SIGSEGV on DerelictGL.reload() - opengl

Trying to get set with SDL and OpenGL on D. Specifically, SDL2 and OpenGL 3.3 core/forward compatible. (although I left the last two out in the example, because it breaks at the same point whether or not they're there). The equivalent to the following in GLFW works fine, so apparently I'm screwing something up on the SDL end, or SDL does some magic things that break Derelict - which seems hard to believe considering that Derelict-gl doesn't do all that much other than loading a few function pointers, but something goes wrong somewhere and I wouldn't exclude a bug in Derelict or SDL, though it's more likely my code.
I don't see it though, and here it is:
import std.stdio;
import std.c.stdlib;
import derelict.sdl2.sdl;
import derelict.opengl3.gl;
void fatal_error_if(Cond,Args...)(Cond cond, string format, Args args) {
if(!!cond) {
stderr.writefln(format,args);
exit(1);
}
}
void main()
{
//set up D bindings to SDL and OpenGL 1.1
DerelictGL.load();
DerelictSDL2.load();
fatal_error_if(SDL_Init(SDL_INIT_VIDEO),"Failed to initialize sdl!");
// we want OpenGL 3.3
SDL_GL_SetAttribute(SDL_GL_CONTEXT_MAJOR_VERSION,3);
SDL_GL_SetAttribute(SDL_GL_CONTEXT_MINOR_VERSION,3);
auto window = SDL_CreateWindow(
"An SDL2 window",
SDL_WINDOWPOS_UNDEFINED,
SDL_WINDOWPOS_UNDEFINED,
800,
600,
SDL_WINDOW_OPENGL); // we want this window to support OpenGL
fatal_error_if(window is null,"Failed to create SDL window!");
auto glprof = SDL_GL_CreateContext(window); // Create the actual context and make it current
fatal_error_if(glprof is null,"Failed to create GL context!");
DerelictGL.reload(); //<-- BOOM SIGSEGV
// just some stuff so we actually see something if nothing exploded
glClearColor(1,0,0,0);
glClear(GL_COLOR_BUFFER_BIT);
SDL_GL_SwapWindow(window);
SDL_Delay(5000);
SDL_DestroyWindow(window);
SDL_Quit();
writeln("If we got to this point everything went alright...");
}
Like the question title says, it breaks on DerelictGL.reload() (which is supposed to load OpenGL functions similar to GLEW). Here's the stacktrace...
#0 0x00007ffff71a398d in __strstr_sse2_unaligned () from /usr/lib/libc.so.6
#1 0x000000000048b8d5 in derelict.opengl3.internal.findEXT() (extname=..., extstr=0x0)
at ../../../../.dub/packages/derelict-gl3-master/source/derelict/opengl3/internal.d:74
#2 0x000000000048b8b0 in derelict.opengl3.internal.isExtSupported() (name=..., glversion=<incomplete type>)
at ../../../../.dub/packages/derelict-gl3-master/source/derelict/opengl3/internal.d:67
#3 0x0000000000487778 in derelict.opengl3.gl.DerelictGLLoader.reload() (this=0x7ffff7ec5e80)
at ../../../../.dub/packages/derelict-gl3-master/source/derelict/opengl3/gl.d:48
#4 0x0000000000473bba in D main () at source/app.d:36
#5 0x00000000004980c8 in rt.dmain2._d_run_main() ()
#6 0x0000000000498022 in rt.dmain2._d_run_main() ()
#7 0x0000000000498088 in rt.dmain2._d_run_main() ()
#8 0x0000000000498022 in rt.dmain2._d_run_main() ()
#9 0x0000000000497fa3 in _d_run_main ()
#10 0x00000000004809e5 in main ()
The error here seems to occur because glGetString(GL_EXTENSIONS) returns null. Why I don't quite understand. If I remove the call to DerelictGL.reload the rest of the program runs, but that'd mean that post OpenGL1.1 functions don't get loaded.
To phrase this as an actual question - am I doing something wrong? If so, what?
Additional
I confirmed that an OpenGL 3.3 context was created - glGet returns 3 on GL_MAJOR_VERSION and GL_MINOR_VERSION respectively.

This seems to be a bug in Derelict-gl3 - if I change this line in gl.d
if( maxVer >= GLVersion.GL12 && isExtSupported( GLVersion.GL12, "GL_ARB_imaging" ) ) {
to
if( maxVer >= GLVersion.GL12 && isExtSupported( maxVer, "GL_ARB_imaging" ) ) {
it works fine. I'll submit an issue on the github repo, see if this is actually the case (I'm not that familiar with how Derelict works, but this appears fairly obvious to me).

Related

GL_INVALID_OPERATION in glGetIntegerv() with GLAD

I use GLAD (config) to load OpenGL functions and GLFW 3.3.8 to create context. Each time I start my program it pops a ERROR 1282 in glGetIntegerv from GLAD debug post-callback function (as far as I know it is invoked after each gl- function and prints an error if any occurred). I figured that this happens after returning from main().
Here's the code (it loads OpenGL 3.3 and shows red window until it is closed, pretty simple I think):
#include <iostream>
#include <glad/glad.h>
#include <GLFW/glfw3.h>
int main()
{
if(glfwInit() != GLFW_TRUE)
throw std::runtime_error{"Unable to initialize GLFW."};
glfwWindowHint(GLFW_CONTEXT_VERSION_MAJOR, 3);
glfwWindowHint(GLFW_CONTEXT_VERSION_MINOR, 3);
glfwWindowHint(GLFW_OPENGL_PROFILE, GLFW_OPENGL_CORE_PROFILE);
GLFWwindow * w{glfwCreateWindow(100, 100, "title", nullptr, nullptr)};
if(w == nullptr)
throw std::runtime_error{"Unable to create window."};
glfwMakeContextCurrent(w);
if(not gladLoadGLLoader(GLADloadproc(glfwGetProcAddress)))
throw std::runtime_error{"Unable to load OpenGL functions."};
glViewport(0, 0, 100, 100);
while(not glfwWindowShouldClose(w))
{
glfwPollEvents();
glClearColor(1.f, 0.f, 0.f, 1.f);
glClear(GL_COLOR_BUFFER_BIT);
glfwSwapBuffers(w);
}
glfwMakeContextCurrent(nullptr);
glfwDestroyWindow(w);
glfwTerminate();
std::cout << "Hey!" << std::endl;
return 0;
}
The output is:
Hey!
ERROR 1282 in glGetIntegerv
From this callstack:
#0 0x00416f91 in _post_call_callback_default_gl (name=0x446d40 <_glfwDataFormat+10036> "glGetIntegerv", funcptr=0x41c1ec <glad_debug_impl_glGetIntegerv#8>, len_args=2) at <glad.c>:45
#1 0x0041c265 in glad_debug_impl_glGetIntegerv#8 (arg0=33309, arg1=0x4526cc <num_exts_i>) at <glad.c>:1385
#2 0x00417168 in get_exts () at <glad.c>:220
#3 0x0042691f in find_extensionsGL () at <glad.c>:3742
#4 0x00426d12 in gladLoadGLLoader (load=0x402a2e <glfwGetProcAddress>) at <glad.c>:3821
#5 0x004016f8 in main () at <main.cpp>:33
Error 1282 is GL_INVALID_OPERATION, but it pops up after the program ended (or at least after the main() ended). Even if I separate the whole code in another function (i. e. create and destroy everything in separate function), and then invoke it in main(), the error still appears after the return 0; from main().
This did not happen when I used GLEW to load OpenGL functions, but maybe it was silenced. I didn't find anything similar to my problem on the internet. What am I doing wrong? Do I have to unload OpenGL or something like that?
UPD: Error message actually pops in gladLoadGLLoader(), not after the end of main().

gdb Cannot find bounds of current function

I am developing a OpenGL program using Mingw32 on Windows 10(64 bit)
The program runs without problem
But when I debug my program using gdb, it shows:
(gdb) n
0x6a7706f8 in ?? ()
from C:\Windows\System32\DriverStore\FileRepository\c0310483.inf_amd64_ab6d2afa5c543409\atioglxx.dll
(gdb) n
Cannot find bounds of current function
(gdb)
Here is the code I want to debug
int main() {
GLFWwindow * window = initGLContext();
initImGui(window);
int points[8] = { 0 };
GLuint VAO, VBO;
glGenVertexArrays(1, &VAO); // I set breakpoint here
glGenBuffers(1, &VBO);
GLShader curveShader("", "", "");
while (!glfwWindowShouldClose(window)) {
glfwPollEvents();
useGUI(points);
render();
glfwSwapBuffers(window);
}
ImGui_ImplGlfwGL3_Shutdown();
ImGui::DestroyContext();
glfwTerminate();
return 0;
}
please let me know if more info is needed
Thanks in advance
Edited:
It turns out that my program lack the debug information for glGenVertexArrays(),which is offered by atioglxx.dll, so I decide to use printf() instead
But when I debug my program using gdb, it shows:
(gdb) n
0x6a7706f8 in ?? () from C:\Windows\System32\DriverStore...
This is happening because you are stopped inside atioglxx.dll, which has no debugging info (or even symbol table).
When debugging, you need to be aware of your current context (e.g. which function am I stopped in).
When you are in your own code, and assuming you compiled it with debug info, you can do next, step, info locals, etc. But when you are in somebody else's code (e.g. in system-provided DLL), these commands will not work (are not expected to work).

Weird segfaults when calling TTF_OpenFont

I ported a game I am making from SDL 1.2 to SDL2. After porting the game and getting it to compile properly I get a segfault when I call TTF_OpenFont here:
bool cargararchivos(SDL_Texture* &background,SDL_Texture* &player,TTF_Font* &font,SDL_Texture* &bullet,Config* placlips,SDL_Renderer* renderer)
{
string playerss;
//Open the font
font = TTF_OpenFont( "lazy.ttf", 28 );
//If there was an error in loading the font
if(font==NULL)
{
return false;
}
try{
playerss = placlips->lookup("filename").c_str();
}catch(const SettingNotFoundException &nfex)
{
cerr << "No 'name' setting in configuration file." << endl;
return false;
}
//Open background
background = cargarimagen("fondo.png",renderer);
if(background==NULL){
return false;
}
//Open player sprites
player = cargarimagen(playerss,renderer);
if(player==NULL){
return false;
}
bullet = cargarimagen("bullet.png",renderer);
if(bullet==NULL)
return false;
return true;
}
The segfault happens before TTF_OpenFont ends. The backtrace I get is:
#0 ?? ?? () (??:??)
#1 0x7ffff7410ce5 TTF_CloseFont(font=0x8af1e0) (SDL_ttf.c:933)
#2 0x7ffff74110fd TTF_OpenFontIndexRW(src=<optimized out>, freesrc=<optimized out>, ptsize=<optimized out>, index=0) (SDL_ttf.c:489)
#3 0x409c9d cargararchivos(background=#0x7fffffffe598: 0x0, player=#0x7fffffffe590: 0x0, font=#0x7fffffffe580: 0x0, bullet=#0x7fffffffe588: 0x0, placlips=0x7fffffffe560, renderer=0x9c25b0) (/home/xxxxx/xxxxx/main.cpp:33)
#4 0x40a526 main(argc=1, args=0x7fffffffe6e8) (/home/xxxxx/xxxxx/main.cpp:173)
If I take out all the SDL_ttf stuff out I still get a similar segfault but with IMG_Load. I suspect it is an issue with my CodeBlocks setup because I can buiid the Lazy Foo SDL2 tutorials fine with g++ and run them. Or maybe it is a bug? I am using Debian sid (Linux) by the way. Please help.
D'oh!
There was an include to the 1.2 version of SDL_ttf that I forgot to change to the new version. Man, I am stupid. Thanks keltar!

Using SDL2 with wxWidgets 3.0

My goal is to build a Game Boy emulator. In order to do this, I would like to embed an SDL2 surface into a wxWidgets window.
I found this tutorial: http://code.technoplaza.net/wx-sdl/part1/, but my program crashes as soon as I run it. However I suspect this was intended for SDL1.2. Part of the program is shown below.
It seems that if I call SDL_Init() and also attempt to show a wxFrame (which, in this case, is MainWindow), it shows the window for a second and then the program crashes. I commented all other calls to SDL in my program so far, so it seems the problem lies with calling Show() on a wxFrame and initing SDL2 in the same program.
So the question is: can SDL2 and wxWidgets 3 work together? If not, could you guys suggest to me good alternatives a GUI of a Game Boy emulator? Does wxWidgets have its own graphics frame like Qt does (I'd rather avoid Qt)?
Thanks very much!
#include "MainApp.h"
#include "MainWindow.h"
#include <stdexcept>
namespace GBEmu {
static void initSDL() {
//This and SDL_Quit() are the only calls to the SDL library in my code
if (SDL_Init(SDL_INIT_EVERYTHING) < 0) {
throw std::runtime_error("Fatal Error: Could not init SDL");
}
}
bool MainApp::OnInit()
{
try {
//If I comment out this line, the MainWindow wxFrame shows up fine.
//If I leave both uncommented, the window shows up quickly and then
//crashes.
initSDL();
//If I comment out this line and leave initSDL() uncommented,
//the program will not crash, but just run forever.
(new MainWindow("GBEmu", {50,50}, {640,480}))->Show();
} catch(std::exception &e) {
wxLogMessage(_("Fatal Error: " + std::string(e.what())));
}
return true;
}
int MainApp::OnExit() {
SDL_Quit();
return wxApp::OnExit();
}
}
wxIMPLEMENT_APP(GBEmu::MainApp);
EDIT: Here is more information on how it crashes: It crashes with a Segfault in what seems to be the pthread_mutex_lock disassembly file. This is the output in the console with stack trace:
Starting /home/dan/Documents/devStuff/GBEmuWx-build/GBEmuWx...
The program has unexpectedly finished.
/home/dan/Documents/devStuff/GBEmuWx-build/GBEmuWx crashed
Stack trace:
Error: signal 11:
/home/dan/Documents/devStuff/GBEmuWx-build/GBEmuWx(_ZN5GBEmu7handlerEi+0x1c)[0x414805]
/lib/x86_64-linux-gnu/libc.so.6(+0x36ff0)[0x7fb88e136ff0]
/lib/x86_64-linux-gnu/libpthread.so.0(pthread_mutex_lock+0x30)[0x7fb88c12ffa0]
/usr/lib/x86_64-linux-gnu/libX11.so.6(XrmQGetResource+0x3c)[0x7fb88d1ca15c]
/usr/lib/x86_64-linux-gnu/libX11.so.6(XGetDefault+0xc2)[0x7fb88d1a7a92]
/usr/lib/x86_64-linux-gnu/libcairo.so.2(+0x94dcf)[0x7fb88af8edcf]
/usr/lib/x86_64-linux-gnu/libcairo.so.2(+0x97110)[0x7fb88af91110]
/usr/lib/x86_64-linux-gnu/libcairo.so.2(cairo_surface_get_font_options+0x87)[0x7fb88af63e07]
/usr/lib/x86_64-linux-gnu/libcairo.so.2(+0x2b61f)[0x7fb88af2561f]
/usr/lib/x86_64-linux-gnu/libcairo.so.2(+0x2ef95)[0x7fb88af28f95]
This is a screenshot of where it seems to fail (line 7):
Update: In my MainWindow class, I attach a menu bar to the window. However, it seems when I comment out the setting of the menu bar, the window will show up fine even with initing of SDL. The menu bar will show up fine if I have initSDL() commented out but not the setting of the menu bar. Here is where I set the menu bar:
MainWindow::MainWindow(const wxString &title, const wxPoint &pos, const wxSize &size)
:wxFrame(nullptr, wxIDs::MainWindow, title, pos, size){
wxMenu *fileMenu = new wxMenu;
fileMenu->Append(wxID_EXIT);
wxMenuBar *menuBar = new wxMenuBar;
menuBar->Append(fileMenu, "&File");
//commenting this line out will allow the window to showup
//and not crash the program
SetMenuBar(menuBar);
}
You are experiencing an old heisenbug.
The workaround is simple: you have to initialize SDL before wxWidgets (basically, before GTK). To achieve this, you have to change
wxIMPLEMENT_APP(GBEmu::MainApp);
to
wxIMPLEMENT_APP_NO_MAIN(GBEmu::MainApp);
so that wxWidgets doesn't hijack your main().
Then you have to create main() manually. In it, initialize SDL, then call wxEntry():
int main(int argc, char** argv)
{
if (SDL_Init(SDL_INIT_EVERYTHING) < 0)
{
std::cerr << "Could not initialize SDL.\n";
return 1;
}
return wxEntry(argc, argv);
}
More about the bug:
I have googled around a bit and found that this bug has come up in a few places over the years. There are open reports in many bug trackers that have stack traces very similar to the one you get here (with debug symbols).
The oldest report I could find is from 2005 (!!) from the cairo bug tracker (https://bugs.freedesktop.org/show_bug.cgi?id=4373).
My best guess is that the real hiding place of this bug in either in GTK, cairo, or X. Unfortunately I do not currently have the time to look into it more in depth.

Segmentation fault in OpenGL under Linux

I am following some tutorials and came up with the following code:
// rendering.cpp
#include "rendering.h"
#include <GL/gl.h>
#include <GL/freeglut.h>
void DrawGLScene()
{
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glLoadIdentity();
}
int InitGL(int argc, char** argv)
{
/*glShadeModel(GL_SMOOTH);
glClearColor(0.0f, 0.0f, 0.0f, 0.0f);
glClearDepth(1.0f);
glEnable(GL_DEPTH_TEST);
glDepthFunc(GL_LEQUAL);
glHint(GL_PERSPECTIVE_CORRECTION_HINT, GL_NICEST);*/
glutInit(&argc, argv);
glutInitWindowSize(500, 500);
glutInitWindowPosition(100, 100);
glutDisplayFunc(DrawGLScene);
glutCreateWindow("Swimming Simulation");
glutMainLoop(); // Enter GLUT's main loop
return true;
}
My main function is very simple and only calls that function:
#include "rendering.h"
int main(int argc, char** argv)
{
InitGL(argc, argv);
return 0;
}
I am compiling with this command:
g++ -Wall -g swim.cpp rendering.cpp -lglut -lGLU -o swim
Running swim creates a window as expected. However, if I uncomment the lines in InitGL, then I get a segmentation fault when running the program:
(gdb) r
Starting program: <dir>
[Thread debugging using libthread_db enabled]
Using host libthread_db library "/lib64/libthread_db.so.1".
Program received signal SIGSEGV, Segmentation fault.
0x000000335ca52ca7 in glShadeModel () from /usr/lib64/libGL.so.1
Missing separate debuginfos, use: debuginfo-install freeglut-2.6.0-6.fc15.x86_64 glibc-2.14.90-24.fc16.6.x86_64 libX11-1.4.3-1.fc16.x86_64 libXau-1.0.6-2.fc15.x86_64 libXdamage-1.1.3-2.fc15.x86_64 libXext-1.3.0-1.fc16.x86_64 libXfixes-5.0-1.fc16.x86_64 libXi-1.4.5-1.fc16.x86_64 libXxf86vm-1.1.1-2.fc15.x86_64 libdrm-2.4.33-1.fc16.x86_64 libgcc-4.6.3-2.fc16.x86_64 libstdc++-4.6.3-2.fc16.x86_64 libxcb-1.7-3.fc16.x86_64 mesa-libGL-7.11.2-3.fc16.x86_64 mesa-libGLU-7.11.2-3.fc16.x86_64
(gdb) backtrace
#0 0x000000335ca52ca7 in glShadeModel () from /usr/lib64/libGL.so.1
#1 0x0000000000401d67 in InitGL (argc=1, argv=0x7fffffffe198)
at rendering.cpp:25
#2 0x0000000000401c8c in main (argc=1, argv=0x7fffffffe198) at swim.cpp:37
What should I be doing here to get my program to run without crashing?
You fell into a tricky pitfall of GLUT. GLUT is sort of a state machine like OpenGL (it's not part of OpenGL). And the callback functions must be set after creating or selecting a window. In your case move the call of glutDisplayFunc (and any other callback setters) after the call of glutCreateWindow.
Get rid of glut and use something better like GLFW
and also a lot of those functions are deprecated so use a modern tutorial like
http://www.opengl-tutorial.org/
or
http://ogldev.atspace.co.uk/
OpenGL functions can be called only when there is OpenGL context - after glutCreateWindow function call if you use GLUT.
But they shouldn't crash the application though...