Using GLEW with Qt 5.x - c++

I am currently trying to get a Qt 5 window and QOpenGLContext working with GLEW. Yeah, I know Qt 5 provides its own function wrappings for OpenGL, but since my rendering engine relies on GLEW and supports other window libraries as well, Qt's built-in stuff is not an option.
Now, here is what I got up and running so far:
I sub-classed QWindow and equipped it with a QOpenGLContext. The context is initialized successfully.
After initializing QOpenGLContext, I (again successfully) call glewInit() to initialize GLEW.
I am now able to render geometry to the default framebuffer in the exact same way as I do it for other window frameworks (GLFW, to be more precise).
Here comes the tricky part: I am using one of OpenGL's uniform buffer objects to transfer light data to the GPU. As soon as I call glBufferData() to initially fill it, I get a segmentation fault. When using my GLFW-based implementation and context initialization, everything works fine. I know that this kind of behavior can be expected for insufficiently initialized OpenGL contexts, but again, setting up QOpenGLContext and calling glewInit() seems to work just fine.
Here is some code to show what I'm trying to do...
QtWindow::QtWindow(QWindow *parent)
: QWindow(parent) {
setSurfaceType(QWindow::OpenGLSurface);
QSurfaceFormat format;
format.setVersion(4,5);
format.setOption(QSurfaceFormat::DeprecatedFunctions);
format.setSwapBehavior(QSurfaceFormat::DoubleBuffer);
format.setProfile(QSurfaceFormat::CoreProfile);
setFormat(format);
}
This should be sufficient to later on get a context of the format I desire. Now, just before the first frame is rendered, I set up the context and GLEW...
void QtWindow::init_context() {
if (!initialized_) {
context_handler_.init(this);
initialized_ = true;
glewExperimental = GL_TRUE;
auto e = glewInit();
if (e != GLEW_OK) {
std::cout << "Failed to initialize glew: "
<< glewGetErrorString(e) << std::endl;
}
glGetError();
}
}
I use a small helper class for initializing QOpenGLContext as I need to prevent Qt from un-defining GLEW macros:
void QtContextHandler::init(QWindow* parent) {
if (!qt_context_) {
qt_context_ = new QOpenGLContext(parent);
qt_context_->setFormat(parent->requestedFormat());
if (qt_context_->create()) {
auto format(qt_context_->format());
std::cout << "Initialized Qt OpenGL context "
<< format.majorVersion() << "."
<< format.minorVersion() << " successfully."
<< std::endl;
qt_context_->makeCurrent(parent);
} else {
std::cout << "Failed to initialize Qt OpenGL context!"
<< std::endl;
}
}
}
Here is what I do for setting up the light UBO and what crashes when OpenGL is initialized as shown above. I am using oglplus as a GL wrapper, but since it wraps OpenGL's functions quite tightly, you should get the idea:
ubo_.Bind(ogl::Buffer::Target::Uniform);
oglplus::Buffer::Data(oglplus::Buffer::Target::Uniform, sizeof(data), &data, oglplus::BufferUsage::DynamicDraw);
Has anyone tried similar approaches and can share their experience? I would appreciate any help since I'm stuck trying to figure out what I am doing wrong. Again: The initialization seems to run smoothly and I am even able to create VBOs/VAOs/IBOs for rendering meshes! Only creating the UBO causes a segmentation fault.
EDIT:
Okay, here are some new insights. First of all, the segmentation fault only occurs if the uploaded data exceeds a certain size (~90 bytes). In other words, I can render a scene with the Qt-created context using exactly one custom light source. When querying GL_MAX_UNIFORM_BLOCK_SIZE though, the driver tells me that 64KB are available for uniform blocks (the same holds for GLFW-created contexts). Does anyone have an idea on what could possibly go wrong?

Okay, just in case anyone encounters similar difficulties: I managed to get it working by uploading the UBO data using glBufferStorage instead of glBufferData. The former is used for creating buffers of immutable size, which is sufficient for my purposes. Still, I don't know what went wrong with glBufferData and whether it is a bug in Qt or I initialized the context incorrectly.

Related

activate quad buffered stereo with sfml or openGL

My program works perfectly fine in a normal 3D with one buffer, it is coded with SFML window management.
I would like to add quad buffered stereo, therefore i changed my drawing code to the following :
glDrawBuffer(GL_BACK_LEFT);
camera->OnMouseMotion(sf::Vector2i(-1,0));
for (auto i = objects->cbegin(); i != objects->cend(); ++i)
(*i)->draw(camera);
glFlush();
glDrawBuffer(GL_BACK_RIGHT);
camera->OnMouseMotion(sf::Vector2i(2,0));
for (auto i = objects->cbegin(); i != objects->cend(); ++i)
(*i)->draw(camera);
glFlush();
camera->OnMouseMotion(sf::Vector2i(-1,0));
Notice that my camera changed are not perfectly right and i know i will have to change these, right now i am focusing on displaying an image just using quad buffered stereo. I noticed in all examples of programs using this stereo that they were initialising the window with something like this :
type = GLUT_DOUBLE | GLUT_RGB | GLUT_DEPTH | GLUT_STEREO;
glutInitDisplayMode(type);
Using SFML, such function isn't available, my questions are :
Can i use a low-level openGL function to achieve the same result ? Can i use another window managing library with SFML ? Should i forget SFML for my program and completly change it to another one ?
SFML doesn't have an initialisation function, it creates openGL context automatically, two things you can do :
Modify SFML sources you are using, and try to add somewhere in the window creation function your parameter
change your display library to create the openGL context, however you may or not keep SFML 2D drawing functions, i am not sure these are gonna work if you create your context with glut for example.
EDIT : after a small check, you cannot create a context with another library and still use SFML for drawing simple 2D forms, i am afraid you are forced to let SFML go.

How to share OpenGL context or data?

I need to shared data (textures, vertex-buffers,... ) across all OpenGL widgets in a application.
The following code isn't working:
I've found some solutions that have one main QGLWidget and other are constructed using this main widget. Unfortunately, I can't use this approach, because all my QGLWidgets are equal and almost certainly the first(main) created QGLWidget will be destroyed before others are.
Possible approach:
single shared OpenGL context between all QGLWidgets
not working: just one QGLWidget gets rendered correctly, others behave as they weren't rendered, corrupted/random data
error for each QGLWidget construction except first one:
QGLWidget::setContext: Context must refer to this widget
Another approach:
main OpenGL context and create sub-context for each QGLWidget
not working: context->isSharing() returns false
code that I use for context creation, context1 and context2 are later passed to constructors of QGLWidgets:
QGLContext *mainContext = new QGLContext(format), *context1, *context2;
mainContext->create();
context1 = new QGLContext(format);
context1->create(mainContext);
context2 = new QGLContext(format);
context2->create(mainContext);
cout << mainContext->isSharing() << " " << context1->isSharing() << endl;
With regards to the first approach, you are not setting up sharing but trying to force the same context to be used with different QGLWidgets. As pointed out above, this is wrong and will not work.
Instead, create the QGLWidgets normally and pass the first QGLWidget in the shareWidget parameter when creating the others. This way you will get a separate context for each QGLWidget but they will all share with the context of the first one (and thus with each other). See http://qt-project.org/doc/qt-4.8/qglwidget.html#QGLWidget
Destroying the first widget before the others should not be an issue since the shared objects will be around until any of the sharing contexts are alive.
I realize that it has been almost a year since this question has been asked, but I believe the comment above may be inaccurate.
To be more precise, while it may be indeed invalid to use a single QGLContext with multiple QGLWidgets, this would be a limitation of Qt's OpenGL implementation rather than a limitation of OpenGL or the windowing system. It certainly seems valid to use the same context to render to multiple windows. For example, the functions wglMakeCurrent and SwapBuffers accept as parameters device handles alongside OpenGL context handles. To quote the wglMakeCurrent documentation:
The hdc parameter must refer to a drawing surface supported by OpenGL.
It need not be the same hdc that was passed to wglCreateContext when
hglrc was created, but it must be on the same device and have the same
pixel format.
I do not even want to go into problems with SwapBuffers, since there are several bug reports all over the web regarding Qt5, which seems to force making the OpenGL context current unnecessarily before SwapBuffers is called.
This has been updated since QT 5.4 and you should now use QOpenGLWidget instead of QGLWidget. Global sharing of contexts has been written into QOpenGLWidget now so you don't have to code it yourself. You just need to enable the sharing flag Qt::AA_ShareOpenGLContexts before you create QGuiApplication.

Why am I not getting any screen output from MyGUI in my OpenGL engine?

I'm trying to integrate MyGUI 3.2.0 into my OpenGL 3 engine, but I'm having some problems. My OpenGL engine does not use any OpenGL functions deprecated/removed in OpenGL 3.3. I have everything built and linked ,with a little mucking around to make sure the FreeType libs ended up in the right place so MyGUI could find them.
I followed the quick start guide and adjusted it to use OpenGLPlatform, but I skipped over the input sections just so I could get it displaying first. I wrote the image loader interface, which works, but I left the save function empty for now if that makes any difference. I don't get any compilation errors or crashes. There aren't any errors in the log file. I've been through the FAQ and I'm kind of in the same situation as the last two entries, but the functions they mention don't exist for OpenGLPlatform, so they probably don't apply.
At one point I had random triangles with what looked like MyGUI's textures on them sticking out from the last mesh I drew from my engine, but I figured out they were just getting caught up in the previous shaders I had bound for my meshes and they disappeared after I unbound the shaders. I checked in gDEBugger and there are vertex buffers and textures being loaded from MyGUI code, so I'm pretty sure they're loading correctly. I know the textures are loading correctly, at least, and the vertex buffers don't look corrupted or anything. I also stepped through the code and it seemed to be drawing something, but I don't get any output on the screen from MyGUI. What am I missing?
In my WindowMgr init():
if (m_platform == NULL)
{
m_platform = new MyGUI::OpenGLPlatform();
m_platform->initialise(&m_imageLoader);
m_platform->getDataManagerPtr()->addResourceLocation("./data/ui/MyGUI", false);
}
if (m_GUI == NULL && m_platform != NULL)
{
m_GUI = new MyGUI::Gui();
m_GUI->initialise();
}
MyGUI::ButtonPtr button = m_GUI->createWidget<MyGUI::Button>("Button", 300, 10, 300, 26, MyGUI::Align::Default, "Main", "test");
button->setCaption("Test");
In WindowMgr render():
if (m_platform != NULL)
{
renderGlobals.shaderMgr.unbindAll();
m_platform->getRenderManagerPtr()->drawOneFrame();
}
In WindowMgr resizeWindow():
if (m_platform != NULL)
{
m_platform->getRenderManagerPtr()->setViewSize(_width, _height);
}
In WindowMgr close():
if (m_GUI != NULL)
{
m_GUI->shutdown();
delete m_GUI;
m_GUI = NULL;
}
if (m_platform != NULL)
{
m_platform->shutdown();
delete m_platform;
m_platform = NULL;
}
Remember, kids, unbind your vertex array objects before you let MyGUI do it's thing or it messes everything up in the most horrible way: it looks like it isn't doing anything at all! I'm an idiot!

Creating an OpenGL 3.2/3.x context in SDL 1.3

I'm facing a problem where SDL says it does not support OpenGL 3.x contexts. I am trying to follow this tutorial: Creating a Cross Platform OpenGL 3.2 Context in SDL (C / SDL). I am using GLEW in this case, but I couldn't get gl3.h to work with this either. This is the code I ended up with:
#include <glew.h>
#include <SDL.h>
int Testing::init()
{
if(SDL_Init(SDL_INIT_EVERYTHING) < 0)
{
DEBUGLINE("Error initializing SDL.");
printSDLError();
system("pause");
return 1; // Error
}
//Request OpenGL 3.2 context.
SDL_GL_SetAttribute(SDL_GL_CONTEXT_MAJOR_VERSION, 3);
SDL_GL_SetAttribute(SDL_GL_CONTEXT_MINOR_VERSION, 2);
//set double buffer
SDL_GL_SetAttribute(SDL_GL_DOUBLEBUFFER, 1);
//Create window
window = SDL_CreateWindow("OpenGL 3.2 test",
SDL_WINDOWPOS_CENTERED, SDL_WINDOWPOS_CENTERED,
600, 400, SDL_WINDOW_OPENGL | SDL_WINDOW_SHOWN);
if(window == NULL) return 3; // Error
//Print errors to console if there are any
printSDLError(__LINE__);
//Set up OpenGL context.
glContext = SDL_GL_CreateContext(window);
printSDLError(__LINE__);
if(glContext == NULL)
{
DEBUGLINE("OpenGL context could not be created.");
system("pause");
return 4;
}
//Initialize glew
GLenum err = glewInit();
if(err != GLEW_OK)
{
DEBUGLINE("GLEW unable to be initialized: " << glewGetErrorString(err));
system("pause");
return 2;
}
return 0; // OK code, no error.
}
The only problem that is reported is after trying to call SDL_GL_CreateContext(window), where SDL reports "GL 3.x is not supported". However, both the tutorial and this sample pack (which I have not bothered to test with) report success in combining SDL 1.3 and OpenGL 3.2. I am aware that SDL 1.3 is in the middle of development, but I somewhat doubt that even unintentional support would be removed.
A context is still created, and GLEW is able to initialize just fine. (I can't figure out for the life of me how to see the version of the context that was created, since it's supposed to be the core profile, and I don't know how to find that either. According to the tutorial, SDL_GL_SetAttribute(SDL_GL_CONTEXT_MAJOR_VERSION, 3) doesn't actually do anything, in which case I have no clue how to get the appropriate context created or change the default context.)
EDIT: After some testing thanks to the helpful function Nicol gave me, I have found that, regardless of the parameters I pass to SDL_GL_SetAttribute, the context is always version 1.1. However, putting in any version below 3.0 doesn't spit out an error saying it is not supported. So the problem is that the "core" version SDL sees is only 1.1.
For the record, I am using Visual C++ 2010 express, GLEW 1.7.0, and the latest SDL 1.3 revision. I am fairly new to using all three of these, and I had to manually build the SDL libraries for both 32 and 64 bit versions, so there's a lot that could go wrong. So far however, the 32 and 64 bit versions are doing the exact same thing.
EDIT: I am using an nVidia 360M GPU with the latest driver, which OpenGL Extension Viewer 4.04 reports to have full compatibility up to OpenGL 3.3.
Any help is appreciated.
UPDATE: I have managed to get SDL to stop yelling at me that it doesn't support 3.x contexts. The problem was that the SDL_GL_SetAttribute must be set BEFORE SDL_Init is called:
//Request OpenGL 3.2 context.
SDL_GL_SetAttribute(SDL_GL_CONTEXT_MAJOR_VERSION, 3);
SDL_GL_SetAttribute(SDL_GL_CONTEXT_MINOR_VERSION, 2);
//Initialize SDL
if(SDL_Init(SDL_INIT_EVERYTHING) < 0)
{
DEBUGLINE("Error initializing SDL.");
return 1; // Error
}
Unfortunately, GLEW still refuses to acknowledge anything higher than OpenGL 1.1 (only GLEW_VERSION_1_1 returns true), which still has me puzzled. glGetString(GL_VERSION) also reports 1.1.0. It seems that my program simply doesn't know of any higher versions, as if I don't have them installed at all.
since I don't know if you already found a solution, here is mine:
I struggled around a lot today and yesterday with this stuff. Advanced GL functions couldn't be used, so I even debugged into opengl32.dll just to see it really works and wraps the calls into the hardware-specific OpenGL DLL (nvoglnt.dll). So there must have been another cause. There were even tips in the internet to link to opengl32.lib before all other libraries, because ChoosePixelFormat and some other functions are overwritten by each other.
But that wasn't the cause, too. My solution was to enable the accelerated visuals here:
// init SDL
if(SDL_Init(SDL_INIT_VIDEO | SDL_INIT_HAPTIC | SDL_INIT_TIMER) < 0) {
fprintf(stderr, "Could not init SDL");
return 1;
}
// we must wish our OpenGL Version!!
SDL_GL_SetAttribute(SDL_GL_CONTEXT_MAJOR_VERSION, 3);
SDL_GL_SetAttribute(SDL_GL_CONTEXT_MINOR_VERSION, 2);
SDL_GL_SetAttribute(SDL_GL_ACCELERATED_VISUAL, 1);
because in the current SDL revision (Dec 15, 2011) he checks for it in SDL_windowsopengl.c
if (_this->gl_config.accelerated >= 0) {
*iAttr++ = WGL_ACCELERATION_ARB;
*iAttr++ = (_this->gl_config.accelerated ? WGL_FULL_ACCELERATION_ARB :
WGL_NO_ACCELERATION_ARB);
}
and this attribute is initialized to -1 if you did not define it on your own.
And: Never set the version attributes before initializing SDL, because settings attributes needs the video backend to be initialized properly!
I hope this helps.
I followed this tutorial. Everything works fine on windowz and linux.
http://people.cs.uct.ac.za/~aflower/tutorials.html

How do I use Qt and SDL together?

I am building a physics simulation engine and editor in Windows. I want to build the editor part using Qt and I want to run the engine using SDL with OpenGL.
My first idea was to build the editor using only Qt and share as much code with the engine (the resource manager, the renderer, the maths). But, I would also like to be able to run the simulation inside the editor. This means I also have to share the simulation code which uses SDL threads.
So, my question is this: Is there a way to have an the render OpenGL to a Qt window by using SDL?
I have read on the web that it might be possible to supply SDL with a window handle in which to render. Anybody has experience dong that?
Also, the threaded part of the simulator might pose a problem since it uses SDL threads.
This is a simplification of what I do in my project. You can use it just like an ordinary widget, but as you need, you can using it's m_Screen object to draw to the SDL surface and it'll show in the widget :)
#include "SDL.h"
#include <QWidget>
class SDLVideo : public QWidget {
Q_OBJECT
public:
SDLVideo(QWidget *parent = 0, Qt::WindowFlags f = 0) : QWidget(parent, f), m_Screen(0){
setAttribute(Qt::WA_PaintOnScreen);
setUpdatesEnabled(false);
// Set the new video mode with the new window size
char variable[64];
snprintf(variable, sizeof(variable), "SDL_WINDOWID=0x%lx", winId());
putenv(variable);
SDL_InitSubSystem(SDL_INIT_VIDEO | SDL_INIT_NOPARACHUTE);
// initialize default Video
if((SDL_Init(SDL_INIT_VIDEO) == -1)) {
std:cerr << "Could not initialize SDL: " << SDL_GetError() << std::endl;
}
m_Screen = SDL_SetVideoMode(640, 480, 8, SDL_HWSURFACE | SDL_DOUBLEBUF);
if (m_Screen == 0) {
std::cerr << "Couldn't set video mode: " << SDL_GetError() << std::endl;
}
}
virtual ~SDLVideo() {
if(SDL_WasInit(SDL_INIT_VIDEO) != 0) {
SDL_QuitSubSystem(SDL_INIT_VIDEO);
m_Screen = 0;
}
}
private:
SDL_Surface *m_Screen;
};
Hope this helps
Note: It usually makes sense to set both the min and max size of this widget to the SDL surface size.
While you might get it to work like first answer suggest you will likely run into problems due to threading. There is no simple solutions when it comes to threading, and here you would have SDL Qt and OpenGL mainloop interacting. Not fun.
The easiest and sanest solution would be to decouple both parts. So that SDL and Qt run in separate processes and have them use some kind of messaging to communicate (I'd recommend d-bus here ). You can have SDL render into borderless window and your editor sends commands via messages.
Rendering onto opengl from QT is trivial (and works very well)
No direct experience of SDL but there is an example app here about mixing them.
http://www.devolution.com/pipermail/sdl/2003-January/051805.html
There is a good article about mixing QT widgewts directly with the opengl here
http://doc.trolltech.com/qq/qq26-openglcanvas.html a bit beyond what you strictly need but rather clever!