I can't paint on QOpenGLWidget - c++

I'm new in Qt and I'm making an app that generates a 3D object with some points the user provides in GUI. The user defines the points and then, in the same window, a QOpenGLWidget paints the final object. But then, the QOpenGLWindow doesn't draw the model (only the GlCLearColor). I've tried the same functions but called before app.exec() in main and it works fine.
I don't know what's happening. I've tried calling makeCurrent() before working with vao and vbo but it doesn't work.
I haven't created any thread neither fbo.
When I tried makeCurrent before working with vao and vbo I got:
"QOpenGLBuffer::bind: buffer is not valid in the current context"
"QOpenGLShaderProgram::bind: program is not valid in the current context."
EDIT I retried making current before vao, vbo and shader binding and that messages dissapeared but the widget isn't painting the object.
I've also tried QOPenGLWidget::update() after drawing. It doesn't draw with resizing.
I'm using OpenGL 4.1 Core Profile and Qt 5.12.3

I've already solved it! I just forgot to make current before setting uniform values in shaders.

Related

Displaying Qt Quick content inside OpenSceneGraph scene

I'd like to show my Qt Quick content on a virtual screen inside my OpenSceneGraph scene.
The approach I'm using right now is highly inefficient:
Render Qt Quick to the offscreen surface using FBO (FrameBufferObject)
Download pixels with QOpenGLFramebufferObject::toImage()
Upload pixels to the OSG
So it's GPU-CPU-GPU transfer. Source code
A proper solution should somehow utilize existing FBO and be able to transfer data solely inside the GPU.
There are two options exist:
Create FBO on the Qt side and use its texture on the OSG side
Create FBO on the OSG side and feed it to the Qt Quick renderer
The Qt part is OK. And I'm completely lost with the OSG. Could anyone provide me with some pointers?
Finally made it.
General idea:
Render QtQuick to texture using FBO - there are some examples over the internet available.
Use this texture inside OpenSceneGraph
The whole thing includes several tricks.
Context Sharing
To perform certain graphics operations, we must initialize OpenGL global state, also known as context.
When a texture is created, context stores it's id. Ids are not globally unique, so when another texture is created within another context, it may get the same id, but with different resource behind it.
If you just pass your texture's id to another renderer (operating within different context), expecting it to show your texture, you end up showing another texture or black screen or crash.
The remedy is context sharing, which effectively means sharing ids.
OpenSceneGraph and Qt abstractions are not compatible, so you need to tell OSG not to use its own context abstraction. This is done by calling setUpViewerAsEmbeddedInWindow
Code:
OsgWidget::OsgWidget(QWidget* parent, Qt::WindowFlags flags)
: QOpenGLWidget(parent, flags)
, m_osgViewer(new osgViewer::Viewer)
{
setFormat(defaultGraphicsSettings());
// ...
m_osgGraphicsContext = m_osgViewer->setUpViewerAsEmbeddedInWindow(x(), y(), width(), height());
}
// osg::ref_ptr<osgViewer::GraphicsWindowEmbedded> m_osgGraphicsContext;
From now on, the existing QOpenGLContext instance will be used as an OpenGL context for OSG rendering.
You will need to create another context for QtQuick rendering and set them shared:
void Widget::initializeGL()
{
QOpenGLContext* qmlGLContext = new QOpenGLContext(this);
// ...
qmlGLContext->setShareContext(context());
qmlGLContext->create();
}
Remember, there can be only one active context at a time.
Your scheme is:
0. create osg::Texture out of QOpenGLFrameBufferObject::texture()
1. make QtQuick context active
2. render QtQuick to texture
3. make primary (OSG) context active
4. render OSG
5. goto 1
Making of a proper osg::Texture
Since OSG and Qt API are incompatible you barely can link QOpenGLFrameBufferObject to osg::Texture2D as it is.
QOpenGLFrameBufferObject has QOpenGLFrameBufferObject::texture() method which returns opengl texture id, but osg::Texture manages all openGL stuff on its own.
Something like osg::Texture2D(uint textureId); could help us but it just doesn't exist.
Let's make one by ourselves.
osg::Texture is backed by osg::TextureObject which stores OpenGL texture id and some other data as well. If we construct osg::TextureObject with a given texture id and pass it to osg::Texture, the latter will use it as its own.
Code:
void Widget::createOsgTextureFromId(osg::Texture2D* texture, int textureId)
{
osg::Texture::TextureObject* textureObject = new osg::Texture::TextureObject(texture, textureId, GL_TEXTURE_2D);
textureObject->setAllocated();
osg::State* state = m_osgGraphicsContext->getState();
texture->setTextureObject(state->getContextID(), textureObject);
}
Complete demo project here

glCreateShader gives same ID

Background :
I have a Shader class in my c++/OpenGL3.1/GLSL/Qt program. My program uses several shaders based on different GLSL sources files.
My application can run many different 3D renderers based on the QGLWidget implementation and each one creates its own shaders.
When I create my first 3d renderer and initialize my shaders, shaders IDs are generated with the help of glCreateShader & glCreateProgram, without any problem.
Problem :
But when I create a second 3d renderer, the OGL functions retrieving the ID give exactly the same but I expect to have new ones. It means that my two renderers will send the data to the same GPU program...
It's obvious that in the GPU program, uniform variables are mixed and when running the second renderer, the first one displays a weird rendering.
Indeed, when I close one of the two renderer, all shaders are killed... and the second renderer cannot display anything.
Idea ?
I'm completely lost and my logical deduction is that glCreateShader & glCreateProgram give ID according to their own thread id. QGLWidget running probably its own thread to call the rendering functions, it may trouble the persistence...
Any idea of how to solve this problem ?
If both your QOpenGLWidget share the same parent window, then by default they share the same context. If you don't want that, the easier is probably to create a new top-level widget (any QWidget with no parent) with your second QOpenGLWidget.
Please note that this is different from the older QGLWidget class. From Qt documentation:
When multiple QOpenGLWidgets are added as children to the same top-level widget, their contexts will share with each other. This does not apply for QOpenGLWidget instances that belong to different windows.
This means that all QOpenGLWidgets in the same window can access each other's sharable resources, like textures, and there is no need for an extra "global share" context, as was the case with QGLWidget.
There is no problem to solve.
Each QGLWidget has its own OpenGL context. And, unless you are explicitly sharing objects between them, each context has its own separate list of objects.
You can only use an OpenGL object with the OpenGL context that created it. So long as you keep the objects separate, and only use them with the context that created it, you should be fine.

Global OpenGL context for rendering in fbo to display in many QOpenGLWidget

I want to display a scene with different points of view in many QOpenGLWidget that are in the same window (separated with a QSplitter architecture).
The important thing is that I want to store my scene data (geometry and textures) only once on the GPU.
But, each QOpenGLWidget have there own QOpenGLContext.
My idea was to create an independant context that render scene into a FrameBufferObject (not QFrameBufferObject) and use the resulting texture in the concerned QOpenGLWidget.
More technically, my approach was to create a new QOpenGLContext and a QOffscreenSurface.
But when I want to use the resulting QOpenGLFunctions given by my QOpenGLContext, it stop with a segmentation fault even if I've checked that my created QOpenGlContext is valid.
In a more general way, it's hard to me to understand the role of the QOpenGLContext, the surface (QOffscreenSurface in my case) and the makeCurrent function.
My request is to understand what is the good way to do this and why I get a segmentation fault.
I come back with an answer of my problem.
In fact I've used my first initialized QOpenGLWidget's context as global context to avoid creating a new one.
The tricky part is to make the global context current everytime it's needed (adding data on GPU or using the fbo).
The paintGL steps are simple :
- Make the global context current.
- Start fbo recording.
- Render the scene.
- Stop the fbo recording.
- Make the QOpenGLWidget context current.
- Render the fbo result on the screen.
I've checked my memory using gDEBugger and all my memory is in the global context (and store only once).

Interactions between Onscreen and Offscreen rendering in Qt5 with QOpenGL\* classes

Objective:
To make some onscreen and offscreen rendering via Qt5 OpenGL framework, such that the resources can be easily shared between both rendering parts. Specifically,
the rendering work is done through the offscreen part (the framebuffer might be larger than the display screen);
the results of the offscreen rendering can be displayed in multiple onscreen parts (say, QOpenGLWidgets) under different settings, e.g. different sizes, for simplicity;
the results of the offscreen rendering can also be extracted from GPU and saved into a QImage or cv::Mat object;
the above tasks can be executed asynchronously (doing the second offscreen rendering, while displaying or extracting the first offscreen result).
Current solution:
Since I don't know how to share resources between both parts, the actual rendering work are done redundantly in both parts in my current solution:
The onscreen part:
A QMainWindow containing multiple QOpenGLWidget (subclass of QOpenGLWidget) objects;
The offscreen part:
A custom class involving members of QOffscreenSurface, QOpenGLContext, and QOpenGLFramebufferObject pointers, as well as a QOpenGLFunctions pointer to invoke OpenGL functions do the actual rendering work, much similar to this link.
The actual renderer:
As the reason above, the actual rendering work is extracted into a seperated class and both parts (onscreen and offscreen) have its handle.
Questions:
There are two QOpenGLContexts:
When doing the offscreen work in a background thread (for asynchronously rendering), it says the QWindow-based QOffscreenSurface are not allowed to exist outside the gui thread;
When doing this in the main (GUI) thread, it says the QOpenGLContext is invalid.
So my questions are:
Should I do the offscreen and onscreen work in the same GUI thread or not?
What is the best way of communicating and sharing resources between the offscreen and onscreen parts?
A brief actual code example doing a simple rendering work (say, draw a triangle via shading language) will be much appreciated.
Assuming that QOpenGLContext *main_ctx is the context that was created by QOpenGLWidget for actual rendering, you can create another context ctx in any thread and make it share textures and buffers with the first one:
ctx = std::make_unique<QOpenGLContext>();
ctx->setFormat(main_ctx->format());
ctx->setShareContext(main_ctx);
ctx->create();
I don't think that QOffscreenSurface must be a QWindow-based.
offscreen_surface = std::make_unique<QOffscreenSurface>();
offscreen_surface->setFormat(ctx->format());
offscreen_surface->create();
ctx->makeCurrent(offscreen_surface);
Then create a QOpenGLFramebufferObject and render into it from the second context (second thread).
Then use its texture in the main context: glBindTexture(GL_TEXTURE_2D, fbo->texture());. Maybe there is a need for some synchronization when doing this.

Draw content of QOpenGLFramebufferObject onto QOpenGLWidget

I am currently porting from Qt4 to Qt5. While Qt5 still provides the QGLWidget etc. classes, these are to be replaced by QOpenGLWidget etc.
Currently I am drawing complicated stuff into a framebuffer object once. Then I draw it on the screen (into the GL widget) using drawTexture() method:
target->drawTexture(rect, fb->texture());
Afterwards, I overdraw with other temporary things. So whenever I need to update the temporary stuff, I can just re-use the framebuffer object instead of redrawing the complicated stuff.
With QOpenGLWidget, as well as QOpenGLContext, however, there is no convenient drawTexture() method anymore. Drawing the texture by-hand with OpenGL would be a minimum of 20 lines of complicated stuff. Textbook stuff, I admit.
Is there any elegant/Qt way of getting my fbo contents onto the screen? For example, it is documented that QOpenGLWidget also uses an fbo as a backend. Instead of drawing the texture, we could also blit the FBOs. But how?