Draw content of QOpenGLFramebufferObject onto QOpenGLWidget - opengl

I am currently porting from Qt4 to Qt5. While Qt5 still provides the QGLWidget etc. classes, these are to be replaced by QOpenGLWidget etc.
Currently I am drawing complicated stuff into a framebuffer object once. Then I draw it on the screen (into the GL widget) using drawTexture() method:
target->drawTexture(rect, fb->texture());
Afterwards, I overdraw with other temporary things. So whenever I need to update the temporary stuff, I can just re-use the framebuffer object instead of redrawing the complicated stuff.
With QOpenGLWidget, as well as QOpenGLContext, however, there is no convenient drawTexture() method anymore. Drawing the texture by-hand with OpenGL would be a minimum of 20 lines of complicated stuff. Textbook stuff, I admit.
Is there any elegant/Qt way of getting my fbo contents onto the screen? For example, it is documented that QOpenGLWidget also uses an fbo as a backend. Instead of drawing the texture, we could also blit the FBOs. But how?

Related

Displaying Qt Quick content inside OpenSceneGraph scene

I'd like to show my Qt Quick content on a virtual screen inside my OpenSceneGraph scene.
The approach I'm using right now is highly inefficient:
Render Qt Quick to the offscreen surface using FBO (FrameBufferObject)
Download pixels with QOpenGLFramebufferObject::toImage()
Upload pixels to the OSG
So it's GPU-CPU-GPU transfer. Source code
A proper solution should somehow utilize existing FBO and be able to transfer data solely inside the GPU.
There are two options exist:
Create FBO on the Qt side and use its texture on the OSG side
Create FBO on the OSG side and feed it to the Qt Quick renderer
The Qt part is OK. And I'm completely lost with the OSG. Could anyone provide me with some pointers?
Finally made it.
General idea:
Render QtQuick to texture using FBO - there are some examples over the internet available.
Use this texture inside OpenSceneGraph
The whole thing includes several tricks.
Context Sharing
To perform certain graphics operations, we must initialize OpenGL global state, also known as context.
When a texture is created, context stores it's id. Ids are not globally unique, so when another texture is created within another context, it may get the same id, but with different resource behind it.
If you just pass your texture's id to another renderer (operating within different context), expecting it to show your texture, you end up showing another texture or black screen or crash.
The remedy is context sharing, which effectively means sharing ids.
OpenSceneGraph and Qt abstractions are not compatible, so you need to tell OSG not to use its own context abstraction. This is done by calling setUpViewerAsEmbeddedInWindow
Code:
OsgWidget::OsgWidget(QWidget* parent, Qt::WindowFlags flags)
: QOpenGLWidget(parent, flags)
, m_osgViewer(new osgViewer::Viewer)
{
setFormat(defaultGraphicsSettings());
// ...
m_osgGraphicsContext = m_osgViewer->setUpViewerAsEmbeddedInWindow(x(), y(), width(), height());
}
// osg::ref_ptr<osgViewer::GraphicsWindowEmbedded> m_osgGraphicsContext;
From now on, the existing QOpenGLContext instance will be used as an OpenGL context for OSG rendering.
You will need to create another context for QtQuick rendering and set them shared:
void Widget::initializeGL()
{
QOpenGLContext* qmlGLContext = new QOpenGLContext(this);
// ...
qmlGLContext->setShareContext(context());
qmlGLContext->create();
}
Remember, there can be only one active context at a time.
Your scheme is:
0. create osg::Texture out of QOpenGLFrameBufferObject::texture()
1. make QtQuick context active
2. render QtQuick to texture
3. make primary (OSG) context active
4. render OSG
5. goto 1
Making of a proper osg::Texture
Since OSG and Qt API are incompatible you barely can link QOpenGLFrameBufferObject to osg::Texture2D as it is.
QOpenGLFrameBufferObject has QOpenGLFrameBufferObject::texture() method which returns opengl texture id, but osg::Texture manages all openGL stuff on its own.
Something like osg::Texture2D(uint textureId); could help us but it just doesn't exist.
Let's make one by ourselves.
osg::Texture is backed by osg::TextureObject which stores OpenGL texture id and some other data as well. If we construct osg::TextureObject with a given texture id and pass it to osg::Texture, the latter will use it as its own.
Code:
void Widget::createOsgTextureFromId(osg::Texture2D* texture, int textureId)
{
osg::Texture::TextureObject* textureObject = new osg::Texture::TextureObject(texture, textureId, GL_TEXTURE_2D);
textureObject->setAllocated();
osg::State* state = m_osgGraphicsContext->getState();
texture->setTextureObject(state->getContextID(), textureObject);
}
Complete demo project here

Interactions between Onscreen and Offscreen rendering in Qt5 with QOpenGL\* classes

Objective:
To make some onscreen and offscreen rendering via Qt5 OpenGL framework, such that the resources can be easily shared between both rendering parts. Specifically,
the rendering work is done through the offscreen part (the framebuffer might be larger than the display screen);
the results of the offscreen rendering can be displayed in multiple onscreen parts (say, QOpenGLWidgets) under different settings, e.g. different sizes, for simplicity;
the results of the offscreen rendering can also be extracted from GPU and saved into a QImage or cv::Mat object;
the above tasks can be executed asynchronously (doing the second offscreen rendering, while displaying or extracting the first offscreen result).
Current solution:
Since I don't know how to share resources between both parts, the actual rendering work are done redundantly in both parts in my current solution:
The onscreen part:
A QMainWindow containing multiple QOpenGLWidget (subclass of QOpenGLWidget) objects;
The offscreen part:
A custom class involving members of QOffscreenSurface, QOpenGLContext, and QOpenGLFramebufferObject pointers, as well as a QOpenGLFunctions pointer to invoke OpenGL functions do the actual rendering work, much similar to this link.
The actual renderer:
As the reason above, the actual rendering work is extracted into a seperated class and both parts (onscreen and offscreen) have its handle.
Questions:
There are two QOpenGLContexts:
When doing the offscreen work in a background thread (for asynchronously rendering), it says the QWindow-based QOffscreenSurface are not allowed to exist outside the gui thread;
When doing this in the main (GUI) thread, it says the QOpenGLContext is invalid.
So my questions are:
Should I do the offscreen and onscreen work in the same GUI thread or not?
What is the best way of communicating and sharing resources between the offscreen and onscreen parts?
A brief actual code example doing a simple rendering work (say, draw a triangle via shading language) will be much appreciated.
Assuming that QOpenGLContext *main_ctx is the context that was created by QOpenGLWidget for actual rendering, you can create another context ctx in any thread and make it share textures and buffers with the first one:
ctx = std::make_unique<QOpenGLContext>();
ctx->setFormat(main_ctx->format());
ctx->setShareContext(main_ctx);
ctx->create();
I don't think that QOffscreenSurface must be a QWindow-based.
offscreen_surface = std::make_unique<QOffscreenSurface>();
offscreen_surface->setFormat(ctx->format());
offscreen_surface->create();
ctx->makeCurrent(offscreen_surface);
Then create a QOpenGLFramebufferObject and render into it from the second context (second thread).
Then use its texture in the main context: glBindTexture(GL_TEXTURE_2D, fbo->texture());. Maybe there is a need for some synchronization when doing this.

Rendering over DirectX window with Awesomium (semi-transparent & rounded elements)

I wonder if it's possible to use Awesomium to render the GUI over the DirectX 11 game (I do NOT use .NET, it's C++/DirectX 11 game)?
It would involve:
Rendering the scene on the window with DirectX 11 (just as I am doing it now).
Rendering the GUI with Awesomium from HTML/CSS over the previously rendered scene.
Note that some GUI elements should be semi-transparent or rounded - so it's not only rendering on some rect, but also blending.
Is it possible? Or maybe I could make it another way (e.g. telling Awesomium to use DirectX for rendering somehow)?
Or maybe I could draw an semi-transparent DirectX texture in Awesomium, and then render it over the scene with DirectX? I know that rendering to texture resource is possible with Awesomium, but does it supports transparency & semi-transparency?
If not, are the good alternatives for what I wanted to achive with Awesomium?
Yes. It can be done.
If you look at the documentation of the Awesomium WebView class it has a surface() method which will return the views backing bitmap.
Here is some c++ documentation for the class.
http://awesomium.com/docs/1_7_0/cpp_api/class_awesomium_1_1_web_view.html
You can copy this bitmap to a texture in DirectX and render it as layer on top of your game creating your UI.
You also have to route and translate input into Awesomium. You can style your UI however you like using HTML, CSS and Javascript. You can make it rounded in this way and introduce transparency.
I won't repeat a perfectly good tutorial on doing this. You can find one here.
http://www.gamedev.net/blog/32/entry-2260646-sweet-snippets-rendering-web-pages-to-texture-using-awesomium-and-direct3d/
How you render your texture after it is written doesn't have anything to do with Awesomium. Choose your blend modes and/or use shaders with output texture for desired effect.

Hide GLUT window

Is it possible to hide OpenGL window and the rendering are still running? I use glutHideWindow which will never trigger display function.
If that is not possible, is it possible in the program to change the focus of the current window? I want to run opengl program but I don't need that window. In fact, I want to use the framebuffer that opengl updates at each frame in another program. But it's always annoying to toggle between the two programs. (They both have window)
Is it possible to hide OpenGL window and the rendering are still running?
Yes and No to both parts of the question.
If you hide a window, all the pixels of the window's viewport will fail the pixel ownership test when rendering. So you can't use a hidden window as a drawable for OpenGL to operate on.
What you need is an off-screen drawable to draw to.
The modern variant are Framebuffer Objects (FBOs), which you can create on a regular OpenGL context, that might even work on a hidden window. FBOs take some drawable attachments (render buffers, textures) and allow OpenGL to draw to these instead to the window.
An older method are PBuffers, also widely supported, but not as easy to use as FBOs.
Note that if you want to perform off-screen rendering on Linux/X11 the X server must be active, i.e. owning the VT so that the GPU actually processes the commands. So you can't just start an X server "in the background" but have another X server use the display device.
After creating the window, you can use glutHideWindow() to go offscreen. Then you still render as nomal and use glReadPixels to read back and get buffer to use it later.

Lazy rendering of Qt on OpenGL

i came about this problem and knew it could be done better.
The problem:
When overlaying a QGLWidget (Qt OpenGL contextview) with Qt widgets, Qt redraws those widgets after every Qt frame.
Qt isn’t built to redraw entire windows with >60fps constantly, so that’s enormously slow.
My idea:
Make Qt use something other to draw upon: a transparent texture. Make OpenGL use this texture whenever it redraws and draw it on top of everything else. Make Qt redirect all interaction with the OpenGL context view to the widgets drawn onto the texture.
The advantage would be that Qt only has to redraw whenever it has to (e.g. a widget is hovered or clicked, or the text cursor in a text field blinks), and can do partial redraws which are faster.
My Question:
How to approach this? how can I tell Qt to draw to a texture? how can i redirect interaction with a widget to another one (e.g. if i move the mouse above the region in the context view where a checkbox sits in the drawn-to-texture widget, Qt should register this event to the checkbox and repaint to reflect itshovered state)
I separate my 2D and 3D rendering out for my CAD-like app for the very same reasons you have, although in my case my the 2D stuff are not widgets - but it shouldn't make a difference. This is how would approach the problem:
When your widget changes render it onto a QGLFramebufferObject, do this by using the FBO as the QPaintDevice for a QPainter in your QGLWidget::paintEvent(..) and calling myWidget->render( myQPainter, ...). Repeat this for however many widgets you have, but only onto the same FBO - don't create an FBO for each one... Remember to clear it first, like a 'normal' framebuffer.
When your current OpenGL background changes, render it onto another QGLFramebufferObject using standard OpenGL calls, in the same way.
Create a pass through vertex shader (the 'camera' will just be a unit cube), and a very simple fragment shader that can layer the two textures on top of each other.
At the end of the QGLWidget::paintEvent(..), activate your shader program, bind your framebuffers as textures for it (myFBO->texture() gets the handle), and render a unit quad. Because your camera is a unit square, and the viewport size defined the FBO size, it will fill the viewport pixel perfect.
However, that's the easy part... The hard part is the widget interaction. Because you are essentially rendering a 'proxy', you going to have to relay the interaction between the 'real' and 'proxy' widget, whilst keeping the 'real' widget invisible. Here's how would I start:
Some operating systems are a bit weird about rendering widgets without ever showing them, so you may have to show and then hide the widget after instantiation - because of the clever painting queue in Qt, it's unlikely to actually make it to the screen.
Catch all mouse events in the viewport, work out which 'proxy' widget the cursor is over (if any), and then offset it to get the relative position for the 'real' hidden widget - this value will depend on what parent object the 'real' widget has, if any. Then pass the event onto the 'real' widget before redrawing the widget framebuffer.
I should state that I also had to create a 'flagging' system to handle redraws nicely. You don't want every widget event to trigger a widget FBO redraw, because there could many simultaneous events (don't just think about the mouse) - but you would only want one redraw. So I created a system where if anything in the application could change anything in the viewport visually, then it would flag the viewport as 'dirty'. Then setup a QTimer for however many fps you are aiming for (in my situation the scene could get very heavy, so I also timed how long a frame took and then used that value +10% as the timer delay for the next check, this way the system isn't bombarded when rendering gets laggy). And then check the dirty status: if it's dirty, redraw; otherwise don't. I found life got easier with two dirty flags, one for the 3D stuff and one for the 2D - but if you need to maintain a constant draw rate for the OpenGL drawing there's probably no need for two.
I imagine what I did wasn't the easiest way to do it, but it provides plenty of scope for tuning and profiling - which makes life easier in the long run. All the answers are definitely not in this post, but hopefully it will get you on the way to a strategy.