OpenGL draws over widgets in Qt - c++

I'm developing a Qt app which uses Cocoa on the Mac and am using PowerVR SDK to enable OpenGL ES 2.0 on Mac Desktop.
I've managed to get it working, everything renders perfectly, the problem is
that when I'm creating a widget in that window, OpenGL renders over it, e.g
I'm creating a QLabel and it renders over it, making the label invisible.
I tried calling QLabel's repaint() method after rendering a single OpenGL frame, but that didn't help.
Has anyone encountered such or similar issue and has any suggestions?
Thanks!

If you wish widgets to interoperate with OpenGL content, you must use the QOpenGLWidget. It draws to an offscreen buffer that then gets composited with the widgets.
Alternatively, you must yourself render the label into a texture, and apply the texture to a quad.

Related

render a qt overlay window over opengl child window

I am looking for some information about rendering child windows in specific about how OpenGL interop with GDI. The problem that I have is that I have basically is that I have two windows, first, the main windows are created in qt, and inside of qt, a child window is hosted that leverages an OpenGL renderer.
Now what I wanted to do is to host an overlay on top of my OpenGL window, so I use that to overlay the OpenGL window. The problem that I am having is that when I render with OpenGL, the OpenGL generated graphics seem to obscure the graphics area including and effectively undo the graphics composited by qt.
In the image below the blue area is the qt overlay, in that picture I'm using GDI (BeginPaint/EndPaint) so and the windows seem to interact fine. That is, window order seems correct, the client region is correct. The moment I start to render with Opengl the blue area gets replaced with whatever OpenGL renders.
What I did I basically created to create the overlay I created a second frameless, topmost QMainWindow, and once the platform HWND was initialized I reparent it. Basically I change the new windows parent to be the same parent of my OpenGL window.
What I believed this would do is that the every window, gets drawn separately and the desktop composition manager would make the final composition and basically avoiding the infamous airspace problem as documented by Microsoft in their WPF framework.
What I would like to know is what could cause these issues? At this point, I lack understanding why once i render with OpenGL the pixels by qt overlay are obscured, even though windows hierarchy should say make them composited. What could I do to accomplish what I want?
Mixing OpenGL and GDI drawing on a shared drawable (that also includes sibling / childwindows without the CS_OWNDC windowclass style flag) never was supported. That's not something about Qt, but simply how OpenGL and GDI interact.
But the more important issue is: Why the hell aren't you using the OpenGL support built right into Qt in the first place? Ever since Qt-5 – if available – uses OpenGL to draw everything (all the UI elements). Qt-5 makes it trivial to mix Qt stuff and OpenGL drawing.

Rendering over DirectX window with Awesomium (semi-transparent & rounded elements)

I wonder if it's possible to use Awesomium to render the GUI over the DirectX 11 game (I do NOT use .NET, it's C++/DirectX 11 game)?
It would involve:
Rendering the scene on the window with DirectX 11 (just as I am doing it now).
Rendering the GUI with Awesomium from HTML/CSS over the previously rendered scene.
Note that some GUI elements should be semi-transparent or rounded - so it's not only rendering on some rect, but also blending.
Is it possible? Or maybe I could make it another way (e.g. telling Awesomium to use DirectX for rendering somehow)?
Or maybe I could draw an semi-transparent DirectX texture in Awesomium, and then render it over the scene with DirectX? I know that rendering to texture resource is possible with Awesomium, but does it supports transparency & semi-transparency?
If not, are the good alternatives for what I wanted to achive with Awesomium?
Yes. It can be done.
If you look at the documentation of the Awesomium WebView class it has a surface() method which will return the views backing bitmap.
Here is some c++ documentation for the class.
http://awesomium.com/docs/1_7_0/cpp_api/class_awesomium_1_1_web_view.html
You can copy this bitmap to a texture in DirectX and render it as layer on top of your game creating your UI.
You also have to route and translate input into Awesomium. You can style your UI however you like using HTML, CSS and Javascript. You can make it rounded in this way and introduce transparency.
I won't repeat a perfectly good tutorial on doing this. You can find one here.
http://www.gamedev.net/blog/32/entry-2260646-sweet-snippets-rendering-web-pages-to-texture-using-awesomium-and-direct3d/
How you render your texture after it is written doesn't have anything to do with Awesomium. Choose your blend modes and/or use shaders with output texture for desired effect.

QML OpenGL game redraw loop

I am making a 3D game with OpenGL ES 2.0 and want to use QML as an overlay for the user interfaces.
I know that I can embed my OpenGL code inside a QGLWidget but how can I force QML to redraw the view as often as possible?
Additionally will I get performance issues because I am embedding the OpenGL view?
If you want to render your game with OpenGL and use QML for the UI, consider using the Qt Quick Scene Graph:
Mixing Scene Graph and OpenGL
Scene Graph - OpenGL Under QML

Combining GLX (OpenGL) and X11 graphics

I have to write an application on Linux using X11 for the interface (in C++). The application uses GLX to render some openGL graphics, but I also need to write some custom UI for this app within the same window.
When I create window I created a GC and a GLX context. Ideally I'd need to "draw" the openGL into a region of the window (say the left part) and the draw the UI on the side of the GL viewport.
How can I do that?
how can i combine GLX and GC drawing calls, such as XDrawString for example.
what would be the best way for me to create a layout within the same window, reserving a region of the window in which I draw the GL content, and having another region of the window in which I draw the UI using X calls. Do I need to create sub-windows for that?
I actually found a useful answer here:
Create GLX context in specific region of a window
The idea is to spawn a sub-window from the current window and draw the GL content to it.
I'd suggest to use BindTexImage extension, draw your X11 (via core / xrender / whatever ) commands to offscreen pixmap and then later composite it as a texture.

Menus don't overlay my OpenGL canvas after I open another frame on Mac only

I have an opengl canvas and lightweight menus set to false, and everything works fine. Then I open up a second frame from the first that has some 2D drawing etc. The menus over the openGL canvas in the first frame no longer draw where they overlap the canvas. This only happens on my mac, not on linux or windows. Any ideas?
It turns out that if you use the UIManager to set the look and feel, it must do something to the previous setting that turned off the lightweight menus (required for JOGL).