pyqt : Why QGLWidget influenced by maya event? - opengl

openGL with maya
I made a openGL View with QGLWidget. It has a problem when work with Maya. As you see in video, when I click and move on a modelPanel within Maya, the QGLWidget get broken. And I found that the marquee rectangle is shown on my QGLWidget, not Maya. Why this problem happend?
To datenwolf
I tried to edit my code as you suggested. But... it doesn't call makeCurrent() and doneCurrent() at all. I expected that when I clicked on my maya modelPanel it could send me the message, but it didn't. What did I miss? Sorry for that.
def makeCurrent(self):
import OpenGL.WGL as wgl
print "MAKE CURRENT!!!"
self.prevHDC = wgl.wglGetCurrentDC()
self.prevHRC = wgl.wglGetCurretnContext()
super(GLWidget, self).makeCurrent()
def doneCurrent(self):
import OpenGL.WGL as wgl
print "DONE CURRENT!!!"
super(GLWidget, self).doneCurrent()
wgl.wglMakeCurrent(self.prevHDC, self.prevHRC)
I wrote just like above. But it never even show the "MESSAGE".

Most likely Qt and Maya's event loop are battling for processing event. Qt's paintGL does the right thing and makes the OpenGL context current whenever it is called. Maya however does not and so drawing commands of Maya end up in your OpenGL context.
Playing along with Maya is going to be tricky, because it requires storing which OpenGL/DC context was active before switching and restoring that once finished with one own's operations. You'll probably have to subclass QGLWidget and QGLContext to do this.
Update due to comment
Derive from QGLWidget, add two members HDC m_prevHDC and HRC m_prevHRC, override makeCurrent and doneCurrent
void QMyGLWidget::makeCurrent()
{
this->m_prevHDC = wglGetCurrentDC();
this->m_prevHRC = wglGetCurrentContext();
QGLWidget::makeCurrent();
}
void QMyGLWidget::doneCurrent()
{
QGLWidget::doneCurrent();
wglMakeCurrent(this->m_prevHDC, this->m_prevHRC);
}
Then derive your actual GLWidget from this intermediary class.

Related

QOpenGLWidget: retrieving window handle for 3rdparty library

I'm creating an application, which interacts with OpenGL via QOpenGL* classes. The graphics is shown through a QOpenGLWidget, which is placed in a UI-form.
Now, there is a library for CAD purposes (Open CASCADE), an OpenGL interface of which requires a handle to the render window. The question is: can I somehow say the library to render everything to the mentioned widget?
In other words, is there a way to interpret the widget as a native, probably, platform-specific (HWND here) window, so that the library renders its own stuff exactly there?
Thanks
QOpenGLWidget is not the same thing as QGLWidget.
The classical approach for embedding OCCT 3D viewer, which you can find in Qt IESample coming with OCCT, creates QWidget with unique window handle flag, takes this window handle and ask OCCT to take care about OpenGL context creation for this window. This is most straightforward, robust and portable way with one limitation - Qt will not be able drawing semitransparent widgets on top of this QWidget. This is not a limitation of OCCT, but rather limitation of Qt Widgets design.
QOpenGLWidget was intended to solve this limitation by allowing to mix custom OpenGL rendering and normal widgets. The integration of external OpenGL graphics engine, however, became more complicated and fragile. It is not very helpful stealing winId() from QOpenGLWidget, as rendering content is expected to be drawn not into window itself, but rather into OpenGL framebufer object (FBO), created by QOpenGLWidget - see QOpenGLWidget::defaultFramebufferObject() property.
External renderer is expected to render into this FBO for proper composition of Qt widgets. Luckily, OCCT is flexible enough to allow such integration. Unluckily, such integration requires some knowledge of OpenGL, as well as its usage by Qt and OCCT.
For that, you need to ask OCCT wrapping OpenGL context already created by Qt (for that V3d_View::SetWindow() provides an optional argument of type Aspect_RenderingContext, which corresponds to HGLRC on Windows and can be fetched using wglGetCurrentContext() within rendering thread) as well as FBO created by QOpenGLWidget (for that, OCCT provides OpenGl_FrameBuffer::InitWrapper() and OpenGl_Context::SetDefaultFrameBuffer() methods, as well as OpenGl_Caps::buffersNoSwap flag to leave window buffer swapping management to Qt).
OCCT doesn't come yet with a sample using QOpenGLWidget, but you can find also qt/AndroidQt sample implementing similar thing for embedding OCCT 3D Viewer into QtQuick application.
After some investigation, I found that method QOpenGLWidget::winId() returns the correct handle. It's been found out only now, because the rendered scene disappeared immediately, leaving a black picture instead. However, when the viewport is resized, the scene returns back (and disappears again, though). Looks like Open CASCADE has problems with Qt 5 OpenGL implementation, since QGLWidget didn't have such problems, as far as I know.

How to draw to "parent TLW backing store"?

This might be very complicated question not many people know answer but I still will ask.
I do have QWindow derived class, with overloaded event(), which uses Backing store and fill in whole window with a colour, let say black.
Now I have my QT QML app, when I create my window and set parent as main view of my app, I getting window sized to 1x1px ! This is driving me crazy..
I dug though the QT source code and found this:
void QQnxRasterWindow::adjustBufferSize()
{
// When having a raster window we don't need any buffers, since
// Qt will draw to the parent TLW backing store.
const QSize windowSize = window()->parent() ? QSize(1,1) : window()->size();
if (windowSize != bufferSize())
setBufferSize(windowSize);
}
void QQnxRasterWindow::setParent(const QPlatformWindow *wnd)
{
QQnxWindow::setParent(wnd);
adjustBufferSize();
}
Which is kinda bummer because I have no idea how I suppose to use TLW and draw into my window now.
Any ideas?
First of all what is TLW ?
Second how do I draw into parent TLW in a way it will end up in my window buffer.
Thank you
QT 5.3.1
Edit:
not renderNow() - my mistake,
overloaded function event, which uses event UpdateRequested to draw my background.
Edit2:
Also this is only problem when I set parent, when no parent set I can do whatewer I want with my QWindow and it has own buffer. Kinda weird.

Is it possible to create child window in opengl parent?

In my opengl project, i need to create some children of opengl window to get some UserInterface to the user and opengl to make some graphic in my program, and actually i have been successfully to make it all.. but the problem is: where we are going to move or resize the child window, my opengl background is getting fall appart yet when resizing and moving are done, my opengl background back to normal.
if it possible, could you explain me step by step to do it?
what should i do to solve this case?

How to display QWidget above data stream from device handled by external library

I'm creating application to analyze data from a device and displaying it on the screen. SDK to handling this device have function to display current frame of data in specific Window by setting window handler (HWND).
In my Qt Gui Application i'm trying display my own widget over video stream, which a DLL showing in my QGLWidget (it's set by winId function and HWND). MainWidget is a parent of QGLWidget where the data stream is displayed, and QWidget (or some graphic marker, for example Circle) should be displayed above data stream from QGLWidget.
Everything works almost perfect, but I'm getting blinking effect (twinkle effect?) - my circle widget is hidding and showing with frequency which human eye can get and i try to avoid it. The only option to eliminate that is create this circle as widget and set for its Qt::Popup flag, by it has big disadvantage - i don't have access to the rest of interface (I know, it's Popup flag fault). I've tried other options like:
set Qt::WindowStaysOnTopHint and few other flags,
create layout object which parent is QGLWidget, where i'm displaying data from device, and than set circle widget as item of layout, but here black background is shading displayed data (i've turned off even background, but i've realized that Qt can't know what is under my widget because it's handled by external library).
In documentation i found information that i can create my own directshow+COM object (or interface, am i right?) to handling video stream but i don't really know this technologies and i want avoid this option so much!
[Edit] I found that i can take current frame of data as IPictureDisp interface, but as I said earlier i don't really know COM technology. I've tried to find something about, how work with IPictureDisp but i don't have basic knowledge about COM technology. Does anybody has any good tutorial about it?
Try this widget hierarchy:
MainWindget
|-QGLWidget
|-MarkerWidet
MarkerWidet should be a small square widget.
Make sure that MarkerWidet is above QGLWidget by calling MarkerWidet::raise().
But call MarkerWidet::lower() before calling MarkerWidet::raise(). Qt has an error with QWidget::raise(), this error is eliminated by calling lower(); raise();.
After starting your application check actual widget hierarchy, use Spy++ to do it. Spy++ can be found in the Visual Studio installation, or downloaded here.
Pseudocode:
MainWindget* mainWidget = new MainWindget;
QGLWidget* glWidget = new QGLWidget(mainWidget);
device->setHwnd(glWidget->winId());
mainWidget->show();
...
MarkerWidet* marker = new MarkerWidet(mainWidget);
marker->resize(30, 30);
marker->move(10, 10);
marker->lower();
marker->raise();

Trying to update only a rectangle in a widget in Qt, but the entire widget's area is updated instead

This happens in Qt Simulator (for phones). I'm trying to update only a portion of a widget's area, but the entire widget is updated instead.
To illustrate, the following code:
void Widget::mousePressEvent(QMouseEvent *event)
{
update(0, 0, 10, 10);
}
void Widget::paintEvent(QPaintEvent *event)
{
qDebug() << event->rect();
}
Gives the following debug output when I click on the widget:
QRect(0,0 458x832)
Which is the entire area of the widget.
What am I doing wrong here?
Edit
I ran the same code on Linux, and it worked as it should, the debug output was
QRect(0,0 10x10)
In most GUI framework I've seen you can't update just some part of the application window/widget. Even if there's some function in API to update some rectangle - like update(x,y,x,y) you're using - it's just to inform the Framework that it needs to update at least the given rectangle, and framework can update a bigger part of the screen.
I'm not sure how it works in phone Qt but it's done this way in desktop version because in most OSes GUI application doesn't store its 'image' anywhere, and if you minimize and then show your window you need to recreate entire surface.
All this means that you can't rely on the assumption, that you'll paint something and then you'll be adding other paint operations in some custom rectangle when you need. You should implement some general 'paint' function, that can redraw everything from scratch and leave the painting optimization to the framework.
I am not familiar with Qt on the phones. But maybe something else is triggering the update of the whole widget. Qt sends one paintEvent for all update() called during one event loop processing. So your code may be calling for a partial update but somewhere in the window system may somehow get touched and calls for a full update.
Try repaint() and see if that sends your paintEvent the right region.
It turned out that this bug was present only in Qt Simulator. On the actual phone itself, the update region was being passed correctly. I tested this by displaying a QMessageBox with the coordinates of event->rect.