No touch events available when QGraphicsView is visible - python-2.7

I have an PyQt4 application which is deployed on a touch device. The central widget is a QTabWidget. All pushbuttons are touch enabled using the setAttribut() method. Now, the application shall get new functionality, which uses a QGraphicView component, which is placed on a new tab.
The As long as the graphic view is not visible, touch events are properly created and processed by the buttons. After the graphic view gets visible, no more touch events are available. Instead, every touch on the screen results in a MousePressEvent.
Which properties have to be set to not get touches as mouse press events only?

Related

Non-clickable QPushButton

I've been trying to make a Qt test app to put two different 'scenes' in, using one main widget which is the window which can hide and show the two other widgets which are the different scenes. In scene 1, there is a QPushButton, which is connected to a signal which is connected to showing scene 2. However, when I click the button, nothing happens. I tried running in debug mode, and the first slot isn't fired, AND the animation of pushing the button down isn't triggering. Is this something to do with the fact that the button is inside a widget inside a widget? I'll add the code if necessary.

How to change UI without opening a new window in Qt?

In my program, I'm moving from QMainWindow to QDialog on a press of a button.
I want to do the same without opening a new window and be able to move between the UI's.
The Target device will have a very small touchscreen, so I want my UI to sit still and require minimal repositioning.
Please point me in the right direction or give me an example on How-to.
To do that, you can use a QStackedWidget.
From the documentation:
The QStackedWidget class provides a stack of widgets where only one widget is visible at a time.
Instead of opening a new window, push its content on top of the stack and pop it when you want to (let me say) close the window.
Each widget is a page of your application and no separate window is required. You can design them as you would design a central widget of a normal window or dialog.

how to make CDockablePane unfocusable?

i create a CDockablePane as well as a CView in the main frame of my application. i need the pane not to get focused. this causes the active view to lose focus and doesn't receive keyboard events anymore. the pane consists of tool boxes. they're dragged into the view. but the pane and its children must not get focus ever.
how can it be achieved? i tried ModifyStyleEx after creation of the pane for setting WS_EX_NOACTIVATE. but it didn't change the pane's behavior. i also tried ModifyStyle to set WS_DISABLED. but it completely disables dragging tools from the pane.

MFC floating CDialog control clipping issue

I am making an SDI MDF application that uses a frameview to provid the user with a set of controls (buttons, editboxes and such). The view also owns a set of CDialogs used to desplay aditional controls that can be can be shown or hidden via a tabcontrol and other means. Untill recently the dialogs have been staticly placed at creation to be in their proper location on the screen but I wanted to add a dialog that the user could move around but is still a child of the view. When I created a dialog with a caption and sysmenu that the user can move around the issue I am running into is that when the window is placed over another control owned by the view, (lets say a button) when the paint method is called on the button, it draws over the dialog. The dialog is still on top and the dialogs controls can still be interacted with but the button is drawn over them untill the dialog is repainted. I have tryed to change the clipchild and clipsiblings settings of the dialog and have been able to get the dialogs to properly clip eachother but can not seem to get the child dialog to properly clip the parent view controls. Does anyone have any ideas on what setting might fix this clipping issue.

Qt popup grabMouse for all children

I'm trying to create a popup menu like control that contains custom widgets. I need to capture the mouse, but I need to have the children in the widget still get their mouse messages. It appears the grabMouse sends events only to the widget that grabbed the mouse, not its children.
The popup is simply a series of buttons (using a QGridLayout). The control should work that the user presses the right-mouse button, the popup appears, they move to an item and release the mouse button. Optimally it would work exactly like a QMenu popup but with custom widgets and a custom layout.
How can I achieve this?
It appears that simply specifying attribute Qt::Popup is enough to get the fundamental behaviour required.
Installing an event filter on all children is also necessary. All mouse events, enter/leave/hover events must be captured. QT has a defect with grabMouse so that won't work -- the filter must be used to get expected behaviour.