How to enable mouse tracking on a QWindow - c++

I'm using a QWindow (Not a QMainWindow) with OpenGL. I need to use a QWindow to correctly control the OGL context.
I'm trying to follow the Scribble example to implement something similar to panning, but I can't find a paradigmatic way to trigger the mouseMoveEvent().
How can I get a "tooltip" effect where mouseMoveEvent() is constantly triggered, similar to setMouseTracking()?

It works fine for me. I created a test program with a MainWindow that inherits QWindow instead of QMainWindow, and handles the mouse move event to print the cursor position:
void MainWindow::mouseMoveEvent(QMouseEvent *e)
{
qDebug("%d, %d", e->pos().x(), e->pos().y());
}
It works, as I move the mouse I get events even without pressing any mouse buttons.

If mouse tracking is disabled (the default), the widget only receives mouse move events when at least one mouse button is pressed while the mouse is being moved.
you can call hasMouseTracking() or setMouseTracking() to control mouse's tracing state.
When mouse is traced, mouseMoveEvent() will be called, and you can reimpletment mouseMoveEven() to acquire mouse position, just as #sashoalm did.
BTW, mouse event is always transfered to your app, but filtered by it's parent or itself. You can reimpletment eventFilter() to code your own filter.

Related

Qt transfer mouse move events to new window

I'm handling mouse events (through an event filter) on a QTabBar to detect when the user clicks and drags a tab to tear it off. When this happens, I remove the current widget from the QTabWidget and create a new top level widget that I add it to so that it's detached and floating, just like when you tear off a tab in Chrome. This new floating window is a custom frameless widget I made that has a custom titlebar that I handle mouse events on to allow the user to drag the window around the desktop.
The problem I'm having is that when you click and drag the tab to pull it off and the new top level window is created, I can't seem to get the application to continue dragging the new window without the user clicking and dragging on my titlebar. I'd like for the original drag motion to just transfer to the new widget so that the use can keep dragging it until he releases the mouse button.
I've tried creating a "fake" QMouseEvent to pass to my title bar (by calling QCoreApplication::sendEvent(object, event) to make it think it's been clicked on, but it doesn't receive any mouse move events unless you actually click on it. I'm open to other ideas.
Update: I added some debugging statements and it looks like once I detach the tab and create the new floating window, the QMainWindow continues to receive the mouse move events until I release the mouse button. I'll try adding some code to forward these mouse events on to the new floating window, but that feels kinda hacky.
Correction: The QMainWindow is not receiving the mouse move events, an object named "MainWindowWindow" is, which is a QWidgetWindow that I guess is a private type used to manage top level windows?
Ok, I got it working, but I don't like it. As I said in my correction, once the tab is detached, the QWidgetWindow starts receiving the mouse move events. This is a private type that's not exposed via the API.
What I ended up doing was installing an event filter on the application and looking for mouse events on an object that inherits from QWidgetWindow. I then directly call new drag() and endDrag() methods that I added to my floating window class. In my eventFilter method, it looks like this.
if (watched->inherits("QWidgetWindow"))
{
if (floater) // <- the floating window that was recently detached
{
if (event->type() == QEvent::MouseMove)
{
floater->drag(QCursor::pos()); // <-- Pass global cursor pos
}
else if (event->type() == QEvent::MouseRelease)
{
floater->endDrag();
floater = nullptr;
}
}
}
Like I said, if feels dirty because I A) am checking for events on a private type that I'm not really supposed to know about, and B) telling another window to drag around when it has code to do this itself. But it works, and I've spent enough time working on this. If someone has a more elegant solution, I'm happy to hear it.

QQuickItem not receiving mouse events after using grabMouse() when mouse is outside app window

I have an QtGuiApplication with single QQuickItem. and I want quickitem to
receive mouse events when mouse pointer is outside the Main Window. mouse pointer can be anywhere in screen.
So at the end QQuickItem should receive mouse events(e.g mouse move) when the mouse pointer is outside application window.
I used grabMouse [void QQuickItem::grabMouse()] to do this. but I see no effect as desired. QuickItem receives event only when mouse pointer is inside App window. It stopped getting any mouse event as soon as mouse pointer leaves the QGuiApplication boundary.
As I read grabMouse should solve my problem but somehow it doesn't.
Can anyone point me that is there is any extras need to be done or anything wrong here.
Note : after using grabmouse Quickitem starts getting mousemove event when
mouse is passed over application
I believe you can not receive mouse events when mouse it outside your application window. That is how OS usually works.

Change QWidget Parent During Mouse Event

I'm trying to create a detachable type style widget, like in the way Chrome tabs are detachable (class is called Tab). I have everything working, except for a bug where sometimes (maybe 50% of the time), the Tab object never gets the mouse release event, and stops getting mouse move events.
Essentially, the detaching system works by allowing drags in the mouse press/move/release functions, just like normal. The mouseMoveEvent checks the total distance moved from the start, and if over a certain amount, will start the "detaching" process. The detaching process involves setting the parent widget to 0 (top level widget, undecorated window), so the Tab object is pretty much floating above everything, under the mouse, and continues to be dragged along with it until released.
I ran through all the QEvent items being delivered, and I found that when this issue occurs, the QEvent::MouseMove items (and all mouse events after this) are being sent to the TabBar (the Tab object's original parent). This occurs directly after calling setParent(0) on the Tab.
Basic mouse handling overview:
void Tab::mousePressEvent(*) {
[set up some boolean, start positions, etc]
}
void Tab::mouseMoveEvent(*) {
[track the updated position]
if (positionChange > STATIC_AMOUNT)
detachTab();
}
void Tab::mouseReleaseEvent(*) {
[return the Tab to its original position, and set the parent back to the TabBar]
}
void Tab::detachTab() {
QPoint mappedPos = mapToGlobal(0, 0);
setParent(0); //The loss of MouseMove events occurs when this returns.
move(mappedPos);
show();
raise();
}
Here are the events that the Tab object receives (first row is QEvent type, second is the name)
[Tab::detachTab() started]
[setParent(0) started]
QEvent::Hide
QEvent::Leave
qApp QEvent::MouseMove [ TabBar ] <-- now the TabBar is soaking up the mouse events
QEvent::HideToParent
QEvent::ParentAboutToChange
QEvent::ParentChange
[setParent(0) returned]
....
Summed up: my draggable QWidget loses QEvent::MouseMove and QEvent::MouseButtonRelease events after having its parent set to 0.
Any advice would be really appreciated!
A bit tricky workaround. I didn't test it, it's just an idea.
When your mouse hovers draggable part of a widget you may create topmost widget (let's call it Shade) with Qt::FramelessWindowHint (and possible with Qt::WA_TranslucentBackground). You may manipulate with Shade apperance via reimplementing paintEvent. For example - draw content of original widget, or draw some transparent preview, etc.
Then you may resize a Shade during dragging, to show user that widget will be detached. You will not loose mouse capture.
When user release mouse - you remember position of Shade, destroy it and detach+move original widget.
Feel free to ask, if you want more details.
Here is similar question.
So you suppose to use QDocWidget and enforce stacking of this widgets using tabifyDockWidget.

Qt mouse leave event while button is pressed

I need to detect if the mouse pointer leaves my custom widget even if a mouse button is pressed.
According to this post, Qt does not cause a leaveEvent in case a button is pressed, at least not in version 4.4. I am working with 4.7.3, but still do not get a leaveEvent in the described case. I also tried with the various drag-related events, but no luck. Does anyone have an idea how to deal with this?
Actually there is an even better option than using the mouse events of the parent widget: You can implement the mouseMoveEvent function of the child widget and get the QMouseEvent::pos() position which is relative to the top left corner of the widget. That means, if you know the size of the widget (use for example QWidget::rect()), you can compute within the widget if the mouse pointer is still on the widget or not without having to change the parent widget.
Well, I did something similar to #gregseth.
Write your own MouseLeaveEvent that is invoked from MouseMoveEvent
when the mouse's position is outside of the widget's boundary.
As Qt's documentation says "Mouse move events will occur only when a mouse button is pressed down"
I got the same problem. The only workaround I found is to manage the mouseMove event of the parent widget, and to check if the position of the cursor is inside or outside the boundaries of the widget you want the event on.

Qt -- pass events to multiple objects?

I basically have 3 layers (Window > Scene > View) that each need to handle a mouseMove event without blocking the others. It seems only the youngest child is getting the event though. I was hoping I could process the event and then call event->ignore() to pass the event back up the stack, but it doesn't seem to be working.
Some relevant code if you need it:
void EditorWindow::createScene() {
m_scene = new EditorScene(this);
m_view = new EditorView(m_scene);
// ...
}
void EditorScene::mouseMoveEvent(QGraphicsSceneMouseEvent* mouseEvent) {
printf("B\n");
// ...
}
void EditorView::mouseMoveEvent(QMouseEvent* event) {
printf("C\n");
event->ignore();
}
Only "C" is being printed. Note that EditorScene and EditorView receive different types of mouse events so it's not completely trivial to pass them around.
The EditorWindow also needs the mouse coordinates; currently I'm sending a signal from one of the children which is caught by the window... but it shouldn't really be necessary to relay it that way, should it?
Found this nice article. Calling ignore() tells Qt to find another receiver. Sounds like it should work, but perhaps it means an unrelated receiver. The proper way to propagate it is actually to call BaseClass::Event like so:
void EditorView::mouseMoveEvent(QMouseEvent* event) {
QGraphicsView::mouseMoveEvent(event); // propogate to parent widget
printf("C\n");
}
Now it's printing BCBCBC... which is great, but I can't seem to nudge it up one more level...
Another edit: It was being propogated up properly, I just didn't have setMouseTracking enabled.
QGraphicsView::mouseMoveEvent(event);
Doesn't propagate up to the parent -- it actually propagates down to the scene.
Here is what's happens -- QGraphicsView receives QMouseEvent, translates it into QGraphicsSceneMouseEvent and passes it to the scene. Scene then passes it to appropriate item or, in your case, prints "B". Event handler then returns back to EditorView and prints "C".
Then, if you explicitly ignore event (mouse move is accepted by default), Qt event handler will pass the event to parent of EditorView. So try ignoring after you print "C".
Another thing about mouse move is this:
If mouse tracking is switched off, mouse move events only occur if a mouse button is pressed while the mouse is being moved. If mouse tracking is switched on, mouse move events occur even if no mouse button is pressed.
So make sure you have tracking enabled on parent of EditorView (or that you press buttons :)).
EDIT:
BTW, EditorScene is not a parent of EditorView. Well, it is in your code, but only in QObject meaning of parentship (memory management only).
QGraphicsScene and View don't have normal family relationship -- scene can have multiple views and those views are children of unrelated parents.
For window event propagation purposes you must have QWidget based parent. In fact, I'm pretty sure you reparent EditorView to EditorWindow, or one of its children (when you add it into layout).
INSTAEDIT:
For coordinates you want View itself to emit a signal. Both for decoupling reasons and because you probably want to show local coordinates of the view, and not of the parent window and not screen coordinates (right?). If you actually want scene coordinates, View is right choice too, because it knows transformation matrix.
Coordinates go like this:
Screen -> EditorWindow local -> EditorView local -> Scene transformed -> whatever item local transformed.
QGraphicsView::mousePressEvent( e ) in my mousePressEvent did the trick!