Qt GraphicsView mouseMoveEvent shadowing GraphicsItem mouseMoveEvent - c++

I'm having trouble with overloading mouseMoveEvent.
I have subclassed QGraphicsView and overloaded mousePressEvent, mouseMoveEvent and mouseReleaseEvent. I am using these events to draw a custom QGraphicsItem - which is a Line. (mousePress - sets the start point of the line, mouseMove makes the line follow the cursor, second mousePress sets the end point of the line and mouseRelease stops the drawing of the line.)
I have also created another custom item - Node. The node is drawn with a mousePress event. I have 2 flags to differentiate between drawing Lines and Nodes. The ItemIsMovable flag of the Node is set to true and I have reimplemented mouseMoveEvent in the Node class to make it move (I change its coordinates and repaint it. And it worked fine.)
The problem is - when I implemented the mouseMoveEvent in my subclass of QGraphicsView(for the drawing of the Line) - the mouseMoveEvent of the Node class stopped working and the Nodes aren't moving anymore. How can I fix this?
Thank you for your time, your help will be appreciated.

You need to call base class (QGraphicsView) implementation from your implementation. Otherwise mouse events will not be processed by QGraphicsView and will not be passed to the scene and its items.
void MyView::mousePressEvent(QMouseEvent* e) {
QGraphicsView::mousePressEvent(e);
//your implementation
}

Related

Change QWidget Parent During Mouse Event

I'm trying to create a detachable type style widget, like in the way Chrome tabs are detachable (class is called Tab). I have everything working, except for a bug where sometimes (maybe 50% of the time), the Tab object never gets the mouse release event, and stops getting mouse move events.
Essentially, the detaching system works by allowing drags in the mouse press/move/release functions, just like normal. The mouseMoveEvent checks the total distance moved from the start, and if over a certain amount, will start the "detaching" process. The detaching process involves setting the parent widget to 0 (top level widget, undecorated window), so the Tab object is pretty much floating above everything, under the mouse, and continues to be dragged along with it until released.
I ran through all the QEvent items being delivered, and I found that when this issue occurs, the QEvent::MouseMove items (and all mouse events after this) are being sent to the TabBar (the Tab object's original parent). This occurs directly after calling setParent(0) on the Tab.
Basic mouse handling overview:
void Tab::mousePressEvent(*) {
[set up some boolean, start positions, etc]
}
void Tab::mouseMoveEvent(*) {
[track the updated position]
if (positionChange > STATIC_AMOUNT)
detachTab();
}
void Tab::mouseReleaseEvent(*) {
[return the Tab to its original position, and set the parent back to the TabBar]
}
void Tab::detachTab() {
QPoint mappedPos = mapToGlobal(0, 0);
setParent(0); //The loss of MouseMove events occurs when this returns.
move(mappedPos);
show();
raise();
}
Here are the events that the Tab object receives (first row is QEvent type, second is the name)
[Tab::detachTab() started]
[setParent(0) started]
QEvent::Hide
QEvent::Leave
qApp QEvent::MouseMove [ TabBar ] <-- now the TabBar is soaking up the mouse events
QEvent::HideToParent
QEvent::ParentAboutToChange
QEvent::ParentChange
[setParent(0) returned]
....
Summed up: my draggable QWidget loses QEvent::MouseMove and QEvent::MouseButtonRelease events after having its parent set to 0.
Any advice would be really appreciated!
A bit tricky workaround. I didn't test it, it's just an idea.
When your mouse hovers draggable part of a widget you may create topmost widget (let's call it Shade) with Qt::FramelessWindowHint (and possible with Qt::WA_TranslucentBackground). You may manipulate with Shade apperance via reimplementing paintEvent. For example - draw content of original widget, or draw some transparent preview, etc.
Then you may resize a Shade during dragging, to show user that widget will be detached. You will not loose mouse capture.
When user release mouse - you remember position of Shade, destroy it and detach+move original widget.
Feel free to ask, if you want more details.
Here is similar question.
So you suppose to use QDocWidget and enforce stacking of this widgets using tabifyDockWidget.

Qt: Custom QGraphicsItem not showing when boundingRect() center is out of view

I'm making a Diagram (Fluxogram) program and for days I'm stuck with this issue:
I have a custom QGraphicsScene that expands horizontally whenever I place an item to it's rightmost area. The problem is that my custom arrows (they inherit QGraphicsPathItem) disappear from the scene whenever it's boundingRect() center is scrolled off the view. Everytime the scene expands, both it's sceneRect() and the view's sceneRect() are updated as well.
I've:
set ui->graphicsView->setViewportUpdateMode(QGraphicsView::FullViewportUpdate)
the item flags QGraphicsItem::ItemIgnoresTransformations and QGraphicsItem::ItemSendsGeometryChanges, setActive(true) on the item as well, and everytime I add an arrow to the scene i call the update(sceneRect()) method. Still, everytime I scroll the view, as soon as the arrow's boundingRect() center moves away from the view, all the arrow disappears. If I scroll back and the boundingRect() center enters the view, all the arrow appears again.
Can someone give me a tip of what I might be missing? I've been using Qt's example project diagramscene as reference, so a lot of my code is similar (the "press item toolButton -> click on the scene" relation to insert items, the way they place the arrows to connect the objects,...).
In the meanwhile I'll try to make a minimal running example that can show what my issue is.
Your Arrow object inherits from QGraphicsPathItem, which I expect also implements the QGraphicsItem::shape function.
Override the shape function in your Arrow class, to return the shape of the item. This, along with the boundingRect is used to collision detection and detection of an item on-screen.
In addition, before changing the shape of an item by changing its boundingRect, you need to call prepareGeometryChange.
As the docs state: -
Prepares the item for a geometry change. Call this function before changing the bounding rect of an item to keep QGraphicsScene's index up to date.
So, in the Arrow class, store a QRectF called m_boundingRect and in the constructor: -
prepareGeometryChange();
m_boundingRect = QRectF(-x, -y, x*2, y*2);
Then return m_boundingRect in the boundingRect() function.
If this is still an issue, I expect it's something in QGraphicsPainterPath that's causing the problem, in which case, you can simply inherit from QGraphicsItem and store a QPainterPath with which you draw in the item's paint function and also return the painter path in shape().
You are making your life too complicated. Do not subclass QGraphicsPathItem just use it and update its path value every time position of anchors (from to) changes.

Qt C++ Let multiple QGraphicsItem handle one mouse event

I've created an object Chartblock that implements QGraphicsItem. My goal is to create a grid of these objects, and when the mouse button is pressed (and held) and is drug over each block, perform something on each block as the cursor enters it.
Since a QGraphicsItem grabs the mouse events when it is clicked within it, other Items will not fire for the mouseMoveEvent. I then created an object based on the QGraphicsItemGroup to handle all the mouse events, but then I would need some way to pass mousePressEvent/mouseReleaseEvent as well as mouseMoveEvent to each child that the cursor is over.
Am I overthinking how to do this? It seems like such a simple action shouldn't be that difficult to create, but with QGraphicsItems holding onto the mouse events for itself, I'm not sure how to get around it. I've read similar situations, but nothing seems to give a straightforward answer.
Edit: I suppose a way to do this would keep track of the coordinates/sizes of every single QGraphicsItem I create in an array, then get the position of the cursor in the Group mouseMoveEvent, and see if there's a hit..
I was able to pull together a few similar answers to create a solution; I dropped the idea of placing all my QGraphicItem's in a Group, and placed them directly on a scene. With the scene grabbing all mouse events, I have the mouseMoveEvent check to see if the current position is on top of a QGraphicsItem - if so, perform something.
I still need to try and get itemAt() to work for my own classes that implement QGraphicsItem, as itemAt returns only QGraphicsItem's, but I'm sure some cast should get it working.
void ChartScene::mouseMoveEvent(QGraphicsSceneMouseEvent *event)
{
QPointF mousePosition = event->scenePos();
QGraphicsItem* pItem = this->itemAt(mousePosition.x(), mousePosition.y());
}

QGraphicsItem doesn't receive mouse hover events

I have a class derived from QGraphicsView, which contains QGraphicsItem-derived elements. I want these elements to change color whenever the mouse cursor hovers over them, so I implemented hoverEnterEvent (and hoverLeaveEvent):
void MyGraphicsItem::hoverEnterEvent(QGraphicsSceneHoverEvent* event)
{
update (boundingRect());
}
However, this event handler code is never executed. I've explicitly enabled mouse tracking:
MyGraphicsView::MyGraphicsView(MainView *parent) :
QGraphicsView(parent)
{
setMouseTracking(true);
viewport()->setMouseTracking(true);
...
}
Still, no luck. What am I doing wrong?
Fixed it. I need to use setAcceptHoverEvents(true) in the constructor of my QGraphicsItem-derived class.
In my case, hover events wouldn't work if I overrode mouseMoveEvent in my implementation of the QGraphicsView class. I fixed this by adding a call to
QGraphicsView::mouseMoveEvent(event);
which propagated the event to the parent, which in turn sent it out to all the scene items.

Qt -- pass events to multiple objects?

I basically have 3 layers (Window > Scene > View) that each need to handle a mouseMove event without blocking the others. It seems only the youngest child is getting the event though. I was hoping I could process the event and then call event->ignore() to pass the event back up the stack, but it doesn't seem to be working.
Some relevant code if you need it:
void EditorWindow::createScene() {
m_scene = new EditorScene(this);
m_view = new EditorView(m_scene);
// ...
}
void EditorScene::mouseMoveEvent(QGraphicsSceneMouseEvent* mouseEvent) {
printf("B\n");
// ...
}
void EditorView::mouseMoveEvent(QMouseEvent* event) {
printf("C\n");
event->ignore();
}
Only "C" is being printed. Note that EditorScene and EditorView receive different types of mouse events so it's not completely trivial to pass them around.
The EditorWindow also needs the mouse coordinates; currently I'm sending a signal from one of the children which is caught by the window... but it shouldn't really be necessary to relay it that way, should it?
Found this nice article. Calling ignore() tells Qt to find another receiver. Sounds like it should work, but perhaps it means an unrelated receiver. The proper way to propagate it is actually to call BaseClass::Event like so:
void EditorView::mouseMoveEvent(QMouseEvent* event) {
QGraphicsView::mouseMoveEvent(event); // propogate to parent widget
printf("C\n");
}
Now it's printing BCBCBC... which is great, but I can't seem to nudge it up one more level...
Another edit: It was being propogated up properly, I just didn't have setMouseTracking enabled.
QGraphicsView::mouseMoveEvent(event);
Doesn't propagate up to the parent -- it actually propagates down to the scene.
Here is what's happens -- QGraphicsView receives QMouseEvent, translates it into QGraphicsSceneMouseEvent and passes it to the scene. Scene then passes it to appropriate item or, in your case, prints "B". Event handler then returns back to EditorView and prints "C".
Then, if you explicitly ignore event (mouse move is accepted by default), Qt event handler will pass the event to parent of EditorView. So try ignoring after you print "C".
Another thing about mouse move is this:
If mouse tracking is switched off, mouse move events only occur if a mouse button is pressed while the mouse is being moved. If mouse tracking is switched on, mouse move events occur even if no mouse button is pressed.
So make sure you have tracking enabled on parent of EditorView (or that you press buttons :)).
EDIT:
BTW, EditorScene is not a parent of EditorView. Well, it is in your code, but only in QObject meaning of parentship (memory management only).
QGraphicsScene and View don't have normal family relationship -- scene can have multiple views and those views are children of unrelated parents.
For window event propagation purposes you must have QWidget based parent. In fact, I'm pretty sure you reparent EditorView to EditorWindow, or one of its children (when you add it into layout).
INSTAEDIT:
For coordinates you want View itself to emit a signal. Both for decoupling reasons and because you probably want to show local coordinates of the view, and not of the parent window and not screen coordinates (right?). If you actually want scene coordinates, View is right choice too, because it knows transformation matrix.
Coordinates go like this:
Screen -> EditorWindow local -> EditorView local -> Scene transformed -> whatever item local transformed.
QGraphicsView::mousePressEvent( e ) in my mousePressEvent did the trick!