How to force Qt update the widget under mouse? - c++

On Windows 7 to be specific, while I don't think it matters.
We have all seen this issue in countless desktop applications, especially games that tend not to use OS-supplied controls: when the screen changes programmatically under a motionless mouse cursor (as opposed to user moving the cursor to a new widget), they go out of sync. Either the cursor does not change or the widget is not painted as it should be with the cursor inside it - obviously the widget's mouse enter event is not triggered. If you shake the mouse a bit without even leaving the widget, the thing fixes itself.
Sadly, Qt 5.7 shares this widespread problem. The first solution to come to mind is to move the mouse programmatically to (0, 0) and back by Windows means. However, it's not cross-platform(ish). Any better ideas?

I don't know if your question is still actual
You can override method underMouse() with following code:
bool MyWidget::underMouse()
{
return rect().contains(mapFromGlobal(QCursor::pos()));
}
Events moveEvent or resizeEvent in the rest panels will be definitely triggered when splitter resizes panel. All you need is check if widget is under mouse and then invoke enterEvent() manually

I found two solutions.
The first one is a simple hack: Call widget->hide() and then immediately widget->show(). This will reevaluate and update the widget's visual state depending on whether it is under mouse cursor or not, even if the cursor has not moved. But I would not recommend this solution because it might have some unwanted side effects. Though I have not encountered any yet.
The second solution is better because it does not look like a hack and it probably does not have any side effects:
widget->setAttribute(Qt::WA_UnderCursor, qApp->widgetAt(QCursor::pos()) == widget);
widget->update();
The code is assumed to be called from widget's parent. But you can adjust it and calle it from any other place. Note: it is better to use QApplication::widgetAt() than widget->rect().contains(), which is suggested in another answers, because in the latter case we would get false positives for widgets which are overlayed by other widgets.
What is actually complicated, is to find the place in code from where you should call this. Because there can be many sources of the widget's motion - moving withing its parent, moving of parent(s), resizing, resizing of parent(s), scrolling etc. This is probably the reason why this would be too complicated to implement this to standard Qt widget library. It would be probably a performance killer in some scenarios. (my guess)
Just to show my usage: I am moving a whole container widget which contains many child widgets. Subsequently one of the child widgets may get under the cursor after the container is moved. So I call:
containerWidget->move(dx, dy); // this moves the container
for (QWidget *child : containerWidget->findChildren<QWidget*>())
{
child->setAttribute(Qt::WA_UnderMouse, qApp->widgetAt(QCursor::pos()) == child);
child->update();
}

Related

QGraphicsScene mouseMoveEvent does not work until QGraphicsView wheelEvent

I have a strange issue that I have been unable to determine the cause. Basically, I created a 2D view with pan and zoom functionality and a scene with items that can moved with grid snapping. To move an item in the scene I extended Scene::mousePressEvent to get a pointer to the item and Scene::mouseMoveEvent to keep the item tracked on the cursor. To drop the item, I used Scene::mousePressEvent again. To pan, I extended View::mousePressEvent, View::mouseReleaseEvent, and View::mouseMoveEvent and to zoom I extended View::wheelEvent.
Now for the symptoms:
I start the application with an item in the Scene. If I click and hold, then move the mouse, the item moves as intended. As soon as I release the mouse button, the item stops moving. I can click to drop and the item is placed according to the drop code in Scene::mousePressEvent. Try again and still the item only moves when the mouse button is pressed.
Then comes the strange part: If I use the mouse wheel to zoom the View, everything performs as expected after that event. The mouse is clicked to select an item, it moves as I move the mouse and drops when I click again.
So the obvious solution:
wheelEvent(new QWheelEvent(QPointF(0,0),0,Qt::NoButton,Qt::NoModifier));
called at the creation of the View and everything works fine. It calls the extended View::wheelEvent with no change to the view and before the scene is even created, but afterwards the programs behaves as expected.
So I'm here to see if any of the excellent Qt experts out there can explain this strange behavior. Any comments or direction are appreciated.
In case it helps, here is the View::wheelEvent override code. tform is a QTransform with which I maintain zoom. Also, I've tried with and without the call to the base method but there is no change in behavior.
void SchematicView::wheelEvent(QWheelEvent* event)
{
// Scale the view / do the zoom
double scaleFactor = 1.1;
if(event->delta() > 0 && tform.m11() < max_zoom) {
tform.scale(scaleFactor,scaleFactor);
} else if (event->delta() < 0 && tform.m11() > min_zoom){
tform.scale(1.0/scaleFactor,1.0/scaleFactor);
}
setTransformationAnchor(QGraphicsView::AnchorUnderMouse);
setTransform(tform);
QGraphicsView::wheelEvent(event);
}
Without a SSCCE to look at and test with, it's hard to say for sure, but what you're describing sounds a lot like your mouseMoveEvent() callback is only getting called when the mouse button is being held down during the move. That, in turn, sounds a lot like the expected behavior for mouseMoveEvent(), as documented in QWidget::mouseMoveEvent():
If mouse tracking is switched off, mouse move events only occur if a
mouse button is pressed while the mouse is being moved. If mouse
tracking is switched on, mouse move events occur even if no mouse
button is pressed.
If that is indeed the problem, then a call to setMouseTracking(true) may get you the behavior you are looking for.
On a broader level, note that there are easier ways of obtaining the behavior you are trying to implement -- for example, to allow the user to drag and drop items in a QGraphicsScene, all you really have to do is call setFlags(QGraphicsItem::ItemIsMovable) on any QGraphicsItems that you want the user to be able to drag around. No hand-coding of event handlers is necessary unless you are trying to obtain some non-standard behaviors.

Is it possible to disable the light-blue mouse over highlighting on a QTreeWidgetItem?

I've a QTreeWidget and need to disable the mouse over highlighting on the childItems but not the click selection. The point here is, that I need to set this per Item because some are selectable. I was thinking about the QTreeWidget::itemEntered signal to check if the item should be highlighted or not but I can't get it to work because the description says
QTreeWidget mouse tracking needs to be enabled for this feature to
work.
and I can't figure out how.
So my questions for are: How can I enable mouse tracking?
Is there an easier way to disable the highlighting?
Simply invoke setMouseTracking() to enable mouse tracking for a specific widget.
I ran into this problem (I know this is an old post, but I might as well post my solution, since it can be useful for others).
I could not properly disable the mouse feedback while keeping the mouse tracking enabled, but I could make this feedback invisible. I'm using qss stylesheets, and I set the mousehover feedback color to transparent:
MyTreeWidget::item:hover {
background-color: transparent
}
It did the trick for me. Sadly it makes the feedback invisible all the time, rather than allowing to turn it off and on.
So as a next step, for when I needed it, I implemented my own feedback by using a delegate and overwritting the paint function.
The QTreeView overwrite mouseMoveEvent and sends mouse coordinates to the delegate. This way, the delegate can adapt what it does in paint to this position. It feels pretty heavy, and a bit dirty, but it works. Delegate should also allow to have different behavior for different items.
PS: If you're using a delegate, in most cases, that should be enough without the qss change. In my case it wasn't, because I call QStyledItemDelegate::paint in my overwritten paint method, so I inherited some unwanted behavior.

Qt MouseMoveEvent only triggers with a mouse button press

I have an odd problem here.
I'm working on an application, and within one of my classes I'm monitoring my mouse events.
The weird thing is, my mouse move event will only get called if any mouse button is pressed.
I'm not even filtering for any button presses within the method; the method itself doesn't even get called unless I click on this object itself (the one that's monitoring it).
What generally causes this type of error to happen?
I'm not sure if it's relevant, but I have 2 different things monitoring my mouse inputs: 1) the main program monitoring the global mouse coordinates, and 2) an object within my program monitoring the mouse coordinates within itself.
Edit
So the problem has to be because mouse move event is generally used when people are dragging the cursor along the screen right?
My reason for not needing it like that is because I'm building a custom context menu of sorts, and I need to know when an item is hovered over.
It turns out that I didn't truly set everything within my class to enable mouse tracking.
I somehow thought if the class itself was set to have it enabled, I wouldn't need to set it to all the sub objects, but now I see how that wouldn't make any sense at all.
So just to clarify my solution:
The items that I needed to track my cursor's position needed to have
setMouseTracking(true);

Resizing a gtk2 widget with the mouse inside an area (window) with scroll?

I wanted to develop a widget container (still on python 2.7 and gtk2), which would be placed in a scrolled window, and could be freely moved and resized in the window, such that: click & drag within the widget would move it inside the window; and when widget's corners / edges are visible in the window, it would expose drag handles for resizing - otherwise, if it is bigger than the window area, it would scale up and down (zoom in and out) on middle-click.
Of course, I want to keep the amount of custom coding of this widget to a minimum, so I though looking into what's available in gtk2 first. It turns out, the only element exposing resize drag handles is gtk.Window - and at that, only if it is a main (or root) window; otherwise, if a window is placed inside a widget, its size is apparently set by the widget, and so there are no drag handles (not menus, titlebars etc). I was wandering why this is - and it seems it is due to multiple document interface (MDI) being considered evil by gtk developers, see e.g. Re: [gtk-list] Resizing widgets with a mouse or Does GTK support MDI? - Linux/BSD whirlpool.net.au.
Just to demonstrate the behavior that I want, I used a PyQt4 code from Python PyQt/PySide QMdiArea subwindows scroll not working in TabbedView - Stack Overflow, since as it turns out, Qt does have an MDI area. So here's the gist of it - if a corner is visible, a resize drag handle appears, and resize drag action can be started:
When you thus drag the corner outside of the window - the scrollbars automatically indicate the new size/position of the inner widget (note the window got also moved a bit in this screenshot below, that was manual and unintended):
Again, I don't really need a window (as in titlebars, menus) - just a widget container that would behave in this way, so I could put e.g. a table (e.g. TreeView) or an image in it, as the situation demands - and at least not worry about recalculating the "outer" scrollbars (naturally, I'd expect I'd have to code the rest of my custom behavior myself). Also, I just need a single widget placed in a window like that for now (so no "multiple document"s).
While Qt seems to offer this in a way, I don't have the possibility right now to get into it to the level of doing something like this; and the same goes for WxWindows (see e.g. wxPython-users - How to resize Widgets? - possible, but as there is no code there, I cannot see if geometry calculation coding is required or not).
So I was wondering - is there a widget I may have missed, that would implement the above behavior, and that I could take as a base for customization? If not, what options do I have to implement something like the above on gtk2 (eventually with Python)?
I'm not sure this would work, but I suggest looking into GtkOffscreenWindow; put the inner widget into that, and render it to a GtkDrawingArea inside a GtkScrolledWindow. This would probably make the scrollbars behave properly depending on the size of the drawing area.
What you won't get:
window titlebars, you'll have to render those yourself because GTK doesn't know about them, they're part of the window manager. (Note, the inner window in the Qt example has a different titlebar than the outer window - I suspect this is the same thing.)
drag handles to resize the window, you'll have to code those yourself, as you expected.
You might also want to look at how the Glade GUI designer does this.

mouse over transparency in Qt

I am trying to create an app in Qt/C++ with Qt4.5 and want the any active windows to change opacity on a mouseover event...
As i understand it, there is no explicit mouseover event in Qt.
However, I got rudimentary functioning by reimplementing QWidget's mousemoveevent() in the class that declares my mainwindow. But the mainwindow's mousemoveevent is not called whenever the mouse travels over any of the group boxes i have created in there (understandbly since QGroupbox has its own reimplementation of mousemoveevent).
So as a cheap work around, I am still using the mousemoveevent of my mainwindow but a query the global mouse position and based on the (x,y) position of the mainwindow (obtained through ->pos()) and the window size (-> size -> rHeight and rWidth), I check if the mouse is within the bounds of the area of the mainwindow and change the opacity thus.
This has had very limited success. The right border works fine, the the left changes opacity 4 pixels early. The top does not work (presumably because the mouse goes through the menubar and title bar) and the bottom changes way too early.
I thought of creating an empty container QWidget class and then place all the rest in there, but i felt that it would still not solve the big issue of the base widget not receiving the mousemoveevent if it has already been implemented in a child widget.
Please suggest any corrections/errors I have made in my method or any alternate methods to achieve this.
p.s. I doubt this matters, but I am working Qt Creator IDE, not Qt integration into VS2008 (it's the same classes anyways - different compiler though, mingw)
Installing event filters for each of your child widgets might do the trick. This will allow your main window to receive child events such as the ones from you group boxes. You can find example code here.
You may be interested in Event filters. QObject proves a way to intercept all events zipping around your application.
http://doc.trolltech.com/4.5/eventsandfilters.html#event-filters
If I understand what you are attempting to do, I would reimplement the widget's enterEvent() and leaveEvent(). The mouse enter event would trigger the fade-in and the leaveEvent would trigger the fade-out.
EDIT: After re-reading several times, I'm still not sure what you are trying to accomplish. Sorry if my suggestion doesn't help. :-)