How can I selectively make QWidget accept focus from a mouse click? - c++

I'm creating an application that provides a visual representation of nodes and lines connecting them together. Both nodes and lines are represented by custom QWidgets, WidgetNode and WidgetLine, say.
I've implemented WidgetLine as a transparent widget that is large enough to contain the start and end point of the line, and a custom function to draw the line itself.
I would like it so that if the user clicks on or right next to the line then the WidgetLine receives focus, but if they click further away from the line (but still over the rectangular area covered by WidgetLine's geometry) then the click is completely ignored by WidgetLine and passed on to the widget below.
I first tried doing this with a custom focusInEvent() function on WidgetLine, but found the mouse clicks weren't propagating below. I then tried setting the focus policy to Qt::NoFocus and using a custom mousePressEvent() using setFocus() to manually set the focus when appropriate, but the mouse events are still not being propagated to widgets above even when I call ignore() on them.
Finally, I've tried installing an event filter to reject mouse events, with this event filter function
bool WidgetLineFilter::eventFilter(QObject* object, QEvent* event)
{
assert(object == mCord);
if (event->type() == QEvent::MouseButtonPress)
{
QMouseEvent* e = dynamic_cast<QMouseEvent*>(event);
assert(e);
if (e)
{
QPoint mouseRelativeToParent = mCord->mapToParent(e->pos());
// calculate distance of mouse click to patch cord
QLineF line(mCord->line());
float distance = distanceFromPointToLine(QVector2D(line.p1()), QVector2D(line.p2()), QVector2D(mouseRelativeToParent));
qDebug() << distance;
const float distanceThreshold = 2.f;
if (distance < distanceThreshold)
{
qDebug() << "consuming mouse click for focus";
mCord->setFocus(Qt::MouseFocusReason);
return true;
}
else{
qDebug() << "mousepressevent too far for focus";
return QObject::eventFilter(object, event);
}
}
}
return false;
}
But this still does not propagate the mouse events to the parent on the "mousepressevent too far for focus" case. I've also tried returning false and true from here, and calling ignore on e but the widgets below are not receiving the click.
(NB the above approaches do work in the sense that WidgetLine only gets focus at the right time, it's just the widgets below that aren't receiving the press events when it doesn't get focus.)
Any ideas on how to fix this?

Store the mouse pos in a global var. Have all of your widgets have enter/leave events like the following and use that to check what widget you're in / near when you run the func.
void QGLWidget::enterEvent(QEvent *)
{
setFocus();
}
void QGLWidget::leaveEvent(QEvent *)
{
clearFocus();
}

In the end, I created an event filter to selectively intercept mouse events and installed it on the base window and recursively on every single child widget of the base window (installing it on new child widgets when they are created). This filter calls the base window with each mouse press event which then iterates through each WidgetLine, testing if they should be selected by this mouse press and setting focus on them if they should. If they all test false then the filter releases the event, otherwise the filter consumes it.
WidgetLine's are then set to be transparent for mouse events with
setAttribute(Qt::WA_TransparentForMouseEvents);
It's messier than it ought to be to achieve this but does the trick.

Related

How to simulate a mouse click with touch data?

I'm working on a Qt5 application to attempt to make use of raw input data from a touchscreen because touch is not fully supported by my Linux kernel (2.6.32). I wrote a parser for the raw data coming from /dev/input/eventX because even though X doesn't support touch, the events are still there.
I'm able to read the events just fine, and wrote a wrapper called "TouchPoint" which contains the ID and (x,y) coordinates of each touch point, as well as a boolean indicating whether or not the touch point is currently "active" (ie: the user currently has that finger on the screen). It's worth noting that I also output the number of points of touch currently on the screen, and the value is accurate.
My issue is that I can't seem to figure out how to accurately simulate a mouse click with it. With multi-touch events in Linux, each touch point is assigned a "tracking ID" when the user presses a finger to the screen, and when that point is lifted, an event setting that slot's tracking ID to -1 is generated. I use this change of ID from some value to -1 to indicate that a touch point is "down" or not:
void TouchPoint::setID(int nid) {
if((id == -1) && (nid >= 0)) down = true;
else if(nid == -1) down = false;
id = nid;
}
So to try and simulate a mouse click, I do this when a tracking ID event is read:
else if(event.code == ABS_MT_TRACKING_ID) {
bool before = touchPoints[cSlot].isDown(); // where cSlot is the current slot the events are referring to
touchPoints[cSlot].setID(event.value);
bool after = touchPoints[cSlot].isDown();
if(!before && after) touch(touchPoints[cSlot]);
}
And then the touch method does this:
void MainWindow::touch(TouchPoint tp) {
if(ui->touchMe->rect().contains(QPoint(tp.getX(), tp.getY()))) {
on_touchMe_clicked();
}
}
The button does not respond to me directly touching it, but sometimes if I wildly flail my fingers around the screen, the message box that should show when it's pressed will show, but when it does, my fingers are always somewhere on another area of the screen.
Can anyone think of something I might be doing wrong here? Is my method of checking to see if the touch point is within the button's bounds wrong? Or is my logic for keeping track of the "down" status of the touch points wrong? Or something else entirely?
UPDATE: I just noticed two things by outputting the screen coordinates of both the touch location and the button location.
The digitizer resolution is larger than the screen resolution, so I needed to scale the X and Y coordinates coming from the raw events.
My coordinates are offset because they are screen coordinates.
UPDATE 2: I've now dealt with the two issues mentioned in my last update, but I'm still not able to figure out how to accurately simulate a touch/mouse event to be able to interact with the interface. I also modified my TouchPoint class to have a boolean flag indicating whether or not the touch point was just updated, which is set to true when the tracking ID is updated and reset when an EV_SYN event is raised. This is so I can get the X and Y position events before creating the event for the application. I tried the following:
Using the QApplication class to post a mouse event to the QApplication::desktop()->screen() widget that has the position of the touch point.
Using the QTest class to raise a touchEvent to the QApplication::desktop()->screen() widget using the press and release methods for the given slot and position of the touch point and then using the bool event(QEvent*) method to try to catch the event. I also enabled the WA_AcceptTouchEvents attribute on the main window.
Neither of these works, as for some reason, when I have the "bool event(QEvent*)" method in place, the signal emitted from the thread reading /dev/input/eventX doesn't trigger the slot in the main window class. and I can't seem to find any other method to accomplish simulating the events. Anyone have any ideas?
You wrote that you sometimes get your message box "clicked" when your fingers are in a "wrong position" so it sounds like you a mixing global screen coordinates and widget local coordinates. It seems, in your MainWindow::touch you try to check if a point in global screen coordinates, e.g. (530, 815) is inside a widget's geometry (in its local coordinates). QWidget::rect() returns internal geometry of the button (widget), i.e. a rectangle of widget's width and height, e.g. (0, 0, 60, 100).
You have to move the rect to a right position in global screen coordinates. QWidget::pos() returns widget local position to its parent. You can use QWidget::mapToGlobal to translate widget coordinate position to global screen coordinates. Then your rect is something like (500, 850, 60, 100) and you should get a hit and get your slot called.
However, better approach is to use QApplication::widgetAt to get the widget in a specific screen position and generate a mouse click for it.
You can generate a mouse click for a widget by posting a mouse press and release to it, something like below:
QPoint screenPos = QPoint(tp.getX(), tp.getY());
QWidget *targetWidget = QApplication::widgetAt(screenPos);
QPoint localPos = targetWidget->mapFromGlobal(screenPos);
QMouseEvent *eventPress = new QMouseEvent(QEvent::MouseButtonPress, localPos, screenPos, Qt::LeftButton, Qt::LeftButton, Qt::NoModifier);
QApplication::postEvent(targetWidget, eventPress);
QMouseEvent *eventRelease = new QMouseEvent(QEvent::MouseButtonRelease, localPos, screenPos, Qt::LeftButton, Qt::LeftButton, Qt::NoModifier);
QApplication::postEvent(targetWidget, eventRelease);

catch mouse motion in gtkmm

I am trying to catch the mouse motion when I hold the mouse middle button. The goal is to implement a rotation feature in an stl viewer.
I found the event mask BUTTON2_MOTION_MASK. But I have a hard time figuring out which signal catches it.
Here's the two line I use to create and hook the event. These two line are inside a GtkApplicationWindow Constructor.
glWidget.add_events(Gdk::BUTTON2_MOTION_MASK);
glWidget.signal_motion_notify_event().connect(sigc::mem_fun(*this,&mainWindow::rotate));
Here's the function I am trying to connect.
bool mainWindow::rotate(GdkEventMotion* motion_event)
{
cout<<"test"<<endl;
}
Am I using the correct method? The code does not react when I hold the middle mouse button and move mouse.
I managed to get glArea widget to react to scrolling this way.
glWidget.add_events(Gdk::SMOOTH_SCROLL_MASK);
glWidget.signal_scroll_event().connect(sigc::mem_fun(*this,&mainWindow::zoom));
the function I connected:
bool mainWindow::zoom(GdkEventScroll *eventScroll)
{
cout<<"test"<<endl;
return true;
}
I figured it out. You need to both add the Gdk::Button1_MOTION_MASK and the Gdk::BUTTON_PRESS_MASK.
glWidget.add_events(Gdk::Button1_MOTION_MASK | Gdk::BUTTON_PRESS_MASK);
This will catch the signal when the left mouse button is clicked and positioned on the widget.
BUTTON2_MOTION_MASK will require that 2 button are pressed. For some reason, it's only the left mouse button(I want the middle button).

Change QWidget Parent During Mouse Event

I'm trying to create a detachable type style widget, like in the way Chrome tabs are detachable (class is called Tab). I have everything working, except for a bug where sometimes (maybe 50% of the time), the Tab object never gets the mouse release event, and stops getting mouse move events.
Essentially, the detaching system works by allowing drags in the mouse press/move/release functions, just like normal. The mouseMoveEvent checks the total distance moved from the start, and if over a certain amount, will start the "detaching" process. The detaching process involves setting the parent widget to 0 (top level widget, undecorated window), so the Tab object is pretty much floating above everything, under the mouse, and continues to be dragged along with it until released.
I ran through all the QEvent items being delivered, and I found that when this issue occurs, the QEvent::MouseMove items (and all mouse events after this) are being sent to the TabBar (the Tab object's original parent). This occurs directly after calling setParent(0) on the Tab.
Basic mouse handling overview:
void Tab::mousePressEvent(*) {
[set up some boolean, start positions, etc]
}
void Tab::mouseMoveEvent(*) {
[track the updated position]
if (positionChange > STATIC_AMOUNT)
detachTab();
}
void Tab::mouseReleaseEvent(*) {
[return the Tab to its original position, and set the parent back to the TabBar]
}
void Tab::detachTab() {
QPoint mappedPos = mapToGlobal(0, 0);
setParent(0); //The loss of MouseMove events occurs when this returns.
move(mappedPos);
show();
raise();
}
Here are the events that the Tab object receives (first row is QEvent type, second is the name)
[Tab::detachTab() started]
[setParent(0) started]
QEvent::Hide
QEvent::Leave
qApp QEvent::MouseMove [ TabBar ] <-- now the TabBar is soaking up the mouse events
QEvent::HideToParent
QEvent::ParentAboutToChange
QEvent::ParentChange
[setParent(0) returned]
....
Summed up: my draggable QWidget loses QEvent::MouseMove and QEvent::MouseButtonRelease events after having its parent set to 0.
Any advice would be really appreciated!
A bit tricky workaround. I didn't test it, it's just an idea.
When your mouse hovers draggable part of a widget you may create topmost widget (let's call it Shade) with Qt::FramelessWindowHint (and possible with Qt::WA_TranslucentBackground). You may manipulate with Shade apperance via reimplementing paintEvent. For example - draw content of original widget, or draw some transparent preview, etc.
Then you may resize a Shade during dragging, to show user that widget will be detached. You will not loose mouse capture.
When user release mouse - you remember position of Shade, destroy it and detach+move original widget.
Feel free to ask, if you want more details.
Here is similar question.
So you suppose to use QDocWidget and enforce stacking of this widgets using tabifyDockWidget.

Qt get mouse events outside of the application window

First I'm not certain this is even possible without some sort of hacking of X.11 input but the discussions i'd seen online made me think it was possible.
Allow me to explain what I hope to do. I want a Qt application which will most likely just be a small window that sides on the screen sort of like a widget. The application does nothing until the user drags another application window over the top of it. The way I was hoping to detect this was to track the mouse and see if the left click is down and the mouse is over the Qt window and Qt is not the active window then do some action. However currently I havent been able to get mouse events when my Qt application is not the active window. I think some of these posts I linked refer to 'window' as a QWindow inside the QApp.
What I mean by window however is a X.11 Window, any application opened in X. My screenshots I hope highlight my current plight. I've attached my code as well and am happy to take any suggestions. Any other hacks that are known to help me achieve this I would also appreciate being informed of.
The red shows where my cursor has clicked, and the mouse event is recorded outside of the Qt window. This was triggered by the 'FocusOut' event however and is the last event I have managed to detect.
As we can see in the console, the mouse has moved but no events are caught. I really want to detect when the mouse crosses over onto the position the Qt App Window is at regardless of whether it is on top of another window or not.
bool MainWindow::eventFilter(QObject *obj, QEvent *event)
{
if (event->type() == QEvent::MouseMove)
{
QMouseEvent *mouseEvent = static_cast<QMouseEvent*>(event);
statusBar()->showMessage(QString("Mouse move (%1,%2)").arg(mouseEvent->pos().x()).arg(mouseEvent->pos().y()));
qDebug() << QString::number(mouseEvent->pos().x());
qDebug() << QString::number(mouseEvent->pos().y());
}
if (event->type() == QEvent::FocusOut)
{
QFocusEvent *focusEvent = static_cast<QFocusEvent*>(event);
focusEvent->accept();
qDebug()<<"event Filter Mouse Move111"<<QCursor::pos();
}
return false;
}
void MainWindow::initWindow()
{
//Makes the window frameless and always on top
//setWindowFlags(Qt::FramelessWindowHint|Qt::WindowStaysOnTopHint);
//Makes the window transparent
//setAttribute(Qt::WA_TranslucentBackground);
//Allows 'mouseMoved' events to be sent, not sure yet if this will be useful, I think we want mouseDragged
setMouseTracking(true);
grabMouse();
//setup this as an event filter for mouse events
qApp->installEventFilter(this);
}
Alright heres how I solved this problem. The event system in Qt, any application I assume, won't register events when the window is not active. However the process is obviously still running so data you can access while the window is active you can access whilst the window is no longer active.
Use a timed poll method to get the mouse position every n seconds
//Method used to hopefully track the mouse regardless of whether or not it is inside the active window
void MainWindow::pollMouse(unsigned long sec)
{
//Loop forever
while ( true )
{
QPoint mouseLoc = QCursor::pos();
qDebug() << "Mouse position global: x,y" << mouseLoc.x() << mouseLoc.y();
QThread::sleep(sec);
}
}

C++ CLI GUI Event Handeling

I am working on a c++ CLI application and am having some difficulty with events. I am wondering if I can get events to fire while the mouse button is clicked. For example, I am wanting to check whether or not the mouse has moved to the next square over only if they have the mouse clicked in. Meaning if they click on square 1 they should be able to hold that click and move the square 2 and my program recognize this.
I have run a number of different events on the mouse, including the "Click" event, but the neither the hover, mouse enter, or mouse down event get triggered while the button is pressed. The "MouseClick" event, which does the same. I tried using just the mouseDown event, but this does not let another mouseDown event, mouse enter, or hover event fire.
Short of checking mouse position I do not know what I can do. I would like to not have to do mouse position checking.
If anyone has any ideas, they would be greatly appreciated.
Clearly you'll want to pay attention to the MouseMove event so you can see the mouse moving into another square. Roughly:
void panel1_MouseMove(Object^ sender, MouseEventArgs^ e) {
if ((e->Button & System::Windows::Forms::MouseButtons::Left) ==
System::Windows::Forms::MouseButtons::Left) {
int square = MapPosToSquare(e->Location);
if (square != currentSquare) {
currentSquare = square;
OnSquareClicked(currentSquare);
}
}
}
If these "squares" are actually controls then you have a different problem. You have to set the control's Capture property to false in the MouseDown event handler so it doesn't capture the mouse.