I am trying to remote control a Qt application over UDP messages from my host computer.
Currently, the remote controlled computer runs a Qt application that receives messages and creates Mouse Events. Host computer sends type, button, modifier and position (between 0 and 1) whenever there is a new Mouse Event. I am trying to create events with these values in remote controlled computer and imitate them just like they are created by a user.
void MainWindow::slotNewMouseEvent(MouseMsg msg)
{
auto point = scalePoint(msg.x, msg.y); // basic conversion for coordinates
switch(msg.type)
{
case QEvent::MouseButtonPress:
QTest::mousePress(this->childAt(point), (Qt::MouseButton) msg.button,
(Qt::KeyboardModifiers) msg.modifier);
break;
case QEvent::MouseButtonRelease:
QTest::mouseRelease(this->childAt(point), (Qt::MouseButton) msg.button,
(Qt::KeyboardModifiers) msg.modifier);
break;
case QEvent::MouseButtonDblClick:
QTest::mouseDClick(this->childAt(point), (Qt::MouseButton) msg.button,
(Qt::KeyboardModifiers) msg.modifier);
break;
case QEvent::MouseMove:
QTest::mouseMove(this->childAt(point), point);
break;
default:
break;
}
}
void MainWindow::slotNewMouseEvent(MouseMsg msg)
{
auto point = scalePoint(msg.x, msg.y); // basic conversion for coordinates
QMouseEvent *evt = new QMouseEvent((QEvent::Type) msg.type,
point,
(Qt::MouseButton) msg.button,
(Qt::MouseButtons) msg.buttons,
(Qt::KeyboardModifiers) msg.modifier);
qApp->postEvent(this->childAt(point), evt);
// I was expecting funcionalities like this one to be done automatically
if(evt->type() == QEvent::MouseButtonDblClick || evt->type() == QEvent::MouseButtonPress)
{
this->childAt(point)->setFocus();
}
qApp->processEvents();
}
Both of these functions gives almost same results which is not desired. I was expecting sending newly created events to the top widget on screen would work. However, some functionalities like setting focus on clicked element, opening context menu by right click or selecting elements of a combobox (elements window is not a widget) does not work.
Is there a solution with keeping those events in remote controlled computer's Qt application? I would prefer not using system libraries like "windows.h" to prevent harms to the system and making my application cross-platform. Do I have to use them?
Related
I've implemented a Qt3DWidget which works fairly well by letting Qt3D draw to an offscreen texture and using the texture's ID to draw on a quad in a QOpenGLWidget. The input source on the QInputSettings is set to this, i.e. the widget itself.
One issue that remains is that mouse hovering events (without clicking) are not properly processed and a QObjectPicker never fires its moved event when only hovering over an object. Clicking and dragging works on the other hand. For a couple of hours now I tried to track down where the event gets eaten - I'm sure this happens somewhere because clicking and moving emits the moved event. The latter indicates (in my view) that the event filter (a PickEventFilter - private Qt3D class) has been successfully installed. One failure case would have been that the event filter doesn't get installed.
Now I'm kind of stuck because it just seems impossible trying to figure out where the event dies. I've got the Qt debugging symbols and stepped through the code (which is sometimes a bit buggy because maybe due to code optimization). I figured out that the PickEventFilter gets the leaving event when moving the mouse by setting a breakpoint in this method:
bool PickEventFilter::eventFilter(QObject *obj, QEvent *e)
{
Q_UNUSED(obj);
switch (e->type()) {
case QEvent::MouseButtonPress:
case QEvent::MouseButtonRelease:
case QEvent::MouseMove: {
QMutexLocker locker(&m_mutex);
m_pendingMouseEvents.push_back({obj, QMouseEvent(*static_cast<QMouseEvent *>(e))});
} break;
case QEvent::HoverMove: {
QMutexLocker locker(&m_mutex);
QHoverEvent *he = static_cast<QHoverEvent *>(e);
m_pendingMouseEvents.push_back({obj, QMouseEvent(QEvent::MouseMove,
he->pos(), Qt::NoButton, Qt::NoButton,
he->modifiers())});
} break;
case QEvent::KeyPress:
case QEvent::KeyRelease: {
QMutexLocker locker(&m_mutex);
m_pendingKeyEvents.push_back(QKeyEvent(*static_cast<QKeyEvent *>(e)));
}
default:
break;
}
return false;
}
And also on this line in QWidgetWindow:
QApplicationPrivate::sendMouseEvent(receiver, &translated, widget, m_widget,
&qt_button_down, qt_last_mouse_receiver);
When sendMouseEvent sends the moving event it never reaches the event filter. The only thing that arrives is a leave event. When you use createWindowContainer and Qt3DWindow the mouse events work. I really don't know what the difference is.
I don't think it's feasible to post code related to this issue but I'd hope that some of you can provide some ideas what to try out.
Thanks to Scheff in the comments I was able to make it work by setting setMouseTracking in the QWidget class to true. This was blocking the hovering events.
I'm working on determining if a certain touchscreen will be compatible with an application and recently got a loaner model of an Elo 2402L touchscreen. I've installed the driver the company provides and was able to see multi-touch events using the evtest utility (parser for /dev/input/eventX).
The thing is that I'm running Scientific Linux 6.4, which uses Linux kernel 2.6.32. I've seen a lot of mixed information on touchscreen compatibility for Linux kernels before 3.x.x. Elo says that their driver only supports single-touch for 2.6.32. Also, I've seen people say that the majority of the compatibility issues with touch events in this kernel version are with Xorg interfaces.
I developed a very simple Qt5 application to test whether Qt could detect the touch events or not, because I'm not sure whether Qt applications are X-based and if they read events directly from /dev/input or something else.
However, despite a simple mouse event handler being able to correctly register mouse events, I also created a simple touch event handler and nothing happens when I touch the main screen. There is a beep, as part of the driver that Elo provides makes a beep when the screen is touched, so I know that SOMETHING is registering that touch, but neither the desktop, nor this application seem to recognize the touch event.
Also, yes, the WA_AcceptTouchEvents attribute is set to true in the window's constructor.
I have a simple mainwindow.h:
...
protected:
int touchEvent(QTouchEvent *ev);
...
And mainwindow.cpp:
MainWindow::MainWindow(QWidget *parent) {
...
setAttribute(Qt::WA_AcceptTouchEvents, true);
touchPoints = 0;
}
...
int MainWindow::touchEvent(QTouchEvent *ev) {
switch(ev->type()) {
case QEvent::TouchBegin:
touchPoints++;
break;
case QEvent::TouchEnd:
touchPoints--;
break;
}
ui->statusBar->showMessage("Touch Points: " + touchPoints);
}
Is there something wrong with the way I'm using the touch event handler? Or is there some issue with the device itself? Does Qt read input events directly from /dev/input, or does it get its input events from X?
Very confused here, as I haven't used Qt before and want to narrow down the cause before I say that it's the device causing the issue.
Also, if anyone has any insight into the device / kernel compatibility issue, that would be extremely helpful.
The QTouchEvent documentation says:
Touch events occur when pressing, releasing, or moving one or more
touch points on a touch device (such as a touch-screen or track-pad).
To receive touch events, widgets have to have the
Qt::WA_AcceptTouchEvents attribute set and graphics items need to have
the acceptTouchEvents attribute set to true.
Probably you just need to call setAttribute(Qt::WA_AcceptTouchEvents, true) inside the MainWindow constructor.
Is there something wrong with the way I'm using the touch event handler?
There is no touch event handler. If you change:
int touchEvent(QTouchEvent *ev);
to:
int touchEvent(QTouchEvent *ev) override;
(which you should always do when you are trying to override virtual functions so you can catch exactly this kind of mistake), you'll see that there is no such function for you to override. What you need to override is the event() handler:
protected:
bool event(QEvent *ev) override;
You need to check for touch events there:
bool MainWindow::event(QEvent *ev)
{
switch(ev->type()) {
case QEvent::TouchBegin:
touchPoints++;
break;
case QEvent::TouchEnd:
touchPoints++;
break;
default:
return QMainWindow(ev);
}
ui->statusBar->showMessage("Touch Points: " + touchPoints);
}
However, it might be better to work with gestures instead of touch events. But I don't know what kind of application you're writing. If you wanted to let Qt recognize gestures rather than implementing them yourself through touch events, you would first grab the gestures you want, in this case pinching:
setAttribute(Qt::WA_AcceptTouchEvents);
grabGesture(Qt::PinchGesture);
and then handle it:
bool MainWindow::event(QEvent *ev)
{
if (e->type() != QEvent::Gesture) {
return QMainWindow::event(e);
}
auto* gestEv = static_cast<QGestureEvent*>(e);
if (auto* gest = gestEv->gesture(Qt::PinchGesture)) {
auto* pinchGest = static_cast<QPinchGesture*>(gest);
auto sf = pinchGest->scaleFactor();
// You could use the pinch scale factor here to zoom an image
// for example.
e->accept();
return true;
}
return QMainWindow::event(e);
}
Working with gestures instead of touch events has the advantage of using the platform's gesture recognition facilities, like those of Android and iOS. But again, I don't know what kind of application you're writing and on what kind of platform you're working on.
I'm trying to write simple mouse clicker for ubuntu via x11.
For first i wrote first variant (based on XSendEvent) of clicking procedure:
#include <unistd.h>
#include <X11/Xlib.h>
#include <X11/Xutil.h>
void mouseClick(int button)
{
Display *display = XOpenDisplay(NULL);
XEvent event;
if(display == NULL)
{
std::cout << "clicking error 0" << std::endl;
exit(EXIT_FAILURE);
}
memset(&event, 0x00, sizeof(event));
event.type = ButtonPress;
event.xbutton.button = button;
event.xbutton.same_screen = True;
XQueryPointer(display, RootWindow(display, DefaultScreen(display)), &event.xbutton.root, &event.xbutton.window, &event.xbutton.x_root, &event.xbutton.y_root, &event.xbutton.x, &event.xbutton.y, &event.xbutton.state);
event.xbutton.subwindow = event.xbutton.window;
while(event.xbutton.subwindow)
{
event.xbutton.window = event.xbutton.subwindow;
XQueryPointer(display, event.xbutton.window, &event.xbutton.root, &event.xbutton.subwindow, &event.xbutton.x_root, &event.xbutton.y_root, &event.xbutton.x, &event.xbutton.y, &event.xbutton.state);
}
if(XSendEvent(display, PointerWindow, True, 0xfff, &event) == 0)
std::cout << "clicking error 1" << std::endl;
XFlush(display);
event.type = ButtonRelease;
event.xbutton.state = 0x100;
if(XSendEvent(display, PointerWindow, True, 0xfff, &event) == 0)
std::cout << "clicking error 2" << std::endl;
XFlush(display);
XCloseDisplay(display);
}
This code works fine on every application except chrome (with mozilla works fine too).
So i wrote second variant (based on XTestFakeButtonEvent):
#include <X11/extensions/XTest.h>
void SendClick(int button, Bool down)
{
Display *display = XOpenDisplay(NULL);
XTestFakeButtonEvent(display, button, down, CurrentTime);
XFlush(display);
XCloseDisplay(display);
}
And this code works fine everyvere include chrome.
Calling of those functions is very simple
// XSendEvent variant
mouseClick(1);
// XTestFakeButtonEvent variant
SendClick(1, true); // press lmb
SendClick(1, false); // release lmb
1: help me to understand what i'm doing wrong (or what wrong in chrome maybe) in first variant.
1.1: I think that i'm trying to send event not for needed window, when i open display with XOpenDisplay(NULL);. Does chrome have different connection system with x11 server?
2: is it good idea to use second variant in applications? It pretty short and works fine with every app i have)
P.S. to compile this code you need add -lX11 -lXtst libs
XSendEvent produces events that are marked as sent. Events sent by the server are not marked.
typedef struct {
int type;
unsigned long serial;
Bool send_event; // <----- here
Display *display;
Window window;
} XAnyEvent;
Some applications ignore events that have this flag set, for security reasons. Think of malware that somehow gets access to your X11 server — it can trick any application into doing whatever it wants by sending those events.
It is perfectly OK to use the second variant on your own machine, but it relies on an extension that can be disabled (again, for security reasons) and so not necessarily works on other people's X11 servers.
On XCB you can use the following function to verify if event was sent via XSendEvent() / xcb_send_event() API:
static bool fromSendEvent(const void *event)
{
// From X11 protocol: Every event contains an 8-bit type code. The most
// significant bit in this code is set if the event was generated from
// a SendEvent request.
const xcb_generic_event_t *e = reinterpret_cast<const xcb_generic_event_t *>(event);
return (e->response_type & 0x80) != 0;
}
AFACT, there is no way to tell if event was send via XTest extension.
You should use XTest as it will work better, XSendEvents don't know anything about internal X server state. Fom XSendEvent manual:
"The contents of the event are otherwise unaltered and unchecked by the X server except to force send_event to True".
So with XSendEvent you might have unexpected issues in some situations.
Although not by using Xlib directly, but through a python Xlib wrapper library as a proxy to Xlib, the first approach currently does work on all windows I currently have open on my desktop, other than with IntelliJ.
In this first approach, you are sending the event directly to a target window, and as others have noted, your event is also marked (tainted) with an attribute value marking it as a simulated one. The receiving window might act on it just the same, as many application windows do.
With the second approach however, you are emulating the actual thing happening ― per my understanding virtually indistinguishable from a user solicited event: the event goes through the fuller X11 flow of handling for a user input event (rather than being blindly dispatched directly to the target window) which means that it will trickle down to the window (or Gnome desktop widget) under the pointer as in the natural flow of events for real user solicited events.
As such, the second approach appears to be more broadly applicable than the first approach ― it will have the desired effect also for windows that opt not to act on the event sent to them through the first approach, as well as on e.g. Gnome desktop elements which are not ordinary windows per-se (such as the language and power widgets). You supply the coordinates without any mention of a window, and the click goes through.
If I had to come up with some kind of explanation for this duality of routes, I might think that XSendEvent is more of a general purpose event sending facility, whereas XTEST provides means for specifically simulating user input events.
So, I have an application where if a particular button is kept pressed it plays an audio device, when the button is released it stops the audio device. I use keyPressEvent and KeyReleaseEvent to implement this which is similar to the code below:
void MainWindow::keyPressEvent(QKeyEvent *event)
{
if(event->isAutoRepeat())
{
event->ignore();
}
else
{
if(event->key() == Qt::Key_0)
{
qDebug()<<"key_0 pressed"<<endl;
}
else
{
QWidget::keyPressEvent(event);
}
}
}
void MainWindow::keyReleaseEvent(QKeyEvent *event)
{
if(event->isAutoRepeat())
{
event->ignore();
}
else
{
if(event->key() == Qt::Key_0)
{
qDebug()<<"key_0 released"<<endl;
}
else
{
QWidget::keyReleaseEvent(event);
}
}
}
But apparently isAutoRepeat function isn't working as I can see continuous print out of key_0 pressed and key_0 released despite the fact I haven't released the 0 key after I have pressed it. Is my code wrong or something else is wrong?
Thanks.
EDIT
I think this is happening because the MainWindow loses the keyboard focus. How can I actually find out which widget has the focus? I'm actually using some widgets when Qt::Key_0 pressed, but I thought I set all those possible widgets to Qt::NoFocus, I guess it's not working.
I'm trying to know which widget has the focus by doing the following:
QWidget * wigdet = QApplication::activeWindow();
qDebug()<<wigdet->accessibleName()<<endl;
but it always prints an empty string. How can I make it print the name of the widget which has the keyboard focus?
So as I also stumbled over this issue (and grabKeyboard didn't really help), I begun digging in qtbase. It is connected to X11 via xcb, and by default, in case of repeated keys, X11 sends for each repeated key a release-event immediately followed by a key-press-event. So holding down a key results in a sequence of XCB_BUTTON_RELEASE/XCB_BUTTON_PRESS-events beeing sent to the client (try it out with xev or the source at the end of this page).
Then, qt (qtbase/src/plugins/platforms/xcb/qxcbkeyboard.cpp) tries to figure out from these events whether its an autorepeat case: when a release is received, it uses a lookahead feature to figure if its followed by a press (with timestamps close enough), and if so it assumes autorepeat.
This does not always work, at least not on all platforms. For my case (old and outworn slow laptop (Intel® Celeron(R) CPU N2830 # 2.16GHz × 2) running ubuntu 16.04), it helped to just put a usleep (500) before that check, allowing the press event following the release event to arrive... it's around line 1525 of qxcbkeyboard.cpp:
// look ahead for auto-repeat
KeyChecker checker(source->xcb_window(), code, time, state);
usleep(500); // Added, 100 is to small, 200 is ok (for me)
xcb_generic_event_t *event = connection()->checkEvent(checker);
if (event) {
...
Filed this as QTBUG-57335.
Nb: The behaviour of X can be changed by using
Display *dpy=...;
Bool result;
XkbSetDetectableAutoRepeat (dpy, true, &result);
Then it wont send this release-press-sequences in case of a hold down key, but using it would require more changes to the autorepeat-detection-logic.
Anyway solved it.
The problem was that I have a widget which is a subclass of QGLWidget which I use to show some augmented reality images from Kinect. This widget takes over the keyboard focus whenever a keyboard button is pressed.
To solve this problem, I needed to call grabKeyboard function from the MainWindow class (MainWindow is a subclass of QMainWindow), so this->grabKeyboard() is the line I needed to add when key_0 button is pressed so that MainWindow doesn't lose the keyboard focus, and then when the key is released I needed to add the line this->releaseKeyboard() to resume normal behaviour, that is, other widgets can have the keyboard focus.
Lets say we have an input device which is a controller. In addition, lets say we also have some construct which checks for events using "SDL_PollEvent(...)".
In my tests, "SDL_PollEvent(...)" was checked every 3 seconds, if a button is tapped quickly just once or even tapped multiple times during the delay, SDL will not generate the button press nor the button release events. If the very same button which was tapped on the controller, is now pressed and held down, the SDL dose generated those events.
In addition to this very same test, if a keyboard key is tapped quickly just once during the very extreme "SDL_Delay(...)" test, a button pressed event is always generated.
Quick button press and release from the controller device will generate events with "SDL_WaitEvent(...)". But, SDL_INIT_VIDEO and SDL_INIT_EVENTTHREAD must be in the same thread as they can't be separated across multiple SDL_Thread[s] in my case.
Given the delay between event polling and how brief a button press is, my situation needs to know that the button was pressed at minimum once regardless.
What can be done in this situation? So that SDL would generate/poll controller events the very same way as it dose for the keyboard device?
Is SDL 1.2 not capable to do this? Are there other libraries which are better capable at guaranteeing that a button tap event is generated ? Thank you!
bool activity(1);
while(activity) // event polling used in testing
{
std::cout << "\nWaiting...";
SDL_Delay(3000U); // <--- simulating delay for purpose of test
while(SDL_PollEvent(&event_))
{
switch(event_.type)
{
case SDL_JOYBUTTONDOWN :
switch(event_.jbutton.button)
{
case 0U :
...
break;
case 1U :
...
break;
}
break;
case SDL_KEYDOWN :
switch(event_.key.keysym.sym)
{
case SDLK_a :
...
break;
}
break;
}
}
}