How to send XInput button? - gamepad

I want to send a button pressed signal to a game and I have read the APIs in MSDN.
But the APIs there only provide functions which get gamepad state not send signal to PC.
Can any one help me?

There is no API for sending a button pressed signal. You will have to write code that monitors the gamepad state every frame and when a button press is detected you send the signal. (Note: You detect the button press by checking the current button flag against the previous frames button flag. If they differ the button has either been pressed or released, depending on flag state.)
A few more details here: Which event to listen for during XInput events

Related

How to retrieve key press/release event while GLFW window is in focus, but not process key event on the OS?

With the Win32 API you can hook keyboard events at different stages in this keyboard input model
If I was using the Win32 API, I could a create hook and retrieve the message from the system message queue, do "some stuff", then not send the message to the thread message queue. (I only want to cancel sending the message to the thread message queue if a a specific GLFW window is in focus)
But since I need my program to be cross platform, I'm just using a GLFW key callback to retrieve keyboard input when the GLFW window is in focus.
But if for example the user alt tabs (presses the Windows key on Windows, uses command + space on MacOS, etc.), I need to do "some stuff" with the keys they pressed without the user actually tabbing out of the GLFW window.
The only way the user should be able to drop focus of the GLFW window is if their cursor is out of the window's bounds and click out of it.
If the explanation of my problem is not so good, please let me know.

QPushButton: How to know whether a "released" signal will be followed by a "clicked" signal?

In my application, there are a few QPushButtons that I need to handle "invalid release" and "click" differently.
By "invalid release" I mean a release that happens outside the button (following a press within the button).
So I'm trying to inherit from QPushButton and implement my own signal void released(bool validClick);.
I'm thinking about using mouseReleaseEvent to check whether it's in the button's rect() to infer if a clicked() signal will follow the released() signal. Is this a good way to do it?
Update::background:
I have a group of three buttons, each of which can start the same action to the backend (with different configurations). However the backend can not handle multiple successive start commands without already started ones being cleaned up, so I have to prevent such a case in the button group.
This application is multi-touch, which makes it very easy to click all three buttons at the same time.
What I want to do:
1) When one of the buttons is pressed, disable others. This is essential because this is a multi touch GUI and multiple buttons being clicked at the same time is the first thing I need to avoid.
2) When the backend informs the button group that all previously started services has been closed and cleaned up and it's ready for next button click, button group unlocks all buttons.
3) If the user triggers a invalid release (press on the button, release outside it), all buttons should be unlocked immediately.
4) If the user triggers a valid release (click), the button group should also disable the button that the user has clicked so that it can not be clicked again until allowed by the backend.
If I can not differentiate between valid and invalid release, then I would have to treat case 4) and case 2) the same way as case 3). This is not desirable.
You don't care about presses nor releases, only about clicks and indications that tasks are done by the backend. You have two states: idle and busy. When idle, buttons are enabled. When any button is clicked() you transition to the busy state until the backend signals that it's not busy anymore. This is trivial to implement using the state machine framework.
You could also have three states to make failure handling easier: idle, pending, and busy. When any button is clicked() you transition to the pending state and request the work by the backend. When the backend signals that it has accepted the request, transition to busy, until the backend has signaled that it has done processing the request, whereas you transition to idle.

MFC button can not receive touch event

the list control can receive the touch event, but the push button can not receive the touch event, it always receive the mouse move event.
I want to send the touch event to the button's parent, how to resole this?
code like below can tell if it is a mouse or touch event, but can not forward the event to its parent to handle the touch event.
#define MOUSEEVENTF_FROMTOUCH 0xFF515700
if ((GetMessageExtraInfo() & MOUSEEVENTF_FROMTOUCH) == MOUSEEVENTF_FROMTOUCH) {
// Click was generated by wisptis / Windows Touch
}else{
// Click was generated by the mouse.
}
by the way, how to stop converting touch event to mouse event?
With WM_TOUCH/WM_GESTURE you get a handle to the touch input event list. The TOUCHINPUT data isn't dedicated to a specific window. Different to GESTUREINFO. But it shouldn't be complicated to translate the info.
An unhandled WM_GESTURE message passed to DefWindowProc will be propagated to the parent window. When forwarding gesture messages between windows, avoid sending messages from parent to child windows in order to prevent closed loops from occurring.
http://msdn.microsoft.com/en-us/library/ee220935.aspx
For a WM_TOUCH message you can use a user defined message and post this message with the lParam value to the parent of the button. But you must handle this message there and need to call CloseTouchInputHandle
If you don't pass the WM_TOUCH/WM_GESTURE message to DefWidnowProc no further translation is done.
Touch events are converted to mouse messages in the DefWindowProc processing.
What gestures are supported and converted to mouse messages is listed here.
The list control can receive the touch event, but the push button or textedit can not receive the touch event, it always receive the mouse move event.
code like below can tell if it is a mouse or touch event, but when flick up & down for a while, it can not receive the event.
Then flick left & right can always receive the mouse move event
#define MOUSEEVENTF_FROMTOUCH 0xFF515700
if ((GetMessageExtraInfo() & MOUSEEVENTF_FROMTOUCH) == MOUSEEVENTF_FROMTOUCH) {
// Click was generated by wisptis / Windows Touch
}else{
// Click was generated by the mouse.
}
I put this button in a scrollview which has the vertical scroll bar.

"?" help button triggers WM_HELP _and_ WM_LBUTTONUP

I have a Windows application that registers one CALLBACK procedure that handles WM_HELP messages for the dialog, and one CALLBACK procedure that handles WM_LBUTTONUP messages for a custom button.
Now, when the user clicks the "?" button, then clicks the custom button, the help opens up as expected (on mouse down), BUT if the help window is not occluding the button, a WM_LBUTTONUP message is triggered as well for the custom button (on mouse up). This causes the button to trigger when the user was only asking for help.
Is there any way to stop the WM_LBUTTONUP message from being sent if the button press is for help?
EDIT: The custom button is implemented using a STATIC control. I believe this is because it needs to have an image and no border. So, it does not send BN_CLICKED notifications. In fact, it does not seem to trigger WM_COMMAND in the parent at all.
Thanks
This is normal. Be sure to use the button's BN_CLICKED notification to see the difference. Generated when the user clicks the button, not generated when the user uses the help cursor. The button still sees the normal button up/down messages so that's not a good trigger to detect a click. Extra bonus is that the button can now also be clicked with the keyboard (space bar).
A good class library takes care of these nasty little details.
A better way would be to create ? as a custom control with BS_CHECKBOX | BS_PUSHLIKE style and capture the mouse. After that you will get all the WM_LBUTTONDOWN message to this custom control and then you can use WindowFromPoint to get the window where the WM_LBUTTONDOWN happened and can send a custom notification to the parent window. The parent window can then show a the tooltip or open the help doc or discard it.
The advantage is, you create the control only once and you can use it in multiple places.
Okay, I fixed it by creating the custom button (static control) with the SS_NOTIFY style, then handling the STN_CLICKED notification in the WM_COMMAND message. (SS_NOTIFY causes WM_COMMAND to trigger in the parent when it is clicked.) This does not trigger when using the "?" button. Thanks!

How to send mouse click event to a game application?

I try to send a mouse click event to a game application. First, i use Spy++ to find what message the application receive. I see something like : WM_MOUSEACTIVATE, WM_WINDOWPOSCHANGING, WM_ACTIVATEAPP, WM_ACTIVATE, WM_SETFOCUS, ...
i try to send the same as i see on Spy++ but it doesn't work. How to send mouse click to a game application without give it focus? . it's run in window mode. Thanks in advance.
You want WM_LMOUSEDOWN. You can always check MSDN for the documentation on which messages mean what.
The best way to automate applications and games is via SendInput. While in theory it should be possible to drive an application via WM_LUBTTONDOWN etc, many applications read the key state directly via lower level APIs (such as GetAsyncKeyState) which don't change their state to reflect the messages processed from the message queue.
Using SendInput requires actually setting the game to the foreground as the input events are synthesized at a low level and are thus delivered to the active/focused window.