Windows Keybord display - c++

I am trying to make an on screen keyboard for windows in C++ that rather than sends an input such as SendInput() takes intercepts the user inputs allowing the user to use a window and have the key press show on the on-screen keyboard.
I plan on using this as a way of making tutorials in programs such as unity and can be used as an overlay for people know play games. to do this I will need to take in the input without stopping it going to its destination but I don't know how.
Any help would be appreciated.

Related

Sending mouse and keyboard input to a unity3d game (Rust)

I would like a python option but that doesn't seem likely, I looked into c++ but I'm not intimately familiar and the methods I tried were not working. I need to be able to move the mouse(which I have been able to do) inside the game (which never works), and to press and or hold keys, AutoHotKey doesn't work for moving the mouse inside of popup dialogs only for controlling recoil and such. I have permission from the admin, I admin on one of his other servers, I'm not looking to release hacks for the game, its just a project I dabbled with for a while and would like to see out.
Does anyone have experience with this or ideas as to how I can simulate input from mouse or keyboard?
i have expeience in ahk and game bot making
https://www.youtube.com/user/FloowSnaake/videos

Interact with another program from C++ (POSIX??)

I made my first GUI for Unix with QT. At the end of my GUI I start another program, which is based on python. There the user would need to press one button to complete the process.
Now I want to automate that last click. I already read a little about POSIX but I'm not quite sure if it can help me. I was thinking that, if I wont be able to access the programm directly, maybe I could at least set to mouse to a certain position and simulate a click? I know this solution is very dirty, but it might work because I will be using the GUI on one certain touch screen only

How to know if computer is in gaming mode

Background
I'm implementing a simple WIN32 application consist of a window.
The user may show/hide the window using a special hotkey. I register the hotkey using RegisterHotKey and respond to WM_HOTKEY
Problem
if user plays a game and accidentally (or not accidentally) press the hotkey combination, then my window pops up and as a result the game is minimized.
Question
Is there a (native) way to know that the user is in gaming mode, or any other special mode, that I could disable the hotkey response
Note
I would also like if windows would make this a feature while I play games. For example don't respond to WinKey+D while I'm in gaming mode.
You can use the SHQueryUserNotificationState function to determine whether the user is playing a full screen D3D game. It will report QUNS_RUNNING_D3D_FULL_SCREEN.

Trying to write a c++ console program to change a setting controlled by a windows checkbox

Is it possible to create a keyboard shortcut to switch between the monitor and portion selection of this wacom preferences window, via a c++ console program?
Sorry if this is poorly worded, I've had trouble trying to find the right words to search for ways to do it.
I think it should be possible, although a bit tedious. You should be able to use the Windows API, and try to EnumWindows/EnumDesktopWindows to identify the respective application Window, and its respective controls (which are also Windows).
You should identify the window title, and class ids, for the app window, and the checkbox button controls, then when you enumerate through all the desktop windows, you can identify the ones you are interested in.
Then you can use the SendMessage() API to send messages to the controls (Windows) of interest to manipulate them.
It's a bit tedious, but sounds possible.
An example of use here to get an idea:
http://www.cplusplus.com/forum/windows/25280/

How to simulate a mouse click without interfering with actual mouse in Python

I've made a simple click bot to automatically play an android game on my Windows PC. It currently identifies when certain things change on the screen then moves the mouse and clicks the correct button.
Currently I am using the following win32api functions to achieve this:
Win32api.SetCursorPos(Position)
Win32api.mouse_event(winn32con.MOUSEEVENTF_LEFTDOWN,0,0)
Win32api.mouse_event(winn32con.MOUSEEVENTF_LEFTUP,0,0)
These work great however when I use the bot it takes over my computer's mouse and I basically have to let it run. Is there anyway I could simulate a click without it actually using my mouse, or is there a way I could isolate the bot on one of my other screens and be able to work freely on the other?
There is a lib specific for deal with user interaction components and periferics: pyautogui
Here, is a short and easy to try documentation for performing/simulating mouse click DOWN & UP
https://pyautogui.readthedocs.org/en/latest/mouse.html#the-mousedown-and-mouseup-functions