Using a Windows 7 touch device Windows shows this little touch-keyboard indicator (tabing this will bring up the touch on screen keyboard) when you tab/focus a textbox or kind of input field (Notepad etc.).
I want to write an application that gets notified when exactly that happens, a textbox (etc.) gets focused (no matter which application).
Are applications informed about focusing in other applications, do I need to hook something?
Is there a way in doing so in c++?
I believe the SetWinEventHook function and specifically the EVENT_OBJECT_FOCUS event is what you are looking for.
From the MSDN description:
An object has received the keyboard focus. The system sends this event for the following user interface elements: list-view control, menu bar, pop-up menu, switch window, tab control, tree view control, and window object. Server applications send this event for their accessible objects.
The hwnd parameter of the WinEventProc callback function identifies the window that receives the keyboard focus.
Related
In my application when I receive an ON_WM_RBUTTONDOWN() message in a certain window I create a CMenu, populated with some items, and then displayed with TrackPopupMenu(xxx). It has no other interaction with Windows messages to be created. It defaults to accepting left clicks to select items and I can see these messages coming in when I use the mouse.
I'm trying to allow use of this context menu on a touch screen - the parent window can receive WM_GESTURENOTIFY messages (for other functionality) and in all other aspects of my app, such as other windows and dialogs, it handles touch gestures fine - Spy++ shows gesture messages and a WM_LBUTTONDOWN which gives me normal behaviour across the app. I CAN touch select menu items when this menu is opened with a physical mouse right click, with the touch input coming through as a WM_LBUTTONDOWN.
I've tried creating and displaying this menu by calling that same creation code again from a touch message, or just sending the window an ON_WM_RBUTTONDOWN() message manually after a touch, with the same flags. This is created fine and works as normal with a mouse, and as far as the app is concerned nothing is different. However, the CMenu is not receiving any touch messages at all - I get the touch-style cursor showing where I'm tapping, but nothing is being piped through to the menu.
I've tried changing from gestures to registering touch while this interaction happens and ensuring the original gesture handle is closed in case it was blocking for whatever reason.
My assumption is that Windows is doing something additional behind the scenes beyond what my app is aware of to allow these messages to be sent through, so I'm a bit stuck for a solution.
I was able to get around this issue by enabling the tablet press-and-hold gesture (it's normally disabled) which serves the purpose of being treated as a right click and having a properly interactable context menu, rather than sending the right click message myself. Works on desktop with a touch screen and a Windows tablet.
https://learn.microsoft.com/en-us/troubleshoot/developer/visualstudio/cpp/language-compilers/mfc-enable-tablet-press-and-hold-gesture
Adding
ULONG CMyView::GetGestureStatus(CPoint /*ptTouch*/) { return 0; } was what enabled this to work.
Are notification/alert windows (that appear above the Windows System Tray) like the examples below just a standard window, owner drawn HMENU's or are they implemented using NOTIFYICONDATA? Note: I know that the actual system tray icon is implemented using NOTIFYICONDATA, but are the notification windows also implemented using this structure?
In my WinAPI C++ application I want to show a similar notification where it will appear above the system tray icon, have buttons, horizontal scroll bars and etc. I know I can just create a new HWND, position it above the system tray and show that but if there is a specific WinAPI 'System Tray Notification' class/function I would prefer to use that, thus my question.
Are notification/alert windows (that appear above the Windows System Tray) like the examples below just a standard window, owner drawn HMENU's or are they implemented using NOTIFYICONDATA?
These are custom dialogs displayed when needed. They are not implemented using NOTIFYICONDATA. You can use Shell_NotifyIconGetRect() to get the current location of your tray icon when needed.
Tray notifications are all about notifying the process that owns the icon that a click event has occurred so it can then do its thing, whatever that may be.
There is no specialised mechanism or framework for anything GUI related in this case.
Best Practices: When a user right-clicks the icon, it should
bring up a normal shortcut menu. However, the result of a single
click with the left mouse button will vary with the function of the
icon. It should display what the user would expect to see in the form
best suited to that content—a popup window, a dialog box or the program
window itself. For instance, it could show status text for a status icon,
or a slider for the volume control.
I am currently developing a user interface DLL that uses the WIN32 API. The DLL must work for numerous platforms, XP, WIN CE, etc. I have managed to incorporate docking, anchoring and so on but appear to have a problem regarding owner-drawn buttons. I can draw the button's correct state, focus, clicked, default. However, I cannot receive key notifications. I specifically want to perform a click operation on a button that currently has focus, should the user press enter.
Note that I am using a windows message loop rather than a dialog message loop. I use windows hooks to hook into the window creation and set the user data to 'point' to my control instance. If I test for WM_KEYDOWN in the main message loop I can get a handle to my button control instance and could forward the message to the relevant control. Unfortunately, I am dealing with a lot of legacy code and this may not be an ideal solution.
So, my question is what is the best way forward. Is subclassing the button control's window procedure a viable option or is there an easier way?
Many thanks in advance.
The comments above are correct. The button with focus should be getting the key messages. But buttons don't (by themselves) respond to Enter--they respond to Space. It sounds like what you're missing is the typical dialog keyboard navigation, like Tab key moving the focus and Enter activating the "default" button.
If you've got a typical Windows message pump, and you want the keyboard behavior normally associated with dialogs, then you need to use the IsDialogMessage API in your message loop. This means your window is essentially a "modeless dialog".
Looks like standard window proc subclassing should do the trick. See http://msdn.microsoft.com/en-us/library/windows/desktop/ms633591(v=vs.85).aspx for details.
I have a dynamically created toolbar on a plain Win32 dialog. My buttons are added with & shortcuts which correctly puts underscore to characters following ampersand but pressing Alt+(char) causes a beep and the button is not clicked.
It has been a while since I have done Win32 API development. Is there something that needs to be done to a dynamically created child window (toolbar) in order for the accelerator keys to work?
This could be something really obvious that I am missing...
Well... You have to write code to handle these keypresses and convert them into WM_COMMAND messages. The traditional way to do this is to define an accelerator table and process them using TranslateAccelerator() - but of course, you're free to do it however you like... Just make sure the keys you handle jibe with the keys you underline!
You might also find this KB article helpful: How to use accelerator keys within a modal dialog box in Visual C++... Or, for a more in-depth (and MFC-free) look at implementing custom message processing in dialogs, check out Raymond Chen's articles on the dialog manager, specifically part 4: The dialog loop and part 9: Custom accelerators in dialog boxes (but seriously, read the whole thing, you know you want to...)
The beep indicates that the command isn't handled by any window in your app.
Since you created the toolbar dynamically, I would guess that the toolbar window isn't set up properly as a child window of your main window (i.e., it's parent and owner window are not set).
To test: click on the toolbar so it has the focus, then press Alt- and it should work.
Scenario: I would like a window control which is a sub-window in my dialog (a subwindow of a subwindow) to propagate its notification messages out to the dialog window.
e.g. A COMBOBOX contains an EDIT control. I have a circumstance where I would really like to know when the EDIT field gains and loses focus (mainly because the stupid COMBOBOX doesn't claim focus or give me notifications if it happens to its embedded EDIT).
But I can see how this could be a general issue: a Control issues a message to its parent WM_NOTIFY... which the direct parent doesn't care about, but maybe its parent does.
Is there a generic way to ask a windows window to propagate notification messages from its subwindows?
e.g. if dialog D has a control C which has a sub-control C', then is there a way to ensure that D receives WM_NOTIFY messages from C'?
I believe you need to subclass the window, see http://msdn.microsoft.com/en-us/library/ms997565.aspx (Content has been removed!) .