In my application when I receive an ON_WM_RBUTTONDOWN() message in a certain window I create a CMenu, populated with some items, and then displayed with TrackPopupMenu(xxx). It has no other interaction with Windows messages to be created. It defaults to accepting left clicks to select items and I can see these messages coming in when I use the mouse.
I'm trying to allow use of this context menu on a touch screen - the parent window can receive WM_GESTURENOTIFY messages (for other functionality) and in all other aspects of my app, such as other windows and dialogs, it handles touch gestures fine - Spy++ shows gesture messages and a WM_LBUTTONDOWN which gives me normal behaviour across the app. I CAN touch select menu items when this menu is opened with a physical mouse right click, with the touch input coming through as a WM_LBUTTONDOWN.
I've tried creating and displaying this menu by calling that same creation code again from a touch message, or just sending the window an ON_WM_RBUTTONDOWN() message manually after a touch, with the same flags. This is created fine and works as normal with a mouse, and as far as the app is concerned nothing is different. However, the CMenu is not receiving any touch messages at all - I get the touch-style cursor showing where I'm tapping, but nothing is being piped through to the menu.
I've tried changing from gestures to registering touch while this interaction happens and ensuring the original gesture handle is closed in case it was blocking for whatever reason.
My assumption is that Windows is doing something additional behind the scenes beyond what my app is aware of to allow these messages to be sent through, so I'm a bit stuck for a solution.
I was able to get around this issue by enabling the tablet press-and-hold gesture (it's normally disabled) which serves the purpose of being treated as a right click and having a properly interactable context menu, rather than sending the right click message myself. Works on desktop with a touch screen and a Windows tablet.
https://learn.microsoft.com/en-us/troubleshoot/developer/visualstudio/cpp/language-compilers/mfc-enable-tablet-press-and-hold-gesture
Adding
ULONG CMyView::GetGestureStatus(CPoint /*ptTouch*/) { return 0; } was what enabled this to work.
Related
I am trying to make a xlib traybar for X11 where it embeds the tray icons using XEMBED as described in the tray specs. However when I close the application with the tray icon it just removes it from the container window but the black container window rectangle and the entry in my code still exists.
In the XEMBED documentation it says
It is the responsibility of the embedder to keep track of all forwarded
accelerators and to remove them when the client window dies.
However my application does not get any events or indication when a embedded window dies.
I basically only receive a dock request event and nothing else afterwards. When a dock request event comes in I create a child window for my panel which contains the tray window and reparent it like this:
enum trayIconSize = 24; // dimensions of icon
icon.trayWindow = XCreateWindow(x.display, panel.window, 0, 0, ...);
icon.ownerHandle = event.data.l[2]; // window id of icon which wants to dock
XReparentWindow(x.display, icon.ownerHandle, icon.trayWindow, 0, 0);
XMoveResizeWindow(x.display, icon.ownerHandle, 0, 0, trayIconSize, trayIconSize);
Adding it to the panel works without any problems but I don't know how to check when to remove it again.
How do I make my application receive close events for those tray icons or how do I check if the reparented window still exists?
I have actually done this before myself: https://github.com/adamdruppe/taskbar it has hacks for my specific setup in the width thing, but most of it should be reasonably usable and the code might help guide you.
But what you want to do is ask for events on the icon window. It has been a while, so I'm kinda using my own code as a guide here, but when I got the dock request, I called XSelectInput(dd, id, EventMask.StructureNotifyMask);
StructureNotifyMask subscribes to events including MapNotify, DestroyNotify, you prolly see where this is going :)
Once you have selected the input on the icon window, you regular event loop can check for the DestroyNotify and UnmapNotify events (my code checks both, tbh, I'm not sure which one actually triggers when the icon is removed) and compare the .window member of the event to your icon's window ID. If it matches, go ahead and remove it from your list because it is gone now.
My taskbar does seem to have a bug if the application crashes as opposed to being closed normally, so I might still be missing something, but checking the event works in most cases.
Is it possible to add custom controls to a console window? You can use GetConsoleWindow() to get the window's handle, and then add your own menu items or consume all its events. I can't find any examples of people adding extra controls though.
I am developing a small, high performance serial terminal app. It is a console application - RichTextBox is too slow and has issues that make it unsuitable for VT100 terminal emulation.
I want to add some little graphics to show the state of the serial control lines (RTS/CTS/DTR/RI etc.) and perhaps a capture on/off toggle button. Ideally I'd like to add them to the window title bar. Bitmaps are all that are required.
After lots of research I found that it isn't easy, or even possible really.
You can add controls to the window with CreateWindow(), but of course only in the client area which is taken up entirely by the console text box. However, you can at least create floating controls that way, which hover over the text to provide status info etc.
You can put controls in the window borders but only with some hacking on XP or a new API that was introduced with Vista. The problem with this API is that it requires you to draw your own program icon and title text, and the console window doesn't seem to cope with it very well.
You can't add your own menu items because the console window doesn't pass the messages.
In the end I used the function keys for everything and gave a status indication by changing the console window icon.
I have an application that hooks into another application via an API. My application launches a modal window which prevents keypresses to reach the parent as one would expect.
However due to limitations in the API I need to click one of the parents toolbar buttons from time to time (yes it's a kludge).
I wonder if this is possible while still having the modal window of my application active? Is it perhaps possible to send the required command directly into the parent command queue?
Clicking the button programmatically with no modal window should not be a problem, one could go by this link for example: http://forums.codeguru.com/showthread.php?307633-How-to-run-a-very-long-SQL-statement. But I would prefer not having to close my window each time I have to click the button.
Although the fifth answer is what I find interesting as I'm thinking this could make it possible to send the command without having to close my modal window first. Also it feels an ever so small bit less ugly.
First of all, when a modal dialog is shown, it runs its own message pump. So any attempt to fake input messages will land in the modal dialog message pump. Which is no good to you. So, you'd have to send a message rather than fake input.
However, when a modal dialog is shown, its owning windows are disabled. Which means that these windows will not respond to any messages you send. So I guess that means that you could:
Enable the owning top-level window that you hosts the toolbar in question.
Send the message to the toolbar button.
Disable the owning window again.
Not the prettiest way to go about things, but you did ask!
I am currently developing a user interface DLL that uses the WIN32 API. The DLL must work for numerous platforms, XP, WIN CE, etc. I have managed to incorporate docking, anchoring and so on but appear to have a problem regarding owner-drawn buttons. I can draw the button's correct state, focus, clicked, default. However, I cannot receive key notifications. I specifically want to perform a click operation on a button that currently has focus, should the user press enter.
Note that I am using a windows message loop rather than a dialog message loop. I use windows hooks to hook into the window creation and set the user data to 'point' to my control instance. If I test for WM_KEYDOWN in the main message loop I can get a handle to my button control instance and could forward the message to the relevant control. Unfortunately, I am dealing with a lot of legacy code and this may not be an ideal solution.
So, my question is what is the best way forward. Is subclassing the button control's window procedure a viable option or is there an easier way?
Many thanks in advance.
The comments above are correct. The button with focus should be getting the key messages. But buttons don't (by themselves) respond to Enter--they respond to Space. It sounds like what you're missing is the typical dialog keyboard navigation, like Tab key moving the focus and Enter activating the "default" button.
If you've got a typical Windows message pump, and you want the keyboard behavior normally associated with dialogs, then you need to use the IsDialogMessage API in your message loop. This means your window is essentially a "modeless dialog".
Looks like standard window proc subclassing should do the trick. See http://msdn.microsoft.com/en-us/library/windows/desktop/ms633591(v=vs.85).aspx for details.
Using a Windows 7 touch device Windows shows this little touch-keyboard indicator (tabing this will bring up the touch on screen keyboard) when you tab/focus a textbox or kind of input field (Notepad etc.).
I want to write an application that gets notified when exactly that happens, a textbox (etc.) gets focused (no matter which application).
Are applications informed about focusing in other applications, do I need to hook something?
Is there a way in doing so in c++?
I believe the SetWinEventHook function and specifically the EVENT_OBJECT_FOCUS event is what you are looking for.
From the MSDN description:
An object has received the keyboard focus. The system sends this event for the following user interface elements: list-view control, menu bar, pop-up menu, switch window, tab control, tree view control, and window object. Server applications send this event for their accessible objects.
The hwnd parameter of the WinEventProc callback function identifies the window that receives the keyboard focus.