I have a window to drop a file in. What I would like to do is being able to change the appareance of the window when the user start to drag something on his desktop for example (So not on the window).
For that I need to catch a global event from window. This event is called GiveFeedBack I think (https://msdn.microsoft.com/en-us/library/system.windows.forms.control.givefeedback(v=vs.110).aspx) ? But how can I detect it on Qt ?
Thanks
You could set a windows hook for mouse events and look out for potential drag/drops or alternatively add hooks onto message procs of windows from other running processes (not particularly nice).
UAC may stop this working in some cases.
See:
SetWindowsHookEx
There may also be some COM interfaces for this. But you may not get events when the drag starts. See RegisterDragDrop.
Related
I need to create a Window that will follow an external program (ie. Notepad.exe). When user move Notepad.exe to a new position in the Desktop, I want my Window to move also.
I did some research :-
Using SetParent (where parent is Notepad)- I got this render initially, moving Notepad will not render my Window.
Using SetWindowPos and SetWindowHook on Notepad.
Using SetWindowSubClass. This doesn't work, getting error code. Possibly Notepad is a different process.
I am thinking no.2 is the path I want to go deeper. Do you think this is the right path? Is this overkilling?
See the SetWinEventHook function.
Also see the SetWindowsHookEx function, specifically the WH_GETMESSAGE hook may be of use.
You can possibly use Windows Hooks to monitor Window movements and Mouse Input. Perhaps that can be an alternative?
http://msdn.microsoft.com/en-gb/library/windows/desktop/ms644960(v=vs.85).aspx
I am currently developing a user interface DLL that uses the WIN32 API. The DLL must work for numerous platforms, XP, WIN CE, etc. I have managed to incorporate docking, anchoring and so on but appear to have a problem regarding owner-drawn buttons. I can draw the button's correct state, focus, clicked, default. However, I cannot receive key notifications. I specifically want to perform a click operation on a button that currently has focus, should the user press enter.
Note that I am using a windows message loop rather than a dialog message loop. I use windows hooks to hook into the window creation and set the user data to 'point' to my control instance. If I test for WM_KEYDOWN in the main message loop I can get a handle to my button control instance and could forward the message to the relevant control. Unfortunately, I am dealing with a lot of legacy code and this may not be an ideal solution.
So, my question is what is the best way forward. Is subclassing the button control's window procedure a viable option or is there an easier way?
Many thanks in advance.
The comments above are correct. The button with focus should be getting the key messages. But buttons don't (by themselves) respond to Enter--they respond to Space. It sounds like what you're missing is the typical dialog keyboard navigation, like Tab key moving the focus and Enter activating the "default" button.
If you've got a typical Windows message pump, and you want the keyboard behavior normally associated with dialogs, then you need to use the IsDialogMessage API in your message loop. This means your window is essentially a "modeless dialog".
Looks like standard window proc subclassing should do the trick. See http://msdn.microsoft.com/en-us/library/windows/desktop/ms633591(v=vs.85).aspx for details.
I try to send a mouse click event to a game application. First, i use Spy++ to find what message the application receive. I see something like : WM_MOUSEACTIVATE, WM_WINDOWPOSCHANGING, WM_ACTIVATEAPP, WM_ACTIVATE, WM_SETFOCUS, ...
i try to send the same as i see on Spy++ but it doesn't work. How to send mouse click to a game application without give it focus? . it's run in window mode. Thanks in advance.
You want WM_LMOUSEDOWN. You can always check MSDN for the documentation on which messages mean what.
The best way to automate applications and games is via SendInput. While in theory it should be possible to drive an application via WM_LUBTTONDOWN etc, many applications read the key state directly via lower level APIs (such as GetAsyncKeyState) which don't change their state to reflect the messages processed from the message queue.
Using SendInput requires actually setting the game to the foreground as the input events are synthesized at a low level and are thus delivered to the active/focused window.
Here is the situation:
1) I have two toplevel windows, A and B
2) A is in front of B
How can I send to keyboard focus to the window B while keeping the window A in front of B ?
I'm assuming you control both windows, and this is on an X11 system like Linux. If not, it's much more challenging. I've done things like this within a single app, and here are some recollections.
You've probably figured out you can't just use gtk_widget_grab_focus() to do it. That only works for determining which widget within a window has focus when the window itself has focus.
It's X11 that determines which window gets a keyboard event, based on the window hierarchy, info from the window manager, etc. However, you can monkey around with that via GDK to get the result you want.
You'll have to learn about GDK event propagation, and probably read some of the GDK sources. But I believe that, generally, what you'll need to do is this:
Use gdk_event_handler_set() to install your own event handler. You'll need to do this after GTK+ is initialized, and chain to gtk_main_do_event().
When you get a keyboard event (GdkEventKey), look at the X event structure. If it has the XID for window A, replace that with the XID for window B, and pass it on to GTK+. You might need to duplicate the event, and not modify the original one.
If the windows belong to different apps, you can look at gdk_event_send_client_message(), but I've never used it.
If you don't mind that it's not direct, you could send the keyboard events from the top level window to the one behind it. Of course that assumes that both windows are created by you rather than writing a program to hover in the background and read keyboard input being used on a separate program.
gtk_window_set_keep_above(a) followed by gtk_window_present(b)?
I'm creating a plugin framework, where my application loads a series of plugin DLL's, then creates a new window and pass this new window's handle to the plugin. The plugin can, then, use this handle to create their own GUI.
Everything seems to be working very well. The only problem is that when I press TAB on a plugin widget (An editbox, for example), it doen't jump to another widget. I figured out that some Windows messages are passed, and some others aren't. The WM_KEYDOWN is passed for other keys, because I can type on the editbox, but this message doesn't handle TAB key.
Hope somebody has a hint.
I'm using Borland VCL with CBuilder, but I think I could use any framework under WIN32 to create these plugins, since they never know how their parent windows were created.
It's very complex matter indeed.
When you hit TAB focus jumps to another control only when these controls belong to a Modal Dialog Box. In fact there are some buttons like ESC, LEFT, RIGHT, DOWN, UP, TAB which modal dialog message function treats in a special way. If you want these keys to behave in similar way with modeless dialog box or any other window you should change you message processing function and use IsDialogMessage inside. You'll find more information about IsDialogMessage function in MSDN also to better understand this stuff you may check as well Dialog Boxes section.
And, as was mentioned before, you should set WS_TABSTOP and WS_GROUP styles when needed.
Good luck!
I believe you'll have to take the following steps:
Subclass your edit controls (and other controls as needed).
Capture the WM_KEYDOWN message in your edit control's WndProc.
Check to see if the shift key is currently held down (using GetKeyState or similar).
Call GetWindow, passing in a handle to your edit control and either GW_HWNDPREV or GW_HWNDNEXT depending on whether shift is held down. This will give you the handle to the window that should receive focus.
Call SetFocus and pass in the window handle you got in step 4.
Make sure you handle the case where your edit controls are multiline, as you might want to have a real tab character appear instead of moving to the next control.
Hope that helps!
I believe you suffer from having a different instance of the VCL in each of your dlls and exes. Classes from the dll are not the same as the ones from your exe, even if they are called the same. Also global variables (Application, Screen) are not shared between them. Neither is the memory since they both have their own memory manager.
The solution is to have the dlls and the exe share the VCL library and the memory manager. I am not a BCB developer, but a Delphi developer. In Delphi we would just use the rtl and the vcl as runtime packages. Maybe you could do the BCB equivalent.
A DLL has its own TApplication object.
to provide uniform key handling. when the DLL Loads.
assign the DLL::TApplication to the EXE::TApplication
Be sure to do the reverse on exit.
--
Michael