Platform Windows
Created a control using windows API: CreateWindowExW and set it's parent to a panel hwnd
But it seems the control does not handle arrow keys, enter keys and tab keys properly.
Is there any flag on wxwidgets give any control created by CreateWindowExW the same ability like edit controls to capture arrow keys, enter keys and tab keys?
The problem might be due to not using WS_EX_CONTROLPARENT for your control when creating it, this style is needed for the built-in tab navigation to work.
And while I don't think it's going to help with your particular problem, I'd still like to say that embedding a native control in an application using wxWidgets is not quite as simple as just giving it the HWND of an existing control as parent, you may want to look at wxNativeWindow (new in wxWidgets 3.1.0) for how to do it correctly.
Related
I am writing a keylogger for windows. I am planning to get the pressed key with GetAsyncKeyState(KEY) and a hidden console. After a key press has been identified I will get the current focused windows with GetForegroundWindow and indentify which program was on top when the key was pressed. I also want to be able to Differentiate between key presses for passwords and other kind of inputs. Is there a way to do it? How?
I am not writing a malicious software. This is for an assignment in Advanced Programming course.
If the foreground app is using standard Win32 UI controls, then try this:
use GetForegroundWindow() to get the HWND of the foreground window.
then use GetWindowThreadProcessId() and GetGUIThreadInfo() to get the foreground window's currently focused child control.
then use GetClassName() to check if it is a standard EDIT control (this check might fail in some UI frameworks! You might have to use the UI Automation API to detect the control type)
then use GetWindowLong/Ptr() to check if it has the ES_PASSWORD style applied.
However, if the foreground app is not using standard Win32 UI controls, and/or is custom-drawing them, or the like, then all bets are off.
Is it possible to create a keyboard shortcut to switch between the monitor and portion selection of this wacom preferences window, via a c++ console program?
Sorry if this is poorly worded, I've had trouble trying to find the right words to search for ways to do it.
I think it should be possible, although a bit tedious. You should be able to use the Windows API, and try to EnumWindows/EnumDesktopWindows to identify the respective application Window, and its respective controls (which are also Windows).
You should identify the window title, and class ids, for the app window, and the checkbox button controls, then when you enumerate through all the desktop windows, you can identify the ones you are interested in.
Then you can use the SendMessage() API to send messages to the controls (Windows) of interest to manipulate them.
It's a bit tedious, but sounds possible.
An example of use here to get an idea:
http://www.cplusplus.com/forum/windows/25280/
I am developing an application for Windows 7 devices and I'm using an embedded web browser (webkit). Normally touching an edit control on a tablet device causes a little keyboard icon to appear. However, since my edit control is in the browser, it's not a real window with an hwnd and Window's doesn't bring up the icon you can click on to bring up the on screen keyboard.
Is there an API I can use to cause the little keyboard icon to appear as it normally would when focus goes to an edit control?
I tried searching MSDN, no success.
I looked at the Windows keyboard API. No dice.
I tried running OSK.exe. This could bring up multiple instances of the keyboard and it's just sloppy. I want to get the same effect a user would get when tapping a windows edit control so the UI is consistent.
There must be an API that can bring up that on screen keyboard.
Thanks.
David
Not sure if you have this answered already. I have been looking at doing a similar thing although it is a part of a larger application and the keyboard is rarely used (but nevertheless had to be supported). I assigned a shortcut key (right click Win 7 on screen keyboard application and choose Properties. In the shortcut tab, assign any shortcut you'd like). When I touch a SurfaceTextEdit control, I emulate the shortcut key from my C++ code using SendInput(). I know this is a hack, but it worked well for me because I rarely used the onscreen keyboard in my application.
I have a simple console application written in C++ that acts as a stub for launching another application through it's jumplist. Purpose is to add jumplist abilities to applications that do not support this. Call it stub.exe. When running stub.exe it creates a custom jumplist using these steps (taken right form the MS samples):
create an ICustomDestinationList
ICustomDestinationList::BeginList()
create an IObjectCollection
for_each item_to_add
create an IShellLink, set its path/arguments/title/icon
add IShellLink to the IObjectCollection
get the IObjectArray interface from the IObjectCollection
call ICustomDestinationList::AddUserTasks( IObjectArray interface )
ICustomDestinationList::CommitList()
When pinning stub.exe to the taskbar and right-clicking it, the jumpilst appears and it contains all IShellLinks added. When clicking an item, it will launch the corresponding process.
Now I'd like a process launched through this jumplist have it's window(s) grouped under stub.exe's taskbar icon, instead of having it's own group. They key to get this working seems to be the AppUsermodelID. This is what I tried so far:
just for testing, create a couple of shortcuts and set the id through IPropertyStore->SetValue( PKEY_AppUserModel_ID, "id" ). Indeed, when launching these shortcuts, they will all group under the same taskbar icon.
since the shortcuts do what I want, I tried adding shortcuts to stub.exe's jumplist: no effect. The shortcuts don't even show up in the jumplist (maybe one cannot have a shortcut to a shortcut?), yet all methods return S_OK
setting the PKEY_AppUserModel_ID on each of the IShellLinks that get added to the jumplist: no effect
calling ICustomDestinationList->SetAppID(): no effect
instead of using SubTasks, tried with SHAddToRecentDocs: no effect. The recent doc list does not show up. But now things get messy. After setting the AppUserModelID on the shortcut that is responsible for the pinned taskbar item (the one in %APPDATA%/Roaming/Microsoft/Internet Explorer/Quick Launch/User Pinned/TaskBar), the jumplist changed: it does not show the 'Tasks' item anymore, but does show 'Recent' and the items I added using SHAddToRecentDocs. Now when clicking them I get a dialog box with a title that starts with 'd:\desktop' followed by Chinese characters. Hovering the items in the jumplist also shows Chinese characters instead of the descirption I set.
Questions:
What's with the Chinese characters in the jumplist?
How come setting the app id on the taskbar shortcut toggles between 'Tasks' and 'Recent' sections, why are they not both there?
What would be the way, if even possible, to achive what I actually want: a custom jump list of which the items launched will group under it's taskbar icon? (note that the processes I plan to laucnh their do not have their app id set currently)
not much reactions here ;]
In the meantime I managed to solve the main problem myself; it's not quite a straightforward solution but it fullfills the requirements: a program runs in the backround and installs a CBT hook. Each time an application creates a window (HookProc code = HCBT_CREATEWND), the hook checks the application's path against a map containing paths and desired application ids. If a match is found, the application id of the HWND is set. Since this occurs before the window is actually shown and is combined with the custom task list, from a user's point of view the application behaves just like one that does support a recent/pinned document list.
I'm creating a plugin framework, where my application loads a series of plugin DLL's, then creates a new window and pass this new window's handle to the plugin. The plugin can, then, use this handle to create their own GUI.
Everything seems to be working very well. The only problem is that when I press TAB on a plugin widget (An editbox, for example), it doen't jump to another widget. I figured out that some Windows messages are passed, and some others aren't. The WM_KEYDOWN is passed for other keys, because I can type on the editbox, but this message doesn't handle TAB key.
Hope somebody has a hint.
I'm using Borland VCL with CBuilder, but I think I could use any framework under WIN32 to create these plugins, since they never know how their parent windows were created.
It's very complex matter indeed.
When you hit TAB focus jumps to another control only when these controls belong to a Modal Dialog Box. In fact there are some buttons like ESC, LEFT, RIGHT, DOWN, UP, TAB which modal dialog message function treats in a special way. If you want these keys to behave in similar way with modeless dialog box or any other window you should change you message processing function and use IsDialogMessage inside. You'll find more information about IsDialogMessage function in MSDN also to better understand this stuff you may check as well Dialog Boxes section.
And, as was mentioned before, you should set WS_TABSTOP and WS_GROUP styles when needed.
Good luck!
I believe you'll have to take the following steps:
Subclass your edit controls (and other controls as needed).
Capture the WM_KEYDOWN message in your edit control's WndProc.
Check to see if the shift key is currently held down (using GetKeyState or similar).
Call GetWindow, passing in a handle to your edit control and either GW_HWNDPREV or GW_HWNDNEXT depending on whether shift is held down. This will give you the handle to the window that should receive focus.
Call SetFocus and pass in the window handle you got in step 4.
Make sure you handle the case where your edit controls are multiline, as you might want to have a real tab character appear instead of moving to the next control.
Hope that helps!
I believe you suffer from having a different instance of the VCL in each of your dlls and exes. Classes from the dll are not the same as the ones from your exe, even if they are called the same. Also global variables (Application, Screen) are not shared between them. Neither is the memory since they both have their own memory manager.
The solution is to have the dlls and the exe share the VCL library and the memory manager. I am not a BCB developer, but a Delphi developer. In Delphi we would just use the rtl and the vcl as runtime packages. Maybe you could do the BCB equivalent.
A DLL has its own TApplication object.
to provide uniform key handling. when the DLL Loads.
assign the DLL::TApplication to the EXE::TApplication
Be sure to do the reverse on exit.
--
Michael