I would like to allow the user to insert a selected text from any source or application into my Qt application using the middle mouse button.
This functionality is already available with the help of the QCliboard class on operating systems with X11 Window System. According to the Qt documentation, this does not work on Windows because Windows does not support the global mouse selection.
Is there a way to make this functionality available in Windows as well? Can this be achieved with a Qt and C++ implementation? Is there possibly a C++ library that I could integrate which offers this functionality?
On devices with touch input Windows 10 displays nice touch-friendly menues, so i'm looking for a way to add this into my application. Is there any new flag or a method to show such a menu without making it completely owner-draw?
The control you are referencing is a MenuFlyout. Depending on whether you're using a mouse or touch, Windows 10 will automatically space the menu items closer/further to be optimal for the input you're using.
To use this control (the easy way), your app should be a Windows 10 Universal (UWP) app, which can be written in C#, VB and C++.
You can find official samples for UWP apps on GitHub.
How do I programatically make Metro windows fullscreen in 8.1 with C++?
I have tried this but it doesn't work and always returns false:
Windows::UI::ViewManagement::ApplicationView::GetForCurrentView()->TryUnsnap();
Also, the docs say it's deprecated.
Thanks.
I do not believe this is possible; it is always the user's choice through interaction with the Metro shell to view your application at the size they desire. Your application cannot programmatically change that (otherwise you could implement a program that behaved maliciously).
Is there a way to show virtual keyboard on Windows 8 styled apps programatically with c++?
I'd rather not have any XAML involved, since I'm using DirectX (even thought I guess they could be used together, but it is easier just to drop the keyboard support than to learn XAML and try to fit it in with DirectX).
If you mean without user action, then the answer is no. See the Invocation and dismissal logic and User-driven invocation sections in The touch keyboard
Microsoft has an Input: Touch keyboard sample that demonstrates how to bring up the touch keyboard when the user move the input focus to a custom control. Note the sample does not implement TSF so there is no full input method support.
Update: the sample is updated for Windows 8.1. Old Windows 8 samples can be downloaded from http://code.msdn.microsoft.com/windowsapps/Windows-8-app-samples-3bea89c8
I'm developing an application in C++ that's partially driven by touch-screen on Windows XP Embedded. Some text entry will be necessary for the user. So far we've been using the standard Windows On-Screen Keyboard (osk.exe), but there are two main problems:
It's rather small on a higher resolution screen which will probably make it hard for users to hit the right keys
It's too "ugly" for the customer, who'd like a slicker on-screen keyboard that integrates better with the custom look-and-feel of the application so far.
Therefore I'm looking for alternatives for the Windows On-Screen Keyboard (osk.exe) that allow a larger size of buttons and can be skinned. Ideally it would have a BSD-like license for unburdened integration into a commercial app, but a royalty-free commercial solution could work.
Do you know of any such applications, or have you had a similar project where you solved the issue in another way?
We are using Click-N-Type for our systems. It is completely resizable. It has some customization possibilities, but I never tried them. We use it on "normal" Windows XP, but it should work on Windows XP embedded also.
I know this question is tagged 'c++', but here's an option for .Net that I found and integrated with less than 5 minutes work. (I've looked, and there isn't a .Net flavour of this question, and I guess it could be ported to C++ with very little effort too).
It uses the standard Windows On-Screen Keyboard (osk.exe), resizes it, docks it to the bottom of the screen and removes the title and menu bars, all from one call in your application.
The Code Project - Manage Windows XP On Screen Keyboard
The download is a single VB.Net class.
please check WPF Component(http://fpscomponents.com/Product.aspx?id=8) that is fully customizable by inbuilt editor. So programmer can fill it with own language and define layout!
Check johngnazzo code:
http://www.daniweb.com/forums/thread4548.html#
Why not write your own keyboard UI? This would (should) be relatively trivial and give you complete control over its look and feel.
I programmed a On Screen Keyboard in Java.
This is working very fine when you want to tip into Java components and Java frames.
When you want to tip in every open window you have to send the key event by implementing Robot sender. The problem i have is that the focus owner get the sended key and when you open the keyboard the keyboard has the focus.
You can not realy implement a global Java keyboard, as far as i know.
When you only want to use the Keyboard for Java, use Java.
Otherwise you should use another language.
You should use a native language where you can handle the OS focus owner or a language where you can completly disable the keyboard focus but also can bring the keyboard to the front of the screen
Take a look at chessware virtual keyboard.
http://hot-virtual-keyboard.com/