display touch friendly menu on Windows 10 - c++

On devices with touch input Windows 10 displays nice touch-friendly menues, so i'm looking for a way to add this into my application. Is there any new flag or a method to show such a menu without making it completely owner-draw?

The control you are referencing is a MenuFlyout. Depending on whether you're using a mouse or touch, Windows 10 will automatically space the menu items closer/further to be optimal for the input you're using.
To use this control (the easy way), your app should be a Windows 10 Universal (UWP) app, which can be written in C#, VB and C++.
You can find official samples for UWP apps on GitHub.

Related

Override Close Box on Windows 10 Universal Apps UWP

I'm trying to prevent the app from being closed by clicking the Close box on the App Window.
For example, having a text editor with unsaved changes, upon pressing Close Box, I would first display, "Do you want to save changes before exiting?"
How can I detect app wanting to close and prevent that from happening?
I'm using C++, and this needs to be for Windows 10 Universal Apps UWP.
I already know how to do this for Win32.
The comments are correct. There is currently no way for a regular Store app to do this.
However, with the Creators Update (and corresponding SDK) we have included a preview API that you can now check out for this functionality:
The Windows.UI.Core.Preview.SystemNavigationManagerPreview class provides a CloseRequested event that an app can mark as handled. For the event to work the app will need to declare the restricted 'confirmAppClose' capability per:
https://learn.microsoft.com/en-us/windows/uwp/packaging/app-capability-declarations
Please let us know your feedback.
Thanks,
Stefan Wick - Windows Developer Platform

Usage of software input panel in DirectX based app

I am trying to add text input functionality into existing DirectX application that was being ported to Windows Store / Phone 8.1. The problem is that I can't even get the sample code provided by microsoft in this article to compile:
http://msdn.microsoft.com/en-us/library/windows/apps/jj247546%28v=vs.105%29.aspx
I am using Universal App project as a base and I have no acces to Windows::Phone::UI::Core (there is no Core namespace at all!)
I was trying to add dummy textbox and hid it somewhere but without luck as to appear the software keyboard you need to focus the textbox - the moment it receives focus it appears on the screen (we draw our own controls so I don't want the system one) despite the fact being set to Transparent, both foreground and background and width to 0.
How can I manipulate SIP to show/hide and retrieve input from the keyboard without having to hack my way through XAML and stuff?
On Windows Phone (but not Windows) apps can request the InputPane hide and show programmatically by calling InputPane.TryShow and TryHide.
If you want the InputPane to show automatically then you need to set focus to a control which identifies itself as a text control to the automation system (see the Touch Keyboard documentation on MSDN. Windows Phone works essentially the same as Windows 8 here).
There are two ways to do this in a DirectX app:
As the other thread describes and as you've tried, you can use a Xaml TextBox on top of your DirectX surface. This has the advantage of being easy as the Xaml controls already implement the accessibility and IME interfaces needed for full text support. It has the disadvantage of being external to the DX scene so it can require some care to place it nicely. You can't really hide the TextBox and divert the input, but need to use the TextBox for input. I prefer to do the full interactive form in Xaml rather than trying to merge a single TextBox into a full scene.
The other option is to implement a text control in DirectX. Windows uses the UI Automation API to identify and interact with text controls. If you implement the TextPattern and focus for your control within DirectX then the keyboad will automatically invoke when the user sets focus to it. There's a sample at Input: Touch keyboard sample which demonstrates the necessary interfaces within a custom Xaml control context. It won't apply directly to DX, but will give the general idea. The UI Automation Provider Programmer's Guide has more in depth information on implementing UI Automation interfaces. Again, while these docs target Windows they will also apply to Windows Phone.
I'm not sure exactly which code didn't compile for you. The linked pages are a bit out of date (SwapChainPanel is now preferred over SwapChainBackgroundPanel), but the classes and techniques involved should be valid for Windows Phone Runtime apps.

Showing virtual keyboard on Windows 8

Is there a way to show virtual keyboard on Windows 8 styled apps programatically with c++?
I'd rather not have any XAML involved, since I'm using DirectX (even thought I guess they could be used together, but it is easier just to drop the keyboard support than to learn XAML and try to fit it in with DirectX).
If you mean without user action, then the answer is no. See the Invocation and dismissal logic and User-driven invocation sections in The touch keyboard
Microsoft has an Input: Touch keyboard sample that demonstrates how to bring up the touch keyboard when the user move the input focus to a custom control. Note the sample does not implement TSF so there is no full input method support.
Update: the sample is updated for Windows 8.1. Old Windows 8 samples can be downloaded from http://code.msdn.microsoft.com/windowsapps/Windows-8-app-samples-3bea89c8

Windows 7 Explorer.exe

I want to extend Windows 7 Explorer.exe (32/64 bits) where does one start? I want to modify UI adding new menu items and perhaps add new windows directly inside of current window. Is it related to "Basic Folder Object Interfaces"? Any help URLs, Books anything appeciated!
http://www.codeproject.com/KB/shell/shellextguideindex.aspx
If you don't like that ATL (or even C++) stuff, you can combine it with:
http://www.codeproject.com/KB/COM/com_in_c1.aspx

Which On-Screen Keyboard for Touch Screen Application?

I'm developing an application in C++ that's partially driven by touch-screen on Windows XP Embedded. Some text entry will be necessary for the user. So far we've been using the standard Windows On-Screen Keyboard (osk.exe), but there are two main problems:
It's rather small on a higher resolution screen which will probably make it hard for users to hit the right keys
It's too "ugly" for the customer, who'd like a slicker on-screen keyboard that integrates better with the custom look-and-feel of the application so far.
Therefore I'm looking for alternatives for the Windows On-Screen Keyboard (osk.exe) that allow a larger size of buttons and can be skinned. Ideally it would have a BSD-like license for unburdened integration into a commercial app, but a royalty-free commercial solution could work.
Do you know of any such applications, or have you had a similar project where you solved the issue in another way?
We are using Click-N-Type for our systems. It is completely resizable. It has some customization possibilities, but I never tried them. We use it on "normal" Windows XP, but it should work on Windows XP embedded also.
I know this question is tagged 'c++', but here's an option for .Net that I found and integrated with less than 5 minutes work. (I've looked, and there isn't a .Net flavour of this question, and I guess it could be ported to C++ with very little effort too).
It uses the standard Windows On-Screen Keyboard (osk.exe), resizes it, docks it to the bottom of the screen and removes the title and menu bars, all from one call in your application.
The Code Project - Manage Windows XP On Screen Keyboard
The download is a single VB.Net class.
please check WPF Component(http://fpscomponents.com/Product.aspx?id=8) that is fully customizable by inbuilt editor. So programmer can fill it with own language and define layout!
Check johngnazzo code:
http://www.daniweb.com/forums/thread4548.html#
Why not write your own keyboard UI? This would (should) be relatively trivial and give you complete control over its look and feel.
I programmed a On Screen Keyboard in Java.
This is working very fine when you want to tip into Java components and Java frames.
When you want to tip in every open window you have to send the key event by implementing Robot sender. The problem i have is that the focus owner get the sended key and when you open the keyboard the keyboard has the focus.
You can not realy implement a global Java keyboard, as far as i know.
When you only want to use the Keyboard for Java, use Java.
Otherwise you should use another language.
You should use a native language where you can handle the OS focus owner or a language where you can completly disable the keyboard focus but also can bring the keyboard to the front of the screen
Take a look at chessware virtual keyboard.
http://hot-virtual-keyboard.com/