the kde pin button API - c++

You have probably already noticed the pin button of KDE, that pins a window across multiple virtual desktops. I would like to know what API the pin button functionality is part of, is it X or KDE or something else? The pin button is second from the left in the image below.

On KDE SC 4.x ut is part of the kwin API. It is KWin::Workspace::slotWindowOnAllDesktops. It is also available through the kwin javascript scripting api as slotWindowOnAllDesktops()
On KDE Frameworks 5 it is part of the KWindowSystem API. Specifically it is KWindowSystem::setOnAllDesktops

Related

Use Apple Pencil through Qt framework

I want to port my Qt application to use the apple pencil on an iPad Pro. Currently, my app uses QTabletEvent to draw to a QGraphicsScene using a Wacom enabled device. I'm planning on trying to handle events from apple pencil with objective-c++ and feed it into Qt's event system. I've never used objective-c++, what are some good tutorials to try to solve this problem? I'm specifically looking for how to pass events from objective-c++ to Qt.
You can call any Qt code from methods in Objective C++ code. You can easily create new events and post them to your application.

display touch friendly menu on Windows 10

On devices with touch input Windows 10 displays nice touch-friendly menues, so i'm looking for a way to add this into my application. Is there any new flag or a method to show such a menu without making it completely owner-draw?
The control you are referencing is a MenuFlyout. Depending on whether you're using a mouse or touch, Windows 10 will automatically space the menu items closer/further to be optimal for the input you're using.
To use this control (the easy way), your app should be a Windows 10 Universal (UWP) app, which can be written in C#, VB and C++.
You can find official samples for UWP apps on GitHub.

Usage of software input panel in DirectX based app

I am trying to add text input functionality into existing DirectX application that was being ported to Windows Store / Phone 8.1. The problem is that I can't even get the sample code provided by microsoft in this article to compile:
http://msdn.microsoft.com/en-us/library/windows/apps/jj247546%28v=vs.105%29.aspx
I am using Universal App project as a base and I have no acces to Windows::Phone::UI::Core (there is no Core namespace at all!)
I was trying to add dummy textbox and hid it somewhere but without luck as to appear the software keyboard you need to focus the textbox - the moment it receives focus it appears on the screen (we draw our own controls so I don't want the system one) despite the fact being set to Transparent, both foreground and background and width to 0.
How can I manipulate SIP to show/hide and retrieve input from the keyboard without having to hack my way through XAML and stuff?
On Windows Phone (but not Windows) apps can request the InputPane hide and show programmatically by calling InputPane.TryShow and TryHide.
If you want the InputPane to show automatically then you need to set focus to a control which identifies itself as a text control to the automation system (see the Touch Keyboard documentation on MSDN. Windows Phone works essentially the same as Windows 8 here).
There are two ways to do this in a DirectX app:
As the other thread describes and as you've tried, you can use a Xaml TextBox on top of your DirectX surface. This has the advantage of being easy as the Xaml controls already implement the accessibility and IME interfaces needed for full text support. It has the disadvantage of being external to the DX scene so it can require some care to place it nicely. You can't really hide the TextBox and divert the input, but need to use the TextBox for input. I prefer to do the full interactive form in Xaml rather than trying to merge a single TextBox into a full scene.
The other option is to implement a text control in DirectX. Windows uses the UI Automation API to identify and interact with text controls. If you implement the TextPattern and focus for your control within DirectX then the keyboad will automatically invoke when the user sets focus to it. There's a sample at Input: Touch keyboard sample which demonstrates the necessary interfaces within a custom Xaml control context. It won't apply directly to DX, but will give the general idea. The UI Automation Provider Programmer's Guide has more in depth information on implementing UI Automation interfaces. Again, while these docs target Windows they will also apply to Windows Phone.
I'm not sure exactly which code didn't compile for you. The linked pages are a bit out of date (SwapChainPanel is now preferred over SwapChainBackgroundPanel), but the classes and techniques involved should be valid for Windows Phone Runtime apps.

Popup notifiers in C/C++

I've been working on a project that will need a notifier in the system tray (sorry, "System Notification Area"). It will be a simple app that just generates popup notifications when it receives a message via a Zeromq socket.
I am not having any luck finding anything other than .NET resources and examples. Does anyone have a sample in C/C++?
I would start with this section of MSDN: Notifications and the Notification Area.
Then I'd check the NotificationIcon Sample in the Windows SDK.
What framework are you using? There should probably be several implementations for MFC, but there might different implementation for WTL and other frameworks. If you want to use the Windows API with no object orientation - well, you won't need any wrapper library then, but you can look at these libraries for example.
Here's one that has MFC and non-MFC version from CodeProject:
http://www.codeproject.com/KB/shell/systemtray.aspx
What you want here is probably ShowBalloon() function, which shows a balloon notification, but I'm pretty sure you must create a tray icon for that (can't have a notification balloon without having a tray icon).

Which On-Screen Keyboard for Touch Screen Application?

I'm developing an application in C++ that's partially driven by touch-screen on Windows XP Embedded. Some text entry will be necessary for the user. So far we've been using the standard Windows On-Screen Keyboard (osk.exe), but there are two main problems:
It's rather small on a higher resolution screen which will probably make it hard for users to hit the right keys
It's too "ugly" for the customer, who'd like a slicker on-screen keyboard that integrates better with the custom look-and-feel of the application so far.
Therefore I'm looking for alternatives for the Windows On-Screen Keyboard (osk.exe) that allow a larger size of buttons and can be skinned. Ideally it would have a BSD-like license for unburdened integration into a commercial app, but a royalty-free commercial solution could work.
Do you know of any such applications, or have you had a similar project where you solved the issue in another way?
We are using Click-N-Type for our systems. It is completely resizable. It has some customization possibilities, but I never tried them. We use it on "normal" Windows XP, but it should work on Windows XP embedded also.
I know this question is tagged 'c++', but here's an option for .Net that I found and integrated with less than 5 minutes work. (I've looked, and there isn't a .Net flavour of this question, and I guess it could be ported to C++ with very little effort too).
It uses the standard Windows On-Screen Keyboard (osk.exe), resizes it, docks it to the bottom of the screen and removes the title and menu bars, all from one call in your application.
The Code Project - Manage Windows XP On Screen Keyboard
The download is a single VB.Net class.
please check WPF Component(http://fpscomponents.com/Product.aspx?id=8) that is fully customizable by inbuilt editor. So programmer can fill it with own language and define layout!
Check johngnazzo code:
http://www.daniweb.com/forums/thread4548.html#
Why not write your own keyboard UI? This would (should) be relatively trivial and give you complete control over its look and feel.
I programmed a On Screen Keyboard in Java.
This is working very fine when you want to tip into Java components and Java frames.
When you want to tip in every open window you have to send the key event by implementing Robot sender. The problem i have is that the focus owner get the sended key and when you open the keyboard the keyboard has the focus.
You can not realy implement a global Java keyboard, as far as i know.
When you only want to use the Keyboard for Java, use Java.
Otherwise you should use another language.
You should use a native language where you can handle the OS focus owner or a language where you can completly disable the keyboard focus but also can bring the keyboard to the front of the screen
Take a look at chessware virtual keyboard.
http://hot-virtual-keyboard.com/