I'm part of the SFML Team and we're currently looking into a feature to "request" window focus. The goal is to get very similar behavior across Windows, OS X and Linux.
For Windows one gets the rather simple SetForegroundWindow function via the WinAPI, which has a few condition as to how the window actually gets focus. The most important part to notice here is, that it only gets focus if it's from the same foreground process.
On OS X it's possible to get the focus for the active app only and otherwise let the icon bounce, i.e. notification.
Here comes the problem now, we'd like to get the same behavior on Linux as well, meaning the window should get focus if the window belongs to the active/foreground process and otherwise it should generate a notification. What would be the closest thing to that with X11?
There are already a few suggestions on the issue tracker of SFML, but none of them are actually implementing this behavior.
"User Story"
I guess developers can think of different things when being confronted with different technical names, as such here's the issue from a user perspective.
There are mainly two situations in which requesting focus is needed:
Sometimes when starting an application that uses a console window in the background, it can happen that the console window gets the focus instead of the actual GUI window. When this happens it's rather annoying for the user having to click on the window first. Since the console window and the GUI window are from the same application there's no harm done in switching the focus to the GUI window.
When one is writing an application that supports multiple windows, there might be situations where the application should decide which window gets the focus and again since the window belong to the same application there's no harm done in switching the focus from one GUI window to the other GUI window.
Further more if a different application has the focus/is being used then it's not okay to steal the focus and as such we just want to get the user's attention. For Windows that might be a blinking taskbar or for OS X that might be a jumping icon.
The current implementation seems to work fine on OS X and Windows, butwe're unsure about the X11 implementation. Thus the question is: How would one go about switching the window focus if the currently focussed window has been created by the same application that makes the focus request and otherwise create some kind of notification. For the notification we're/I'm not even sure if there's some generic way of doing it with X11.
In X11, "focus" means "the keyboard focus", that is, the window that gets the keyboard input. The window that has the focus is not necessarily in the foreground. This depends on your window manager focus policy. Most can be configured to have "click-to-focus" or "point-to-focus" policy. If you are interested in the keyboard focus, use XSetInputFocus. If you want to bring your window to the foreground, use XRaiseWindow.
It is OK to call RaiseWindow and XSetInputFocus once, when the application starts. It is also OK to bring a window to the foreground/set focus as a response to a user interaction with that or some other window of the same application. But it's not OK to do so as a response to some background event (time passed, file downloaded etc).
The standard X11 method of drawing attention to a window is setting the urgency hint. This will normally flash or bounce the icon, depending on your window manager. Do not forget to unset the hint when the user finally interacts with the window.
I think all of this has been discussed in the thread you have linked. I'm not quite sure which concerns are still left unanswered. Nothing can implement the exact same behaviour as with the other windowing systems, simply because X11 is not those windowing systems, and it's totally OK. X11, Mac OS X and Windows all behave differently and the users know and expect that. It would annoy me to no end if some application on X11 decided to behave exactly like it does on Windows, instead of toeing the X11 party line.
Related
While answering a different StackOverflow question I realized that a 100% correct solution would need to know when the Qt application's window was being dragged or resized by the user's mouse, so that it would refrain from moving or resizing itself during that period, and thereby avoid "fighting with the user" for control of the window.
However, I don't know of any mechanism for a Qt app to be notified when the user pressed down the left-mouse button on the window's title bar, or when the user releases the left-mouse button afterwards... I assume that is because that functionality is handled by the OS's window-manager rather than by the Qt library. That said, is there any secret way to do it? Cross-platform would be best, but OS-specific solution are also of interest.
I found a bug yesterday in one of my Windows applications, which is built in a high level framework, which in the end, calls Windows APIs like CreateWindow, and ShowWindow, in order to display its user interface.
One one machine so far, and only one, which happened to be a customer machine, I observed the following behaviour:
For only one window in my entire application, when I first call ShowWindow(Handle,SW_SHOW ) for this window, the size which it previously had received by SetWindowPos is overridden.
Reading the MSDN Win32 API documentation, on ShowWindow(Handle,SW_SHOW) I can not see any reference it it moving the window bounds. I can work around this surprising result by having my window-show routine get the bounds before it calls the Win32 ShowWindow routine.
My question is, has anyone ever seen behaviour like this? I think it must be one of the following:
An obscure bug in Windows 7 Service Pack 1 that does not reproduce on all systems, and only reproduces perhaps for a particular version of a particular video card driver. (This affected system has dual AMD/ATI FireGL video cards)
An obscure problem caused by a side effect of some other software running on the system, which may be hooking window handles, installing trampolined code hooks somewhere (perhaps even inside my own process, thanks to some DLL or something that I am not aware of).
Something my 4 million line application is doing to me, through some weird code somewhere I have not yet identified.
I am hitting an application compatibility shim within the Win32 API layer.
If anyone who has worked in C++, C, or Delphi, or any other language, has ever seen anything like this and can think of a reason why ShowWindow would have this amazing and unexpected side effect, of moving the bounds of the window, back to a certain original position, in my case, x=175, y=175, width=320, height=240, which appears to be have been the window bounds right after the initial CreateWindow call, I'd like to know what it is.
Here is a sequence of events:
Application starts up, and creates a few top level windows parented to the desktop.
The first window created is the main application window and the second is a tool window, both have full window grabber bars and are conventional top level Win32 windows, Forms which are sizeable, draggable, and parented to the desktop.
The second window's position is loaded from disk, and the form is shown.
During the form show process, its bounds are set so that the window is at some x and y top/left position, and some height/width is given.
If I query the Win32 window handle immediately before I call ShowWindow, its bounds are where I expect.
If I query the Win32 window handle immediately after I call ShowWindow, its bounds have been reset.
According to MSDN help SW_SHOW means Activates the window and displays it in its current size and position.
This is indeed what occurs on over 100 client PCs I have observed. Only on a single customer-owned Windows 7 PC is this behaviour different.
This affected system has dual AMD/ATI FireGL video cards
I ain't sure about FireGL, but for consumers videocards, made upon the same chip lineage, video-drivers exactly have add-on to reposition windows as they think is easier for operator.
it is called HydraVision Package for Catalyst Software Suite
I'm trying to use SendMessage to post mouse clicks to a background window (Chrome), which works fine, but brings the window to front after every click. Is there any way to avoid that?
Before anyone says this is a duplicate question, please make sure that the other topic actually mentions not activating the target window, because I couldn't find any.
Update: aha, hiding the window does the trick, almost. It receives simulated mouse/keyboard events as intended, and doesn't show up on screen. However, I can just barely use my own mouse to navigate around the computer, and keyboard input is completely disrupted.
So my question is, how does sending messages to a window affect other applications? Since I'm not actually simulating mouse/keyboard events, shouldn't the other windows be completely oblivious to this?
Is it possibly related to the window calling SetCapture when it receives WM_LBUTTONDOWN? And how would I avoid that, other than hooking the API call (which would be very, very ugly for such a small task)?
The default handling provided by the system (via DefWindowProc) causes windows to come to the front (when clicked on) as a response to the WM_MOUSEACTIVATE message, not WM_LBUTTONDOWN.
The fact that Chrome comes to the front in response to WM_LBUTTONDOWN suggests that it's something Chrome is specifically doing, rather than default system behaviour that you might be able to prevent in some way.
The source code to Chrome is available; I suggest you have a look at it and see if it is indeed something Chrome is doing itself. If so, the only practical way you would be able to prevent it (short of compiling your own version of Chrome) is to inject code into Chrome's process and sub-class its main window procedure.
Im developing an application and use FlashWindowEx to flash the update window however it always manages to steal focus from full screen applications like games and such.
This is not what i want and is very annoying. Is there any way to work out whats causing it to steal focus (tried commenting out FlashWindowEx but it still did) or a way to tell it not to steal focus.
This happens on all versions of windows (including 7) and the game is launched seperatly to the application.
Check out the WS_EX_NOACTIVATE window style perhaps?
I am developing an application for PocketPC. When the application starts the custom function SetScreenOrientation(270) is called which rotates the screen. When the application closes the function SetScreenOrientation(0) is called which restores the screen orientation.
This way the screen orientation isn't restored if the user minimizes the application and this is not acceptable.
Does anyone know where (in which event handlers) should SetScreenOrientation(int angle) be called to set the screen orientation on application start, restore orientation on minimize, set the orientation on maximize and restore the orientation on close?
Actually I don't know which event handler handles the Minimize and Maximize event.
The correct message is WM_SIZE, but Daemin's answer points to the wrong WM_SIZE help topic. Check the wParam. Be careful as your window may be maximized but hidden.
Going from my Windows CE experience you should handle either the WM_SIZE or WM_WINDOWPOSCHANGED messages. If you're working on PocketPC I would suggest you take a look at the WM_WINDOWPOSCHANGED message first because I'm not sure the WM_SIZE has the right parameters that you need.
From the WM_WINDOWPOSCHANGED message's WINDOWPOS structure take a look at the flags member, specifically SWP_SHOWWINDOW and SWP_HIDEWINDOW.
The specific version of the messages that you need to look at vary with what operating system you're using. The Pocket PC OS is built on Windows CE 3.0 (and lower), while Windows Mobile is now built on Windows CE 5.0 (even Windows Mobile 6), but was also built on Windows CE 4. (Source)
So just look under the relevant section in MSDN for the OS that you're writing for.
I don't know what these are called in the C++ world, but in .NET Compact Framework your application form's Resize event would be called when you minimize/maximize a window, and then in the event code you would check the WindowState property of the form to see if its minimized or mazimized.
Altering the state of your PDA from within your application is risky (although there are lots of good reasons to do it), because if your app crashes it will leave the PDA in whatever state it was in. I've done a lot of kiosk-type (full-screen) apps in Windows Mobile, and one of the tricks to doing this effectively is to hide the WM title bar (the top row with the Windows start button) to keep it from flashing up for a split second every time you open a new form. If the app crashes, the windows bar remains invisible until you reset the device, which isn't good. At least with screen rotation the user can restore it manually.
It really depends on the platform, but I'd go with WM_WINDOWPOSCHANGED or the OnShow. It's not wm_size.. That one is not always thrown on all platforms. Casio's don't throw the size event when you'd expect them to. TDS and Symbol's do.
Even though the MSDN is a great sourse for info, remember not all OS's are created equal. In the PPC world the hardware provider gets to create their own OS and sometimes the miss things, or purposfully ignore things.
I've got a platform here (name withheld to protect... well me) that has left and right buttons.. When you press them, you'd expect to be able to catch VK_LEFT, VK_RIGHT.. You'd be wrong. You actually get ';' or ':'. How's that for a kick in the pants.