How do I have my apps dialog box pop up and take focus from the current running app? - c++

I know this type of thing is looked negatively upon but I write software for people with disabilities and sometimes good gui practices don't make sense. In this case, the user interacts with a assistive interface and under certain conditions, my control app needs to prompt the user with a question. My background process creates a dialog (I'm using wxwidgets wxDialog class) and calls Show(). The dialog box appears but it does not have focus (the application that the user was previously using keeps it). Since my users can't use mice, they can't simply click on the window. I've tried calling show and then followed by SetFocus(HWND) but that doesn't do it. What's the problem? Is this even possible? Window7. I'm thinking that it might have something to do with it being a dialog and not a full window (wxFrame). Any help is greatly appreciated.

Try using SetWindowPos(hWnd, HWND_TOPMOST, 0, 0, 0, 0, SWP_NOSIZE|SWP_NOMOVE)

Unfortunately, not only is it 'looked negatively upon', but it is not possible. There's no getting around this; ask yourself what would happen if every application could do this? Obviously, if you can put your dialog on top of the other application, it can do exactly the same back to you.
http://blogs.msdn.com/b/oldnewthing/archive/2010/11/25/10096329.aspx
The only think I can think of would be for you to put a notification icon in the system tray, and then have it display a notification balloon.

I had to do something like this before. Simply calling functions like SetForegroundWindow or SetWindowPos didn't do the trick.
I ended up using this ForceForegroundWindow function (1st one) and it works pretty well.
I know this is Delphi code, but the API is the same and Delphi is a pretty simple language.

Related

Issue attaching functionality to MFC Button

I'm just getting started using MFC to make Windows applications and was hoping someone could help me get the ball rolling on some code with a button I'm trying to write.
What I'm trying to do is build an application that will have a button, that when pressed, will open a modal dialog with some functionality that's not relevant to my question. I'm having trouble getting started because I can't seem to catch when the button is pressed and attach code to that event.
The event added to my message map:
ON_BN_CLICKED(1, OnBnClicked)
This is OnBnClicked:
void CMainFrame::OnBnClicked()
{
CPaintDC dc(this);
dc.TextOutW(0, 100, _T("SUp dawg"));
MessageBox(_T("Hey Dawg"));
}
Heres the button creation:
BOOL bCreated = myButton.Create(_T("Hey Dawg"), WS_CHILD | WS_VISIBLE,
CRect(40, 40, 190, 90), this, 1);
I just can't seem to figure out why it won't do anything when I click on it. I appreciate all help with this, it's not particularly well documented online :
Warning: I'm not going to try to answer your question directly, simply because it's almost unanswerable without seeing (quite a bit) more of what you've done, such as where you put the code that creates the button. Even when/if I did answer the question directly, I doubt the result would be particularly useful. Therefore, I'm going to describe how I'd handle the same basic situation instead.
The basic point I'd make goes back to the single responsibility principle. A window should typically be represented by a class, and that class should have a single responsibility. If that window acts as a control container, then the sole responsibility of that class should be to act as a control container.
Most of the time, the class/window that contains the button shouldn't do (much of) anything other than hold buttons (or other controls). In other words, buttons mostly belong in thing like dialogs and toolbars. If you need a display that hosts buttons, you usually want that view to be a CFormView. In this case, the form is built from a dialog template. You can put a button on that dialog template with the dialog editor, just like you would with any other dialog. You attach functionality to it the same way you would to any button in any dialog (shift-click or right-click and select "Add Event Handler...").
Yes, it's possible to do things other ways--neither Visual Studio nor MFC even tries to do much to limit what you can do. That doesn't mean you should ignore basic design though--the single responsibility principle still applies, and it's still important. It's just up to you to enforce it as you design your code--and one of its implications is that you don't just create buttons at random places in the code.

Simulate mouse click in background window

I'm trying to use SendMessage to post mouse clicks to a background window (Chrome), which works fine, but brings the window to front after every click. Is there any way to avoid that?
Before anyone says this is a duplicate question, please make sure that the other topic actually mentions not activating the target window, because I couldn't find any.
Update: aha, hiding the window does the trick, almost. It receives simulated mouse/keyboard events as intended, and doesn't show up on screen. However, I can just barely use my own mouse to navigate around the computer, and keyboard input is completely disrupted.
So my question is, how does sending messages to a window affect other applications? Since I'm not actually simulating mouse/keyboard events, shouldn't the other windows be completely oblivious to this?
Is it possibly related to the window calling SetCapture when it receives WM_LBUTTONDOWN? And how would I avoid that, other than hooking the API call (which would be very, very ugly for such a small task)?
The default handling provided by the system (via DefWindowProc) causes windows to come to the front (when clicked on) as a response to the WM_MOUSEACTIVATE message, not WM_LBUTTONDOWN.
The fact that Chrome comes to the front in response to WM_LBUTTONDOWN suggests that it's something Chrome is specifically doing, rather than default system behaviour that you might be able to prevent in some way.
The source code to Chrome is available; I suggest you have a look at it and see if it is indeed something Chrome is doing itself. If so, the only practical way you would be able to prevent it (short of compiling your own version of Chrome) is to inject code into Chrome's process and sub-class its main window procedure.

Stealing focus (for a good reason)

I'm working on a clone of Yakuake and, if you have used it, you'd know that one of it's features is stealing the focus for easiness.
Basically, you hit the "show" hotkey, the app appears and you can write on it.
You could be doing whatever thing with whatever app, (being Yakuake hidden), but as soon as you hit the hotkey, Yakuake appears and steals the focus. I want to do the same with my app.
I know there are some window manager rules that prevent applications from doing this, but Yakuake is doing it, why I'm not able to do it?
Also, this application is meant to be compatible with Windows, Linux and Mac, so no KDE or Gnome or < insert_your_favourite_window_manager_here > hacks; I won't go the detect-WM-and-do-hack way.
PS: I'm doing that app in C++ and Qt4.
EDIT:
Just to make it clear, I'm not asking for any code (but if you actually have some example, I appreaciate it). I'm asking for a way for doing it. What should I do to make the WM assign the focus to my app. Is there any standard way for doing so?
There is the Qt::WindowStaysOnTopHint....
The solution is simpler than I thought. I did an animation with a duration of 0s and at the end of the animation I just did a focus. This did the work.
If you want to do it with a "show" hotkey or shortcut you'll have to create and use a hook on the keybord.
Qt don't provide such things so you'll have to do it by yourself.
you can have a look at this post : QT background process
I don't know for other OS.
When you'll get the right keyboard event from your hook, you can create a window with the "allwas on top hint" and that should by ok.

Retrieving Menu in Explorer

As the context menu for the desktop and explorer windows is disabled, I wanted to make a little something to bring back some functionality. My idea was to just list out things in a context menu (copy, paste, new, open with, etc) whenever a user right-clicks one of these windows, and then just simulate the appropriate event in the actual menu (file->new, edit->copy, etc). It wouldn't look perfectly pretty, but it would hopefully allow for the use of right-clicking.
The problem is that I cannot seem to get the actual menu. I opened My Documents and tried going down the child list towards SysListView32, calling GetMenuItemCount each time. Most returned -1, and the only other return value than that was 0.
How am I supposed to get a handle to the (file, edit, view...) menu?
If this isn't possible, is there a way I could simulate the user clicking something on the normal context menu, even if it's disabled?
Also, is there a way of making this work for the desktop? You can get the same type of thing if you view it in the explorer window, so I figured there might be a way.
I'm running Windows XP and any help is appreciated.
As per David Heffernan's comment,
As for your question, you are on the wrong track.
GetMenuItemCount needs an HMENU but you've been feeding it HWND.
That won't work. It also won't work from a different process.
You could possibly write a program that use the shell COM APIs
to show a context menu for a shell item. But your basic problem
is the bone-headed group policy. You really need to get that fixed.
Tell the IT guy that takes the decision that I said he was a fool
and was stopping you doing any useful work. ;-)
This led me onto the path of using the correct alternative method to achieve my goal.

How to mix C++ and external buttons on seperate window?

I want to make a C++ button on Start>Run i.e but when I do it will not do signalled event?
Im sorry I have seen that you do not get the question.
Ok basically when you create a button with CreateWindowEx(); I want to do that but put on a different window with SetPArent which I have already done now the button does not work so I need my program to someone get when it is clicked from the Run window as example!
And yes you have it, but it's not making the button is the problem it's getting when it's clicked with my program since it does not belong to it anymore!
You need to apply the ancient but still-supported technique known in Windows as subclassing; it is well explained here (15-years-old article, but still quite valid;-). As this article puts it,
Subclassing is a technique that allows
an application to intercept messages
destined for another window. An
application can augment, monitor, or
modify the default behavior of a
window by intercepting messages meant
for another window.
You'll want "instance subclassing", since you're interested only in a single window (either your new button, or, the one you've SetParented your new button to); if you decide to subclass a window belonging to another process, you'll also need to use the injection techniques explained in the article, such as, injecting your DLL into system processes and watching over events with a WH_CBT hook, and the like. But I think you could keep the button in your own process even though you're SetParenting it to a window belonging to a different process, in which case you can do your instance subclassing entirely within your own process, which is much simpler if feasible.
"Superclassing" is an alternative to "subclassing", also explained in the article, but doesn't seem to offer that many advantages when compared to instance subclassing (though it may compared with global subclassing... but, that's not what you need here anyway).
You'll find other interesting articles on such topics here, here, and here (developing a big, rich C++ library for subclassing -- but, also showing a simpler approach based on hooks which you might prefer). Each article has a pretty difference stance and code examples, so I think that having many to look at may help you find the right mindset and code for your specific needs!
OK, I'll do my very best - as I understand you, you're trying to inject a button into some existing window. That meaning: Your tool creates a button in some window that does not belong to your application. Now, you want to be notified when that button is pressed. Am I correct so far?
To be notified about the button being pressed, you need to get the respective window message, which will only work if you also "inject" a different WndProc into the window. Actually I have no idea how that should work, but I faintly remember functions like GetWindowLong and SetWindowLong. Maybe they will help?
EDIT
I've searched MSDN a little: While you can get the address of a window's WndProc using GetWindowLong, you can not set the WndProc using SetWindowLong on Windows NT/2000/XP (and up I suppose). See here (MSDN).
So what you could do is install a global message hook that intercepts all window messages, filter those for the window you've injected the button into and then find your message. If you have trouble with this, however, I'm the wrong person to ask, because it's been years ago since I've done anything like that, but it would be stuff for a new question.
EDIT 2
Please see Alex Martinellis answer for how to define the hook. I think he's describing the technique I was referring to when I talked about defining global message hooks to intercept the window messages for the window you injected your button into.