I am trying to deploy a cross platform Qt application written in c++. It works fine on my Ubuntu Linux but when I run it on Windows the application's main window's position gets set on the very top left point of the screen with the upper frame (that holds the minimize, maximize, close buttons) missing.
That is it until i resize the main window (in this case making the width smaller from the right). When this happens the upper frame and the control buttons appear as in the visualization I provided.
Note: I've removed all widgets on the app so they do not аppear as a distraction.
Note2: It appears the maximize button is disabled, which is not the case inside Ubuntu. I have not set any window flags.
How do i visualize the upper frame at the very start of the application without the need to resize the window. I understand its an OS specific behaviour. Setting the main window's geometry with a starting point with a higher y value does NOT help. It still pops at the very top left of the screen.
try to use QWidget::move to set the Window position after setGeometry.
If the widget is a window, the position is that of the widget on the
desktop, including its frame.
You ask a question about cross-platform UI code, and then you don't show the full code. Please show the full code.
The one line of code you do show does something in the wrong way: if you want to maximize a window, call the appropriate function, instead of setting its absolute size that you think shows the window maximized. Windows, their decorations and their placement are very very platform specific, and you should prefer their cross-platform abstractions over trying to do them yourself.
Concretely: the window positioning handles the decorations (title bar) differently on Windows and on Ubuntu. There is absolutely nothing you can do about it except not position your windows absolutely like this.
In the MainWindow constructor at the end this->setGeometry(0, 0, 1336, 600);
That's the problem. setGeometry deals with the geometry of the client area. This is well documented. You should use move to change the position of the widget frame. To set the size of the frame requires the knowledge of the frame's incremental width and height:
bool setWidgetFrameGeometry(QWidget *w, const QRect &r) {
auto frame = w->frameGeometry().size();
auto client = w->size();
auto delta = frame - client;
auto maxDelta = 128;
if (delta.width() > 0 && delta.width() < maxDelta
&& delta.height() > 0 && delta.height() < maxDelta) {
w->move(r.topLeft());
w->resize(r.size() - delta);
return true;
}
return false;
}
The call may need to be deferred to when the event loop had a chance to run:
auto setter = [this]{ return setWidgetFrameGeometry(this, {0,0,1336,600}); };
if (!setter())
QTimer::singleShot(0, this, setter);
Related
I've got an app that programmatically moves its window around the user's screen. My problem:
User has two physical monitors
User starts app in primary monitor
App moves window, causing more of it to "overflow" into the secondary monitor than is on the primary monitor
This causes the app's window to entirely jump to that secondary monitor and disappear from the primary monitor.
I would like for the window to stay in the primary monitor unless I want it to go to the secondary monitor, or some defined point (e.g. the center) goes into the secondary monitor, or it would even be ok if the window were split and shown on both. Is there any way to prevent this jump from happening as soon the window intersects more with the secondary monitor?
To reiterate, I'm moving the window programmatically and am using Electron on macOS.
Also, this doesn't appear to happen when I move window manually, but I think this is because it's not using its percentage rule but going on whether or not the point of the mouse has entered the secondary monitor.
Also, I'm open to any kind of solution, including C/C++ or Swift.
EDIT: Here's how I'm currently moving the window:
win.setPosition(x, y);
where win is an instance of Electron's BrowserWindow.
The desired behavior is to move the window anywhere on a display. Currently, if some of the window goes off the edge of the current display enough, it jumps to another display. This is because Apple's default behavior it to automatically move a window to the display with which it overlaps the most and hide it on all other displays. It is worth noting that this is not the behavior when dragging a window manually, but just when moving it programmatically.
I don't know how you are moving this window, or how you want the window to act when it hits the edge, but I will do my best to give you some solutions.
My first thought was for you to create the bounds of the screen when the program starts.
So maybe use GetSystemMetrics(), screen.getPrimaryDisplay(), or perhaps this pseudo-code helps:
var canvas = GetWindowSize()
var canvasWidth = canvas.width
var canvasHeight = canvas.height
if myWindow.x + mywindow.width > canvasWidth
//if you want it to bounce off the edge
then myWindow.direction = oppositeDirection
//if you want it to wrap to the other side of the screen
then myWindow.X -= canvasWidth
You should check out this code of a Bouncing DVD Logo.
Or look at my Asteroids game to see how ship and asteroids move in the screen.
Furthermore, you can use the distance formula to have the window react to the edge when the center gets close. (Although, this would also require you to get the screen bounds x and y.)
Hope that helps!
EDIT:
I think if you were to somehow use the entire screen, (both monitors) your window wouldn't even recognize that there is an edge. That being said, I still don't know what your code actually does.
It's not working when you do win.setPosition(x, y); because of mac OS behaviour.
(I also tried with electron quick start project https://github.com/electron/electron-quick-start, even with the same electron version 6.0.5)
What I did is to programmatically simulate a mouse drag of your app to the right of the screen. To do it you can use robotjs. You will not see the dragging effect, which is pretty close to do a setPosition(x, y)
When you install it you could have an issue when starting you app. Then you have to rebuild rebotJS for your version of electron. https://github.com/octalmage/robotjs/wiki/Electron
So what you will try to do programmatically is
Working code
// get your app bounds and screen bounds
const windowBounds = mainWindow.getBounds()
const displayBounds = electron.screen.getPrimaryDisplay().bounds
setTimeout(() => {
// 1. move mouse upon the top left corner your app
robot.moveMouse(windowBounds.x + 100, windowBounds.y + 5);
// 2. mouse left key, 'down'
robot.mouseToggle("down", "left");
// 3. drag window to (100, 100) from default
robot.dragMouse(displayBounds.width, 0);
// 4. mouse left key toggle 'up'
robot.mouseToggle("up", "left");
}, 100);
Example in main.js: https://playcode.io/electron
I use those version:
"devDependencies": {
"electron": "^6.0.5"
},
"dependencies": {
"robotjs": "^0.6.0"
}
and my versoon of nodejs is v10.20.0
Maybe it's not quite what you asked for, but here's a thought: why not resize the window instead of letting it "overflow" to other screens?
When you predict that window will go out bounds of current display, just reduce its size.
It can be done by something along this lines:
const winWidth = 800;
const winHeight = 600;
const SAFETY_MARGINS = 10;
let mainWindow;
// setPosition //////////////////////////////////////////////////////////////////////////
function setPosition(x, y) {
const displayBounds = screen.getPrimaryDisplay().bounds
const fixedX = Math.min(
displayBounds.x + displayBounds.width,
Math.max(x, displayBounds.x)
);
const fixedY = Math.min(
displayBounds.y + displayBounds.height,
Math.max(y, displayBounds.y)
);
mainWindow.setPosition(fixedX, fixedY);
const fixedWidth = Math.max(SAFETY_MARGINS, Math.min(winWidth, displayBounds.width + displayBounds.x - x, winWidth - displayBounds.x + x));
const fixedHeight = Math.max(SAFETY_MARGINS, Math.min(winHeight, displayBounds.height + displayBounds.y - y, winHeight - displayBounds.y + y));
mainWindow.setSize(fixedWidth, fixedHeight);
}
This version of setPosition will always resize your window so that it won't go beyond display bounds.
You can modify/extend it to allow window to go out of bounds just a bit and to scroll window's content (if needed).
I believe that with some tweaks, for most users, resizing and moving the window just slightly off the screen would look more-or-less like actually moving the window off the screen.
Here's a demo moving the window randomly using this code:
https://github.com/yoava/so-61092503
(I didn't test in with dual screens as I'm only with my laptop now.)
I have a Windows-system with two monitors connected to it that itself expand the Windows-desktop. Now I want to start two Qt-applications but need to force each of them to a specific monitor, means application A always has to open it's window on monitor 1, application B always has to open it's window on monitor 2 (no matter where they have been opened the last time and no matter where the mouse is located at the moment).
How can this be done automatically? Can it only be done via the screen-coordinates of the desktop? If yes: how can I force my QWidget-based window to a specific coordinate? If no: how else can this be done?
To get the number of screens at runtime you can use:
int screenCount = QApplication::desktop()->screenCount();
To get the geometry of a screen, you can use:
QRect screenRect = QApplication::desktop()->screenGeometry(1); // 0-indexed, so this would get the second screen
Moving a window to that position (or resizing it) is then trivial:
yourWindow->move(QPoint(screenRect.x(), screenRect.y()));
I have a little wxWidget application which can save a few preferences into a simple xml file. Amongst those preferences, I store the position, size and maximized state of my top level window so that I can restore it on the next launch.
By default, when you maximize a window, when you click again the maximize button, you get back your initial (non-maximized) position/size. But when I save my preferences, the only position and size I can get are the maximized one. So when the user restart its application, and want to "un-maximize" it, the window will still occupies the whole screen.
On Windows XP, I did a little trick which was to call SetMaximize(false) before getting the position and size. This was working just fine. But now I'm on Seven, and this doesn't work anymore. It seems that the SetMaximize(false) is deferred : when I break, it works, but during a normal execution, I always end-up with the maximized position/size, as if the unmaximize operation is done in another thread.
So I tried to add a Sleep() just after my "SetMaximize(false)" call, but I need to use a really high value to ensure it's always working, and I don't like that.
So, my question is : is there any way to get the position and size of the non-maximized window ? (I also tried to catch resize events, but it only work for the size, and I need the position also ... and didn't found any "window moved" event)
Thanks in advance for any help !
I do it with:
wxPoint pos = GetPosition();
wxSize size = GetSize();
and it works with Win7/XP and Linux.
It's easy.
The "window-moved" class is wxMoveEvent and the event-type for catching any move is wxEVT_MOVE. So define a function in your top-level window class,
void MyFrame::OnMove(wxMoveEvent& evt );
Bind it like so:
Bind(wxEVT_MOVE, &MyFrame::OnMove, this);
In both the OnMove function and the OnSize function, check to see if the window is maximized by calling the member-function IsMaximized(). When it returns true, do not change the position and size data.
I'm currently trying to enable alt-tabbing out of my fullscreen Xlib OpenGL window, but am having some difficulties. I've tried XUnmapWindow(..), which kindof works, but the resolution does not reset (unless I should be doing that manually?) and my Xlib window does not appear as a minimized window (i.e. I can't alt-tab back into the window, even though the app still seems to be running in the background).
The next thing I tried was changing my window from fullscreen to windowed mode (i.e. re-creating the window over again in windowed mode), but obviously, I'd rather not have to do that.
I'm listening to FocusOut and FocusIn events, and the FocusOut seems to be called when I alt-tab, but I'm just not sure how to get my app to minimize properly. If I don't do anything in my code when a FocusOut event is called, my app doesn't do anything (i.e. I can't minimize the window).
Any help would be appreciated!
Edit: Unfortunately, I've been unable to get X Windows to properly minimize a fullscreen window. So, to work around this problem I've decided to destroy() the fullscreen window and then create() a new window in windowed mode. Seems to work well.
XUnmapWindow() completely removes the window from the display. Minimizing a Window happens through EMWH ICCCM state, so that the window manager knows, that the window is still there in some form. And like you already assumed you're responsible for resetting the screen resolution. This is BTW the very same in Windows.
EDIT:
Minimizing a Window in Xlib is done with XIconifyWindow, which will take care to set the right ICCCM properties, and unmaps the window. Both must be done to interact properly with the WM. However X11 only defines the methods, not the policy, so when unmapping a fullscreen window you're also responsible to reset the screen resolution, like I already wrote above.
On a side note: I suggest you don't change the resolution at all, but instead, if such is available, render to a Framebuffer Object of the target size, and map the final result to the full, native screen size. If you combine this with native resolution text/HUD overlays (I assume this is for a game or similar), you get much higher percieved quality and save the resolution switching. You may even combine this with taking a screenshot of the desktop and gradually fading to your content.
EDIT 2 for reference
:
XIconifyWindow is just a helper/convenience function, it's source code is
/*
* This function instructs the window manager to change this window from
* NormalState to IconicState.
*/
Status XIconifyWindow(Display *dpy, Window w, int screen)
{
XClientMessageEvent ev;
Atom prop;
prop = XInternAtom(dpy, "WM_CHANGE_STATE", False);
if(prop == None)
return False;
ev.type = ClientMessage;
ev.window = w;
ev.message_type = prop;
ev.format = 32;
ev.data.l[0] = IconicState;
return XSendEvent(dpy, RootWindow(dpy, screen), False,
SubstructureRedirectMask|SubstructureNotifyMask,
(XEvent *)&ev);
}
You can try to do it like this :
XEvent xev;
Atom wm_state = XInternAtom(dpy, "_NET_WM_STATE", False);
Atom wm_hide_win = XInternAtom(dpy, "_NET_WM_STATE_HIDDEN", False);
memset(&xev, 0, sizeof(xev));
xev.type = ClientMessage;
xev.xclient.window = win;
xev.xclient.message_type = wm_state;
xev.xclient.format = 32;
xev.xclient.data.l[0] = _NET_WM_STATE_ADD;
xev.xclient.data.l[1] = wm_hide_win;
XSendEvent(dpy, DefaultRootWindow(dpy), False, SubstructureNotifyMask, &xev);
EDIT
If you have access to gnome API, you can use wnck_window_minimize(), or take a look into the source for that function.
After many months of trying, searching, reviewing code, etc. I'm unable to find a solution to properly positioning a new window in QT. In my most basic case I simply want to get the final size of the window and center it under the mouse. It will be shifted to ensure that no part of the window is outside of the screen. I do not wish to have the window appear then move into position, that produces visual jarring, particularly with desktop FX turned on.
The problems I've encountered, not all of which have proper solutions:
frameGeometry is not always populated before the window has been shown before.
frameGeometry is sometimes just completely wrong, in particular on Windows 7.
Prior to display it is not possible to know whether sizeHint or size will be applied, or something else in between. That is, the size policy does not appear predictable.
Note that I know how to save/restore geometry of a previously created window. Despite QT defects here as well I have a working solution.
Also note that I cannot use the window manager default placement. For non-MDI apps on a multi-monitor setup their placement is terrible (often not even being on the same monitor as the mouse).
I'd also like to avoid sub-classing all widgets and dialogs just to implement the solution, as it would not be generic. If this is the only possible way then I'd be willing to consider it (should event filters also not be an option).
Does anybody have good workable solutions?
Edited to look more scientific: I have changed the arbitrary number of calls to
processEvents with a loop that checks the return value.
Edited again: It seems that the new version is not safe: it can get stuck in the loop. So I've put a limit in the number of iterations.
Original:
Tell me about it. If I may be permitted to quote from my own code:
// BIG PAIN: We want to get the dialog box to caluclate its own size. But there is
// no simple way to do this. The following seems to work, but only if processEvents
// is called at least twice. God knows why:
setAttribute (Qt::WA_DontShowOnScreen, true) ; // Prevent screen flicker
show() ;
QEventLoop EventLoop (this) ;
for (int i = 0 ; i < 10 ; i++)
if (!EventLoop.processEvents()) break ;
hide() ;
setAttribute (Qt::WA_DontShowOnScreen, false) ;
int x = 99 ; // whatever
int y = 99 ; // whatever
// Make sure it fits on the screen
QRect ScreenRect = qApp -> desktop() -> availableGeometry (ApplicationData -> mainWindow) ;
if (x + frameGeometry().width() > ScreenRect.right())
x = ScreenRect.right() - frameGeometry().width() ;
if (x < ScreenRect.x()) x = ScreenRect.x() ;
if (y + frameGeometry().height() > ScreenRect.bottom())
y = ScreenRect.bottom() - frameGeometry().height() ;
if (y < ScreenRect.y()) y = ScreenRect.y() ;
move (x, y) ;
Try this, with varying numbers of calls to processEvents. (In these calls, the various sub-widgets and sub-sub-widgets size themselves recursively.)
Regarding the problem of not being able to query a window for its size before its been shown, there is a simple workaround. You can move the window to somewhere far outside the screen before showing it. For instance to center a main window on the primary screen without flickering I do the following:
MainWindow mainWindow;
QRect primaryScreenGeometry(QApplication::desktop()->screenGeometry());
mainWindow.move(-50000,-50000);
mainWindow.show();
mainWindow.move((primaryScreenGeometry.width() - mainWindow.width()) / 2.0,
(primaryScreenGeometry.height() - mainWindow.height()) / 2.0);
I've only tested this code on Windows XP and Qt 4.8.x. Hopefully it works on other platforms as well.
Have you tried activating the layout. It forces it to calculate sizes and positions
QLayout::activate()
after this call, your widgets should have their correct size.