We have an application which uses an OpenGL render context in a subwindow to display a large bitmap. However, when a user remotely connects to a box running this app, the openGL display stops working, most likely due to the reduced texture resolution.
While we can detect the remote desktop connection starting/ending using WTS_REMOTE_CONNECT, the openGL context does not switch to the virtual driver when trying to determine the new max texture resolution.
Completely restarting the openGL subthread hangs on ChoosePixelFormat, this wont return until I am logged in locally again, otherwise this would be the "bad" solution.
It seam that application is badly written.
Code that is responsible for detecting context changes and reacting to them accordingly, do not exist or is buggy. Any way, you can not do much, unless you have access to source code. Also you can report is as a bug to vendor or provider from whom you bought it.
Related
I have written an OpenGL application that is always rendering full-screen. Occasionally I switch displays while the app is running, (physically connect a different monitor or projector) which may also result in a change of resolution. How can I detect when the display has changed resolution so that I can update my OpenGL output window, and adapt the content to the new resolution?
What I'm looking for is an event or signal of some kind that I can observe when the display mode changes due to a different physical display being connected.
I expect there's probably some way of getting notified - maybe it's an xlib thing? Just not sure where the event or signal might come from.
I'm working in OpenGL ES, C++, Linux ARM (aarch64).
With a single monitor my program works in both windowed and full screen mode (using any resolution chosen from EnumAdapterModes), but when I plug in my second monitor (running the same code) I can create a full screen device at any resolution from EnumAdapterModes, but only at the native resolution (1600 x 900) does it display the scene, otherwise the screen is just black among other problems listed below.
What I've discovered so far:
This problem does not occur in windowed or multihead mode
I can still render to a texture (I had to switch modes to display it though)
All function calls return success codes (including TestCooperativeLevel)
If I try to draw to the back buffer using Clear or the DrawPrimitive functions or call Present (which still leaves a black screen), than calls to GetRenderTargetData fail and attempting to lock a volume texture will return different slice pitches at sub levels
Commercial games that use Direct3D9 (Portal) don't have any problem switching between resolutions with my second monitor plugged in so there must be a solution
The problem seems to be related to the back buffer created by the Direct3D9 run time but the only solution I can come up with is to force multihead mode on devices with multiple monitors, any ideas?
Question that seems to have the same problem but lacks a solution: How do I render a fullscreen frame with a different resolution than my display?
Finally figured it out, seems to be a driver bug in Windows Vista and later and using Direct3D9Ex fixed the problem.
I didn't want to use Direct3D9Ex because it was only introduced on Windows Vista and I wanted to support Windows XP as a minimum, but MSDN has some sample code on how to support both so it's all good.
We are trying to implement WebGL for a 3D user interface. We have some users that need access through a Remote Desktop connection, however, WebGL is disabled. When loading a page with WebGL on a Remote Desktop, the error "Oops... Sorry, experimental-webgl context is not supported on this machine!... " appears and the 3D rendering fails. The 2D rendering works. We were wondering if there is a way around this problem.
OpenGL doesn't work with remote desktop because remote desktop opens up another desktop and isn't attached to the hardware display, thus I would assume WebGL has the same limitations. You will need to do a different type of remote desktop maybe vnc or even join.me seems to work fine.
I am writing a game in C++ using SDL 1.2.14 and the OpenGL bindings included with it.
However, if the game is in fullscreen and I Alt - Tab out then back into the game, the results are unpredictable. The game logic still runs. However, rendering stops. I only see the last frame of the game that was drawn before the Alt-tab
I've made sure to re-initialize the OpenGL context and reload all textures when I get an SDL_APPACTIVE = 1 event and that seems to work for only one Alt - Tab, then all subsequent Alt - Tabs will stop rendering (I've made sure SDL_APPACTIVE is properly being handled each time and setting the context accordingly.)
I'd hazard a guess that SDL does something under the hood when minimizing the application that I'm not aware of.
Any ideas?
It's a good pratice to "slow down" your fullscreen application when it looses the focus. Two reasons:
User may need to Alt-Tab and do something important (like closing a heavy application that's hogging the resources). When he switches, the new application takes control, and the OS must release resources from your app as needed
Modern OS uses a lot of GPU - this means it needs to release some graphics memory to work.
Try shutting down every GL resource you use when APPACTIVE=0 and alloc them again on APPACTIVE=1. If this solves, it was "your fault". If it does not solves, it's SDL (or GL or OS) bug.
EDIT: s/SO/OS/g
When trying to implement a simple OpenGL application, I was surprised that while it is easy to find plenty of examples and documentation on advanced rendering stuff, the Win32 framework is poorly documented and even most samples and tutorials do not implement this properly even for basic cases, not mentioning advanced stuff like multiple monitors. Despite of several hours of searching I was unable to find a way which would handle Alt-Tab reliably.
How should OpenGL fullscreen application respond to Alt-Tab? Which messages should the app react to (WM_ACTIVATE, WM_ACTIVATEAPP)? What should the reaction be? (Change the display resolution, destroy / create the rendering context, or destroy / create some OpenGL resources?)
If the application uses some animation loop, suspend the loop, then just minimize the window. If it changed display resolution and gamma revert to the settings before changing them.
There's no need to destroy OpenGL context or resources; OpenGL uses an abstract resource model: If another program requires RAM of the GPU or other resources, your programs resources will be swapped out transparently.
EDIT:
Changing the window visibility status may require to reset the OpenGL viewport, so it's a good idea to either call glViewport apropriately in the display/rendering function, or at least set it in the resize handler, followed by a complete redraw.