Current state and solutions for OpenGL over Windows Remote [closed] - opengl

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 2 years ago.
The community reviewed whether to reopen this question 5 months ago and left it closed:
Original close reason(s) were not resolved
Improve this question
OpenGL and Windows Remote don't play along nicely.
Solutions for this are dependent on the use case and answers are fragmented across the vast depths of the net.
This is a write-up I wish existed when I started researching this, both for coders and non-coders.
Problem:
A RDP session of Windows does not expose the graphics card, at least not directly. For instance you cannot change the desktop resolution and GraphicsCard drivers usually just disable their setting menus. Starting a OpenGL context higher than v1.1 fails because of this. The, especially in support IRCs, often suggested "Don't use WindowsRemote" is unfortunately not an option for many. In many corporate environments Windows Remote is a constantly used tool and an app has to work there as well.
Non-Coder workarounds
You can start the OpenGL program, allowing it to see the graphics card, create an opengl context and then connect via WindowsRemote. This always works, as Windows remote just transfers the window content. This can be accomplished by:
A batch script, that closes the session and starts the program, allowing you to connect to the program already running. (Source)
Using VNC or other to remote into the machine, start the program and then switch to Windows Remote. (Simple VNC programm, also with a portable client)
Coder workarounds
(Only for OpenGL ES)Translate OpenGL to DirectX. DirectX works under Windows Remote flawselly and even has a Software rendering fallback built into DX11 if something fails.
Use the ANGLE Project to do this at run-time. This is what QT officially suggests you do and how Chrome and Firefox implement WebGL. (Source)
Switch to software rendering as a fall back. Some CAD software like 3dsMax does this for instance:
Under SDL2 you can use SDL_CreateSoftwareRenderer (Source)
Under GLFW version 3.3 will release OSMesa (Mesa's off screen rendering), in the mean time you can build the Github version with -DGLFW_USE_OSMESA=TRUE, but I personally still struggle to get that running (Source)
Directly use Mesa's LLVM pipe for a fast OpenGL implementation. (Source)
Misc:
Use OpenGL 1.1: Windows has a built in implementation of OpenGL 1.1 and
earlier. Some game engines have a built in fall back to this and thus
work under Windows Remote.
Apparently there is a middle-ware, that allows for even OpenGL 4 over Windows Remote, but it's part of a bigger package and is a commercial solution. (Source)
Any other solutions or corrections are greatly appreciated.
[10] Nvidia -> https://www.khronos.org/news/permalink/nvidia-provides-opengl-accelerated-remote-desktop-for-geforce-5e88fc2035e342.98417181

According to this article it seems that now RDP handles newer versions of Direct3D and OpenGL on Windows 10 and Windows Server 2016, but by default it is disabled by Group Policy.
I suppose that for performance reasons, using a hardware graphics card is disabled, and RDP uses a software-emulated graphics card driver that provides only some baseline features.
I stumbled upon this problem when trying to run Ultimaker CURA over standard Remote Desktop from a Windows 10 client to a Windows 10 host. Cura shouted "cannot initialize OpenGL 2.0 context". I also noticed that Repetier Host's "preview" window runs terribly slow, and Repetier detects only an OpenGL 1.1 card. Pretty much fits the "only baseline features" description.
By running gpedit.msc then navigating to
Local Computer Policy\Computer Configuration\Administrative Templates\Windows Components\Remote Desktop Services\Remote Desktop Session Host\Remote Session Environment
and changing the value of
Use hardware graphics adapters for all Remote Desktop Services sessions
I was able to successfully run Ultimaker CURA via with no issues, and Repetier-Host now displays OpenGL 4.6, and everything finally runs fast as it should.
Note from genpfault:
As usual, this Policy is kept in the HKLM registry group in
HKEY_LOCAL_MACHINE\SOFTWARE\Policies\Microsoft\Windows NT\Terminal Services
Set REG_DWORD:bEnumerateHWBeforeSW to 1 to turn ON using GPUs in RDP.

OpenGL works great by RDP with professional Nvidia cards without anything like virtual machines and RemoteFX. For Quadro (Quadro 4000 tested) you need driver 377.xx. For M60 you can use the same driver. If you want to use last driver with M60, you have to change the driver mode to WDDM mode (see c:\Program Files\NVIDIA Corporation\NVSMI\nvidia-smi.1.pdf). It is possible that there are some problems with licensing in this last case.

Some people recommend using "tscon.exe" if you can: https://stackoverflow.com/a/45723167/32453 or using a scheduler to do it on native hardware: https://stackoverflow.com/a/41839102/32453 or creating a group policy:
https://community.esri.com/thread/225251-enabling-gpu-rendering-on-windows-server-2016-windows-10-rdp
maybe copy opengl32.dll (or opengl64.dll) to your executable's dir: https://blender.stackexchange.com/a/73014 and newer version of the dll: https://fdossena.com/?p=mesa/index.frag

Remote Desktop and OpenGL does not play very well. When you connect to a Windows box the OpenGL Driver is unloaded and you end up with software emulation of OpenGL.
When you disconnect from the Windows box the OpenGL driver is not reloaded. This causes issues when you are running tests on the machine as you have to physically login to the machine to reset the drivers.
The solution I ended up using was to:
Disable Remote Desktop.
Delete all other software for remote desktop access. Because if it's used for logging in remotely the current set of drivers loaded may be messed up.
Install NoMachine
NoMachine is my personal favourite (when it does not play up) for a number of reasons:
Hardware acceleration of compression (video of desktop).
Works on Windows and Linux.
Works well on low-bandwidth connections especially if the client and server have the necessary hardware for compression of the data stream.
On Linux you get your desktop as you last left it when you were sitting in front of the machine.
On Windows it does not affect OpenGL.
currently free for personal and commercial use. Do check the licence in case it's changed.
When NoMachine plays up it hogs the CPU but this happens rarely. It is however in active development
Others to consider:
TurboVNC
TightVNC
TeamViewer - only free for personal use.

Related

DirectX11 Desktop duplication not working with NVIDIA

I'm trying too use DirectX desktop duplication API.
I tried running exmaples from
http://www.codeproject.com/Tips/1116253/Desktop-Screen-Capture-on-Windows-via-Windows-Desk
And from
https://code.msdn.microsoft.com/windowsdesktop/Desktop-Duplication-Sample-da4c696a
Both of these are examples of screen capture using DXGI.
I have NVIDIA GeForce GTX 1060 with Windows 10 Pro on the machine. It has Intelâ„¢ Core i7-6700HQ processor.
These examples work perfectly fine when NVIDIA Control Panel > 3D Settings is selected to Auto select processor.
However if I set the setting manually to NVIDIA Graphics Card the samples stop working.
Error occurs at the following line.
//IDXGIOutput1* DxgiOutput1
hr = DxgiOutput1->DuplicateOutput(m_Device, &m_DeskDupl);
Error in hr(HRESULT) is DXGI_ERROR_UNSUPPORTED 0x887A0004
I'm new to DirectX and I don't know the issue here, is DirectX desktop duplication not supported on NVIDIA ?
If that's the case then is there a way to select a particular processor at the start of program so that program can run with any settings ?
#Edit
After looking around I asked the developer (Evgeny Pereguda) of the second sample project on codeproject.com
Here's a link to the discussion
https://www.codeproject.com/Tips/1116253/Desktop-Screen-Capture-on-Windows-via-Windows-Desk?msg=5319978#xx5319978xx
Posting the screenshot of the discussion on codeproject.com in case original link goes down
I also found an answer on stackoverflow which unequivocally suggested that it could not be done with the desktop duplication API referring to support ticket at microsoft's support site https://support.microsoft.com/en-us/help/3019314/error-generated-when-desktop-duplication-api-capable-application-is-ru
Quote from the ticket
This issue occurs because the DDA does not support being run against
the discrete GPU on a Microsoft Hybrid system. By design, the call
fails together with error code DXGI_ERROR_UNSUPPORTED in such a
scenario.
However there are some applications which are efficiently duplicating desktop on windows in both modes (integrated graphics and discrete) on my machine. (https://www.youtube.com/watch?v=bjE6qXd6Itw)
I have looked into the installation folder of the Virtual Desktop on my machine and can see following DLLs of interest
SharpDX.D3DCompiler.dll
SharpDX.Direct2D1.dll
SharpDX.Direct3D10.dll
SharpDX.Direct3D11.dll
SharpDX.Direct3D9.dll
SharpDX.dll
SharpDX.DXGI.dll
SharpDX.Mathematics.dll
Its probably an indication that this application is using DXGI to duplicate desktop, or may be the application is capable of selecting a specific processor before it starts.
Anyway the question remains, is there any other efficient method of duplicating desktop in both modes?
The likely cause is certain internal limitation for Desktop Duplication API, described in Error generated when Desktop Duplication API-capable application is run against discrete GPU:
... when the application tries to duplicate the desktop image against the discrete GPU on a Microsoft Hybrid system, the application may not run correctly, or it may generate one of the following errors:
Failed to create windows swapchain with 0x80070005
CDesktopCaptureDWM: IDXGIOutput1::DuplicateOutput failed: 0x887a0004
The article does not suggest any other workaround except use of a different GPU (without more specific detail as for whether it is at all achievable programmatically):
To work around this issue, run the application on the integrated GPU instead of on the discrete GPU on a Microsoft Hybrid system.
Microsoft introduced a registry value that can be set programmatically to control which GPU an application runs on. Full answer here.

OpenGL Shader exception in Vncviewer [closed]

Closed. This question needs debugging details. It is not currently accepting answers.
Edit the question to include desired behavior, a specific problem or error, and the shortest code necessary to reproduce the problem. This will help others answer the question.
Closed 8 years ago.
Improve this question
I connect the remote Ubuntu 12.04 64bit by vncviewer application. But when I run a OpenGL application, it shows the exception information:
Caught exception GLShader::GLShader: GL_ARB_shader_objects not supported while initializing rendering windows
But if I connect the monitor with the remote computer, it works well and could show the OpenGL application.
Is there any solution to make the OpenGL application run in the remote window by vncviewer? Thanks!
UPDATED:
In the remote Ubuntu 12.04 64bit server, the ~/.vnc/xstartup file is as follows:
.
And the VNC Viewer client in a Windows 7 32bit system is as follows:
Usually on Linux the VNC server is a dedicated variant of the Xorg X11 server (Xvnc) that uses a software based renderer backend and has not GPU acceleration. I guess you're using a NVidia GPU and the NVidia proprietary drivers, or AMD GPU with AMD proprietary drivers, because otherwise the Mesa softpipe implementation would have kicked in.
If you really want to use the GPU you'll have to VNC into a running X11 session into which you start the x11vnc server.
Update
First things first, for the GPU to work a X server must be running and have its output be sent to the display connectors. Sorry, the current driver model doesn't allow for a purely off-screen GPU accelerated X11 sever; this is not a limitation of the hardware, but just the Xorg X11 server implementation. This also means, that whatever you're doing will be visible to whoever connects a monitor to the screen. At least we can take care of, that nobody messes with mouse and keyboard.
First things first create a custom /etc/X11/xorg.vnc.conf consisting of this
Section "ServerFlags"
Option "AllowEmptyInput" "true"
Option "AutoAddDevices" "off"
Option "DontZap" "false"
Option "DontVTSwitch" "true"
Option "HandleSpecialKeys" "Never"
EndSection
Section "Device"
Identifier "DeviceGPU"
Driver "nvidia"
EndSection
Next implement a script stat starts everything you want to run in that particular X11 session. Most of the time this would be something that launches the x11vnc server and then execs into the desktop envinronment, e.g.
#!/bin/sh
x11vnc -display $DISPLAY &
exec startxfce4 # or whatever
I refer you to the manpage of x11vnc on how to configure the authentication to use.
Lastly you should check that the Xorg server binary is SUID root; the NVidia driver is still not making full use of KMS and depends on the X server being started with full privileges.
Once these prerequsites are met you can start a X11 session that supports VNC using
xinit $FULL_PATH_TO_YOUR_SESSION_SCRIPT -- $DISPLAY -config xorg.vnc.conf
where $DISPLAY is a free X11 display number.

OpenGL (v >=3) application on remote machine

Is there a way to start an application with OpenGL >= 3 on a remote machine?
Local and remote machine run on Linux.
More precisely, I have the following problem:
I have an application that uses Qt for GUI stuff and OpenGL for 3D rendering.
I want to start this application on several remote machines because the program does some very time consuming computation.
Thus, I created a version of my program that does not raise a window. I use QGuiApplication, QOffscreenSurface, and a framebuffer object as rendertarget.
BUT: When I start the application on a remote machine (ssh -Y remotemachine01 myapp) I only have OpenGL version 2.1.2. When I start the application locally (on the same machine, I have opengl 4.4). I suppose the X forwarding is the problem.
So I need a way to avoid X forwarding.
Right now there's no clean solution, sorry.
GLX (the OpenGL extension to X11 which does the forwarding stuff) is only specified up to OpenGL-2.1, hence your inability to forward a OpenGL-3 context. This is actually a ridiculous situation, because the "OpenGL-3 way" is much better suited for indirected rendering, than old fashioned OpenGL-2.1 and earlier. Khronos really needs to get their act together and specify GLX-3.
Your best bet would be either to fall back to a software renderer on the remote side and some form of X compression. Or use Xpra backed by on GPU X11 server; however that only works for only a single user at a time.
In the not too far future the upcomming Linux graphics driver models will allow for remote GPU rendering execution by multiple users sharing graphics resources. But we're not there yet.

What is the effectiveness of using the compatibility feature for older operating systems in Windows XP, Vista, 7 and 8?

I've been trying to research why certain compatibility features differ based on operating system so I can program a patch. I'm using the compatibility settings in the registry for Windows 95 to run a game (that of which the game was produced on) in each system. In Windows XP, the game runs perfectly. None of the scenes lag, and the sound works just as well as the scenes. I'm unsure of how it runs in Windows Vista, but in Windows 7 & 8 the compatibility feature breaks the game. I used a VM to run XP, but that doesn't effect the game's playability; real XP users have tested it. Whenever I play the game using the Win95 setting for compatibility in 7 & 8, everything lags. The music doesn't slow down during gameplay, but the graphics do. During cutscenes, they literally break. Everything pixelates, white noise and static increases volume, and the video lags every two seconds.
I therein tested it in Ubuntu Linux via WINE, and it runs better than it does in XP. I just had to use the alsa sound driver. What changed? If so, is it programmatically fixable? I'm using an amalgamation of C++, Batch and Java.
If it is necessary, the video game is entitled "The Neverhood."
Thanks.
The compatibility feature available in the shell is just scratching the surface of the "Application Compatibility" subject in Windows.
There is a tool called "Microsoft Application Compatibility Toolkit (ACT)" (that exist since Windows XP exist I believe) that has much more to offer, so maybe that can help.
For example here are some compatibility settings for Graphics Control Issues
I currently play "The Neverhood" on Win7 x64 without any visual problem, you are right when I played on Win7 for first time (4 years ago) was a headache and a little tricky to do the correct compatibility flags for each win version but finally I wrote this reg code for Win7 and worked for me while 4 years, sure it will work for you too:
Windows Registry Editor Version 5.00
[HKEY_CURRENT_USER\Software\Microsoft\Windows NT\CurrentVersion\AppCompatFlags\Layers]
"C:\\Folder\\nhc.exe"="# WIN95 256COLOR 640X480 DISABLEDWM"
Where "C:\\Folder\\nhc.exe" of course is the path to your Neverhood. (Notice the double backslashes)
that flags means: Change Display color to 256 colors, change display resolution to 640x480, disable Themes service (DWM Service).
I hope this help you.
This may not answer the question directly, but if you want to improve performance of The Neverhood, change the compatibility to run in Windows 95 - then switch all other options ON, except the bottom three. This helps to make the game as fast and smooth as possible.

How to fallback to software rendering in Java3D?

We are having some weird problems using Java3D over a Windows' remote desktop. The remote machine is a virtualized server, which can't use the (physical) server's graphic card. When I run the app, the following error pops:
Unable to create DirectX D3D context.
Neither Hardware and Software Renderer are available.
Please update your video card drivers
and get the latest DirectX available at http://microsoft.com/directx
After switching to OpenGL (starting the JVM with -Dj3d.rend=ogl) the same error appears! What is possibly happening? How can I fallback to software rendering, either with OpenGL or DirectX, when the error appears?
EDIT: I've already tried using another OpenGL vendor, using Mesa3D's DLLs instead of the native ones, but it did nothing different. I also installed DirectX SDK and tried to start Java3D with the reference driver (-Dj3d.d3ddevice=reference), but it didn't work either.
The same error appears because if OpenGL fails, Java3D tries to use DirectX. If that fails, too, then the pop is shown.
I didn't manage to solve it because, instead of trying to change things at the remote server, I tried to emulate the problem at my own machine by disabling the video driver. I still don't know why both problems aren't equivalent, but after I returned to work on the server and put DirectX's d3dref9.dll at Java's \bin, it worked.
Now I have an entire new problem, as the JVM can't find the DLL if I place it at java.library.path or Tomcat's \bin :) Problems just can't not exist.
Try the following:
Under Windows:
First, open the Display Properties pane by right clicking on desktop screen and choosing Properties item in menu. In that pane, display the Settings tab, and click on the Advanced button. Then in the Troubleshoot tab of the pane that opened, check the Hardware acceleration cursor is at its maximum on Full, confirm your choice and try to run your program again.
If the previous operation didn't resolve your problem, update the OpenGL and DirectX drivers of your graphic card with the latest available ones, and try to run Sweet Home 3D again.