Enabling OpenGL triple buffering + vsync in AMD CCC breaks our app - c++

We've got a desktop Windows app written in C++ which uses an OpenGL rendered view.
On some AMD cards, if you open Catalyst Control Center and force Triple Buffering and V-sync on, it breaks our app: nothing renders at all, it's just a grey screen (on some other driver versions, it crashes on creating the context instead). Turning off either triple buffering or V-sync restores it to normal.
We use wglSwapIntervalEXT to enable V-syncing in our app. Thinking it might conflict, I removed the code for it; no change.
Is this definitely a driver bug or is there anything different we have to do to handle triple buffering?

I have run into this same issue in my own application and it's been maddening to track down. Here's the additional information I can provide based upon a minimal application testing setup I built to replicate the problem:
1) All of your calls to set the pixel format and create a GL RC will succeed. However, GLDebugger will show that the RC does not actually acquire static buffers.
2) When you try to make the RC current, it will return false, and GetLastError() says there is an invalid handle.
3) I can only replicate this problem in MFC. Is that what you're using? When I built a testbed application using straight Win32 API, it works fine. There has to be an obscure interaction at play here..
4) If I delay RC creation until after OnCreate, then things build fine.
I'm afraid my answer leans towards "Driver Bug", but point #4 shows a workaround -- rather than do your GL window creation in OnCreate, try instead doing it as a one-off in OnInitialUpdate -- this so far is working in test for me!
UPDATE: I've contacted AMD about this issue, and it turns out that this is a result of MFC creating the window with zero width/height originally, then resizing. If in your PreCreate function assign nonzero dimensions, everything works.
Hopefully this will be a good resource for everyone trying to figure out what's going on with this!

Related

Windows 10 Desktop Window Manager swap timing?

I have a few questions regarding the Desktop Window Manager (aka DWM) in Windows 10:
Background: For an OpenGL application I wrote in C++ I need precise timing regarding the swap of the front and back buffers in OpenGL and the realization of these commands on the OS level. (I know Windows 10, or Windows in general, is a bad choice for this, but there are other limiting factors).
Question 1: My internet research showed that the DWM manages a third buffer (making visualization a triple buffered system) which I cannot control and therefore creates an unpredictable delay. The investigation also showed that this can be bypassed by opening an OpenGL context in fullscreen mode. Is this information correct?
Question 2: Is this delay caused by the fact that the OS randomly instructs the DWM to copy the buffer?
Question 3: How long is the actual delay, my investigation showed numbers between < 1ms and up to 50ms, but there was no trustworthy source.
In fact, besides for the single fact, the mere existence of the delay, there was no trustworthy source for any of the other assumptions which I was able to find on the internet. Therefore I kindly ask anyone having an answer to this questions to include if this possible, a reference to their statement.
I don't know if this is important, but I'm using OpenGL via GLFW and GLEW.
Although I was unable to find an answer to question 2 and 3, contacting the Nvidia support provided the answer to question 1.
Nvidia statet that an application rendered in a full screen context cannot access the DWM. Only applications rendered in windowed mode are handled by it.
Warning: They also said that this was by design. Considering the fact that Microsoft attempts to force users/programmers to use the DWM there is no guarantee on how long this design decision will remain unchanged.
Original mail from Nvidia:
[...]
After checking your request with our specialized department, please note that when a game or anything is in Full Screen you cannot access this Windows Feature [annot.: DWM]. This is by design. It needs to be windowed mode if you want to access this feature.
[...]

C++ Code for Screen Capturing

I need to write code which will do screen sharing like WebEx or Team Viewer for Windows PC. The requirement is I don't have admin access and I can not Install any application or software for this. I know below technology but none of them is working for me. I have tried all sample for this code project URL http://www.codeproject.com/Articles/5051/Various-methods-for-capturing-the-screen
(1) GetDC(NULL) and BitBlt with SRCCOPY <= This will not capture Transparent Window and It cause GDI hung (Just try drawing in Paint.. your pencil stuck for some time when BitBlt operation performerd)
(2) GetDC(NULL) and BitBlt with SRCCOPY and CAPTUREBLT Option <= This will hide Cursor when I call BitBlt Operation and also GDI Hung when BitBlt Operation Performed.
(3) I also tried with DirectX using GetFrontBufferData.. This cause Flicker of my Transparent Window.
(4) I tried with Windows Media API but this require Windows Media Encoder to be Installed.
(5) I tried with Mirror Driver also but this require Driver to be Installed with Admin Access.
Can any one please suggest API where without any Installation I can capture entire screen and No flicker or GDI hung problem.
Thanks in Advance.....
The problem is that whatever method you'll use you have to hook into the system (intercept some OS-to-driver call) to let the system give you the time to do your operation safely. that requires whatever software to run in administrative mode.
All the above methods fail because of some internall call failure due to not enough priviledges.
If you think a bit, if running an exe at user level can share a system call even from non system level users, the system may have serious security breaches: I just have to deliver an application you use that shares your screen without you notice that.
So, instead to try to fool your company security policies, just ask to your admins: if you need those software for business purpose, they will do what is needed.

How to fallback to software rendering in Java3D?

We are having some weird problems using Java3D over a Windows' remote desktop. The remote machine is a virtualized server, which can't use the (physical) server's graphic card. When I run the app, the following error pops:
Unable to create DirectX D3D context.
Neither Hardware and Software Renderer are available.
Please update your video card drivers
and get the latest DirectX available at http://microsoft.com/directx
After switching to OpenGL (starting the JVM with -Dj3d.rend=ogl) the same error appears! What is possibly happening? How can I fallback to software rendering, either with OpenGL or DirectX, when the error appears?
EDIT: I've already tried using another OpenGL vendor, using Mesa3D's DLLs instead of the native ones, but it did nothing different. I also installed DirectX SDK and tried to start Java3D with the reference driver (-Dj3d.d3ddevice=reference), but it didn't work either.
The same error appears because if OpenGL fails, Java3D tries to use DirectX. If that fails, too, then the pop is shown.
I didn't manage to solve it because, instead of trying to change things at the remote server, I tried to emulate the problem at my own machine by disabling the video driver. I still don't know why both problems aren't equivalent, but after I returned to work on the server and put DirectX's d3dref9.dll at Java's \bin, it worked.
Now I have an entire new problem, as the JVM can't find the DLL if I place it at java.library.path or Tomcat's \bin :) Problems just can't not exist.
Try the following:
Under Windows:
First, open the Display Properties pane by right clicking on desktop screen and choosing Properties item in menu. In that pane, display the Settings tab, and click on the Advanced button. Then in the Troubleshoot tab of the pane that opened, check the Hardware acceleration cursor is at its maximum on Full, confirm your choice and try to run your program again.
If the previous operation didn't resolve your problem, update the OpenGL and DirectX drivers of your graphic card with the latest available ones, and try to run Sweet Home 3D again.

openGL screensaver causes problems

I have an application written in QT4, that uses an openGL window. It has been running happily for months. Windows XP, service Pack 3,
Recently I was diddling with my screensaver, and happened to select the 3D text choice. When I previewed it, the QT4 application seg-faulted immediately. When I ran in the debugger,it is crashing in ig4dev32.dll, which is an intel graphics accelerator driver for Open GL.
When I do a similar test on a machine with an NVIDIA card, I (not surprisingly) get no problems.
I'm not really sure whether I'm asking for help, or insight, or whatever--has anybody ever seen it? Google tells me others have seen it happen in gaming applications, but I see no references to developers having it happen to them. Obviously, I can not use that screensaver, but I suspect the problem is "bigger" than that. Ideas?
I would start by reporting this to Intel. No doubt, they will not resolve it by the end of the week, but eventually. In the mean time, I'd also report it to Qt software, so see if they can trouble shoot it as well.
In the mean time, you know the issue and how to resolve it (no OpenGL screensavers). So all you have to do is to inform your customers. The best would be if the application itself could inform the customers, but detecting if a screensaver uses OpenGL or not does not seem feasable.
Perhaps you could do some additional tests. For instance, what happens if your application is run in paralell with, say, Google Earth in OpenGL mode?

Off screen rendering when laptop shuts screen down?

I have a lengthy number-crunching process which takes advantage of quite abit of OpenGL off-screen rendering. It all works well but when I leave it to work on its own while I go make a sandwich I would usually find that it crashed while I was away.
I was able to determine that the crash occurs very close to the moment The laptop I'm using decides to turn off the screen to conserve energy. The crash itself is well inside the NVIDIA dlls so there is no hope to know what's going on.
The obvious solution is to turn off the power management feature that turns the screen and video card off but I'm looking for something more user friendly.
Is there a way to do this programatically?
I know there's a SETI#home implementation which takes advantage of GPU processing. How does it keep the video card from going to sleep?
I'm not sure what OS you're on, but windows sends a message that it is about to enter a new power state. You can listen for that and then either start processing on the CPU or deny the request to enter a lower-power state.
For the benefit of Linux users encountering a similar issue, I thought I'd add that, you can obtain similar notifications and inhibit power state changes using the DBUS API. An example script in Python, taken from the link, to inhibit power state change:
#!/usr/bin/python
import dbus
import time
bus = dbus.Bus(dbus.Bus.TYPE_SESSION)
devobj = bus.get_object('org.freedesktop.PowerManagement',
'/org/freedesktop/PowerManagement')
dev = dbus.Interface (devobj, "org.freedesktop.PowerManagement.Inhibit")
cookie = dev.Inhibit('Nautilus', 'Copying files from /media/SANVOL')
time.sleep(10)
dev.UnInhibit(cookie)
According to MSDN, there is an API that allows an application to tell Windows that it is still working and that Windows should not go to sleep or turn off the display.
The function is called SetThreadExecutionState (MSDN). It works for me, using the flags ES_SYSTEM_REQUIRED and ES_CONTINUOUS.
Note, however, that using this function does not stop the screen saver from running, which might interfere with your OpenGL app if the screen saver also uses OpenGL (oder Direct3D).