CUDA Dynamic Parallelism debugging is not supported in preemption mode. Breakpoints will be disabled - c++

Now, I realize there is already a "solution" to this problem, but that solution doesn't work for me.
My setup is very close to the one in this post : Can't debug CUDA: CUDA dynamic parallelism debugging is not supported in preemption mode . I'm also cognizant of this link : https://devtalk.nvidia.com/default/topic/536202/debugging-dynamic-parallelism-and-preemption-mode/
I'm on VS2012, Win 7 64bit, drivers are version 331.65, 2 GTX Titans (Device 0 driving display, Device 1 headless) and Nsight 3.2. I've followed the instructions in this post and turned off the forcing of SW preemption for Desktop & Headless GPUs. I've done a deviceQuery and both my Titans are showing up. Additionally, I've got my monitors plugged into the top Titan on the mobo, which I'm pretty sure is Device 0. Thus I've specified cudaSetDevice(1); in my code. I've disabled Windows Aero and...
...have no idea what else to do to prevent this from happening. I am toying with putting yet another GPU in my system, a GTX580 to drive the display, but I don't feel that should be necessary. I've tried changing the cudaSetDevice argument to 0 - same error, and 2 - can't find a CUDA device. Can anyone help me out here? I've got some beastly debugging to do.

Following the instructions listed in the link that I mentioned, I found that CUDA debugging would eventually work and work well. I don't really know what changed since I posted this question, but follow the previous solution's guidance and it should work.

Related

DirectX11 Desktop duplication not working with NVIDIA

I'm trying too use DirectX desktop duplication API.
I tried running exmaples from
http://www.codeproject.com/Tips/1116253/Desktop-Screen-Capture-on-Windows-via-Windows-Desk
And from
https://code.msdn.microsoft.com/windowsdesktop/Desktop-Duplication-Sample-da4c696a
Both of these are examples of screen capture using DXGI.
I have NVIDIA GeForce GTX 1060 with Windows 10 Pro on the machine. It has Intelâ„¢ Core i7-6700HQ processor.
These examples work perfectly fine when NVIDIA Control Panel > 3D Settings is selected to Auto select processor.
However if I set the setting manually to NVIDIA Graphics Card the samples stop working.
Error occurs at the following line.
//IDXGIOutput1* DxgiOutput1
hr = DxgiOutput1->DuplicateOutput(m_Device, &m_DeskDupl);
Error in hr(HRESULT) is DXGI_ERROR_UNSUPPORTED 0x887A0004
I'm new to DirectX and I don't know the issue here, is DirectX desktop duplication not supported on NVIDIA ?
If that's the case then is there a way to select a particular processor at the start of program so that program can run with any settings ?
#Edit
After looking around I asked the developer (Evgeny Pereguda) of the second sample project on codeproject.com
Here's a link to the discussion
https://www.codeproject.com/Tips/1116253/Desktop-Screen-Capture-on-Windows-via-Windows-Desk?msg=5319978#xx5319978xx
Posting the screenshot of the discussion on codeproject.com in case original link goes down
I also found an answer on stackoverflow which unequivocally suggested that it could not be done with the desktop duplication API referring to support ticket at microsoft's support site https://support.microsoft.com/en-us/help/3019314/error-generated-when-desktop-duplication-api-capable-application-is-ru
Quote from the ticket
This issue occurs because the DDA does not support being run against
the discrete GPU on a Microsoft Hybrid system. By design, the call
fails together with error code DXGI_ERROR_UNSUPPORTED in such a
scenario.
However there are some applications which are efficiently duplicating desktop on windows in both modes (integrated graphics and discrete) on my machine. (https://www.youtube.com/watch?v=bjE6qXd6Itw)
I have looked into the installation folder of the Virtual Desktop on my machine and can see following DLLs of interest
SharpDX.D3DCompiler.dll
SharpDX.Direct2D1.dll
SharpDX.Direct3D10.dll
SharpDX.Direct3D11.dll
SharpDX.Direct3D9.dll
SharpDX.dll
SharpDX.DXGI.dll
SharpDX.Mathematics.dll
Its probably an indication that this application is using DXGI to duplicate desktop, or may be the application is capable of selecting a specific processor before it starts.
Anyway the question remains, is there any other efficient method of duplicating desktop in both modes?
The likely cause is certain internal limitation for Desktop Duplication API, described in Error generated when Desktop Duplication API-capable application is run against discrete GPU:
... when the application tries to duplicate the desktop image against the discrete GPU on a Microsoft Hybrid system, the application may not run correctly, or it may generate one of the following errors:
Failed to create windows swapchain with 0x80070005
CDesktopCaptureDWM: IDXGIOutput1::DuplicateOutput failed: 0x887a0004
The article does not suggest any other workaround except use of a different GPU (without more specific detail as for whether it is at all achievable programmatically):
To work around this issue, run the application on the integrated GPU instead of on the discrete GPU on a Microsoft Hybrid system.
Microsoft introduced a registry value that can be set programmatically to control which GPU an application runs on. Full answer here.

OpenGL glslDevil fails debugging

I started using glslDevil to debug my OpenGL engine shaders. When I start the program from inside glslDevil all I get in GLTrace windows is the calls to:
wglGetPixelFormatAttribivARB()
which is printed like 1002 times and after that the debugged app freezes and that is it. No debug info or anything else. Maybe glslDevil doesn't support newer OpenGL versions?
I am using OpenGL 4.2 compatibility mode (but fully programmable pipeline), running Win7 64 bit. The tested software is 32 bit.
wglGetPixelFormatAttribivARB is a function of the Window OpenGL subsystem that retrieves a particular Pixel Format configuration. There can be a lot of different configurations available (several hundreds to thousands) and those 1002 calls to it are most likely caused by a loop that enumerates all of them to find a best match.
Other than giving you that information I can't help you much, though.

OpenGL game runs fine in Win7, drops to 5fps in Windows 8?

I have just recently installed Windows 8, and I tried to compile and build a simple c++ game project in VS 2010, but when I did, it was running at 5 fps. On windows 7, it runs at a solid 60 fps. Nothing has been changed in the code, but there is just horrible slow down.
I have updated my video drivers, but there is still horrible lag. I thought the problem was to do with compatibility issues with windows 8 and OpenGL, but I can't find anything to confirm this. I was wondering if anyone else has had this problem, and if you have solved it.
I would recommend you test your graphics card / drivers first. All sorts of driver issues could arise when you upgrade operating systems. One of the best tests would be to download Cinebench and see how it performs. Cinebench will evaluate your OpenGL performance. If you get poor results, then you know it's a hardware / driver issue and not an issue with your application.
If the Cinebench results are good, then you can move on to the recommendations made by #Robert Rouhani (comments).
http://www.maxon.net/products/cinebench/overview.html
What sort of video card do you have in the Win8 machine?
If it's a laptop you might be battling against nVidia Optimus (or an equivalent technology?). Basically programs have to tell the OS in advance that they want to use the video card or they get defaulted to using the low power GPU embedded in the CPU (note: over-simplification).
If this is the case, there's some options in the nVidia control panel to let you create a profile telling the OS to run your app with the discrete GPU, rather than the embedded one.

OpenGL GLUT on VirtualBox Ubuntu 11.10 segmentation fault

DISCLAIMER:
I see that some suggestions for the exact same question come up, however that (similar) post was migrated to SuperUsers and seems to have been removed. I however would still like to post my question here because I consider it software/programming related enough not to post on SuperUsers (the line is vague sometimes between what is a software and what is a hardware issue).
I am running a very simple OpenGL program in Code::Blocks in VirtualBox with Ubuntu 11.10 installed on a SSD. Whenever I build&run a program I get these errors:
OpenGL Warning: XGetVisualInfo returned 0 visuals for 0x232dbe0
OpenGL Warning: Retry with 0x802 returned 0 visuals
Segmentation fault
From what I have gathered myself so far this is VirtualBox related. I need to set
LIBGL_ALWAYS_INDIRECT=1
In other words, enabling indirect rendering via X.org rather then communicating directly with the hardware. This issue is probably not related to the fact that I have an ATI card as I have a laptop with an ATI card that runs the same program flawlessly.
Still, I don't dare to say that the fact that my GPU is an ATI doesn't play any role at all. Nor am I sure if the drivers are correctly installed (it says under System info -> Graphics -> graphics driver: Chromium.)
Any help on HOW to set LIBGL_ALWAYS_INDIRECT=1 would be greatly appreciated. I simply lack the knowledge of where to put this command or where/how to execute it in the terminal.
Sources:
https://forums.virtualbox.org/viewtopic.php?f=3&t=30964
https://www.virtualbox.org/ticket/6848
EDIT: in the terminal type:
export LIBGL_ALWAYS_INDIRECT = 1
To verfiy that direct rendering is off:
glxinfo | grep direct
However, the problem persists. I still get mentioned OpenGL warnings and the segmentation fault.
I ran into this same problem running the Bullet Physics OpenGL demos on Ubuntu 12.04 inside VirtualBox. Rather than using indirect rendering, I was able to solve the problem by modifying the glut window creation code in my source as described here: https://groups.google.com/forum/?fromgroups=#!topic/comp.graphics.api.opengl/Oecgo2Fc9Zc.
This entailed replacing the original
...
glutCreateWindow(title);
...
with
...
if (!glutGet(GLUT_DISPLAY_MODE_POSSIBLE))
{
exit(1);
}
glutCreateWindow(title);
...
as described in the link. It's not clear to me why this should correct the segfault issue; apparently glutGet has some side effects beyond retrieving state values. It could be a quirk of freeglut's implementation of glut.
If you look at the /etc/environment file, you can see a couple of variables exposed there - this will give you and idea of how to expose that environment variable across the entire system. You could also try putting it in either ~/.profile or ~/.bash_profile depending on your needs.
The real question in my mind is: Did you install the guest additions for Ubuntu? You shouldn't need to install any ATI drivers in your guest as VirtualBox won't expose the actual physical graphics hardware to your VM. You can configure your guest to support 3D acceleration in the virtual machine settings (make sure you turn off the VM first) under the Display section. You will probably want to boost the allocated virtual memory - 64MB or 128MB should be plenty depending on your needs.

openGL screensaver causes problems

I have an application written in QT4, that uses an openGL window. It has been running happily for months. Windows XP, service Pack 3,
Recently I was diddling with my screensaver, and happened to select the 3D text choice. When I previewed it, the QT4 application seg-faulted immediately. When I ran in the debugger,it is crashing in ig4dev32.dll, which is an intel graphics accelerator driver for Open GL.
When I do a similar test on a machine with an NVIDIA card, I (not surprisingly) get no problems.
I'm not really sure whether I'm asking for help, or insight, or whatever--has anybody ever seen it? Google tells me others have seen it happen in gaming applications, but I see no references to developers having it happen to them. Obviously, I can not use that screensaver, but I suspect the problem is "bigger" than that. Ideas?
I would start by reporting this to Intel. No doubt, they will not resolve it by the end of the week, but eventually. In the mean time, I'd also report it to Qt software, so see if they can trouble shoot it as well.
In the mean time, you know the issue and how to resolve it (no OpenGL screensavers). So all you have to do is to inform your customers. The best would be if the application itself could inform the customers, but detecting if a screensaver uses OpenGL or not does not seem feasable.
Perhaps you could do some additional tests. For instance, what happens if your application is run in paralell with, say, Google Earth in OpenGL mode?