Quad-buffer OpenGL for 4.2 core-profile? - opengl

I'm developing a 3D stereoscopic OpenGL app specifically for Windows 7 and nVidia Quadro K5000 cards. Rendering the scene from left and right-eye perspectives using glDrawBuffer(GL_BACK_LEFT) and glDrawBuffer(GL_BACK_RIGHT) works fine, and the 3D effect is displayed nicely.
While this works, I'd like to use nVidia's nSight Graphics local debugging. However, I get the error "Cannot enter frame debugging. nSight only supports frame debugging for ... OpenGL 4.2. Reason: glDrawBuffer(bufs[i] = 0x00000402)"
If the calls to glDrawBuffer are removed, nSight local debugging works.
Going through the OpenGL 4.2 spec, DrawBuffer is described in section 4.2.1
So, two questions:
1) Is there some other way (besides DrawBuffer) to specify BACK_RIGHT or BACK_LEFT buffers for drawing to quad-buffers?
2) Is nSight capable of doing frame-level debugging on quad-buffered stereoscopic setups? If so, how?

Related

Installed OpenGL wrongly

I'm trying to run OpenGL 3 programs but I'm not sure which implementation I'm using and probably set it up wrong. (I'm a DirectX programmer) While trying to run these demos:
https://github.com/tomdalling/opengl-series/archive/master.zip.
I get this exception:
ERROR: WGL: OpenGL profile requested but WGL_ARB_create_context_profile is unavailable
This machine is Windows 7 with a 1023MB NVIDIA GeForce GT 520M (Dell) card. Has anyone else seen this error?
I think I know exactly what the issue is.
Most laptops have two graphics cards: A dedicated card (GeForce GT 520M), and an integrated card (Intel HD).
Your integrated card only supports up to OpenGL 3.1, while these demos are requesting OpenGL 3.2.
All you need to to is go into the Nvidia control panel, go into 'Manage 3D Settings', then set the preferred graphics processor to 'High-Performance Nvidia processor'.
After that, the demos should run correctly.

Nvidia Nsight 4.0 cannot profile code in OpenGL 4.3

I am using Visual Studio 13 with Nvidia NSights 4.0. In my application I am doing a mix of different types of rendering but, for the purpose of testing the proiler, I did a simple rendering of a scene. I opened the graphics debugger and, when I open the GUI and press spacebar to capture the frame I get this error:
Cannot enter frame debugger. Nsight only supports frame debugging for
D3D9, D3D10, D3D11, and OpenGL 4.2.
Reason: glEnd
I am using a GT540m and I checked my OpenGL version and it is 4.3
If I, then, try to use the performance anaysis tool and trace OpenGL (following the instructions) I always get some percentage of CPU frames and 0 GPU frames.
I have no idea what am I doing wrong. Is there any solution to this or alternative ways to profile OpenGL?
Are you using immediate mode drawing? Ie. glBegin(..); glVertex<> ; glEnd()
From the Nsight User Guide's Supported OpenGL Functions page:
NVIDIA® Nsight™ Visual Studio Edition 4.0 frame debugging supports the set of OpenGL operations, which are defined by the OpenGL 4.2 core profile. Note that it is not necessary to create a core profile context to make use of the frame debugger. An application that uses a compatibility profile context, but restricts itself to using the OpenGL 4.2 core subset, will also work. A few OpenGL 4.2 compatibility profile features, such as support for alpha testing and a default vertex array object, are also supported.
So, replace the immediate mode rendering with newer drawing functions like glDrawArrays and glDrawElements that vertex array objects.
Better yet, create a core profile context to ensure you aren't using deprecated functionality.
My advice: stay away from outdated tutorials online and read the latest edition of the Red book (OpenGL Programming Guide), which only covers modern OpenGL.
You can also try the more basic GPUView tool that can be found in Win 8 SDK
UPDATE:
As for why 0 GPU frames are retrieved, are you sure that your GPU is on the list of supported hardware. I had the same problem where NSight was mostly working (was able to profile other aspects) but 0 GPU frames were collected. Later realized that my card was not officially supported.
Now available Nsight 4.5 RC1, works with cuda sdk 7 RC, and among its features, now support openGL 4.3 !

gl_PointCoord has incorrect/uninitialized value

I am having trouble to get valid values in my fragment-shader's gl_PointCoord variable. I use libgdx, which is a cross-platform java framework that allows to run the same application on the desktop as well on android. The shader works fine with OpenGL ES on android, only the desktop seems to not provide a correctly interpolated value but always zero.
Could this be an issue with libgdx or with the graphics driver?
NVidia Quadro 3000M
Driver 275.33
Win 7 64-bit (Service Pack 1)
libgdx-0.9.6
fyi: Did not do much research yet, but seems to be a bug in lwjgl or the driver that gl_PointCoord is only available when enabling pointsprite-mode via
Gdx.gl20.glEnable(GL11.GL_POINT_SPRITE_OES);
This is not available in OpenGL 4.2 or OpenGL ES 2.0, but seems to be required to be explicitly set on the desktop.

How to do stereoscopic 3D with OpenGL on GTX 560 and later?

I am using the open source haptics and 3D graphics library Chai3D running on Windows 7. I have rewritten the library to do stereoscopic 3D with Nvidia nvision. I am using OpenGL with GLUT, and using glutInitDisplayMode(GLUT_RGB | GLUT_DEPTH | GLUT_DOUBLE | GLUT_STEREO) to initialize the display mode. It works great on Quadro cards, but on GTX 560m and GTX 580 cards it says the pixel format is unsupported. I know the monitors are capable of displaying the 3D, and I know the cards are capable of rendering it. I have tried adjusting the resolution of the screen and everything else I can think of, but nothing seems to work. I have read in various places that stereoscopic 3D with OpenGL only works in fullscreen mode. So, the only possible reason for this error I can think of is that I am starting in windowed mode. How would I force the application to start in fullscreen mode with 3D enabled? Can anyone provide a code example of quad buffer stereoscopic 3D using OpenGL that works on the later GTX model cards?
What you experience has no technical reasons, but is simply product policy of NVidia. Quadbuffer stereo is considered a professional feature and so NVidia offers it only on their Quadro cards, even if the GeForce GPUs would do it as well. This is not a recent development. Already back in 1999 it was like this. For example I had (well still have) a GeForce2 Ultra back then. But technically this was the very same chip like the Quadro, the only difference was the PCI-ID reported back to the system. One could trick the driver into thinking you had a Quadro by tinkering with the PCI-IDs (either by patching the driver or by soldering an additional resistor onto the graphics card PCB).
The stereoscopic 3D mode for Direct3D hack was already supported by my GeForce2 then. Back then the driver duplicated the rendering commands, but applied a translation to the modelview and a skew to the projection matrix. These days it's implemented a shader and multi rendertarget trick.
The NVision3D API does allow you to blit images for specific eyes (this is meant for movie players and image viewers). But it also allows you to emulate quadbuffer stereo: Instead of GL_BACK_LEFT and GL_BACK_RIGHT buffers create two Framebuffer Objects, which you bind and use as if they were quadbuffer stereo. Then after rendering you blit the resulting images (as textures) to the NVision3D API.
With only as little as 50 lines of management code you can build a program that seamlessly works on both NVision3D as well as quadbuffer stereo. What NVidia does is pointless and they should just stop it now and properly support quadbuffer stereo pixelformats on consumer GPUs as well.
Simple: you can't. Not the way you're trying to do it.
There is a difference between having a pre-existing program do things with stereoscopic glasses and doing what you're trying to do. What you are attempting to do is use the built-in stereo support of OpenGL: the ability to create a stereoscopic framebuffer, where you can render to the left and right framebuffers arbitrarily.
NVIDIA does not allow that with their non-Quadro cards. It has hacks in the driver that will force stereo on applications with nVision and the control panel. But NVIDIA's GeForce drivers do not allow you to create stereoscopic framebuffers.
And before you ask, no, I have no idea why NVIDIA doesn't let you control stereo.
Since I was looking into this issue for my own game, I w found this link where somebody hacked the USB protocol. http://users.csc.calpoly.edu/~zwood/teaching/csc572/final11/rsomers/
I didn't follow it through but at the time when I was researching on this it didn't look to hard to make use of this information. So you might have to implement your own code in order to support it in your app, which should be possible. Unfortunately a generic solution would be harder, because then you would have to hack the driver or somehow hook into the OpenGL library and intercept the calls.

Can't figure out why this OpenGL program is not rendering

I compiled Joe Groff's "An intro to modern OpenGL: Hello World: The Slideshow.
I have compiled it using Mingw-w64 with freeglut, Glut 3.7 and a version that makes my own context.
However, when I run the program, the image doesn't fade back and forth like its supposed to and I can't figure out why (spent a whole day on it).
Also, I have examined most of inputs and outputs except for the shaders and cant find anything wrong, anyone have any ideas?
Most likely, your OpenGL version doesn't support shaders. Are you by any chance running in a virtual machine or via remote desktop? These tend to only support OpenGL 1.1 even if the graphics card/drivers are much more recent, and OpenGL 1.1 does not support shaders. It's also possible that if you're using an older laptop with an integrated Intel GPU that shaders are not (properly) supported.