Binding a stencil render buffer to a frame buffer in opengl - opengl

Have anyone done this succesfully? It seems whatever index format I use in the stencil render buffer glCheckFramebufferStatus(...) returns GL_FRAMEBUFFER_UNSUPPORTED.
I've succesfully bound both a depth\color render buffer, but whenever I try to do the same thing with my stencil buffer I get (as I said) GL_FRAMEBUFFER_UNSUPPORTED.
Here is snippets of my code:
// Create frame buffer
GLuint fb;
glGenFramebuffers(1, &fb);
// Create stencil render buffer (note that I create depth buffer the exact same way, and It works.
GLuint sb;
glGenRenderbuffers(1, &sb);
glBindRenderbuffer(GL_RENDERBUFFER, sb);
glRenderbufferStorage(GL_RENDERBUFFER, GL_STENCIL_INDEX8, w, h);
// Attach color
glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, cb, 0);
// Attach stencil buffer
glBindFramebuffer(GL_FRAMEBUFFER, fb);
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_STENCIL_ATTACHMENT, GL_RENDERBUFFER, rb);
// And here I get an GL_FRAMEBUFFER_UNSUPPORTED when doing glCheckFramebufferStatus()
Any ideas?
Note: The color attachement is a texture and not a renderbuffer

Never use a free-standing stencil buffer. If you need stencil, always use a depth+stencil image format. Note that the stencil index formats are not required image formats.
Even though you're not using a depth buffer here, you still should use GL_DEPTH24_STENCIL8, which you should attach to GL_DEPTH_STENCIL_ATTACHMENT​.

You can use stencil-only on recent nvidia hardware/drivers
glFramebufferTexture2DEXT(GL_FRAMEBUFFER_EXT, GL_STENCIL_ATTACHMENT_EXT, GL_TEXTURE_2D, fboStencilTexture, 0);
Still no support for both separate depth and stencil

Related

GL_DEPTH_TEST does not work in OpenGL 3.3

I am writing a simple rendering program using OpenGL 3.3.
I have following lines in my code (which should enable depth test and culling):
glEnable(GL_DEPTH_TEST);
glDepthFunc(GL_LESS);
ExitOnGLError("ERROR: Could not set OpenGL depth testing options");
glEnable(GL_CULL_FACE);
glCullFace(GL_BACK);
glFrontFace(GL_CCW);
ExitOnGLError("ERROR: Could not set OpenGL culling options");
However after rendering I see the following result:
As you can see the depth test does not seem to work. What am I doing wrong? Where I should look for the problem?
Some information that may be useful:
In the projection matrix I have near clipping plane set to 0.2 and far to 3.2 (so near plane is not zero).
I render mesh and texture it using simple method with glDrawArrays and two buffers for vertex and texture coordinates. Shaders than are used to display these arrays properly.
I do not calculate and draw normals.
Context creation code: http://pastebin.com/mRMUxPL1
UPDATE:
Finally got it working! As it turns out I was not creating the buffer for depth rendering.
When I replaced this code (buffers initialization):
glGenFramebuffers(1, &mainFrameBufferId);
glGenRenderbuffers(1, &renderBufferId);
glBindRenderbuffer(GL_RENDERBUFFER, renderBufferId);
glRenderbufferStorage(GL_RENDERBUFFER,
GL_RGBA8,
camera.imageSize.width,
camera.imageSize.height);
glBindFramebuffer(GL_FRAMEBUFFER, mainFrameBufferId);
glFramebufferRenderbuffer(GL_FRAMEBUFFER,
GL_COLOR_ATTACHMENT0,
GL_RENDERBUFFER,
renderBufferId);
CV_Assert(glCheckFramebufferStatus(GL_FRAMEBUFFER) == GL_FRAMEBUFFER_COMPLETE);
with this one:
glGenFramebuffers(1, &mainFrameBufferId);
glGenRenderbuffers(1, &renderBufferId);
glBindRenderbuffer(GL_RENDERBUFFER, renderBufferId);
glRenderbufferStorage(GL_RENDERBUFFER,
GL_RGBA8,
camera.imageSize.width,
camera.imageSize.height);
glBindFramebuffer(GL_FRAMEBUFFER, mainFrameBufferId);
glFramebufferRenderbuffer(GL_FRAMEBUFFER,
GL_COLOR_ATTACHMENT0,
GL_RENDERBUFFER,
renderBufferId);
CV_Assert(glCheckFramebufferStatus(GL_FRAMEBUFFER) == GL_FRAMEBUFFER_COMPLETE);
glGenRenderbuffers(1, &depthRenderBufferId);
glBindRenderbuffer(GL_RENDERBUFFER, depthRenderBufferId);
glRenderbufferStorage(GL_RENDERBUFFER,
GL_DEPTH24_STENCIL8,
camera.imageSize.width,
camera.imageSize.height);
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, GL_RENDERBUFFER, depthRenderBufferId);
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_STENCIL_ATTACHMENT, GL_RENDERBUFFER, depthRenderBufferId);
everything started to work fine!
static int visualAttribs[] = { None };
^^^^
int numberOfFramebufferConfigurations = 0;
GLXFBConfig* fbConfigs = glXChooseFBConfig( display, DefaultScreen(display), visualAttribs, &numberOfFramebufferConfigurations );
CV_Assert(fbConfigs != 0);
glXChooseFBConfig():
GLX_DEPTH_SIZE: Must be followed by a nonnegative minimum size specification. If this value is zero, frame buffer configurations with no depth buffer are preferred. Otherwise, the largest available depth buffer of at least the minimum size is preferred. The default value is 0.
Try setting visualAttribs[] to something like { GLX_DEPTH_SIZE, 16, None }

What's wrong with my framebuffer?

I'm creating a framebuffer object to be my gbuffer for deferred shading. I mainly learned from http://ogldev.atspace.co.uk/, and modified to be a little more... sane. Here's the source code where I create the framebuffer:
/* Create the FBO */
glGenFramebuffers(1, &fbo);
glBindFramebuffer(GL_FRAMEBUFFER, fbo);
/* Create the gbuffer textures */
glGenTextures(GBUFFER_NUM_TEXTURES, tex);
/* Create the color buffer */
glBindTexture(GL_TEXTURE_2D, tex[GBUFFER_COLOR]);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA8, width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, NULL);
glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, tex[GBUFFER_COLOR], 0);
/* Create the normal buffer */
glBindTexture(GL_TEXTURE_2D, tex[GBUFFER_NORMAL]);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RG16F, width, height, 0, GL_RG, GL_FLOAT, NULL);
glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT1, GL_TEXTURE_2D, tex[GBUFFER_NORMAL], 0);
/* Create the depth-stencil buffer */
glBindTexture(GL_TEXTURE_2D, tex[GBUFFER_DEPTH_STENCIL]);
glTexImage2D(GL_TEXTURE_2D, 0, GL_DEPTH24_STENCIL8, width, height, 0, GL_DEPTH_STENCIL, GL_FLOAT, NULL);
glFramebufferTexture2D(GL_FRAMEBUFFER, GL_DEPTH_STENCIL_ATTACHMENT, GL_TEXTURE_2D, tex[GBUFFER_DEPTH_STENCIL], 0);
GLenum drawBuffers[] = {GL_COLOR_ATTACHMENT0, GL_COLOR_ATTACHMENT1};
glDrawBuffers(2, drawBuffers);
glReadBuffer(GL_NONE);
/* Check for errors */
GLenum status = glCheckFramebufferStatus(GL_FRAMEBUFFER);
if (status != GL_FRAMEBUFFER_COMPLETE)
{
error("In GBuffer::init():\n");
errormore("Failed to create Framebuffer, status: 0x%x\n", status);
fbo = 0;
return;
}
// restore default FBO
glBindFramebuffer(GL_FRAMEBUFFER, 0);
When I run this, however, status returns GL_FRAMEBUFFER_INCOMPLETE_ATTACHMENT. If it's not clear, I'm trying to create 3 gbuffers:
a 32-bit RGBA color buffer (I'd use 24-bit but I'm scared of alignment penalties),
a 32-bit RG normal buffer (each component using a 16-bit float, but I might get away with a signed short?)
a 24-bit Depth buffer packed with an 8-bit Stencil buffer
(total of 96 bits, or 12 bytes)
Possible problem areas that I can see might be using GL_FLOAT for the normal buffer, and GL_FLOAT for the depth-stencil buffer. I'd imagine GL_HALF_FLOAT would be more appropriate for normals, but that's not on the list of types that I can use with glTexImage2D. Similarly, I have no idea what type is most appropriate to use for a depth-stencil buffer.
What am I doing wrong?
Your use of GL_FLOAT is mostly irrelevant, since no pixel transfer actually happens.
You can supply anything you want there as long as it is a meaningful data type. While no pixel transfer happens when you pass NULL for data, GL still validates the pixel transfer data type against the set of valid types and will raise an error if you do something wrong. To that end, if it raises an error the texture will be incomplete and thus cannot be used as an FBO attachment.
Here is where the problem lies, GL_FLOAT is not a meaningful data type for pixel transfer into a packed GL_DEPTH_STENCIL image format... it expects a packed data type such as GL_UNSIGED_INT_24_8 or something really exotic like GL_FLOAT_32_UNSIGNED_INT_24_8_REV​ (64-bit packed Floating-Point Depth + Stencil format).
In any event, there are actually two components that need to be packed into your data type. GL_FLOAT can only describe one of the two components, because floating-point stencil buffers are meaningless.
By the way, this whole confusing mess about pixel transfer data type can be completely avoided if you use something like glTexStorage2D (...) to only allocate storage for the texture. glTexImage2D (...) does double-duty, allocating storage for a texture LOD and providing a mechanism to initialize it with data. You really do not care about the later part if you are drawing into the texture with an FBO, since that is the only place it gets any data from.

glReadPixels() sets GL_INVALID_OPERATION error

I'm trying to implement color picking with FBO. I have multisampled FBO (fbo[0]) which I use to render the scene and I have non multisampled FBO (fbo[1]) which I use for color picking.
The problem is: when I try to read pixel data from fbo[1] everything goes well until glReadPixels call which sets GL_INVALID_OPERATION flag. I've checked the manual and can't find the reason why.
The code to create FBO:
glBindRenderbuffer(GL_RENDERBUFFER, rbo[0]);
glRenderbufferStorageMultisample(GL_RENDERBUFFER, numSamples, GL_RGBA8, resolution[0], resolution[1]);
glBindRenderbuffer(GL_RENDERBUFFER, rbo[1]);
glRenderbufferStorageMultisample(GL_RENDERBUFFER, numSamples, GL_DEPTH24_STENCIL8, resolution[0], resolution[1]);
glBindRenderbuffer(GL_RENDERBUFFER, rbo[2]);
glRenderbufferStorage(GL_RENDERBUFFER, GL_R32UI, resolution[0], resolution[1]);
glBindRenderbuffer(GL_RENDERBUFFER, rbo[3]);
glRenderbufferStorage(GL_RENDERBUFFER, GL_DEPTH24_STENCIL8, resolution[0], resolution[1]);
glBindFramebuffer(GL_DRAW_FRAMEBUFFER, fbo[1]);
glFramebufferRenderbuffer(GL_DRAW_FRAMEBUFFER, GL_DEPTH_STENCIL_ATTACHMENT, GL_RENDERBUFFER, rbo[3]);
glFramebufferRenderbuffer(GL_DRAW_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_RENDERBUFFER, rbo[2]);
OGLChecker::checkFBO(GL_DRAW_FRAMEBUFFER);
glBindFramebuffer(GL_DRAW_FRAMEBUFFER, fbo[0]);
glFramebufferRenderbuffer(GL_DRAW_FRAMEBUFFER, GL_DEPTH_STENCIL_ATTACHMENT, GL_RENDERBUFFER, rbo[1]);
glFramebufferRenderbuffer(GL_DRAW_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_RENDERBUFFER, rbo[0]);
OGLChecker::checkFBO(GL_DRAW_FRAMEBUFFER);
My checker stays silent so the FBOs are complete. Next the picking code
glBindFramebuffer(GL_DRAW_FRAMEBUFFER, fbo[1]);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT );
// bla, bla, bla
// do the rendering
unsigned int result;
glBindFramebuffer(GL_DRAW_FRAMEBUFFER, 0);
glBindFramebuffer(GL_READ_FRAMEBUFFER, fbo[1]);
int sb;
glReadBuffer(GL_COLOR_ATTACHMENT0);
glBindBuffer(GL_PIXEL_PACK_BUFFER, 0);
glGetIntegerv(GL_SAMPLE_BUFFERS, &sb);
// glBindFramebuffer(GL_READ_FRAMEBUFFER, 0);
OGLChecker::getGlError();
std::cerr << "Sample buffers " << sb << std::endl;
glReadPixels(pos.x(), resolution.y() - pos.y(), 1, 1, GL_RED, GL_UNSIGNED_INT, &result);
OGLChecker::getGlError();
return result;
the output:
Sample buffers 0
OpenGL Error : Invalid Operation
The interesting fact that if I uncomment glBindFramebuffer(GL_READ_FRAMEBUFFER, 0); then no error happens and pixels are read from screen (but I don't need this).
What may be wrong here?
Your problem is the format parameter. For a texture that has a one-channel integer internal format the correct parameter isn't GL_RED, but GL_RED_INTEGER:
glReadPixels(pos.x(), resolution.y() - pos.y(), 1, 1, GL_RED_INTEGER, GL_UNSIGNED_INT, &result);
Look at the OpenGL documentation wiki (emphasis mine):
...
format
Specifies the format of the pixel data. For transfers of depth, stencil, or depth/stencil data, you must use GL_DEPTH_COMPONENT, GL_STENCIL_INDEX, or GL_DEPTH_STENCIL, where appropriate. For transfers of normalized integer or floating-point color image data, you must use one of the following: GL_RED, GL_GREEN, GL_BLUE, GL_RG, GL_RGB, GL_BGR, GL_RGBA, and GL_BGRA. For transfers of non-normalized integer data, you must use one of the following: GL_RED_INTEGER, GL_GREEN_INTEGER, GL_BLUE_INTEGER, GL_RG_INTEGER, GL_RGB_INTEGER, GL_BGR_INTEGER, GL_RGBA_INTEGER, and GL_BGRA_INTEGER. Even if no actual pixel transfer is made (data​ is NULL and no buffer is bound to GL_PIXEL_UNPACK_BUFFER), you must set this parameter correctly for the internal format of the destination image.
...
Note: the official reference page is incomplete/wrong.
Given that it's "fixed" if you uncomment that line of code, I wonder if your driver is lying to you about GL_SAMPLE_BUFFERS being 0. From http://www.opengl.org/sdk/docs/man/xhtml/glReadPixels.xml:
GL_INVALID_OPERATION is generated if GL_READ_FRAMEBUFFER_BINDING is non-zero, the read framebuffer is complete, and the value of GL_SAMPLE_BUFFERS for the read framebuffer is greater than zero.
If you're using NVIDIA's binary driver on Linux and have switched to a non-graphical virtual console (e.g. CTRL+ALT+F1) then any attempt to glReadPixels() will return GL_INVALID_OPERATION (0x502).
Solution: Switch back to the graphical console (usually CTRL+ALT+F7).

Problems outputting gl_PrimitiveID to custom frame buffer object (FBO)

I have a very basic fragment shader which I want to output 'gl_PrimitiveID' to a fragment buffer object (FBO) which I have defined. Below is my fragment shader:
#version 150
uniform vec4 colorConst;
out vec4 fragColor;
out uvec4 triID;
void main(void)
{
fragColor = colorConst;
triID.r = uint(gl_PrimitiveID);
}
I setup my FBO like this:
GLuint renderbufId0;
GLuint renderbufId1;
GLuint depthbufId;
GLuint framebufId;
// generate render and frame buffer objects
glGenRenderbuffers( 1, &renderbufId0 );
glGenRenderbuffers( 1, &renderbufId1 );
glGenRenderbuffers( 1, &depthbufId );
glGenFramebuffers ( 1, &framebufId );
// setup first renderbuffer (fragColor)
glBindRenderbuffer(GL_RENDERBUFFER, renderbufId0);
glRenderbufferStorage(GL_RENDERBUFFER, GL_RGBA, gViewWidth, gViewHeight);
// setup second renderbuffer (triID)
glBindRenderbuffer(GL_RENDERBUFFER, renderbufId1);
glRenderbufferStorage(GL_RENDERBUFFER, GL_RGB32UI, gViewWidth, gViewHeight);
// setup depth buffer
glBindRenderbuffer(GL_RENDERBUFFER, depthbufId);
glRenderbufferStorage(GL_RENDERBUFFER, GL_DEPTH_COMPONENT32, gViewWidth, gViewHeight);
// setup framebuffer
glBindFramebuffer(GL_FRAMEBUFFER, framebufId);
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_RENDERBUFFER, renderbufId0);
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT1, GL_RENDERBUFFER, renderbufId1);
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, GL_RENDERBUFFER, depthbufId );
// check if everything went well
GLenum stat = glCheckFramebufferStatus(GL_FRAMEBUFFER);
if(stat != GL_FRAMEBUFFER_COMPLETE) { exit(0); }
// setup color attachments
const GLenum att[] = {GL_COLOR_ATTACHMENT0, GL_COLOR_ATTACHMENT1};
glDrawBuffers(2, att);
// render mesh
RenderMyMesh()
// copy second color attachment (triID) to local buffer
glReadBuffer(GL_COLOR_ATTACHMENT1);
glReadPixels(0, 0, gViewWidth, gViewHeight, GL_RED, GL_UNSIGNED_INT, data);
For some reason glReadPixels gives me a 'GL_INVALID_OPERATION' error? However if i change the internal format of renderbufId1 from 'GL_RGB32UI' to 'GL_RGB' and I use 'GL_FLOAT' in glReadPixels instead of 'GL_UNSIGNED_INT' then everything works fine. Does anyone know why I am getting the 'GL_INVALID_OPERATION' error and how I can solve it?
Is there an alternative way of outputting 'gl_PrimitiveID'?
PS: The reason I want to output 'gl_PrimitiveID' like this is explained here: Picking triangles in OpenGL core profile when using glDrawElements
glReadPixels(0, 0, gViewWidth, gViewHeight, GL_RED, GL_UNSIGNED_INT, data);
As stated on the OpenGL Wiki, you need to use GL_RED_INTEGER when transferring true integer data. Otherwise, OpenGL will try to use floating-point conversion on it.
BTW, make sure you're using glBindFragDataLocation to set up which buffers those fragment shader outputs go to. Alternatively, you can set it up explicitly in the shader if you're using GLSL 3.30 or above.

Frame Buffer Object (FBO) and Render & Depth Buffers Relation

I saw many examples on the web (for example) which do the following
Create and Bind FBO
Create and Bind BUFFERS (texture, render, depth, stencil)
Then, UnBind BUFFERS
To work with FBO- Bind FBO, do the work then UnBind FBO
However, also Bind texture BUFFER for read, write etc. with texture BUFFER
BUT NEVER EVER SEEN re-Bind of other BUFFERS (render, depth, stencil), Why?
Example of BUFFERS creation and bind/unbind (Below code is just for example only to show what I explained and works perfectly),
// create a framebuffer object, you need to delete them when program exits.
glGenFramebuffersEXT(1, &fboId);
glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, fboId);
// create color buffer object and attached to fbo
glGenRenderbuffersEXT(1, &rboId);
glBindRenderbufferEXT(GL_RENDERBUFFER_EXT, rboId);
glRenderbufferStorageEXT(GL_RENDERBUFFER_EXT, GL_RGB, TEXTURE_WIDTH, TEXTURE_HEIGHT);
glBindRenderbufferEXT(GL_RENDERBUFFER_EXT, 0); //UnBind
if(useDepthBuffer) {
glGenRenderbuffersEXT(1, &rboIdDepth);
glBindRenderbufferEXT(GL_RENDERBUFFER_EXT, rboIdDepth);
glRenderbufferStorageEXT(GL_RENDERBUFFER_EXT, GL_DEPTH_COMPONENT, TEXTURE_WIDTH, TEXTURE_HEIGHT);
glBindRenderbufferEXT(GL_RENDERBUFFER_EXT, 0); //UnBind
}
glFramebufferRenderbufferEXT(GL_FRAMEBUFFER_EXT, GL_COLOR_ATTACHMENT0_EXT, GL_RENDERBUFFER_EXT, rboId);
if(useDepthBuffer)
glFramebufferRenderbufferEXT(GL_FRAMEBUFFER_EXT, GL_DEPTH_ATTACHMENT_EXT, GL_RENDERBUFFER_EXT, rboIdDepth);
// check FBO status
printFramebufferInfo();
bool status = checkFramebufferStatus();
if(!status)
fboUsed = false;
.
//then,
.
glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, fboId);
// Do the work
glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, 0);
1. Why dont we need to bind all the BUFFERS again (I mean while
working-with/drawing-objects-to
FBO)?
2. What is going on under-the-hood here?
EDIT: attach-> Bind and deattach-> UnBind
I don't know if I completely understood you, but the renderbuffers bound to the attachment points (GL_COLOR_ATTACHMENT...) are per-FBO state and this FBO state remains unchanged you only need to bind the FBO to tell OpenGL that this FBO is now used and all its state (that you set earlier) will take effect.
I think by
Then, deattach BUFFERS
You refer to this
glBindRenderbufferEXT(GL_RENDERBUFFER_EXT, rboIdDepth);
However this is not a detach, this is a unbind, which means something different. The renderbuffer is still attached to the FBO. The binding however selects the buffer object on which the following buffer object operations are to be performed on. It's kinda like the with statement found in some languages.
The actual attaching of a buffer object, that doesn't have to be bound, happens here
glFramebufferRenderbufferEXT(GL_FRAMEBUFFER_EXT,
GL_COLOR_ATTACHMENT0_EXT,
GL_RENDERBUFFER_EXT,
rboId ); // rboID != 0
It would be detached by a matching
glFramebufferRenderbufferEXT(GL_FRAMEBUFFER_EXT,
GL_COLOR_ATTACHMENT0_EXT,
GL_RENDERBUFFER_EXT,
0 );