Error when attaching depth/stencil buffer to the framebuffer - opengl

I create a depth/stencil buffer by issuing this series of command to OpenGL:
glBindTexture(GL_TEXTURE_2D, 0)
glGenTextures(1, &TextureId)
glBindTexture(GL_TEXTURE_2D, TextureId)
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST)
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST)
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT)
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT)
glTexParameteri(GL_TEXTURE_2D, GL_DEPTH_TEXTURE_MODE, GL_INTENSITY)
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_COMPARE_MODE, GL_COMPARE_R_TO_TEXTURE)
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_COMPARE_FUNC, GL_LEQUAL)
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA8, 640, 480, 0, GL_RGBA, GL_UNSIGNED_BYTE, NULL);
glBindTexture(GL_TEXTURE_2D, 0)
Then I try and attach it to the framebuffer with
glFramebufferTexture2D(GL_FRAMEBUFFER, GL_DEPTH_STENCIL_ATTACHMENT, GL_TEXTURE_2D, TextureId, 0)
But a call to glCheckFramebufferStatusEXT returns GL_FRAMEBUFFER_INCOMPLETE_ATTACHMENT. If I don’t attach the depth-stencil buffer, this very test goes fine (but obviously the display is screwed).
Do you guys have any clue?
EDIT:
Changed: I have simplified the texture creation to a basic texture’s format.

Okay, the problem disappeared when I removed all the trailing EXT…

The texture you're attaching to GL_DEPTH_STENCIL_ATTACHMENT has neither depth nor stencil information in it. If you want to attach a texture to GL_DEPTH_STENCIL_ATTACHMENT, you make sure it's image format has depth and stencil data.

Now OpenGL versions does not need EXT extension~~~

Related

OpenGL texture gets worse when moving camera away from object

The texture gets generally worse more I move the camera away from the object. I need to be really close to the object for the texture to be fine. Does anyone know what would cause it and how to fix it?
this is how I load the texture
stbi_set_flip_vertically_on_load(1);
m_Local_buffer = stbi_load(path.c_str(), &m_width, &m_height, &m_bpp, 4);
glGenTextures(1, &m_rendererID);
glBindTexture(GL_TEXTURE_2D, m_rendererID);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR_MIPMAP_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA8, m_width, m_height, 0, GL_RGBA, GL_UNSIGNED_BYTE, m_Local_buffer);
glGenerateMipmap(GL_TEXTURE_2D);
glBindTexture(GL_TEXTURE_2D, 0);
image of the bad texture
Edit: updated the code.
For mipmapping you have to use one of the mipmap minifying filters. e.g.: GL_LINEAR_MIPMAP_LINEAR (see glTexParameter):
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR_MIPMAP_LINEAR);
Problem was that the faces in the model were too close to each other which caused Z-fighting and that caused flickering. It was not caused by texture or my or by OpenGL code like I originally assumed. When I changed the model that had not this issue it worked correctly without any fliggering.

Setting up GL_TEXTURE_2D_ARRAY, framebuffer incomplete

For cascaded shadow mapping I'm trying to use a GL_TEXTURE_2D_ARRAY for the individual shadow maps. However following tutorials found online and even looking up things in a textbook I can't seem to create a working framebuffer, as it always errors with GL_FRAMEBUFFER_INCOMPLETE_MISSING_ATTACHMENT.
The code:
glGenFramebuffers(1, &m_DepthMap.m_FrameBuffer);
glGenTextures(1, &m_DepthMap.m_DepthTexture);
glBindTexture(GL_TEXTURE_2D_ARRAY, m_DepthMap.m_DepthTexture);
glTexImage3D(
GL_TEXTURE_2D_ARRAY,
0,
GL_DEPTH_COMPONENT32F,
Renderer::m_ShadowMapResolution,
Renderer::m_ShadowMapResolution,
3,
0,
GL_DEPTH_COMPONENT,
GL_FLOAT,
nullptr);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_BORDER);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_BORDER);
glBindFramebuffer(GL_FRAMEBUFFER, m_DepthMap.m_FrameBuffer);
glFramebufferTexture2D(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, GL_TEXTURE_2D, m_DepthMap.m_DepthTexture, 0);
glDrawBuffer(GL_NONE);
glReadBuffer(GL_NONE);
const int status = glCheckFramebufferStatus(GL_FRAMEBUFFER);
if (status != GL_FRAMEBUFFER_COMPLETE)
{
std::cout << "ERROR::FRAMEBUFFER:: Framebuffer is not complete!";
throw 0;
}
Sources that tell me that this should be correct:
https://people.inf.elte.hu/plisaai/pdf/OpenGL%20Insights.pdf , page 264
https://johanmedestrom.wordpress.com/2016/03/18/opengl-cascaded-shadow-maps/
What am I doing wrong here? Why is the framebuffer incomplete?
The tutorial you cited has led you astray.
A 2D texture is not the same thing as a 2D array texture. You can either attach a specific array layer (of a specific mipmap level) to a framebuffer, or attach all of the array images in a mipmap. In neither of these cases can you call glFramebufferTexture2D to do this for an array texture (which is why you should be checking for OpenGL errors in one way or another, as this function should have errored out).
In any case, if you want to attach a specific array layer to the framebuffer, then you want to use glFramebufferTextureLayer. If you want to attach it as a layered attachment (because you're doing layered rendering), then you'd use glFramebufferTexture.

Framebuffer Incomplete Attachment for Texture with Internal Format?

So i am trying to create a FrameBuffer where i render to a texture, but i can't seem to get it to work with the format i need. That is GL_RGB32F. It works for GL_RGB16F and GL_RGBA32F so i don't understand why GL_RGB32F is giving me GL_FRAMEBUFFER_INCOMEPLETE_ATTACHMENT from glCheckFramebufferStatus. I get no errors from the calls creating the texture either. Is there a special requirement to use that internal format? Can i see if i have support for it?
glGenFramebuffers(1, &fbo);
glBindFramebuffer(GL_FRAMEBUFFER, fbo);
glGenTextures(1, &positionTexture);
glBindTexture(GL_TEXTURE_2D, positionTexture);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB32F, width, height, 0, GL_RGB, GL_FLOAT, nullptr);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, positionTexture, 0);
GL_RGB16F andGL_RGB32F are not required color buffer formats in the GL, while GL_RGBA32F is. You can have a look in Table 8.12 of the OpenGL 4.5 core profile specification (pages 198-200, assuming june 2017 revision of said document). It tells you that all these mentioned formats are color renderable but only GL_RGBA32F out of that set is a required render format. Implementations might support the other ones optionally, but you can't rely on that.

Bind Texture in Repeat Mode

I am trying to store an octree in a 3D texture in OpenGL for use on the GPU using Cg, from a chapter in GPU Gems 2 found here http://http.developer.nvidia.com/GPUGems2/gpugems2_chapter37.html. However the results I am getting are incorrect. I think this is because of how I create the octree.
In the appendix of that chapter, it says "If we bind the indirection pool texture (octree texture) in repeat mode (GL_REPEAT)...".
Does this simply mean set the the filters and wrapping to repeat, or do I need to do something else? This is my code so far
glGenTextures(1, &octree_texture);
glBindTexture(GL_TEXTURE_3D, octree_texture);
glTexEnvi(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_REPLACE);
glTexParameteri(GL_TEXTURE_3D, GL_TEXTURE_MAG_FILTER, GL_REPEAT);
glTexParameteri(GL_TEXTURE_3D, GL_TEXTURE_MIN_FILTER, GL_REPEAT);
glTexParameteri(GL_TEXTURE_3D, GL_TEXTURE_WRAP_S, GL_REPEAT);
glTexParameteri(GL_TEXTURE_3D, GL_TEXTURE_WRAP_T, GL_REPEAT);
glTexParameteri(GL_TEXTURE_3D, GL_TEXTURE_WRAP_R, GL_REPEAT);
glTexImage3D(GL_TEXTURE_3D, 0, GL_RGBA, WIDTH, HEIGHT, DEPTH, 0, GL_RGBA, GL_UNSIGNED_BYTE, octreeData);
Thanks for the help :)
The filters can't be repeat, that will generate a GL error, only the wrap mode can be GL_REPEAT, and that's probably what the book means.

OpenGL GL_BGR not working for texture internal format

From what I've read in the OpenGL documentation, OpenGL prefers BGR format over RGB. However, when setting the texture's internal format as BGR, the texture is all white when rendering. When the internal format is set to RGB or RGBA it displays properly. The source format is BGR to begin with (loaded directly from DIB).
glBindTexture(GL_TEXTURE_2D, id);
glPixelStorei(GL_UNPACK_ALIGNMENT, 4);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, nClamp);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, nClamp);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, nMagFilter);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, nMinFilter);
glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_MODULATE);
The following line is the problem... setting the internal format to GL_BGR makes everything all white... changing to GL_RGB makes it render correctly
glTexImage2D(GL_TEXTURE_2D, 0, GL_BGR, pTex->nWidth, pTex->nHeight, 0, GL_BGR, GL_UNSIGNED_BYTE, pTex->pBuffer);
From the glTexImage2D man page, GL_BGR is not a internalFormat, just a format.
You should check for GL errors since that would've catched this error. What you read about "OpenGL prefering BGR" is false.