When rendering the depth buffer with the iOS simulator Getting the true z value from the depth buffer , all is fine.
But the rendering on real device gives bad result: the depth is rendered with only few values (there is no fade) like when you display an image into a 256 values color range.
Here is the code for the fbo generation:
glGenFramebuffers(1, &sceneFBO);
glGenTextures(2, textures_scene);
glBindTexture(GL_TEXTURE_2D, textures_scene[0]);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, widthResolution, heightResolution, 0, GL_RGBA, GL_UNSIGNED_BYTE, NULL);
glBindTexture(GL_TEXTURE_2D, textures_scene[1]);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glTexImage2D(GL_TEXTURE_2D, 0, GL_DEPTH_COMPONENT, widthResolution, heightResolution, 0, GL_DEPTH_COMPONENT, GL_UNSIGNED_INT, NULL);
//TODO: GL_DEPTH_COMPONENT cannot be GL_UNSIGNED_BYTE ?
glBindFramebuffer(GL_FRAMEBUFFER, sceneFBO);
glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, textures_scene[0], 0);
glFramebufferTexture2D(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, GL_TEXTURE_2D, textures_scene[1], 0);
It seems that the only working value for GL_DEPTH_ATTACHMENT fbo is GL_UNSIGNED_INT. But it is not enough to render the depth...
PS: I don't want to display again the scene into a GL_COLOR_ATTACHMENT0 with the z-values in order to have a correct depth (for performance reasons) nor use OpenGL ES3.0 or the new metal (iPhone4s support).
Any idea?
There is no depth buffer support in ES 1 or 2. See the answer to this question : glReadPixels doesn't read depth buffer values on iOS
Seems, that it's not possible to use GL_UNSIGNED_BYTE as type parameter together with GL_DEPTH_COMPONENT as internalformat, only GL_UNSIGNED_SHORT and GL_UNSIGNED_INT are allowed according to the GL_OES_depth_texture extension spec: https://www.khronos.org/registry/gles/extensions/OES/OES_depth_texture.txt
I finally put the z value in the alpha part of the scene (I don't use transparency for this fbo).
color.w=(gl_postion.z-near)/(far-near)
A little mcgyver but it does the job.
Related
I render the color (GL_COLOR_ATTACHMENT0) and depth (GL_DEPTH_ATTACHMENT) of my scene into a FBO, which works fine on PC with OpenGL 4+.
But on my Smartphone with OpenGL ES 3.1+ I always get for the depth via glCheckFramebufferStatus(GL_FRAMEBUFFER) the status = 36054.
glGenTextures(1, &m_textCol);
glBindTexture(GL_TEXTURE_2D, m_textCol);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, wth, hgt, 0, GL_RGBA, GL_UNSIGNED_BYTE, NULL);
glBindTexture(GL_TEXTURE_2D, 0);
glGenTextures(1, &m_textDepth);
glBindTexture(GL_TEXTURE_2D, m_textDepth);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glTexImage2D(GL_TEXTURE_2D, 0, GL_DEPTH_COMPONENT, wth, hgt, 0, GL_DEPTH_COMPONENT, GL_FLOAT, NULL);
glBindTexture(GL_TEXTURE_2D, 0);
GLuint framebuffer;
glGenFramebuffers(1, &framebuffer);
glBindFramebuffer(GL_FRAMEBUFFER, framebuffer);
glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, m_textCol, 0);
GLenum status = glCheckFramebufferStatus(GL_FRAMEBUFFER); always 36053 -> ok
glFramebufferTexture2D(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, GL_TEXTURE_2D, m_textDepth, 0);
status = glCheckFramebufferStatus(GL_FRAMEBUFFER); // always 36054 -> not ok
I tried it also with
glTexImage2D(GL_TEXTURE_2D, 0, GL_DEPTH_COMPONENT, wth, hgt, 0, GL_DEPTH_COMPONENT, GL_UNSIGNED_BYTE, NULL);
but no change.
Has maybe someone an idea what I'm doing wrong? Thanks.
Status 36054 is GL_FRAMEBUFFER_INCOMPLETE_ATTACHMENT.
If we look at Section 9.4 of the OpenGL ES3.1 specification, it lists the following valid internalFormat values for depth attachments:
DEPTH_COMPONENT16
DEPTH_COMPONENT24
DEPTH_COMPONENT32F
DEPTH24_STENCIL8
DEPTH32F_STENCIL8
Note that the last two also contain a stencil part. You can use these for the depth component and ignore the stencil attachment if you do not need it. Especially DEPTH24_STENCIL8 has a good chance of being widely supported.
I was under the impression if you set your sampler uniforms to the correct texture unit, it doesn't matter if the currently bound texture target is 0 or not. For example,
glActiveTexture(GL_TEXTURE1);
glGenTextures(1, &mytexture);
glBindTexture(GL_TEXTURE_2D, mytexture);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, width, height, 0, GL_RGB, GL_UNSIGNED_BYTE, my_data);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glBindTexture(GL_TEXTURE_2D, 0); // This is the line I'm wondering about
Sometime later when drawing ...
glUniform1i(glGetUniformLocation(program, "mysampler"), 1);
//draw_stuff
Unfortunately, the screen is all black unless I keep GL_TEXTURE_2D bound to mytexture. Is it illegal to sample when GL_TEXTURE_2D is bound to 0???
Exactly, think about GL_TEXTUREN as a slot of several texture target types (GL_TEXTURE_2D, GL_TEXTURE_3D etc). While activating GL_TEXTURE1 and binding a texture to GL_TEXTURE_2D you're telling the driver that 2d texture in slot 1 is going to be set to "mytexture".
Then you need to pass this information to your shader as well:
glUniform1i(glGetUniformLocation(program, "mysampler"), 1);
This simply tells your sampler2D in your shader that it should look for GL_TEXTURE_2D in slot 1. If you unbind the texture it will have nothing to sample from.
I'm having trouble with rendering to texture output from this opengl example: http://en.wikibooks.org/wiki/OpenGL_Programming/Modern_OpenGL_Tutorial_Text_Rendering_01
My framebuffer is setup as follows:
glGenTextures(1, &back_text);
glBindTexture(GL_TEXTURE_2D, back_text);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA8, WINDOW_WIDTH, WINDOW_HEIGHT, 0, GL_BGRA, GL_UNSIGNED_BYTE, 0);
glGenFramebuffersEXT(1, &font_buffer);
glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, font_buffer);
glFramebufferTexture2DEXT(GL_FRAMEBUFFER_EXT, GL_COLOR_ATTACHMENT0_EXT, GL_TEXTURE_2D, back_text, 0);
In best case I can get filled rectangle with color where letter should appear (it appears like alpha is always 1.0).
I've tried original example and it works correctly.
My question is: do I need to enable something to make this work or do I need to take different approach?
Edit:
It turns out that I need to use
glutInitContextVersion(2,0); instead of
glutInitContextVersion(3,2); and world is happy place!! :)
In my program it is necessary for me to do off-screen rendering. For that purpose I use a FBO. In order to see if the image I draw is the correct for testing purposes I copy it from the FBO to a texture then render the texture to a quad. The problem is that when I copy from the FBO into the texture and render it the image appears dark/get a black color but the shapes are correct. I have tried using a texture as attachment in the FBO and rendering it directly (without copying it into another texture) and the colors are correct.
Below is the code for texture creation
//initial texture which works when rendered to a quad
glGenTexturesEXT(3, &textureID[0]);
glBindTextureEXT(GL_TEXTURE_2D, textureID[0]);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR_MIPMAP_LINEAR);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_GENERATE_MIPMAP, GL_TRUE);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA8, 600, 512, 0, GL_RGBA, GL_UNSIGNED_BYTE, 0);
//second one which should be a copy of the above but has tha dark color mentioned
glBindTextureEXT(GL_TEXTURE_2D, textureID[1]);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR_MIPMAP_LINEAR);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_GENERATE_MIPMAP, GL_TRUE); // automatic mipmap
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA8, 600, 512, 0, GL_RGBA, GL_UNSIGNED_BYTE, 0);
glBindTextureEXT(GL_TEXTURE_2D, 0);
glFramebufferTexture2DEXT(GL_FRAMEBUFFER_EXT,GL_COLOR_ATTACHMENT0_EXT,GL_TEXTURE_2D,textureID[0],0);
//Attach depth buffer to FBO
glFramebufferRenderbufferEXT(GL_FRAMEBUFFER_EXT, GL_DEPTH_ATTACHMENT_EXT, GL_RENDERBUFFER_EXT, depth_rb);
st1=glCheckFramebufferStatusEXT(GL_FRAMEBUFFER_EXT);
Now in the rendering function
glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, fb); //biding FBO
glClear(GL_COLOR_BUFFER_BIT|GL_DEPTH_BUFFER_BIT);
//render code
unsigned char *pixels= new unsigned char [600*512*4];
glBindFramebufferEXT(GL_FRAMEBUFFER_EXT,0); //unbiding
glBindTextureEXT(GL_TEXTURE_2D,textureID[1]); //if I change to textureID[0] result is fine
glTexImage2D(GL_TEXTURE_2D,0,GL_RGBA8,600,512,0,GL_RGBA,GL_UNSIGNED_BYTE,pixels);
glBindTextureEXT(GL_TEXTURE_2D,0);
glViewport(0, 0, 600, 512);
glClearColor(1.0,1.0,1.0,1.0);
glClear(GL_COLOR_BUFFER_BIT|GL_DEPTH_BUFFER_BIT);
glBindTextureEXT(GL_TEXTURE_2D, textureID[1]);
//setting the correct matrixes + render a quad
//before rendering a quad I set the color with
glColor3f(1.0, 1.0, 1.0);
delete[] pixels;
glutSwapBuffers();
I'm using glut for the setup. I have tried other functions such as glGetTexImage2D after unbiding the FBO and biding the texture as an alternative to glReadPixels(...) but with no success.
I don't understand this call to glTexImage2D in your second code snippet. pixels will contain just garbage. What do you expect it to do?
Textures have been core OpenGL for a very, very long time. It's just glGenTextures and glBindTexture not ...EXT. Also I recommend to either use ...ARB versions of the FBO functionality, or just using OpenGL core framebuffer support (of later OpenGL versions).
I would like to use the depth buffer to store the depth values of particles in eye space in a 2D texture by using OpenGL 2.1 / GLSL 1.2.
So far I found a way to use the colorbuffer
// create texture
glGenTextures(1, &g_hDepthTexture);
glBindTexture(GL_TEXTURE_2D, g_hDepthTexture);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA32F_ARB, g_windowWidth, g_windowHeight, 0, GL_RGBA, GL_FLOAT, 0);
// create framebuffer
glGenFramebuffersEXT(1, &g_hFBO);
glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, g_hFBO);glFramebufferTexture2DEXT(GL_FRAMEBUFFER_EXT, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, g_hDepthTexture, 0);
However, I don't need the BGA components. Hence, I have tried to use the depth buffer, but the following code clamps each value in the texture to 0...1
// create texture
glGenTextures(1, &g_hDepthTexture);
glBindTexture(GL_TEXTURE_2D, g_hDepthTexture);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexImage2D(GL_TEXTURE_2D, 0, GL_DEPTH_COMPONENT, g_windowWidth, g_windowHeight, 0, GL_DEPTH_COMPONENT, GL_FLOAT, 0); // create framebuffer
glGenFramebuffersEXT(1, &g_hFBO);
glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, g_hFBO);
glDrawBuffer(GL_NONE);
glReadBuffer(GL_NONE);
glFramebufferTexture2DEXT(GL_FRAMEBUFFER_EXT, GL_DEPTH_ATTACHMENT, GL_TEXTURE_2D, g_hDepthTexture, 0);
I would like to know how to use the depth buffer (probably how to choose the correct internal format / format) so that the texture values are not clamped.
Normalized integer image formats are always clamped. That's why they're normalized integers. If you want an unclamped format, then you need floating point values.
I would suggest using an actual 1-channel floating-point image format, such as GL_R32F. Maybe GL_R16F, depending on how much precision I need. If you don't have GL 3.x hardware, you may be able to use GL_LUMINANCE32F_EXT, depending on what extensions are available.
BTW, if you're doing this for deferred rendering, don't bother. You can actually calculate the eye-space point directly from the regular depth buffer. Yes, really.