This is totally driving me nuts!
I've created my fbo by
QOpenGLFramebufferObjectFormat format;
format.setAttachment(QOpenGLFramebufferObject::CombinedDepthStencil);
format.setMipmap(false);
format.setSamples(4);
format.setTextureTarget(GL_TEXTURE_2D);
format.setInternalTextureFormat(GL_RGBA32F_ARB);
qfbo=new QOpenGLFramebufferObject(QSize(w,h),format);
ok it seems to be working and Qt doesn't claim about that.
The problem is that I need that QOpenGLFramebufferObject as read/write, write on it seems to be easy, just qfbo.bind() makes its work, the problem is that I cannot bind its texture, if I ask it about GLuint handle it always return 0 in both modes:
qDebug()<<qfbo->texture();
qDebug()<<qfbo->takeTexture();
obviously my intention is to bind the texture by itself like:
glBindTexture(GL_TEXTURE_1D, fbo->texture());
Any tip about this? I've been googling for a long time without luck :(
I forget to say that if I don't use .setsamples(4); get a !=0 GLuit but totally black.
If you use multisampling, there's no texture allocated, instead a multisampled renderbuffer gets used.
You need to create another, non multisampled FBO of matching dimensions and use blitFramebuffer to resolve the multisampled data into single sampled.
Then you'll have the texture you can bind (to the 2D target, not to the 1D!). If this texture is still full black, run apitrace on your code and see what you're doing wrong.
Related
I want to render into a framebuffer with DSA. However I only got it working when manually binding the framebuffer. Is there a way without bind it?
This is how I thought it would work:
glNamedFramebufferDrawBuffer(m_fbo, GL_COLOR_ATTACHMENT0);
drawCall();
This only works if I use glBindFramebuffer(GL_FRAMEBUFFER, m_fbo); before. How do I do it correctly?
Also, what is the equivalent to glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT); for framebuffers with DSA? Again, I can only clear if I previously bound the framebuffer.
The array of draw buffers is not a global state, but rather it is stored per-framebuffer. You are probably familiar with the mechanics of Vertex Array Objects, which maintain separate sets of vertex attribute pointers; draw buffers are analogous to attribute pointers in this situation.
When you make a call to glNamedFramebufferDrawBuffer (m_fbo, ...), you are modifying the state of m_fbo's array without first having to bind m_fbo. You are not actually telling OpenGL to source its color buffer from m_fbo's GL_COLOR_ATTACHMENT0 - that only happens when you bind m_fbo.
In fact, if you think about this critically, this is the only logical way it can work. If you could arbitrarily source buffers from different framebuffer objects, then that would violate validation (completeness). For instance, FBO0 might have a multi-sampled color attachment with 4 samples and FBO1 might have a single-sampled depth attachment. Those are two incompatible buffers, but the only time that is validated is when you try to attach those two images to the same FBO.
I succeeded in render to texture with Texturebuffer, using VAO and shaders.
But FBO has another options for color buffer, it's Renderbuffer. I searched a lot on the internet, but cannot found any example related to draw Renderbuffer as Texturebuffer with shaders
If I ain't wrong, Renderbuffer is released in OpenGL 3.30, and it's faster than Texturebuffer.
Can I use Renderbuffer as Texturebuffer? (stupid question huh? I think it should be absolutely, isn't it?)
If yes, please lead me or give any example to draw render buffer as texture buffer.
My target is just for study, but I'd like to know is that a better way to draw textures? Should we use it frequently?
First of all, don't use the term "texture buffer" when you really just mean texture. A "buffer texture"/"texture buffer object" is a different conecpt, completely unrelated here.
If I ain't wrong, Renderbuffer is released in OpenGL 3.30, and it's faster than Texturebuffer.
No. Renderbuffers were there when FBOs were first invented. One being faster than the other is not generally true either, but these are implementation details. But it is also irrelevant.
Can I use Renderbuffer as Texturebuffer? (stupid question huh? I think it should be absolutely, isn't it?)
Nope. You cant use the contents of a renderbuffer directly as a source for texture mapping. Renderbuffesr are just abstract memory regions the GPU renders to, and they are not in the format required for texturing. You can read back the results to the CPU using glReadPixels, our you could copy the data into a texture object, e.g. via glCopyTexSubImage - but that would be much slower than directly rendering into textures.
So renderbuffers are good for a different set of use cases:
offscreen rendering (e.g. where the image results will be written to a file, or encoded to a video)
as helper buffers during rendering, like the depth buffer or stencil buffer, where you do not care anbout the final contents of these buffers anyway
as intermediate buffer when the image data can't be directly used by the follwoing steps, e.g. when using multisampling, and copying the result to a non-multisampled framebuffer or texture
It appears that you have your terminology mixed up.
You attach images to Framebuffer Objects. Those images can either be a Renderbuffer Object (this is an offscreen surface that has very few uses besides attaching and blitting) or they can be part of a Texture Object.
Use whichever makes sense. If you need to read the results of your drawing in a shader then obviously you should attach a texture. If you just need a depth buffer, but never need to read it back, a renderbuffer might be fine. Some older hardware does not support multisampled textures, so that is another situation where you might favor renderbuffers over textures.
Performance wise, do not make any assumptions. You might think that since renderbuffers have a lot fewer uses they would somehow be quicker, but that's not always the case. glBlitFramebuffer (...) can be slower than drawing a textured quad.
I wanted to know if it's possible to give a non NULL texture to a frame buffer to render on it. I mean just drawing on it so it will become the background of the final texture.
From what I have tried it just keep the texture I give and render it directly, there's no drawing on it ( as if the drawing part have been useless).
If i give a NULL texture the drawing is done.
So i wanted to know if it's possible, am i just doing it wrongly?
all example of use of fbo i've seen only show NULL texture sent.
What you're trying to do is not as common as the use case where content in an FBO attachment is rendered from scratch. That's why you won't find as many examples.
It's still perfectly legal, though, and should work. The only difference should really be that you don't call glClear() after attaching the texture to the FBO, and starting to render.
One case where you'll have to be careful is if you use depth buffering for the rendering you want to do on top of the original texture content. In this case, you will of course need a depth buffer attachment (which is typically a renderbuffer) in your FBO, as usual. In this case, you will need to clear your depth buffer, but not the color buffer, before starting to render:
glClear(GL_DEPTH_BUFFER_BIT);
I want to do a depth only pass in my renderer. I have created a FBO for this pass, and I attach only a depth buffer to it. I also pass GL_NONE to both glDrawBuffer and glReadBuffer, as per this thread:
Rendering multiple depth information with FBOs
However, after this frame draws, the screen goes completely white. If I disable that pass, then I see the expected output (resulting from the other passes).
Any ideas on why this is happening? At first, I was reusing an existing FBO (that handles offscreen rendering) for the depth only pass. I then created another FBO for just this scenario. Both approaches still result in a white screen.
Does this article offer a possible source of the problem?
http://www.opengl.org/discussion_boards/showthread.php/149004-glDrawBuffer(-GL_NONE-)-expected-behaviour
Using apitrace (https://github.com/apitrace/apitrace), I can now show the OpenGL code that I believe is responsible for the white screen:
Offending OpenGL code http://imageshack.us/a/img855/7660/jmkq.png
In this simple example, I am attaching only the depth attachment to FBO 2. I then pass 0 to both glFramebufferTexture2D() and glFramebufferRenderbuffer() to detach any previous attachments at GL_COLOR_ATTACHMENT0, and pass GL_NONE to both glDrawBuffer() and glReadBuffer(). I then immediately bind the default FBO (0), and restore the viewport back to 1280x720. I also pass GL_BACK to both glDrawBuffer() and glReadBuffer().
After this, both the working and non-working versions of my code proceed to bind another FBO for offscreen rendering. I then attach to both GL_COLOR_ATTACHMENT0 and GL_DEPTH_ATTACHMENT using glFramebufferTexture2D() and glFramebufferRenderbuffer(), respectively. For glDrawBuffer() and glReadBuffer(), I provide GL_COLOR_ATTACHMENT0.
Does this make things clearer? I'm not quite sure how the above code snippet is causing my screen to go white.
I currently have a rendering engine using multiple passes in which various parts of the image are rendered on textures, and then combined using shaders. It works, and now I would like to activate multi-sampling.
I read here ( http://www.opengl.org/wiki/Framebuffer_Object_Examples#MSAA ) that, with OpenGL, you can't attach a GL_TEXTURE2D_MULTISAMPLE to a framebuffer object.
It seems one way to use multi-sampling and still have access to the result as texture is to use a multi-sampled render buffer, and then copy the result into a multisample texture.
My question is: what would be the best way to go forward?
Is it possible to render in a render buffer and use the output in my shader, without copying into a texture?
Should I indeed copy the content of the buffer into a texture, and then use it?
Is there another, better, solution?
Thanks.
I read here ( http://www.opengl.org/wiki/Framebuffer_Object_Examples#MSAA ) that, with OpenGL, you can't attach a GL_TEXTURE2D_MULTISAMPLE to a framebuffer object.
Read it again. It says nothing about GL_TEXTURE_2D_MULTISAMPLE textures. Actually, I take that back: don't read that page again. If you want good FBO info, read the page on Framebuffer Objects that explains 3.x behavior. The page you linked to is old.
Back in the EXT days, all you had were multisampled renderbuffers, because multisample textures didn't exist. You could create multisampled buffers, but you couldn't texture with them. You could only blit them.
In 3.3 OpenGL, you can create multisampled textures. And you can attach them just like any other texture to an FBO.