Disable clamping in glCopyTexSubImage2D - c++

I am attempting to do the following using OpenGL:
Render an image to the floating point color attachment of an off-screen framebuffer
Allocate an empty texture object with the same format and dimensions as the color attachment
Copy the pixels in the color attachment to the texture, as fast as possible
I am using the OpenGL function glCopyTexSubImage2D to do the copying. However I found that the copied values are clamped between 0 and 1 in the destination texture.
Right now I am using OpenGL 3.3, but I have to port it to OpenGL ES 2.0 later, so I cannot make use of pixel buffer objects.
I am using the following initialization code, before any copying is done:
glClampColor(GL_CLAMP_READ_COLOR, GL_FALSE);
glClampColor(GL_CLAMP_VERTEX_COLOR, GL_FALSE);
glClampColor(GL_CLAMP_FRAGMENT_COLOR, GL_FALSE);
This disables clamping for glReadPixels but seems to have no effect on glCopyTexSubImage2D.
Is there a way to disable this clamping in glCopyTexSubImage2D?
Edit:
This is for an image processing application with some iterative parts, not for 3D graphics.

Related

Highlighter blend func OpenGL

What is the standard builtin blend function used to implement a highlighter in OpenGL? Using glBlendEquation(GL_MIN); produces the desired effect but does not allow for opacity-based blending as well. I am modeling transparency using glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA); Is it possible to account for opacity using min blending? If not, is it best to use a custom shader with a frame buffer object?
For reference, here is an example of the desired result (not rendered using OpenGL):
I ended up rendering highlight-able elements to floating point texture and blending using glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);. This intermediate frame buffer can then be blended with the default framebuffer 0 using the same blend function. Subtractive blending can be emulated by multiplying the rgb components of the final value in the fragment shader by -1. Floating point textures support negative values and are typically write-able as frame buffer color attachments. See Blend negative value into framebuffer 0 opengl for more information.

Use default framebuffer depth buffer in a fbo

I want to achieve a post process outline effect on some actors in my game and I want the outline be occluded by other actors closer to the camera. To do so I planed to render the elements that should be outline in an FBO and an outline shader. To discard the pixel that are occluded I want to use the depth buffer of the default framebuffer.
I read and searched but I didn't find how to properly use the default framebuffer depth buffer in another framebuffer or how to copy or anyway to use the default depth buffer informations on an fbo.
How can I achieve it?
I read and searched but I didn't find how to properly use the default framebuffer depth buffer in another framebuffer
Unfortunately, there is no way in OpenGL to do that.
or how to copy or anyway to use the default depth buffer informations on an fbo.
That is quite easy. Just create an FBO with a appropriate depth attachment (no matter if texture or renderbuffer), and blit the default depth buffer to it:
glBindFramebuffer(GL_READ_FRAMEBUFFER, 0);
glBindFramebuffer(GL_DRAW_FRAMEBUFFER, your_target_fbo);
glBlitFramebuffer(0,0,width,height,0,0,width,height,GL_DEPTH_BUFFER_BIT, GL_NEAREST);
However, since you do some post-processing anyway, it might be a good option to render the intial scene into an FBO and re-use the same depth buffer in different FBOs.
Unfortunately, it is not possible to attach the default depth-buffer to a FBO or to read it from a shader.
You can copy the default framebuffer into a renderbuffer (or a framebuffer attached texturem, see glBlitFramebuffer), but this could be relatively slow.
Another option is to render the complete scene first to a FBO (color + depth), then use the outline shader to read from this FBO, combine the result with the outline calculation and write the final result to the default framebuffer.

glReadPixels() doesn't return an Anti-aliased picture

When I render my scene to the screen, the scene uses a _MultiSampling Anti-Aliasing 8x; it's ok.
But I want to read to the pixel buffers through glReadPixels(), I have a Aliased-Image with Aliased lines. It doesn't use the 8x MSAA.
Code:
glReadPixels(0,0, w, h, GL_BGRA_EXT, GL_UNSIGNED_BYTE, (void*)pixels);
How do I access to the pixels buffer with the MSAA transformation (filter)?
glReadPixels doesn't have a special parameter.
Side Note: I would like to use the Opengl 1
Important Note: I use Qt 5.9's QOpenGLWidget class
I take it that you're rendering to an off-screen renderbuffer or texture via FBO. The solution is to create a renderbuffer that uses a MSAA pixelformat. Antialiasing is not a postprocessing filter! (at least not in the form MSAA implements it).

Using layered rendering with default framebuffer

I know that we can attach a layered texture which is:
A mipmap level of a 1D/2D texture array
A mipmap levevl of a 3D texture
A mipmap levevl of a Cube Texture/ Cube Texture Array
to a FBO and do layered rendering.
The OpenGL wiki also says "Layered rendering is the process of having the GS send specific primitives to different layers of a layered framebuffer."
Can default framebuffer be a layered framebuffer? i.e Can I bind a 3d texture to the default FB and use a geometry shader to render to different layers of this texture?
I tried writing such a program but the screen is blank and I am not sure if this is right.
If it is not, what is possibly happening when I bind default FB for layered rendering?
If you use the default framebuffer for layered rendering, then everything you draw will go straight to the default framebuffer. They will behave as if everything were in the same layer.
OpenGL 4.4 Core Specification - 9.8 Layered Framebuffers - pp. 296
A framebuffer is considered to be layered if it is complete and all of its populated attachments are layered. When rendering to a layered framebuffer, each fragment generated by the GL is assigned a layer number.
[...]
A layer number written by a geometry shader has no effect if the framebuffer is not layered.

Normal back buffer + render to depth texture in OpenGL (FBOs)

Here's the situation: I have a texture containing some depth values. I want to render some geometry using that texture as the depth buffer, but I want the color to be written to the normal framebuffer (i.e., the window's framebuffer).
How do I do this? I tried creating an FBO and only binding a depth texture to it (GL_DEPTH_COMPONENT) but that didn't work; none of the colors showed up.
No you can't. The FBO you are rendering to may be either a main framebuffer or an off-screen one. You can't mix them in any way.
Instead, I would suggest you to render to a color renderbuffer and then do a simple blitting operation into the main framebuffer.
Edit-1.
Alternatively, if you already have depth in the main FB, you can first blit your depth and then render to a main FB, thus saving video memory on the additional color renderbuffer.
P.S. Blitting is done via glBlitFramebuffer. In order to make it work you should setup GL_READ_FRAMEBUFFER, GL_DRAW_FRAMEBUFFER and glDrawBuffer() for each of them.