Get OpenGL histogram of masked texture - c++

I can get the histogram of an opengl texture using the glGetHistogram() function.
Similar to the OpenCV histogram function, where a second OpenCV matrix can be given as a mask, I have an OpenGL Texture and a binary mask (either as alpha channel or as a separate texture), and I would like to get a histogram of all the pixels in the image that are not masked.
Is this possible somehow?

glGetHistogram is deprecated since OpenGL 3.1 anyway.
Using compute shaders or occlusion queries would be a better idea.

Related

Detect single channel texture in pixel shader

Is it possible to detect when a format has a single channel in HLSL or GLSL? Or just as good, is it possible to extract a greyscale color from such a texture without knowing if it has a single channel or 4?
When sampling from texture formats such as DXGI_FORMAT_R8_*/GL_R8 or DXGI_FORMAT_BC4_UNORM, I am getting pure red RGBA values (g,0,0,1). This would not be a problem if I knew (within the shader) that the texture only had the single channel, as I could then flood the other channels with that red value. But doing anything of this nature would break the logic for color textures, requiring a separate compiled version for the grey sampling (for every texture slot).
Is it not possible to make efficient use of grey textures in modern shaders without specializing the shader for them?
The only solution I can come up with at the moment would be to detect the grey texture on the CPU side and generate a macro on the GPU side that selects a different compiled version of the shader for every texture slot. Doing this with 8 texture slots would add up to 8x8=64 compiled versions every shader that wants to support grey inputs. That's not counting the other macro-like switches that actually make sense being there.
Just to be clear, I do know that I can load these textures into GPU memory as 4-channel greyscale textures, and go from there. But doing that uses 4X the memory, and I would rather load in 3 more textures.
In OpenGL there's two ways to achieve what you're looking for:
Legacy: The INTENSITY and LUMINANCE texture formats will when sampled result in vec4(I,I,I,I) or vec4(L,L,L,1).
Modern: Use a swizzle mask to apply user defined channel swizzling per texture: glTexParameteriv(GL_TEXTURE_2D, GL_TEXTURE_SWIZZLE_RGBA, {GL_RED,GL_RED,GL_RED,GL_ONE});
In DirectX 12 you can use component mapping during the creation of a ShaderResourceView.

With Metal (vs OpenGL), How does color/alpha blending work on single-channel render targets?

I'm porting some OpenGL code from a technical paper to use with Metal. In it, they use a render target with only one channel - a 16-bit float buffer. But then they set blending operations on it like this:
glBlendFunci(1, GL_ZERO, GL_ONE_MINUS_SRC_ALPHA);
With only one channel, does that mean that with OpenGL, the target defaults to being an alpha channel?
Does anyone know if it is the same with Metal? I am not seeing the results I expect and I am wondering if Metal differs, or if there is a setting that controls how single-channel targets are treated with regards to blending.
In OpenGL, image format channels are labeled explicitly with their channels. There is only one one-channel color format: GL_R* (obviously with different bitdepths and other info). That is, red-only. And while texture swizzling can make the red channel appear in other channels from a texture fetch, that doesn't work for framebuffer writes.
Furthermore, that blend function doesn't actually use the destination alpha. It only uses the source alpha, which has the value the FS gave it. So the fact that the framebuffer doesn't store an alpha is essentially irrelevant.

OpenGL: How to add alpha channel to RGB texture

There are 2 textures in memory, an RGB texture and a single channel texture that is it's alpha mask. Is there any way to combine the two in OpenGL?
I'm currently combining the two in openCV and then passing the whole to openGL for rendering but the channel combination in openCV is too slow and I'm looking for alternatives.

SDL grid color smoothing

Is it possible to smooth a grid of values displayed as color using methods such as bicubic or linear interpolation with SDL? Currently I use SDL_FillRect to fast fill rectangles. Here is a reference image of interpolation in MATLAB to get an idea of what I want to achieve:
There is no built-in system in SDL to do what you want. You could easily do this using a shader and OpenGL, but you mentioned using SDL_FillRect so I assume you are using an SDL_Surface which suggests you're not using OpenGL. In that case you can easily plot the pixels and calculate the colours however you like.

How to render grayscale texture without using fragment shaders? (OpenGL)

Is it possible to draw a RGB texture as grayscale without using fragment shaders, using only fixed pipeline openGL?
Otherwise I'd have to create two versions of texture, one in color and one in black and white.
I don't know how to do this with an RGB texture and the fixed function pipeline.
If you create the texture from RGB source data but specify the internal format as GL_LUMINANCE, OpenGL will convert the color data into greyscale for you. Use the standard white material and MODULATE mode.
Hope this helps.
No. Texture environment combiners are not capable of performing a dot product without doing the scale/bias operation. That is, it always pretends that [0, 1] values are encoded as [-1, 1] values. Since you can't turn that off, you can't do a proper dot product.