How to render grayscale texture without using fragment shaders? (OpenGL) - opengl

Is it possible to draw a RGB texture as grayscale without using fragment shaders, using only fixed pipeline openGL?
Otherwise I'd have to create two versions of texture, one in color and one in black and white.

I don't know how to do this with an RGB texture and the fixed function pipeline.
If you create the texture from RGB source data but specify the internal format as GL_LUMINANCE, OpenGL will convert the color data into greyscale for you. Use the standard white material and MODULATE mode.
Hope this helps.

No. Texture environment combiners are not capable of performing a dot product without doing the scale/bias operation. That is, it always pretends that [0, 1] values are encoded as [-1, 1] values. Since you can't turn that off, you can't do a proper dot product.

Related

Detect single channel texture in pixel shader

Is it possible to detect when a format has a single channel in HLSL or GLSL? Or just as good, is it possible to extract a greyscale color from such a texture without knowing if it has a single channel or 4?
When sampling from texture formats such as DXGI_FORMAT_R8_*/GL_R8 or DXGI_FORMAT_BC4_UNORM, I am getting pure red RGBA values (g,0,0,1). This would not be a problem if I knew (within the shader) that the texture only had the single channel, as I could then flood the other channels with that red value. But doing anything of this nature would break the logic for color textures, requiring a separate compiled version for the grey sampling (for every texture slot).
Is it not possible to make efficient use of grey textures in modern shaders without specializing the shader for them?
The only solution I can come up with at the moment would be to detect the grey texture on the CPU side and generate a macro on the GPU side that selects a different compiled version of the shader for every texture slot. Doing this with 8 texture slots would add up to 8x8=64 compiled versions every shader that wants to support grey inputs. That's not counting the other macro-like switches that actually make sense being there.
Just to be clear, I do know that I can load these textures into GPU memory as 4-channel greyscale textures, and go from there. But doing that uses 4X the memory, and I would rather load in 3 more textures.
In OpenGL there's two ways to achieve what you're looking for:
Legacy: The INTENSITY and LUMINANCE texture formats will when sampled result in vec4(I,I,I,I) or vec4(L,L,L,1).
Modern: Use a swizzle mask to apply user defined channel swizzling per texture: glTexParameteriv(GL_TEXTURE_2D, GL_TEXTURE_SWIZZLE_RGBA, {GL_RED,GL_RED,GL_RED,GL_ONE});
In DirectX 12 you can use component mapping during the creation of a ShaderResourceView.

Direct3D11 Pixel Shader Interpolation

I'm trying to create a simple transition between two colors. So a render of a quad with top edge colors being #2c2b35 and bottom color being #1a191f. When I put it through simple pixel shader which just applies colors to the vertices and interpolates it I get an effect of color banding and I'm wondering if there is an option to eliminate that effect without some advanced pixel shader dithering techniques.
I'm attaching a picture of a render where the left part is being rendered in my app and part on the right is rendered in Photoshop with "Dithered" option checked.
Thanks in advance.
Banding is a problem that every rendering engine have to deal with at some point.
Before you start implementing dithering, you should first confirm you are following a proper srgb pipeline. (https://en.wikipedia.org/wiki/SRGB)
To be SRGB correct :
Swapchain and render surfaces in RGBX8 have to contains SRGB values. The simplest way is to create a render target view with a XXXX_UNORM_SRGB format.
Texture have to use a UNORM_SRGB format view when appropriate ( albedo map ) and UNORM if not appropriate ( normal map )
Shader computation like lighting has to be in linear space.
Texture and render target when using the right format will perform the conversion for you at read and write, but for constant, you have to do it manually. For example in #2c2b35 : 0x2c is in srgb 44(0.1725) but in linear value is 0.025
To do a gradient, you have to do it in linear space in the shader and let the render target convert it back to srgb on write.

Blending sprite with pre-existing texture

I am just learning the intricacies of OpenGL. What I would like to do is render a sprite onto a pre-existing texture. The texture will consist of terrain with some points alpha=1 and some points alpha=0. I would like the sprite to appear on a pixel of the texture if and only if the corresponding texture's pixel's alpha = 0. That is, for each pixel of the sprite, the output colour is:
Color of the sprite, if terrain alpha = 0.
Color of the terrain, if terrain alpha = 1.
Is this possible to do with blending function, if not how should I do it?
This is the exact opposite of the traditional blending function. The usual blend function is a linear interpolation between the source and destination colors, based on the source alpha.
What you want is a linear interpolation between the source and destination colors, based on the destination alpha. But you also want to invert the usual meaning; a destination alpha of 1 means that the destination color should be taken, not the source color.
That's pretty easy.
glBlendFunc(GL_ONE_MINUS_DST_ALPHA, GL_DST_ALPHA);
However, the above assumes that your sprites do not themselves have some from of inherent transparency. And most sprites do. That is, if the sprite alpha is 0 at some pixel, you don't want to overwrite the terrain color, no matter what the terrain's alpha is.
That makes this whole process excessively difficult. Pre-multiplying the alpha will not save you either, since black will just as easily overwrite the color in the terrain if there is no terrain color there.
In effect, you would need to do a linear interpolation based on neither the source nor the destination, but on a combination of them. I think multiplication of the two (src-alpha * (1 - dst-alpha)) would do a good job.
This is not possible with OpenGL's standard blending system. You would need to employ some form of programmatic blending technique. This typically involves read/modify/write operations using NV/ARB_texture_barrier or otherwise ping-ponging between bound textures.

Good way to deal with alpha channels in 8-bit bitmap? - OpenGL - C++

I am loading bitmaps with OpenGL to texture a 3d mesh. Some of these bitmaps have alpha channels (transparency) for some of the pixels and I need to figure out the best way to
obtain the values of transparency for each pixel
and
render them with the transparency applied
Does anyone have a good example of this? Does OpenGL support this?
First of all, it's generally best to convert your bitmap data to 32-bit so that each channel (R,G,B,A) gets 8 bits. When you upload your texture, specify a 32bit format.
Then when rendering, you'll need to glEnable(GL_BLEND); and set the blend function, eg: glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);. This tells OpenGL to mix the RGB of the texture with that of the background, using the alpha of your texture.
If you're doing this to 3D objects, you might also want to turn off back-face culling (so that you see the back of the object through the front) and sort your triangles back-to-front (so that the blends happen in the correct order).
If your source bitmap is 8-bit (ie: using a palette with one colour specified as the transparency mask), then it's probably easiest to convert that to RGBA, setting the alpha value to 0 when the colour matches your transparency mask.
Some hints to make things (maybe) look better:
Your alpha channel is going to be an all-or-nothing affair (either 0x00 or 0xff), so apply some blur algorithm to get softer edges, if that's what you're after.
For texels (texture-pixels) with an alpha of zero (fully transparent), replace the RGB colour with the closest non-transparent texel. When texture coordinates are being interpolated, they wont be blended towards the original transparency colour from your BMP.
If your pixmap are 8-bit single channel they are either grayscale or use a palette. What you first need to do is converting the pixmap data into RGBA format. For this you allocate a buffer large enough to hold a 4-channel pixmap of the dimensions of the original file. Then for each pixel of the pixmap use that pixel's value as index into the palette (look up table) and put that color value into the corresponding pixel in the RGBA buffer. Once finished, upload to OpenGL using glTexImage2D.
If your GPU supports fragment shaders (very likely) you can do that LUT transformation in the shader: Upload the 8-bit pixmal as a GL_RED or GL_LUMINANCE 2D texture. And upload the palette as a 1D GL_RGBA texture. Then in the fragment shader:
uniform sampler2D texture;
uniform sampler1D palette_lut;
void main()
{
float palette_index = texture2D(texture,gl_TexCoord[0].st).r;
vec4 color = texture1D(palette_lut, palette_index);
gl_FragColor = color;
}
Blended rendering conflicts with the Z buffer algorithm, so you must sort your geometry back-to-front for things to look properly. As long as this affects objects at a whole this is rather simple, but it becomes tedious if you need to sort the faces of a mesh rendering each and every frame. A method to avoid this is breaking down meshes into convex submeshes (of course a mesh that's convex already can not be broken down further). Then use the following method:
Enable face culling
for convex_submesh in sorted(meshes, far to near):
set face culling to front faces (i.e. the backside gets rendered)
render convex_submesh
set face culling to back faces (i.e. the fronside gets rendered)
render convex_submesh again

OpenGL - How Transparency works?

Do i need Alpha channels for transparency to work in OpenGL? can i use glBlendFunc or anything else to make somehow the Black or White color transparent/not visible? if yes, how to do it?
No, you don't need an alpha channel in your textures. Call discard in your fragment shader for all fragments that match your transparency rule.
Yes, you need alpha channels to use transparency. You can emulate the behaviour of color keying using shaders, or processing the image and replacing the color key with pixels with alpha = 0.0.
Notice that GPUs always allocate RGBA textures, even if you want a RGB texture. The alpha channel is still present in hardware.