I have OpenGL application which blends colors based on the following settings:
glBlendEquationSeparate( GL_FUNC_ADD, GL_FUNC_ADD );
glBlendFuncSeparate( GL_ONE, GL_ONE, GL_ONE, GL_ONE );
In the same way I would like blend two textures in the fragment shader. Does the above blending settings translate to the simple addition:
vec4 colorA = texture( samplerA, texCoords );
vec4 colorB = texture( samplerB, texCoords );
vec4 colorC = colorA + colorB;
?
The resulting formula of
glBlendEquationSeparate(GL_FUNC_ADD, GL_FUNC_ADD);
glBlendFuncSeparate(GL_ONE, GL_ONE, GL_ONE, GL_ONE);
is
dst.rgb = 1 * src.rgb + 1 * dst.rgb
dst.a = 1 * src.a + 1 * dst.a
Blending is very different from computing colors in the fragment shader.
In this case, the current fragment color (and the alpha channel) is added to the fragment color (and the alpha channel) in the framebuffer. So yes, you can think of it as colorA + colorB.
Related
I tried to draw two textures of decals and background, but only the alpha part of the decals becomes white.
I simply tried the following.
Draw 2 textures (background & decals)
Add glBlendFunc to apply decals alpha value
#version 330 core
in vec2 UV;
out vec3 color;
uniform sampler2D background;
in vec3 decalCoord;
uniform sampler2D decal;
void main(){
vec3 BGTex = texture( background, UV ).rgb;
vec3 DecalTex = texture(decal, decalCoord.xy).rgba;
color =
vec4(BGTex,1.0) + // Background texture is DDS DXT1 (I wonder if DXT1 is the cause?)
vec4(DecalTex,0.0); // Decal texture is DDS DXT3 for alpha
}
// Set below...
glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
glBlendEquation(GL_FUNC_ADD);
I was able to draw normally but there are the following problems.
-The alpha part of the decals can not be made transparent.
Is this a problem that can be solved within the fragmentshader?
In that case, how should I rewrite the code?
The texture is DDS, one of which is DXT1 type (Background. Because this texture doesn't need alpha) and the other is DXT3 type (Decal). Is this also the cause?(Both need to be the DXT3 type?)
Also, should I look for another way to put on the decals?
DecalTex should be a vec4, not a vec3. Otherwise the alpha
value will not be stored.
You will also have to change the line at the end to: color =
vec4(BGTex, 1.0) + DecalTex * DecalTex.a As currently it sets the alpha component
to 0.
The DecalTex has an alpha channel. The alpha channel is "weight", which indicates the intensity of the DecalTex. If the alpha channel is 1, then the color of DecalTex has to be used, if the alpha channel is 0, then the color of BGTex has to be used.
Use mix to mix the color of BGTex and DecalTex dependent on the alpha channel of DecalTex. Of course the type of the DecalTex has to be vec4:
vec3 BGTex = texture( background, UV ).rgb;
vec4 DecalTex = texture(decal, decalCoord.xy).rgba;
color = vec4(mix(BGTex.rgb, DecalTex.rgb, DecalTex.a), 1.0);
Note, mix linear interpolates between the 2 values:
mix(x, y, a) = x * (1−a) + y * a
This is similar the operation which is performed by the blending function and equation:
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
glBlendEquation(GL_FUNC_ADD);
But Blending is applied to the fragment shader output and the current value in the framebuffer.
You don't need any blending at all, because the textures are "blended" in the fragment shader and the result is put in the framebuffer. Since the alpha channel of the fragment shader output is >= 1.0, the output is completely "opaque".
These are my blending functions.
glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
glBlendEquation(GL_FUNC_ADD);
I mix two textures in the fragment shader using mix function.
FragColor =
mix(texture(texture1, TexCoord), texture(texturecubeMap, ransformedR), MixValueVariables.CubeMapValue ) *
vec4(result, material.opacity / 100.0); // multiplying texture with material value
Despite changing the blend Equation or BlendFunc types i don't see any difference in the output.
Does mix function works with the blend types.
Does mix function works with the blend types.
No. Blending and the glsl function mix are completely different things.
The function mix(x, y, a) calculates always x * (1−a) + y * a.
Blending takes the fragment color outputs from the Fragment Shader and combines them with the colors in the color buffer of the frame buffer.
If you want to use Blending, then you've to
draw the geometry with the first texture
activate blending
draw the geometry with the second texture
If you just want to blend the textures in the fragment shader, then you've to use the glsl arithmetic. See GLSL Programming/Vector and Matrix Operations
e.g.
vec4 MyBlend(vec4 source, vec4 dest, float alpha)
{
return source * alpha + dest * (1.0-alpha);
}
void main()
{
vec4 c1 = texture(texture1, TexCoord);
vec4 c2 = texture(texturecubeMap, ransformedR);
float f = MixValueVariables.CubeMapValue
FragColor = MyBlend(c1, c2, f) * vec4(result, material.opacity / 100.0);
}
Adapt the function MyBlend to your needs.
I have a full screen quad with two textures.
I want to blend two textures in arbitrary shape according to user selection.
For example, the quad at first is 100% texture0 while texture1 is transparent.
If the user selects a region, for example a circle, by dragging the mouse on the quad, then
circle region should display both texture0 and texture1 as translucent.
The region not enclosed by the circle should still be texture0.
Please see example image, textures are simplified as colors.
For now, I have achieved blending two textures on the quad, but the blending region can only be vertical slices because I use the step() function.
My frag shader:
uniform sampler2D Texture0;
uniform sampler2D Texture1;
uniform float alpha;
uniform float leftBlend;
uniform float rightBlend;
varying vec4 oColor;
varying vec2 oTexCoord;
void main()
{
vec4 first_sample = texture2D(Texture0, oTexCoord);
vec4 second_sample = texture2D(Texture1, oTexCoord);
float stepLeft = step(leftBlend, oTexCoord.x);
float stepRight = step(rightBlend, 1.0 - oTexCoord.x);
if(stepLeft == 1.0 && stepRight == 1.0)
gl_FragColor = oColor * first_sample;
else
gl_FragColor = oColor * (first_sample * alpha + second_sample * (1.0-alpha));
if (gl_FragColor.a < 0.4)
discard;
}
To achieve arbitrary shape, I assume I need to create a alpha mask texture which is the same size as texture0 and texture 1?
Then I pass that texture to frag shader to check values, if value is 0 then texture0, if value is 1 then blend texture0 and texture1.
Is my approach correct? Can you point me to any samples?
I want effect such as OpenGL - mask with multiple textures
but I want to create mask texture in my program dynamically, and I want to implement blending in GLSL
I have got blending working with mask texture of black and white
uniform sampler2D TextureMask;
vec4 mask_sample = texture2D(TextureMask, oTexCoord);
if(mask_sample.r == 0)
gl_FragColor = first_sample;
else
gl_FragColor = (first_sample * alpha + second_sample * (1.0-alpha));
now mask texture is loaded statically from a image on disk, now I just need to create mask texture dynamically in opengl
Here's one approach and sample.
Create a boolean test for whether you want to blend.
In my sample, I use an equation for a circle centered on the screen.
Then blend (i blended by weighted addition of the 2 colors).
(NOTE: i didn't have texture coords to work with in this sample, so i used the screen resolution to determine the circle position).
uniform vec2 resolution;
void main( void ) {
vec2 position = gl_FragCoord.xy / resolution;
// test if we're "in" or "out" of the blended region
// lets use a circle of radius 0.5, but you can make this mroe complex and/or pass this value in from the user
bool isBlended = (position.x - 0.5) * (position.x - 0.5) +
(position.y - 0.5) * (position.y - 0.5) > 0.25;
vec4 color1 = vec4(1,0,0,1); // this could come from texture 1
vec4 color2 = vec4(0,1,0,1); // this could come from texture 2
vec4 finalColor;
if (isBlended)
{
// blend
finalColor = color1 * 0.5 + color2 * 0.5;
}
else
{
// don't blend
finalColor = color1;
}
gl_FragColor = finalColor;
}
See the sample running here: http://glsl.heroku.com/e#18231.0
(tried to post my sample image but i don't have enough rep) sorry :/
Update:
Here's another sample using mouse position to determine the position of the blended area.
To run, paste the code in this sandbox site: https://www.shadertoy.com/new
This one should work on objects of any shape, as long as you have the mouse data setup correct.
void main(void)
{
vec2 position = gl_FragCoord.xy;
// test if we're "in" or "out" of the blended region
// lets use a circle of radius 10px, but you can make this mroe complex and/or pass this value in from the user
float diffX = position.x - iMouse.x;
float diffY = position.y - iMouse.y;
bool isBlended = (diffX * diffX) + (diffY * diffY) < 100.0;
vec4 color1 = vec4(1,0,0,1); // this could come from texture 1
vec4 color2 = vec4(0,1,0,1); // this could come from texture 2
vec4 finalColor;
if (isBlended)
{
// blend
finalColor = color1 * 0.5 + color2 * 0.5;
}
else
{
// don't blend
finalColor = color1;
}
gl_FragColor = finalColor;
}
Is it possible in OpenGL to sample from, and then write to the same texture in one draw call using GLSL?
I think it actually IS possible. I did it like this for GLSL alpha blending:
vec4 prevColor = texture(dfFrameTexture, pointCoord);
vec4 result = vec4(color.a) * color + vec4(1.0 - color.a) * prevColor;
gl_FragColor = result;
dfFrameTexture is the texture that is being written to with gl_FragColor.
I tested this and it worked. Drawing a white quad onto dfFrameTexture after it's been cleared to black results in a grey quad when I set the color to (0.5, 0.5, 0.5, 0.5)
I have a terrain in OpenGL, and two textures which I am combining using GLSL mix() function.
Here are the textures I am using.
Now I am able to combine and mix these two textures, but for some reason, when I render the textures on the terrain, the terrain becomes transparent.
I render the LHS texture first, and then I render the RHS texture with alpha channel, I don't understand why it is transparent.
Here is an interesting fact, in the screenshot, you can see the result of the terrain when rendered on Nvidia GPU, when I render the same thing on interl HD 3k, I get different result.
This result is how it is supposed to be, nothing is transparent in this following screenshot.
Here is my fragment shader code..
void main()
{
vec4 dryTex = texture( u_dryTex, vs_texCoord *1 );
vec4 grassTex = texture( u_grassTex, vs_texCoord *1 );
vec4 texColor1= mix(dryTex, grassTex , grassTex.a);
out_fragColor = texColor1;
}
Looks like you're interpolating the alpha channel as well. Component-wise, the mix does:
texColor1.r = dryTex.r*(1-grassTex.a) + grassTex.r*grassTex.a
texColor1.g = dryTex.g*(1-grassTex.a) + grassTex.g*grassTex.a
texColor1.b = dryTex.b*(1-grassTex.a) + grassTex.b*grassTex.a
texColor1.a = dryTex.a*(1-grassTex.a) + grassTex.a*grassTex.a
The alpha channel for an opaque dryTex will thus be
grassTex.a^2
which is transparent most of the time.
Edit:
The fix would be:
void main()
{
vec4 dryTex = texture( u_dryTex, vs_texCoord *1 );
vec4 grassTex = texture( u_grassTex, vs_texCoord *1 );
vec4 texColor1= vec4(mix(dryTex.rgb, grassTex.rgb, grassTex.a), 1);
out_fragColor = texColor1;
}