Cg Shaders Alpha Blending - opengl

I want to use Alpha Blending (SRC_ALPHA, ONE_MINUS_SRC_ALPHA), which basically is this:
frag_out.aaaa * frag.out + ((1.0, 1.0, 1.0, 1.0) - frag_out.aaaa) * pixel_color
Let's say the pixel color already on screen is (1.0, 1.0, 1.0, 1.0)
The color I am currently rendering is (1.0, 1.0, 1.0, 0.4)
When I render this the resulting color on screen has an alpha 0.76 (even though it was fully opaque BEFORE)
Indeed:
(0.4, 0.4, 0.4, 0.4) * (1.0, 1.0, 1.0, 0.4) + (0.6, 0.6, 0.6, 0.6) * (1.0, 1.0, 1.0, 1.0) = (1.0, 1.0, 1.0, 0.76)
So basically the color on the screen was opaque (white) and after I put a transparent sprite on top the screen become transparent (0.76 alpha) and I am able to look through the background. It would be perfect if I could blend it like this:
(0.4, 0.4, 0.4, 1.0) * (1.0, 1.0, 1.0, 0.4) + (0.6, 0.6, 0.6, 0.6) * (1.0, 1.0, 1.0, 1.0) = (1.0, 1.0, 1.0, 1.0)
Is it possible to achieve this? If yes, how?

This is perfectly possible when using OpenGL 3.0 and above since the blend functions can there be set separately for RGB and A. For your case this would be
glBlendFuncSeparate(GL_SOURCE_ALPHA, GL_ONE_MINUS_SOURCE_ALPHA, GL_ONE, GL_ONE);
The last parameter specifying the blend factor for destination alpha might have to be adapted since you didn't specify the behavior for destination alpha < 1. For more details have a look at the function documentation.

Related

Reordering OpenGL Texture vertices to flip image rows

I am a complete OpenGL beginner and I inherited a codebase. I want OpenGL to flip a texture in vertical direction, meaning the top row goes to the bottom and so on. I am only doing 2D processing, if that is relevant.
My texture vertices are currently this:
const float texture_vertices[] = {
0.0, 1.0,
0.0, 0.0,
1.0, 0.0,
0.0, 1.0,
1.0, 0.0,
1.0, 1.0,
};
I tried changing the directions of the triangles and reordering them from going clockwise to counter clockwise. I know this is caused by my lack of awareness of how the very basics of OpenGL work, but I would appreciate all help (especially a short explanation of why something is the right way to reorder them).
Maybe I am going about this all wrong and the texture coordinates are not what I am interested in?
You need to flip the 2nd component of the texture coordinates (swap 0 and 1):
const float texture_vertices[] = {
0.0, 0.0,
0.0, 1.0,
1.0, 1.0,
0.0, 0.0,
1.0, 1.0,
1.0, 0.0,
};

How can I map a single texture to all sides of a cube using the UV attributes of vertices?

Currently I am creating a mesh of a cube like this:
//bottom vertices of cube
positions.push([ 0.0, 0.0, 0.0 ]);
uvs.push([ 0.0, 0.0 ]);
positions.push([ 1.0, 0.0, 0.0 ]);
uvs.push([ 0.0, 1.0 ]);
positions.push([ 1.0, 0.0, 1.0 ]);
uvs.push([ 1.0, 1.0 ]);
positions.push([ 0.0, 0.0, 1.0 ]);
uvs.push([ 1.0, 0.0 ]);
//upper vertices of cube
positions.push([ 0.0, 1.0, 0.0 ]);
uvs.push([ 0.0, 1.0 ]);
positions.push([ 1.0, 1.0, 0.0 ]);
uvs.push([ 1.0, 1.0 ]);
positions.push([ 1.0, 1.0, 1.0 ]);
uvs.push([ 1.0, 0.0 ]);
positions.push([ 0.0, 1.0, 1.0 ]);
uvs.push([ 1.0, 1.0 ]);
(I use a texture that is 16x16 pixels big)
But I struggle to map those uv positions, because the maximum number of sides that work are 4.
I would know what to do if there would be 4 vertices for each side (24 in total), but I dont want to do it like this because of performance reasons.
Is there any way of doing this with only 8 vertices per cube?
the minimal cube has six sides
each side has TWO TRIANGLES
you will have TWELVE tris for a minimal cube
DONT, repeat DO NOT bother trying to share the verts. if for some reason you want to, do that later.
Simply ensure that each pair of tris (one square) has the correct UV
https://stackoverflow.com/a/36845398/294884

Is it possible to pass a GLSL variable to javaScript.

I see that you can pass variables from javaScript to GLSL but is it possible to go the other way around. Basically I have a shader that converts a texture to 3 colors red, green, and blue based on the alpha channel.
if (texture.a == 1.0) {
gl_FragColor = vec4(0.0, 1.0, 0.0, 1.0);
}
if (texture.a == 0.0) {
gl_FragColor = vec4(0.0, 0.0, 1.0, 1.0);
call0nce = 1;
}
if (texture.a > 0.0 && texture.a < 1.0) {
"gl_FragColor = vec4(1.0, 0.0, 0.0, 1.0);",
}
I have been using the colors to help visualize things better. What I'm really trying to do is pick a random point with with an alpha of 1.0 and a random point with an alpha of 0.0 and output the texture coordinates of both in a way that I can access them from javaScript. How would I go about this?
You can not directly pass values from GLSL back to JavaScript. GLSL, at least in WebGL 1.0, can only output pixels to textures, renderbuffers, and the backbuffer.
You can read the pixels though by calling gl.readPixels so you can indirectly get data from GLSL back into JavaScript by writing the values you want and then calling gl.readPixels.

How to optimize a color gradient shader?

I have created this simple fragment shader for achieving a vertical color gradient effect.
But I find this to be taxing for my mobile device in full screen.
is there any way to optimize this?
here is the link to the code
http://glsl.heroku.com/e#13541.0
You could do something like this instead.
vec2 position = (gl_FragCoord.xy / resolution.xy);
vec4 top = vec4(1.0, 0.0, 1.0, 1.0);
vec4 bottom = vec4(1.0, 1.0, 0.0, 1.0);
gl_FragColor = vec4(mix(bottom, top, position.y));
Example
You can further change the color yourself, I just used random colors.
You can even further eliminate calculating the x but that's kinda overkill.
vec4 top = vec4(1.0, 0.0, 1.0, 1.0);
vec4 bottom = vec4(1.0, 1.0, 0.0, 1.0);
gl_FragColor = vec4(mix(bottom, top, (gl_FragCoord.y / resolution.y)));

default uniform (array) values

Instead of explicitly setting uniform data for a GL program, I set 'defaults' in a simple test (fragment) shader with:
uniform vec3 face_rgb[] = vec3[]
(
vec3(0.0, 0.0, 1.0), vec3(0.0, 1.0, 0.0), vec3(1.0, 0.0, 0.0),
vec3(1.0, 0.0, 1.0), vec3(0.0, 1.0, 1.0), vec3(1.0, 1.0, 0.0),
vec3(0.2, 0.2, 0.2), vec3(0.0, 0.0, 0.0)
);
Depending on the fragment's texture coordinates, an index value is formulated to look up an RGB value. (The actual RGB values are immaterial)
This works perfectly well with OS X (GL 3.2 core profile). In fact, far better than using an index with a const array. My question is - is this valid GLSL syntax, and not an implementation-dependent hack? (I have no 4+ access at the moment, but I assume the answer still applies). Also, any ideas as to why a uniform might outperform a constant array?
Yes, uniform arrays are allowed to have default values in GL 3.2. So your syntax is valid.
That doesn't mean it will always work, only that it's valid. Driver bugs can still get you.