How do I use a GLSL shader to apply a radial blur to an entire scene? - opengl

I have a radial blur shader in GLSL, which takes a texture, applies a radial blur to it and renders the result to the screen. This works very well, so far.
The problem is, that this applies the radial blur to the first texture in the scene. But what I actually want to do, is to apply this blur to the whole scene.
What is the best way to achieve this functionality? Can I do this with only shaders, or do I have to render the scene to a texture first (in OpenGL) and then pass this texture to the shader for further processing?
// Vertex shader
varying vec2 uv;
void main(void)
{
gl_Position = vec4( gl_Vertex.xy, 0.0, 1.0 );
gl_Position = sign( gl_Position );
uv = (vec2( gl_Position.x, - gl_Position.y ) + vec2(1.0) ) / vec2(2.0);
}
// Fragment shader
uniform sampler2D tex;
varying vec2 uv;
const float sampleDist = 1.0;
const float sampleStrength = 2.2;
void main(void)
{
float samples[10];
samples[0] = -0.08;
samples[1] = -0.05;
samples[2] = -0.03;
samples[3] = -0.02;
samples[4] = -0.01;
samples[5] = 0.01;
samples[6] = 0.02;
samples[7] = 0.03;
samples[8] = 0.05;
samples[9] = 0.08;
vec2 dir = 0.5 - uv;
float dist = sqrt(dir.x*dir.x + dir.y*dir.y);
dir = dir/dist;
vec4 color = texture2D(tex,uv);
vec4 sum = color;
for (int i = 0; i < 10; i++)
sum += texture2D( tex, uv + dir * samples[i] * sampleDist );
sum *= 1.0/11.0;
float t = dist * sampleStrength;
t = clamp( t ,0.0,1.0);
gl_FragColor = mix( color, sum, t );
}

This basically is called "post-processing" because you're applying an effect (here: radial blur) to the whole scene after it's rendered.
So yes, you're right: the good way for post-processing is to:
create a screen-sized NPOT texture (GL_TEXTURE_RECTANGLE),
create a FBO, attach the texture to it
set this FBO to active, render the scene
disable the FBO, draw a full-screen quad with the FBO's texture.
As for the "why", the reason is simple: the scene is rendered in parallel (the fragment shader is executed independently for many pixels). In order to do radial blur for pixel (x,y), you first need to know the pre-blur pixel values of the surrounding pixels. And those are not available in the first pass, because they are only being rendered in the meantime.
Therefore, you must apply the radial blur only after the whole scene is rendered and fragment shader for fragment (x,y) is able to read any pixel from the scene. This is the reason why you need 2 rendering stages for that.

Related

GLSL: Fade 2D grid based on distance from camera

I am currently trying to draw a 2D grid on a single quad using only shaders. I am using SFML as the graphics library and sf::View to control the camera. So far I have been able to draw an anti-aliased multi level grid. The first level (blue) outlines a chunk and the second level (grey) outlines the tiles within a chunk.
I would now like to fade grid levels based on the distance from the camera. For example, the chunk grid should fade in as the camera zooms in. The same should be done for the tile grid after the chunk grid has been completely faded in.
I am not sure how this could be implemented as I am still new to OpenGL and GLSL. If anybody has any pointers on how this functionality can be implemented, please let me know.
Vertex Shader
#version 130
out vec2 texCoords;
void main() {
gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;
texCoords = (gl_TextureMatrix[0] * gl_MultiTexCoord0).xy;
}
Fragment Shader
#version 130
uniform vec2 chunkSize = vec2(64.0, 64.0);
uniform vec2 tileSize = vec2(16.0, 16.0);
uniform vec3 chunkBorderColor = vec3(0.0, 0.0, 1.0);
uniform vec3 tileBorderColor = vec3(0.5, 0.5, 0.5);
uniform bool drawGrid = true;
in vec2 texCoords;
void main() {
vec2 uv = texCoords.xy * chunkSize;
vec3 color = vec3(1.0, 1.0, 1.0);
if(drawGrid) {
float aa = length(fwidth(uv));
vec2 halfChunkSize = chunkSize / 2.0;
vec2 halfTileSize = tileSize / 2.0;
vec2 a = abs(mod(uv - halfChunkSize, chunkSize) - halfChunkSize);
vec2 b = abs(mod(uv - halfTileSize, tileSize) - halfTileSize);
color = mix(
color,
tileBorderColor,
smoothstep(aa, .0, min(b.x, b.y))
);
color = mix(
color,
chunkBorderColor,
smoothstep(aa, .0, min(a.x, a.y))
);
}
gl_FragColor.rgb = color;
gl_FragColor.a = 1.0;
}
You need to split your multiplication in the vertex shader to two parts:
// have a variable to be interpolated per fragment
out vec2 vertex_coordinate;
...
{
// this will store the coordinates of the vertex
// before its projected (i.e. its "world" coordinates)
vertex_coordinate = gl_ModelViewMatrix * gl_Vertex;
// get your projected vertex position as before
gl_Position = gl_ProjectionMatrix * vertex_coordinate;
...
}
Then in the fragment shader you change the color based on the world vertex coordinate and the camera position:
in vec2 vertex_coordinate;
// have to update this value, every time your camera changes its position
uniform vec2 camera_world_position = vec2(64.0, 64.0);
...
{
...
// calculate the distance from the fragment in world coordinates to the camera
float fade_factor = length(camera_world_position - vertex_coordinate);
// make it to be 1 near the camera and 0 if its more then 100 units.
fade_factor = clamp(1.0 - fade_factor / 100.0, 0.0, 1.0);
// update your final color with this factor
gl_FragColor.rgb = color * fade_factor;
...
}
The second way to do it is to use the projected coordinate's w. I personally prefer to calculate the distance in units of space. I did not test this code, it might have some trivial syntax errors, but if you understand the idea, you can apply it in any other way.

OpenGL shapes look darker when camera is below them

I have a problem with rendering my quads in OpenGL. They look darker when translucency is applied, if the camera is below a certain point. How can I fix this? The objects are lots of quads with tiny amounts of Z difference. I have implemented rendering of translucent objects from this webpage: http://www.alecjacobson.com/weblog/?p=2750
Render code:
double alpha_factor = 0.75;
double alpha_frac = (r_alpha - alpha_factor * r_alpha) / (1.0 - alpha_factor * r_alpha);
double prev_alpha = r_alpha;
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glEnable(GL_BLEND);
// quintuple pass to get the rendering of translucent objects, somewhat correct
// reverse render order for getting alpha going!
// 1st pass: only depth checks
glDisable(GL_CULL_FACE);
glDepthFunc(GL_LESS);
r_alpha = 0;
// send alpha for each pass
// reverse order
drawobjects(RENDER_REVERSE);
// 2nd pass: guaranteed back face display with normal alpha
glEnable(GL_CULL_FACE);
glCullFace(GL_FRONT);
glDepthFunc(GL_ALWAYS);
r_alpha = alpha_factor * (prev_alpha + 0.025);
// reverse order
drawobjects(RENDER_REVERSE);
// 3rd pass: depth checked version of fraction of calculated alpha. (minus 1)
glEnable(GL_CULL_FACE);
glCullFace(GL_FRONT);
glDepthFunc(GL_LEQUAL);
r_alpha = alpha_frac + 0.025;
// normal order
drawobjects(RENDER_NORMAL);
// 4th pass: same for back face
glEnable(GL_CULL_FACE);
glCullFace(GL_BACK);
glDepthFunc(GL_ALWAYS);
r_alpha = alpha_factor * (prev_alpha + 0.025);
// reverse order
drawobjects(RENDER_REVERSE);
// 5th pass: just put out the entire thing now
glDisable(GL_CULL_FACE);
glDepthFunc(GL_LEQUAL);
r_alpha = alpha_frac + 0.025;
// normal order
drawobjects(RENDER_NORMAL);
glDisable(GL_BLEND);
r_alpha = prev_alpha;
GLSL shaders:
Vertex shader:
#version 330 core
layout(location = 0) in vec3 vPos_ModelSpace;
layout(location = 1) in vec2 vertexUV;
layout(location = 2) in mat4 model_instance;
out vec2 UV;
out float alpha;
flat out uint alpha_mode;
// model + view + proj matrix
uniform mat4 proj;
uniform mat4 view;
uniform float v_alpha;
uniform uint v_alpha_mode;
void main() {
gl_Position = proj * view * model_instance * vec4(vPos_ModelSpace, 1.0);
// send to frag shader
UV = vertexUV;
alpha = v_alpha;
alpha_mode = v_alpha_mode;
}
Fragment shader:
#version 330 core
// texture UV coordinate
in vec2 UV;
in float alpha;
flat in uint alpha_mode;
out vec4 color;
// Values that stay constant for the whole mesh.
uniform sampler2D texSampler;
void main() {
int amode = int(alpha_mode);
color.rgb = texture(texSampler, UV).rgb;
color.a = alpha;
if(amode == 1)
color.rgb *= alpha;
}
Image when problem happens:
Image comparison for how it should look regardless of my position:
The reason it fades away in the center is because when you look at the infinitely thin sides of the planes they disappear. As for the brightness change top vs bottom, it's due to how your passes treat surface normals. The dark planes are normals facing away from the camera but with no planes facing the camera to lighten them up.
It looks like you are rendering many translucent planes in a cube to estimate a volume. Here is a simple example of a volume rendering: https://www.shadertoy.com/view/lsG3D3
http://developer.download.nvidia.com/books/HTML/gpugems/gpugems_ch39.html is a fantastic resource. It explains different ways to render volume, shows how awesome it is. For reference, that last example used a sphere as proxy geometry to raymarch a volume fractal.
Happy coding!

OpenGL GLSL blend two textures by arbitrary shape

I have a full screen quad with two textures.
I want to blend two textures in arbitrary shape according to user selection.
For example, the quad at first is 100% texture0 while texture1 is transparent.
If the user selects a region, for example a circle, by dragging the mouse on the quad, then
circle region should display both texture0 and texture1 as translucent.
The region not enclosed by the circle should still be texture0.
Please see example image, textures are simplified as colors.
For now, I have achieved blending two textures on the quad, but the blending region can only be vertical slices because I use the step() function.
My frag shader:
uniform sampler2D Texture0;
uniform sampler2D Texture1;
uniform float alpha;
uniform float leftBlend;
uniform float rightBlend;
varying vec4 oColor;
varying vec2 oTexCoord;
void main()
{
vec4 first_sample = texture2D(Texture0, oTexCoord);
vec4 second_sample = texture2D(Texture1, oTexCoord);
float stepLeft = step(leftBlend, oTexCoord.x);
float stepRight = step(rightBlend, 1.0 - oTexCoord.x);
if(stepLeft == 1.0 && stepRight == 1.0)
gl_FragColor = oColor * first_sample;
else
gl_FragColor = oColor * (first_sample * alpha + second_sample * (1.0-alpha));
if (gl_FragColor.a < 0.4)
discard;
}
To achieve arbitrary shape, I assume I need to create a alpha mask texture which is the same size as texture0 and texture 1?
Then I pass that texture to frag shader to check values, if value is 0 then texture0, if value is 1 then blend texture0 and texture1.
Is my approach correct? Can you point me to any samples?
I want effect such as OpenGL - mask with multiple textures
but I want to create mask texture in my program dynamically, and I want to implement blending in GLSL
I have got blending working with mask texture of black and white
uniform sampler2D TextureMask;
vec4 mask_sample = texture2D(TextureMask, oTexCoord);
if(mask_sample.r == 0)
gl_FragColor = first_sample;
else
gl_FragColor = (first_sample * alpha + second_sample * (1.0-alpha));
now mask texture is loaded statically from a image on disk, now I just need to create mask texture dynamically in opengl
Here's one approach and sample.
Create a boolean test for whether you want to blend.
In my sample, I use an equation for a circle centered on the screen.
Then blend (i blended by weighted addition of the 2 colors).
(NOTE: i didn't have texture coords to work with in this sample, so i used the screen resolution to determine the circle position).
uniform vec2 resolution;
void main( void ) {
vec2 position = gl_FragCoord.xy / resolution;
// test if we're "in" or "out" of the blended region
// lets use a circle of radius 0.5, but you can make this mroe complex and/or pass this value in from the user
bool isBlended = (position.x - 0.5) * (position.x - 0.5) +
(position.y - 0.5) * (position.y - 0.5) > 0.25;
vec4 color1 = vec4(1,0,0,1); // this could come from texture 1
vec4 color2 = vec4(0,1,0,1); // this could come from texture 2
vec4 finalColor;
if (isBlended)
{
// blend
finalColor = color1 * 0.5 + color2 * 0.5;
}
else
{
// don't blend
finalColor = color1;
}
gl_FragColor = finalColor;
}
See the sample running here: http://glsl.heroku.com/e#18231.0
(tried to post my sample image but i don't have enough rep) sorry :/
Update:
Here's another sample using mouse position to determine the position of the blended area.
To run, paste the code in this sandbox site: https://www.shadertoy.com/new
This one should work on objects of any shape, as long as you have the mouse data setup correct.
void main(void)
{
vec2 position = gl_FragCoord.xy;
// test if we're "in" or "out" of the blended region
// lets use a circle of radius 10px, but you can make this mroe complex and/or pass this value in from the user
float diffX = position.x - iMouse.x;
float diffY = position.y - iMouse.y;
bool isBlended = (diffX * diffX) + (diffY * diffY) < 100.0;
vec4 color1 = vec4(1,0,0,1); // this could come from texture 1
vec4 color2 = vec4(0,1,0,1); // this could come from texture 2
vec4 finalColor;
if (isBlended)
{
// blend
finalColor = color1 * 0.5 + color2 * 0.5;
}
else
{
// don't blend
finalColor = color1;
}
gl_FragColor = finalColor;
}

OpenGL texture interpolation in shader

How are texture coordinates interpolated in a GLSL shader?
I'm trying to downsample an image from screen size to 1/4 its original size. This is a pretty simple procedure but it doesn't seem to be correct. I have an FBO, with a single colour attachment to use as a target that is 1/4 screen size.
In order to downsample the texture, I have a simple full-size quad that I'm going to draw, where coordinates are as follows:
v[0].v.position.x = -1.f; v[0].v.position.y = -1.f; v[0].v.position.z = 0.f;
v[1].v.position.x = -1.f; v[1].v.position.y = 1.f; v[1].v.position.z = 0.f;
v[2].v.position.x = 1.f; v[2].v.position.y = -1.f; v[2].v.position.z = 0.f;
v[3].v.position.x = 1.f; v[3].v.position.y = 1.f; v[3].v.position.z = 0.f;
Now, my vertex shader simply scales and biases the input coordinates in order to generate texture coordinates, passing through the vertices without need to transform them and without the need for me to pass texture coordinates in with the geometry, as follows:
#version 420
layout(location = 0) in vec3 attrib_Position;
out vec2 attrib_Fragment_Texture;
void main()
{
attrib_Fragment_Texture = attrib_Position.xy * 0.5 + 0.5;
gl_Position = vec4(attrib_Position.xy, 0.0, 1.0);
}
So the texture coordinates output as *attrib_Fragment_Texture* will go from 0.0 to 1.0 as the vertex coordinates go from -1 to 1. A little test of this with the following shader however, seems to show that uv.x and uv.y only go from 0.0 to 0.25. I expected them to interpolate from 0.0 to 1.0!
#version 420
in vec2 attrib_Fragment_Texture;
out vec4 Out_Colour;
void main(void)
{
vec2 uv = attrib_Fragment_Texture.xy;
Out_Colour = vec4(uv.x, uv.y, 1.0, 1.0);
}
Can anyone spot what my obviously simple mistake/misunderstanding might be?
I suspect you're rendering with the original viewport, which is 4x the size of your FBO, resulting in most of the primitive being clipped.

OpenGL two pass shader effect with FBO render to texture gives noise in result on Windows only

What is the correct way of doing the following:
Render a scene into a texture using a FBO (fbo-a)
Then apply an effect using the texture (tex-a) and render this into another texture (tex-b) using the same fbo (fbo-a)
Then render this second texture, with the applied effect (tex-b) as a full screen quad.
My approach is this, but this gives me a texture filled with "noise" on window + the applied effect (all pixels are randomly colored red, green, blue white, black).
I'm using one FBO, with two textures set to GL_COLOR_ATTACHENT0 (tex-a) and GL_COLOR_ATTACHMENT1 (tex-b)
I bind my fbo, make sure it's rendered into the tex-a using glDrawBuffer(GL_COLOR_ATTACHMENT0)
Then I apply the effect in a shader with tex-a bound and set as 'sampler2D'. Using texture unit 1, and switch to the second color attachment (glDrawBuffer(GL_COLOR_ATTACHMENT1)). and render a full screen quad. Everything is now rendered into tex-b
Then I switch back to the default FBO (0) and use tex-b with a full screen quad to render the result.
Example of the result when applying my shader
This is the shader I'm using. I'm not aware this could be what is causing this, but maybe the noise is caused by a overflow?
Vertex shader
attribute vec4 a_pos;
attribute vec2 a_tex;
varying vec2 v_tex;
void main() {
mat4 ident = mat4(1.0);
v_tex = a_tex;
gl_Position = ident * a_pos;
}
Fragment shader
uniform int u_mode;
uniform sampler2D u_texture;
uniform float u_exposure;
uniform float u_decay;
uniform float u_density;
uniform float u_weight;
uniform float u_light_x;
uniform float u_light_y;
const int NUM_SAMPLES = 100;
varying vec2 v_tex;
void main() {
if (u_mode == 0) {
vec2 pos_on_screen = vec2(u_light_x, u_light_y);
vec2 delta_texc = vec2(v_tex.st - pos_on_screen.xy);
vec2 texc = v_tex;
delta_texc *= 1.0 / float(NUM_SAMPLES) * u_density;
float illum_decay = 1.0;
for(int i = 0; i < NUM_SAMPLES; i++) {
texc -= delta_texc;
vec4 sample = texture2D(u_texture, texc);
sample *= illum_decay * u_weight;
gl_FragColor += sample;
illum_decay *= u_decay;
}
gl_FragColor *= u_exposure;
}
else if(u_mode == 1) {
gl_FragColor = texture2D(u_texture, v_tex);
gl_FragColor.a = 1.0;
}
}
I've read this FBO article on opengl.org, where they describe a feedback loop at the bottom of the article. The description is not completely clear to me and I'm wondering if I'm exactly doing what they describe there.
Update 1:
Link to source code
Update 2:
When I first set gl_FragColor.rgb = vec3(0.0, 0.0, 0.0); before I start the sampling loop (with NUM_SAMPLES), it works find. No idea why though.
The problem is that you're not initializing gl_FragColor, and you're modifying it with the lines
gl_FragColor += sample;
and
gl_FragColor *= u_exposure;
both of which depend on the previous value of gl_FragColor. So you're getting some random junk (whatever happened to be in the register that the shader compiler decided to use for the gl_FragColor computation) added in. This has a strong possibility of working fine on some driver/hardware combinations (because the compiler decided to use a register that was always 0 for some reason) and not on others.