Glow effect on simple rectangle in openGL ES - glsl

I would like to create a glow effect, on a rectangle:
I don't really know where to start the fragment shader.
Actually, I would like to achieve this effect on shapes (circles, polygons, rectangles). There is no real border color. There are just blury.

One of the ways:
If you have a rectangle defined with 4 lines (4 points) and have a model matrix then multiply the 4 points with the model matrix and send them as a uniform into the fragment shader. In vertex shader create another varying for the position which is the input position multiplied with the model matrix only. Also some radius must be sent as a uniform.
Now in the fragment shader create code for each of the point pairs representing a line and compute a distance. Now if the distance is smaller then radius create a color scale for the border. The sum of all 4 is then used as a result for the border color.
scale += 1.0-(clamp(currentDistanceToLeftBorder/radius, .0, 1.0));
scale += 1.0-(clamp(currentDistanceToTopBorder/radius, .0, 1.0));
scale += 1.0-(clamp(currentDistanceToRightBorder/radius, .0, 1.0));
scale += 1.0-(clamp(currentDistanceToBottomBorder/radius, .0, 1.0));
Then mix the colors:
color = mix(defaultColor, borderColor, clamp(scale, .0, 1.0));

Related

How does the coordinate system work for 3D textures in OpenGL?

I am attempting to write and read from a 3D texture, but it seems my mapping is wrong. I have used Render doc to check the textures and they look ok.
A random layer of this voluemtric texture looks like:
So just some blue to denote absence and some green values to denote pressence.
The coordinates I calculate when I write to each layer are calculated in the vertex shader as:
pos.x = (2.f*pos.x-width+2)/(width-2);
pos.y = (2.f*pos.y-depth+2)/(depth-2);
pos.z -= level;
pos.z *= 1.f/voxel_size;
gl_Position = pos;
Since the texture itself looks ok it seems these coordinates are good to achieve my goal.
It's important to note that right now voxel_size is 1 and the scale of the texture is supposed to be 1 to 1 with the scene dimensions. In essence, each pixel in the texture represents a 1x1x1 voxel in the scene.
Next I attempt to fetch the texture values as follows:
vec3 pos = vertexPos;
pos.x = (2.f*pos.x-width+2)/(width-2);
pos.y = (2.f*pos.y-depth+2)/(depth-2);
pos.z *= 1.f/(4*16);
outColor = texture(voxel_map, pos);
Where vertexPos is the global vertex position in the scene. The z coordinate may be completely wrong however (i am not sure if I am supposed to normalize the depth component or not) but that is not the only issue. If you look at the final result:
There is a horizontal sclae component problem. Since each texel represents a voxel, the color of a cube should always be a fixed color. But as you can see I am getting multiple colors for a single cube on the top faces. So my horizontal scale is wrong.
What am i doing wrong when fetching the texels from the texture?

How can I add different color overlays to each iteration of the texture in this glsl shader?

So I'm working on this shader right now, and the goal is to have a series of camera based tiles, each overlayed with a different color (think Andy Warhol). I've got the tiles working (side issue - the tiles on the ends are currently being cut off ~50%) but I want to add a color filter to each iteration. I'm looking for the cleanest possible way of doing this. Any ideas?
frag shader:
#define N 3.0 // number of columns
#define M 3. // number of rows
#import "GPUImageWarholFilter.h"
NSString *const kGPUImageWarholFragmentShaderString = SHADER_STRING
(
precision highp float;
varying highp vec2 textureCoordinate;
uniform sampler2D inputImageTexture;
void main()
{
vec4 color = texture2D(inputImageTexture, vec2(fract(textureCoordinate.x * N)/(M/ N), fract(textureCoordinate.y * M) / (M/N)));
gl_FragColor = color;
}
);
Let's say you're texture is getting applied to a quad and your vertices are:
float quad[] = {
0.0, 0.0,
1.0, 0.0,
1.0, 0.0, // First triangle
1.0, 0.0,
1.0, 0.0,
1.0, 1.0 // Second triangle
};
You can specify texture coordinates which corresponds to each of these vertices and describe the relationship in your vertex array object.
Now, if you quadruple the number of vertices, you'll have access to mid points between the extents of the quad (create four sub-quads) and you can associate (0,0) -> (1,1) texture coordinates with each of these subquads. The effect would be a rendering of the full texture in each subquad.
You could then do math on your vertices in your vertex shader to calculate a color component to assign to each subquad. The color component would be passed to your fragment shader. Use uniforms to specify number of rows and columns then define the color component based on which cell you've calculated the current vertex to be in.
You could also try to compute everything in the fragment shader, but I think it could get expensive pretty fast. Just a hunch though.
You could also try the GL_REPEAT texture parameter, but you'll have less control of the outcome: https://open.gl/textures

How to make radial gradient on each face using shader in OpenGL

using simple shaders I've found a way to create gradients.
Here's result of my job:
http://goo.gl/A7pY01 (A little updated after OpenGL ES 2.0 Shader - 2D Radial Gradient in Polygon question)
It's nice, but I still need to display this gradient pattern on each face of my meshes. Or on the billboard face, just like it's a texture.
The glsl function gl_FragCoord returns window-related coordinates. Could someone explain me the way how to translate this into face-related coords and then draw my pattern?
Okey. A little surfing of stackoverflow gave me this topic: OpenGL: How to render perfect rectangular gradient?
Here is the meaning string: gl_FragColor = mix(color0, color1, uv.u + uv.v - 2 * uv.u * uv.v);
Of course we cannot translate window-space coordinates into something "face-related", but we could use UV coordinates of a face. So, I decided, what if we have a square face with uv-coordinates corresponding to full-sized texture (like 0,0; 0,1; 1,0; 1,1); So the center of a structure is 0.5,0.5. This could be a center of my round-gradient.
so my code of fragment shader is:
vec2 u_c = vec2(0.5,0.5);
float distanceFromLight = length(uv - u_c);
gl_FragColor = mix(vec4(1.,0.5,1.,1.), vec4(0.,0.,0.,1.), distanceFromLight*2.0);
Vertex shader:
gl_Position = _mvProj * vec4(vertex, 1.0);
uv = uv1;
Of course, we need to give correct UV coordinates, but the point is understood.
Here's example:
http://goo.gl/A7pY01

GLSL Fragment Position

In my .cpp code I create a list of quads, a few of them have a flag, in the pixel shader I check if this flag is set or not, if the flag is not set, the quad gets colored in red for example, if the flag is set, I want to decide the color of every single pixel, so if I need to colour half of the flagged quad in red and the other half in blue I can simply do something like :
if coordinate in quad < something color = red
else colour = blue;
In this way I can get half of the quad colored in blue and another half colored in red, or I can decide where to put the red color or where to put the blue one.
Imagine I've got a quad 50x50 pixels
[frag]
if(quad.flag == 1)
{
if(Pixel_coordinate.x<25 ) gl_fragColor = vec4(1.0, 0.0, 0.0, 1.0);
else gl_fragColor = vec4(0.0, 1.0, 0.0, 1.0);
}
else
gl_FragColor = vec4(1.0, 0.0, 0.0, 1.0);
In this case I would expect that a quad with the flag set will get two colors per face.
I hope I have been more specific now.
thanks.
Just to add something I can't use any texture.
Ok i do this now :
Every quad has 4 textures coordinated (0,0), (0,1), (1,1), (1,0);
I enable the texture coordinates using :
glTexCoordPointer(2, GL_SHORT, sizeof(Vertex), BUFFER_OFFSET(sizeof(float) * 7));
[vert]
varying vec2 texCoord;
main()
{
texCoord = gl_MultiTexCoord0.xy;
}
[frag]
varying vec2 texCoord;
main()
{
float x1 = texCoord.s;
float x2 = texCoord.t;
gl_FragColor = vec4(x1, x2, 0.0, 1.0);
}
I get always the yellow color so x1 =1 and x2 = 1 almost always and some quad is yellow/green.
I would expect that the texture coordinates change in the fragment shader and so I should get a gradient, am I wrong?
If you want to know the coordinate within the quad, you need to calculate it yourself. In order to that, you'll need to create a new interpolant (call it something like vec2 quadCoord), and set it appropriately for each vertex, which means you'll likely also need to add it as an attribute and pass it through your vertex shader. eg:
// in the vertex shader
attribute vec2 quadCoordIn;
varying vec2 quadCoord;
main() {
quadCoord = quadCoordIn;
:
You'll need to feed in this attribute in your drawing code when drawing your quads. For each quad, the vertexes will have likely have quadCoordIn values of (0,0), (0,1), (1,1) and (1,0) -- you could use some other coordinate system if you prefer, but this is the easiest.
Then, in your fragment program, you can access quadCoord.xy to determine where in the quad you are.
In addition to Chris Dodd's answer, you can also access the screen-space coordinate (in pixels, though actually pixel centers and thus ?.5) of the currently processed fragment through the special fragment shader variable gl_FragCoord:
gl_FragColor = (gl_FragCoord.x<25.0) ? vec4(1.0, 0.0, 0.0, 1.0) : vec4(0.0, 1.0, 0.0, 1.0);
But this gives you the position of the fragment in screen-space and thus relative to the lower left corner of you viewport. If you actually need to know the position inside the individual quad (which makes more sense if you want to actually color each quad half-by-half, since the "half-cut" would otherwise vary with the quad's position), then Chris Dodd's answer is the correct approach.

What's wrong with this shader for a centered zooming effect in Orthographic projection?

I've created a basic orthographic shader that displays sprites from textures. It works great.
I've added a "zoom" factor to it to allow the sprite to scale to become larger or smaller. Assuming that the texture is anchored with its origin in the "lower left", what it does is shrink towards that origin point, or expand from it towards the upper right. What I actually want is to shrink or expand "in place" to stay centered.
So, one way of achieving that would be to figure out how many pixels I'll shrink or expand, and compensate. I'm not quite sure how I'd do that, and I also know that's not the best way. I fooled with order of my translates and scales, thinking I can scale first and then place, but I just get various bad results. I can't wrap my head around a way to solve the issue.
Here's my shader:
// Set up orthographic projection (960 x 640)
mat4 projectionMatrix = mat4( 2.0/960.0, 0.0, 0.0, -1.0,
0.0, 2.0/640.0, 0.0, -1.0,
0.0, 0.0, -1.0, 0.0,
0.0, 0.0, 0.0, 1.0);
void main()
{
// Set position
gl_Position = a_position;
// Translate by the uniforms for offsetting
gl_Position.x += translateX;
gl_Position.y += translateY;
// Apply our (pre-computed) zoom factor to the X and Y of our matrix
projectionMatrix[0][0] *= zoomFactorX;
projectionMatrix[1][1] *= zoomFactorY;
// Translate
gl_Position *= projectionMatrix;
// Pass right along to the frag shader
v_texCoord = a_texCoord;
}
mat4 projectionMatrix =
Matrices in GLSL are constructed column-wise. For a mat4, the first 4 values are the first column, then the next 4 values are the second column and so on.
You transposed your matrix.
Also, what are those -1's for?
For the rest of your question, scaling is not something the projection matrix should be dealing with. Not the kind of scaling you're talking about. Scales should be applied to the positions before you multiply them with the projection matrix. Just like for 3D objects.
You didn't post what your sprite's vertex data is, so there's no way to know for sure. But the way it ought to work is that the vertex positions for the sprite should be centered at the sprite's center (which is wherever you define it to be).
So if you have a 16x24 sprite, and you want the center of the sprite to be offset 8 pixels right and 8 pixels up, then your sprite rectangle should be (-8, -8) to (8, 16) (from a bottom-left coordinate system).
Then, if you scale it, it will scale around the center of the sprite's coordinate system.