Drawing a circular arc in glsl - glsl

I already know how to draw a circle. But I'm really stuck on how to actually make it an arc. In the end I would like to be able to specify the start and end angle. Then the arc should start from the bottom, offset by the start angle, and then go until the end angle is reached.
This is how I drew the circle:
vec2 uv = textureCoords * 2.0 - 1.0;
float distance = sqrt(dot(uv,uv));
float OD = 0.7;
float ID = 0.5;
float ODC = smoothstep(OD, OD - 0.01, distance);
float IDC = smoothstep(ID, ID + 0.01, distance);
float alpha = ODC * IDC;
if(alpha < 0.001)
discard;
FragColor = vec4(0.5, 0.5, 0.5, alpha);
I then tried to fiddle around with:
float g = uv.y / dist;
float sector = 0.5 - g / 2.0;
But I cant seem to get the mapping quite right...

float angle = (atan(uv.x, uv.y) + pi) / (pi * 2);
if(alpha < 0.001 || angle < 0.25 || angle > 0.5)
discard;
Got it, but maybe someone has a better solution.

Related

What is the best OpenGL self-shadowing technique?

I chose the quite old, but sufficient method of shadow mapping, which is OK overall, but I quickly discovered some self-shadowing problems:
It seems, this problem appears because of the bias offset, which is necessary to eliminate shadow acne artifacts.
After some googling, it seems that there is no easy solution to this, so I tried some shader tricks which worked, but not very well.
My first idea was to perform a calculation of a dot multiplication between a light direction vector and a normal vector. If the result is lower than 0, the angle between vectors is >90 degrees, so this surface is pointing outward at the light source, hence it is not illuminated. This works good, except shadows may appear too sharp and hard:
After I was not satisfied with the results, I tried another trick, by multiplying the shadow value by the abs value of the dot product of light direction and normal vector (based on the normal map), and it did work (hard shadows from the previous image got smooth transition from shadow to regular diffuse color), except it created another artifact in situations, when the normal map normal vector is pointing somewhat at the sun, but the face normal vector does not. It also made self-shadows much brighter (but it is fixable):
Can I do something about it, or should I just choose the lesser evil?
Shader shadows code for example 1:
vec4 fragPosViewSpace = view * vec4(FragPos, 1.0);
float depthValue = abs(fragPosViewSpace.z);
vec4 fragPosLightSpace = lightSpaceMatrix * vec4(FragPos, 1.0);
vec3 projCoords = fragPosLightSpace.xyz / fragPosLightSpace.w;
// transform to [0,1] range
projCoords = projCoords * 0.5 + 0.5;
// get depth of current fragment from light's perspective
float currentDepth = projCoords.z;
// keep the shadow at 0.0 when outside the far_plane region of the light's frustum.
if (currentDepth > 1.0)
{
return 0.0;
}
// calculate bias (based on depth map resolution and slope)
float bias = max(0.005 * (1.0 - dot(normal, lightDir)), 0.0005);
vec2 texelSize = 1.0 / vec2(textureSize(material.texture_shadow, 0));
const int sampleRadius = 2;
const float sampleRadiusCount = pow(sampleRadius * 2 + 1, 2);
for(int x = -sampleRadius; x <= sampleRadius; ++x)
{
for(int y = -sampleRadius; y <= sampleRadius; ++y)
{
float pcfDepth = texture(material.texture_shadow, vec3(projCoords.xy + vec2(x, y) * texelSize, layer)).r;
shadow += (currentDepth - bias) > pcfDepth ? ambientShadow : 0.0;
}
}
shadow /= sampleRadiusCount;
Hard self shadows trick code:
float shadow = 0.0f;
float ambientShadow = 0.9f;
// "Normal" is a face normal vector, "normal" is calculated based on normal map. I know there is a naming problem with that))
float faceNormalDot = dot(Normal, lightDir);
float vectorNormalDot = dot(normal, lightDir);
if (faceNormalDot <= 0 || vectorNormalDot <= 0)
{
shadow = max(abs(vectorNormalDot), ambientShadow);
}
else
{
vec4 fragPosViewSpace = view * vec4(FragPos, 1.0);
float depthValue = abs(fragPosViewSpace.z);
...
}
Dot product multiplication trick code:
float shadow = 0.0f;
float ambientShadow = 0.9f;
float faceNormalDot = dot(Normal, lightDir);
float vectorNormalDot = dot(normal, lightDir);
if (faceNormalDot <= 0 || vectorNormalDot <= 0)
{
shadow = ambientShadow * abs(vectorNormalDot);
}
else
{
vec4 fragPosViewSpace = view * vec4(FragPos, 1.0);
float depthValue = abs(fragPosViewSpace.z);
...

How to render a smooth ellipse?

I'm trying to render an ellipse where I can decide how hard the edge is.
(It should also be tileable, i.e. I must be able to split the rendering into multiple textures with a given offset)
I came up with this:
float inEllipseSmooth(vec2 pos, float width, float height, float smoothness, float tiling, vec2 offset)
{
pos = pos / iResolution.xy;
float smoothnessSqr = smoothness * tiling * smoothness * tiling;
vec2 center = -offset + tiling / 2.0;
pos -= center;
float x = (pos.x * pos.x + smoothnessSqr) / (width * width);
float y = (pos.y * pos.y + smoothnessSqr) / (height * height);
float result = (x + y);
return (tiling * tiling) - result;
}
See here (was updated after comment -> now it's how I needed it):
https://www.shadertoy.com/view/ssGBDK
But at the moment it is not possible to get a completely hard edge. It's also smooth if "smoothness" is set to 0.
One idea was "calculating the distance of the position to the center and comparing that to the corresponding radius", but I think there is probably a better solution.
I was not able to find anything online, maybe I'm just searching for the wrong keywords.
Any help would be appreciated.
I don't yet understand what you are trying to accomplish.
Anyway, I have been playing with shadertoy and I have created something that could help you.
I think that smoothstep GLSL function is what you need. And some inner and outer ratio to set the limits of the inner and border.
It is not optimized...
void mainImage( out vec4 fragColor, in vec2 fragCoord )
{
int tiling = 4;
float width = 0.5;
float height = 0.2;
float smoothness = 0.9;
float outerRatio = 1.0;
float innerRatio = 0.75;
vec2 offset = vec2(0.25, 0.75);
//offset = iMouse.xy / iResolution.xy;
vec2 center = vec2(0.5, 0.5);
vec2 axis = vec2(width, height);
vec2 pos = float(tiling) * (fragCoord.xy);
pos = mod(pos / iResolution.xy, 1.0);
pos = mod(pos - offset, 1.0);
pos = pos - center;
pos = (pos * pos) / (axis * axis);
float distance = pos.x + pos.y;
float alpha;
if ( distance > outerRatio ) { alpha = 0.0; }
else if ( distance < innerRatio ) { alpha = 1.0; }
else { alpha = smoothstep(outerRatio, innerRatio, distance); }
fragColor = vec4(vec3(alpha), 1.0);
}
Shadertoy multiple ellipses with soft edge and solid inner

GLSL : Why are my calculated normals not working properly

I am trying to follow the Ray Tracing in one Weekend tutorial and my normals do not look like i expect them to look.
float hit_sphere(Sphere sphere, Ray r){
vec3 oc = r.origin - sphere.center;
float a = dot(r.direction,r.direction);
float b = 2.0 * dot(oc, r.direction);
float c = dot(oc, oc) - sphere.radius * sphere.radius;
float discriminant = b * b - 4 * a * c;
if(discriminant > 0){
return -1;
}
else{
return (-b -sqrt(discriminant))/(2.0 * a);
}
}
vec3 at(Ray ray, float p){
return ray.origin + p * ray.direction;
}
void main()
{
vec3 camera_origin = vec3(0,0,2);
vec2 st = gl_FragCoord.xy/vec2(x, y);
Ray r = Ray(camera_origin, normalize(vec3(st.x - 0.5, st.y - 0.5, 1.0)));
Sphere sphere = {vec3(0,0,0),0.5};
float p = hit_sphere(sphere, r);
if(p < 0.0){
vec3 N = normalize(at(r, p) - sphere.center);
FragColor = vec4(N.x + 1, N.y + 1, N.z + 1, 1);
}
else{
// FragColor = vec4(st.xy, 1.0, 1.0);
FragColor = vec4(1 - st.y+0.7, 1 - st.y+0.7,1 - st.y+0.9, 1.0);
}
}
Note that the + 1 in each normal color channel is to make it more noticeable to spot the color difference.
This is how my normals look.
Although this is not how i expect these normals to be.
They should be something like this (not exactly like this but close)
What mistake or overseen problem is causing this.
Note : Moving back and forth doesnt change the situation
The expression
FragColor = vec4(N.x + 1, N.y + 1, N.z + 1, 1);
creates colors in range [0, 2]. However, the color channels have to be in range [0, 1]:
FragColor = vec4(N.xyz * 0.5 + 0.5, 1);
Note that you can also use abs to represent the normals:
FragColor = vec4(abs(N.xyz), 1);
or even scale with the reciprocal maximum color channel:
vec3 nv_color = abs(N.xyz);
nv_color /= max(nv_color.x, max(nv_color.y, nv_color.z));
FragColor = vec4(nv_color, 1.0);
You draw the normal vector when discriminant > 0 with p is -1. Actually you always calculate normalize(at(r, -1) - sphere.center). This is wrong, because p needs to be the distance from origin to the pint on the sphere where the ray hits the sphere.
When the ray hits the sphere, p is >= 0. In this case you want to draw the normal vector:
if (p < 0.0)
if (p >= 0.0) {
vec3 N = normalize(at(r, p) - sphere.center);
FragColor = vec4(N.xyz * 0.5 + 0.5, 1);
}
discriminant > 0
if (discriminant < 0){
return -1;
}
else {
return (-b -sqrt(discriminant))/(2.0 * a);
}

Fish-eye warping about mouse position - fragment shader

I'm trying to create a fish-eye effect but only in a small radius around the mouse position. I've been able to modify this code to work about the mouse position (demo) but I can't figure out where the zooming is coming from. I'm expecting the output to warp the image similarly to this (ignore the color inversion for the sake of this question):
Relevant code:
// Check if within given radius of the mouse
vec2 diff = myUV - u_mouse - 0.5;
float distance = dot(diff, diff); // square of distance, saves a square-root
// Add fish-eye
if(distance <= u_radius_squared) {
vec2 xy = 2.0 * (myUV - u_mouse) - 1.0;
float d = length(xy * maxFactor);
float z = sqrt(1.0 - d * d);
float r = atan(d, z) / PI;
float phi = atan(xy.y, xy.x);
myUV.x = d * r * cos(phi) + 0.5 + u_mouse.x;
myUV.y = d * r * sin(phi) + 0.5 + u_mouse.y;
}
vec3 tex = texture2D(tMap, myUV).rgb;
gl_FragColor.rgb = tex;
This is my first shader, so other improvements besides fixing this issue are also welcome.
Compute the vector from the current fragment to the mouse and the length of the vector:
vec2 diff = myUV - u_mouse;
float distance = length(diff);
The new texture coordinate is the sum of the mouse position and the scaled direction vector:
myUV = u_mouse + normalize(diff) * u_radius * f(distance/u_radius);
For instance:
uniform float u_radius;
uniform vec2 u_mouse;
void main()
{
vec2 diff = myUV - u_mouse;
float distance = length(diff);
if (distance <= u_radius)
{
float scale = (1.0 - cos(distance/u_radius * PI * 0.5));
myUV = u_mouse + normalize(diff) * u_radius * scale;
}
vec3 tex = texture2D(tMap, myUV).rgb;
gl_FragColor = vec4(tex, 1.0);
}

XNA or OpenGL sphere texture mapping

I'm trying to map a completly normal texture into a sphere.
I can't change my texture to a wrapped one, so I need to find some mapping function.
This is my vertex shader code:
vec3 north = vec3(0.0, 0.0, 1.0);
vec3 equator = vec3(0.0, 1.0, 0.0);
vec3 northEquatorCross = cross(equator, north);
vec3 vertexRay = normalize(gl_Vertex.xyz);
float phi = acos(vertexRay.z);
float tv = (phi / (PI*tiling));
float tu = 0.0;
if (vertexRay.z == 1.0 || vertexRay.z == -1.0) {
tu = 0.5;
} else {
float ang_hor = acos(max(min(vertexRay.y / sin(phi), 1.0), -1.0));
float temp = ang_hor / ((2.0*tiling) * PI);
tu = (vertexRay.x >= 0.0) ? temp : 1.0 - temp;
}
texPosition = vec2(tu, tv);
its straight from here:
http://blogs.msdn.com/coding4fun/archive/2006/10/31/912562.aspx
This is my fragment shader:
color = texture2D(debugTex, texPosition);
As you can see in this screenshot: http://img189.imageshack.us/img189/4695/sphereproblem.png,
it shows a crack in the sphere... and this is what I'm trying to fix.
(the texture used: http://img197.imageshack.us/img197/56/debug.jpg)
The first comment in the XNA website really fixes the problems using:
device.RenderState.Wrap0 = WrapCoordinates.Zero;
But because I don't understand enough about XNA internals, I can't understand what this solves in this particular problem.
Around the web, some have experienced the same, and reported to be about interpolation errors, but because I'm implementing this directly as a fragment shaders (per-pixel/frag), I shouldn't have this problem (no interpolation in the texture uv).
Any info/solution on this?
It's funny ! I've had the same problem, but it got fixed, I suspect there are some floating point issues... here is my shader that works!
uniform float r;
void main(void) {
float deg1, deg2, rr, sdeg1, sdeg2, cdeg2;
deg1 = (gl_Vertex.y / " "32" ".0) * 2.0 * 3.1415926;
deg2 = (gl_Vertex.x / " "32" ".0) * 2.0 * 3.1415926;
sdeg1 = sin(deg1);
sdeg2 = sin(deg2);
cdeg2 = cos(deg2);
gl_Vertex.y = r*sdeg1;
rr = r*cos(deg1);
if(rr < 0.0001) rr = 0.0001;
gl_Vertex.x = rr*sdeg2;
gl_Vertex.z = rr*cdeg2;
vec3 vertexRay = normalize(gl_Vertex.xyz);
float phi = acos(vertexRay.y);
float tv = (phi / (3.1415926));
float sphi = sin(phi);
float theta = 0.5;
float temp = vertexRay.z / sphi;
if(temp > 1.0) temp = 1.0;
if(temp < -1.0) temp = -1.0;
theta = acos(temp) / (2.0*3.1415926);
float tu = 0.0;
if(deg2 > 3.1415926) tu = theta;
else tu = 1.0 - theta;
gl_TexCoord[0].x = tu;
gl_TexCoord[0].y = tv;
gl_Position = ftransform();
}
WrapCoordinates.Zero effectively signifies that the texture is wrapped horizontally, not vertically, from the perspective of the texture.