Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 2 years ago.
Improve this question
I am trying to make a grid with fragment shader and i get problems with uv coords.
On this screenshot you can see first result:
float roundRect(vec2 p, vec2 size, float radius) {
vec2 d = abs(p) - size;
return min(max(d.x, d.y), 0.0) + length(max(d, 0.0)) - radius;
}
void main() {
vec2 uv = gl_FragCoord.xy / u_resolution.xy;
vec2 f_uv = fract(uv * 22.680);
float rect = smoothstep(0.040, 0.0, roundRect(f_uv - vec2(0.5), vec2(0.44), 0.040));
gl_FragColor = vec4(vec3(rect), 1.0);
}
Second:
float roundRect(vec2 p, vec2 size, float radius) {
vec2 d = abs(p) - size;
return min(max(d.x, d.y), 0.0) + length(max(d, 0.0)) - radius;
}
void main() {
vec2 uv = gl_FragCoord.xy / u_resolution.xy;
vec2 f_uv = fract(uv * 20.680);
float rect = smoothstep(0.040, 0.0, roundRect(f_uv - vec2(0.5), vec2(0.44), 0.040));
gl_FragColor = vec4(vec3(rect), 1.0);
}
These both screenshots have a difference in line
vec2 f_uv = fract(uv * x);
How can i fix it?
What you see is aliasing caused by gridlines
thinner than the pixels themselves, and
spaced at non-integer pixel intervals.
To fix that you need to band-limit your function. One way of doing this is as follows:
void main() {
float scale = 22.680;
vec2 uv = gl_FragCoord.xy / u_resolution.xy * scale;
float fw = max(fwidth(uv.x), fwidth(uv.y));
float rect = smoothstep(fw, -fw, roundRect(fract(uv) - vec2(0.5), vec2(0.44), 0.040));
gl_FragColor = vec4(vec3(rect), 1.0);
}
The results look as follows:
Note that some lines are still blurrier than others -- but the only way around it is to ensure that your scale factor is an integer amount of pixels.
Related
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 1 year ago.
Improve this question
Many times I tried to change the triangle points to circle in the vertexShader.glsl without a successful outcome. Would I ask you to rewrite the code so that the program draws a circle instead of a triangle?
So the vertexShader code:
uniform float offsetX;
uniform float offsetY;
void main(void)
{
if(gl_VertexID == 0) gl_Position = vec4(0.25 + offsetX, -0.25 + offsetY, 0.0, 1.0);
else if(gl_VertexID == 1) gl_Position = vec4(-0.25 + offsetX, -0.25 + offsetY, 0.0, 1.0);
else gl_Position = vec4(0.0 + offsetX, 0.25 + offsetY, 0.0, 1.0);
}
One option is to draw the inner circle of the triangle in the fragment shader.
Compute the coordinates and the radius of the the Incenter of a Triangle. See Incircle and excircles of a triangle and pass it to the fragment shader:
#version 150
uniform float offsetX;
uniform float offsetY;
out vec2 v_pos;
out vec2 center;
out float radius;
void main(void)
{
vec2 pts[3] = vec2[3](
vec2(0.25, -0.25),
vec2(-0.25, -0.25),
vec2(0.0, 0.25));
v_pos = pts[gl_VertexID] + vec2(offsetX, offsetY);
float a = distance(pts[1], pts[2]);
float b = distance(pts[2], pts[0]);
float c = distance(pts[0], pts[1]);
center = (pts[0] * a + pts[1] * b + pts[2] * c) / (a+b+c) + vec2(offsetX, offsetY);
float s = (a + b + c) / 2.0;
radius = sqrt((s - a) * (s - b) * (s - c) / s);
gl_Position = vec4(v_pos, 0.0, 1.0);
}
discard the points outside the incircle in the fragment shader:
#version 150
out vec4 out_color;
in vec2 v_pos;
in vec2 center;
in float radius;
void main(void)
{
if (distance(v_pos, center) > radius)
{
discard;
// debug
//out_color = vec4(0.5, 0.5, 0.5, 1.0);
//return;
}
out_color = vec4(1.0, 0.0, 0.0, 1.0);
}
im creating a 2d top down game in sfml where i would like the player to only be able to see things in their fov of 45 deg, currently my fragment shader looks like follows
uniform sampler2D texture;
uniform vec2 pos;
uniform vec2 screenSize;
uniform float in_angle;
void main()
{
vec2 fc = gl_FragCoord.xy/screenSize;
vec2 ndcCoords = vec2(0.0);
float fov = radians(45);
ndcCoords = (pos + (screenSize/2))/screenSize;
ndcCoords.y = abs(ndcCoords.y - 1.0);
float angle = radians(-angle+90+45);
float coT;
float siT;
vec2 adj = vec2(0.0);
coT = cos(angle);
siT = sin(angle);
adj.x = coT * ndcCoords.x - siT * ndcCoords.y;
adj.y = siT * ndcCoords.x + coT * ndcCoords.y;
vec2 diff = normalize(ndcCoords - fc);
float dist = acos(dot(diff, normalize(adj)));
vec3 color = vec3(0.0f);
if(dist < fov/2)
{
color = vec3(1.0);
}
gl_FragColor = vec4(color, 1.0) * texture2D(texture, gl_TexCoord[0].xy);
}
what this is doing is adjusting the playerPos vec2 and rotating it, so i can determine what fragcoords are within the players fov, however, when i move down without moving the mouse from directly above the player the fov shifts to the left / right without the player rotating at all, i've tried every solution i can think of but i can't seem to stop it, nor can i find a solution to this online. any suggestions would be appreciated
a solution has arisen, instead of trying to rotate the object to get its vector2 normalised direction, a simpler method is to calculate the angle between a frag and the playerPosition followed by creating a difference by subtracting the player rotation in radians as shown below. this can then be adjusted for the coordinate space and compared to the players fov
void main()
{
float fov = radians(45);
float pi = radians(180);
vec3 color = vec3(0.2);
vec2 st = gl_FragCoord.xy/screenSize.xy;
pos = (pos + (screenSize/2)) / screenSize;
float angleToObj = atan2(st.y - pos.y, st.x - pos.x);
float angleDiff = angleToObj - radians(-angle);
if(angleDiff > pi)
angleDiff -= 2.0f * pi;
if(angleDiff < -pi)
angleDiff += 2.0f * pi;
if(abs(angleDiff) < fov/2)
color = vec3(1.0f);
gl_FragColor = vec4(color, 1.0) * texture2D(texture, gl_TexCoord[0].xy);
}
Closed. This question needs debugging details. It is not currently accepting answers.
Edit the question to include desired behavior, a specific problem or error, and the shortest code necessary to reproduce the problem. This will help others answer the question.
Closed 2 years ago.
Improve this question
I'm attempting to implement normal mapping into my glsl shaders for the first time. I've written an ObjLoader that calculates the Tangents and Bitangents - I then pass the relevant information to my shaders (I'll show code in a bit). However, when I run the program, my models end up looking like this:
Looks great, I know, but not quite what I am trying to achieve!
I understand that I should be simply calculating direction vectors and not moving the vertices - but it seems somewhere down the line I end up making that mistake.
I am unsure if I am making the mistake when reading my .obj file and calculating the tangent/bitangent vectors, or if the mistake is happening within my Vertex/Fragment Shader.
Now for my code:
In my ObjLoader - when I come across a face, I calculate the deltaPositions and deltaUv vectors for all three vertices of the face - and then calculate the tangent and bitangent vectors:
I then organize the vertex data collected to construct my list of indices - and in that process I restructure the tangent and bitangent vectors to respect the newly constructed indice list.
Lastly - I perform Orthogonalization and calcuate the final bitangent vector.
After binding the VAO, VBO, IBO, and passing all the information respectively - my shader calculations are as follows:
Vertex Shader:
void main()
{
// Output position of the vertex, in clip space
gl_Position = MVP * vec4(pos, 1.0);
// Position of the vertex, in world space
v_Position = (M * vec4(pos, 0.0)).xyz;
vec4 bitan = V * M * vec4(bitangent, 0.0);
vec4 tang = V * M * vec4(tangent, 0.0);
vec4 norm = vec4(normal, 0.0);
mat3 TBN = transpose(mat3(tang.xyz, bitan.xyz, norm.xyz));
// Vector that goes from the vertex to the camera, in camera space
vec3 vPos_cameraspace = (V * M * vec4(pos, 1.0)).xyz;
camdir_cameraspace = normalize(-vPos_cameraspace);
// Vector that goes from the vertex to the light, in camera space
vec3 lighPos_cameraspace = (V * vec4(lightPos_worldspace, 0.0)).xyz;
lightdir_cameraspace = normalize((lighPos_cameraspace - vPos_cameraspace));
v_TexCoord = texcoord;
lightdir_tangentspace = TBN * lightdir_cameraspace;
camdir_tangentspace = TBN * camdir_cameraspace;
}
Fragment Shader:
void main()
{
// Light Emission Properties
vec3 LightColor = (CalcDirectionalLight()).xyz;
float LightPower = 20.0;
// Cutting out texture 'black' areas of texture
vec4 tempcolor = texture(AlbedoTexture, v_TexCoord);
if (tempcolor.a < 0.5)
discard;
// Material Properties
vec3 MaterialDiffuseColor = tempcolor.rgb;
vec3 MaterialAmbientColor = material.ambient * MaterialDiffuseColor;
vec3 MaterialSpecularColor = vec3(0, 0, 0);
// Local normal, in tangent space
vec3 TextureNormal_tangentspace = normalize(texture(NormalTexture, v_TexCoord)).rgb;
TextureNormal_tangentspace = (TextureNormal_tangentspace * 2.0) - 1.0;
// Distance to the light
float distance = length(lightPos_worldspace - v_Position);
// Normal of computed fragment, in camera space
vec3 n = TextureNormal_tangentspace;
// Direction of light (from the fragment)
vec3 l = normalize(TextureNormal_tangentspace);
// Find angle between normal and light
float cosTheta = clamp(dot(n, l), 0, 1);
// Eye Vector (towards the camera)
vec3 E = normalize(camdir_tangentspace);
// Direction in which the triangle reflects the light
vec3 R = reflect(-l, n);
// Find angle between eye vector and reflect vector
float cosAlpha = clamp(dot(E, R), 0, 1);
color =
MaterialAmbientColor +
MaterialDiffuseColor * LightColor * LightPower * cosTheta / (distance * distance) +
MaterialSpecularColor * LightColor * LightPower * pow(cosAlpha, 5) / (distance * distance);
}
I can spot 1 obvious mistake in your code. TBN is generated by the bitangent, tangent and normal. While the bitangent and tangent are transformed from model space to view space, the normal is not transformed. That does not make any sense. All the 3 vetors have to be related to the same coordinate system:
vec4 bitan = V * M * vec4(bitangent, 0.0);
vec4 tang = V * M * vec4(tangent, 0.0);
vec4 norm = V * M * vec4(normal, 0.0);
mat3 TBN = transpose(mat3(tang.xyz, bitan.xyz, norm.xyz));
I'm trying to implement Oren-Nayar lighting in the fragment shader as shown here.
However, I'm getting some strange lighting effects on the terrain as shown below.
I am currently sending the shader the 'view direction' uniform as the camera's 'front' vector. I am not sure if this is correct, as moving the camera around changes the artifacts.
Multiplying the 'front' vector by the MVP matrix gives a better result, but the artifacts are still very noticable when viewing the terrain from some angles. It is particularly noticable in dark areas and around the edges of the screen.
What could be causing this effect?
Artifact example
How the scene should look
Vertex Shader
#version 450
layout(location = 0) in vec3 position;
layout(location = 1) in vec3 normal;
out VS_OUT {
vec3 normal;
} vert_out;
void main() {
vert_out.normal = normal;
gl_Position = vec4(position, 1.0);
}
Tesselation Control Shader
#version 450
layout(vertices = 3) out;
in VS_OUT {
vec3 normal;
} tesc_in[];
out TESC_OUT {
vec3 normal;
} tesc_out[];
void main() {
if(gl_InvocationID == 0) {
gl_TessLevelInner[0] = 1.0;
gl_TessLevelInner[1] = 1.0;
gl_TessLevelOuter[0] = 1.0;
gl_TessLevelOuter[1] = 1.0;
gl_TessLevelOuter[2] = 1.0;
gl_TessLevelOuter[3] = 1.0;
}
tesc_out[gl_InvocationID].normal = tesc_in[gl_InvocationID].normal;
gl_out[gl_InvocationID].gl_Position = gl_in[gl_InvocationID].gl_Position;
}
Tesselation Evaluation Shader
#version 450
layout(triangles, equal_spacing) in;
in TESC_OUT {
vec3 normal;
} tesc_in[];
out TESE_OUT {
vec3 normal;
float height;
vec4 shadow_position;
} tesc_out;
uniform mat4 model_view;
uniform mat4 model_view_perspective;
uniform mat3 normal_matrix;
uniform mat4 depth_matrix;
vec3 lerp(vec3 v0, vec3 v1, vec3 v2) {
return (
(vec3(gl_TessCoord.x) * v0) +
(vec3(gl_TessCoord.y) * v1) +
(vec3(gl_TessCoord.z) * v2)
);
}
vec4 lerp(vec4 v0, vec4 v1, vec4 v2) {
return (
(vec4(gl_TessCoord.x) * v0) +
(vec4(gl_TessCoord.y) * v1) +
(vec4(gl_TessCoord.z) * v2)
);
}
void main() {
gl_Position = lerp(
gl_in[0].gl_Position,
gl_in[1].gl_Position,
gl_in[2].gl_Position
);
tesc_out.normal = normal_matrix * lerp(
tesc_in[0].normal,
tesc_in[1].normal,
tesc_in[2].normal
);
tesc_out.height = gl_Position.y;
tesc_out.shadow_position = depth_matrix * gl_Position;
gl_Position = model_view_perspective * gl_Position;
}
Fragment Shader
#version 450
in TESE_OUT {
vec3 normal;
float height;
vec4 shadow_position;
} frag_in;
out vec4 colour;
uniform vec3 view_direction;
uniform vec3 light_position;
#define PI 3.141592653589793
void main() {
const vec3 ambient = vec3(0.1, 0.1, 0.1);
const float roughness = 0.8;
const vec4 water = vec4(0.0, 0.0, 0.8, 1.0);
const vec4 sand = vec4(0.93, 0.87, 0.51, 1.0);
const vec4 grass = vec4(0.0, 0.8, 0.0, 1.0);
const vec4 ground = vec4(0.49, 0.27, 0.08, 1.0);
const vec4 snow = vec4(0.9, 0.9, 0.9, 1.0);
if(frag_in.height == 0.0) {
colour = water;
} else if(frag_in.height < 0.2) {
colour = sand;
} else if(frag_in.height < 0.575) {
colour = grass;
} else if(frag_in.height < 0.8) {
colour = ground;
} else {
colour = snow;
}
vec3 normal = normalize(frag_in.normal);
vec3 view_dir = normalize(view_direction);
vec3 light_dir = normalize(light_position);
float NdotL = dot(normal, light_dir);
float NdotV = dot(normal, view_dir);
float angleVN = acos(NdotV);
float angleLN = acos(NdotL);
float alpha = max(angleVN, angleLN);
float beta = min(angleVN, angleLN);
float gamma = dot(view_dir - normal * dot(view_dir, normal), light_dir - normal * dot(light_dir, normal));
float roughnessSquared = roughness * roughness;
float roughnessSquared9 = (roughnessSquared / (roughnessSquared + 0.09));
// calculate C1, C2 and C3
float C1 = 1.0 - 0.5 * (roughnessSquared / (roughnessSquared + 0.33));
float C2 = 0.45 * roughnessSquared9;
if(gamma >= 0.0) {
C2 *= sin(alpha);
} else {
C2 *= (sin(alpha) - pow((2.0 * beta) / PI, 3.0));
}
float powValue = (4.0 * alpha * beta) / (PI * PI);
float C3 = 0.125 * roughnessSquared9 * powValue * powValue;
// now calculate both main parts of the formula
float A = gamma * C2 * tan(beta);
float B = (1.0 - abs(gamma)) * C3 * tan((alpha + beta) / 2.0);
// put it all together
float L1 = max(0.0, NdotL) * (C1 + A + B);
// also calculate interreflection
float twoBetaPi = 2.0 * beta / PI;
float L2 = 0.17 * max(0.0, NdotL) * (roughnessSquared / (roughnessSquared + 0.13)) * (1.0 - gamma * twoBetaPi * twoBetaPi);
colour = vec4(colour.xyz * (L1 + L2), 1.0);
}
First I've plugged your fragment shader into my renderer with my view/normal/light vectors and it works perfectly. So the problem has to be in the way you calculate those vectors.
Next, you say that you set view_dir to your camera's front vector. I assume that you meant "camera's front vector in the world space" which would be incorrect. Since you calculate the dot products with vectors in the camera space, the view_dir must be in the camera space too. That is vec3(0,0,1) would be an easy way to check that. If it works -- we found your problem.
However, using (0,0,1) for the view direction is not strictly correct when you do perspective projection, because the direction from the fragment to the camera then depends on the location of the fragment on the screen. The correct formula then would be view_dir = normalize(-pos) where pos is the fragment's position in camera space (that is with model-view matrix applied without the projection). Further, this quantity now depends only on the fragment location on the screen, so you can calculate it as:
view_dir = normalize(vec3(-(gl_FragCoord.xy - frame_size/2) / (frame_width/2), flen))
flen is the focal length of your camera, which you can calculate as flen = cot(fovx/2).
I know this is a long dead thread, but I've been having the same problem (for several years), and finally found the solution...
It can be partially solved by fixing the orientation of the surface normals to match the polygon winding direction, but you can also get rid of the artifacts in the shader, by changing the following two lines...
float angleVN = acos(cos_nv);
float angleLN = acos(cos_nl);
to this...
float angleVN = acos(clamp(cos_nv, -1.0, 1.0));
float angleLN = acos(clamp(cos_nl, -1.0, 1.0));
Tada!
I'm trying to make an effect in fragment shader... This is what I get without effects:
This is what I get by multiplying the color by a 'gradient':
float fragPosition = gl_FragCoord.y / screenSize.y;
outgoingLight /= fragPosition;
So I tried to dividing but the color is kind of burned by light
float fragPosition = gl_FragCoord.y / screenSize.y;
outgoingLight /= fragPosition;
And here are the kind of colors/gradient I want (per face if available):
EDIT
Here is the vertex shader (I use three JS chunks)
precision highp float;
precision highp int;
#define PHONG
uniform float time;
attribute vec4 data;
varying vec3 vViewPosition;
#ifndef FLAT_SHADED
varying vec3 vNormal;
#endif
$common
$map_pars_vertex
$lightmap_pars_vertex
$envmap_pars_vertex
$lights_phong_pars_vertex
$color_pars_vertex
$morphtarget_pars_vertex
$skinning_pars_vertex
$shadowmap_pars_vertex
$logdepthbuf_pars_vertex
void main(){
float displacementAmount = data.x;
int x = int(data.y);
int y = int(data.z);
bool edge = bool(data.w);
$map_vertex
$lightmap_vertex
$color_vertex
$morphnormal_vertex
$skinbase_vertex
$skinnormal_vertex
$defaultnormal_vertex
#ifndef FLAT_SHADED
vNormal = normalize( transformedNormal );
#endif
$morphtarget_vertex
$skinning_verte
$default_vertex
if( edge == false ){
vec3 displacement = vec3(sin(time * 0.001 * displacementAmount) * 0.2);
mvPosition = mvPosition + vec4(displacement, 1.0);
gl_Position = projectionMatrix * mvPosition;
}
$logdepthbuf_vertex
vViewPosition = -mvPosition.xyz;
$worldpos_vertex
$envmap_vertex
$lights_phong_vertex
$shadowmap_vertex
vec3 newPosition = position + vec3(mvPosition.xyz);
gl_Position = projectionMatrix * modelViewMatrix * vec4(newPosition, 1.0);
EDIT 2:
After #gamedevelopmentgerm suggestion to mix here is what I get:
It's much better what I get but is it possible to avoid black to white gradient in background. I only want blue to white.