i'm new in this shader world and just want to try out something. I want to do the following:
In a specific radius around the mouse the texture in the background should rotate by 10°. The mouse coordinates are absolute values, so to work with them i have to normalize them, so i can adress the right space in the texture. But somehow this doesn't work right. The rotation works but i get the color information with a little offset.
I think the problem is the normalize(mouse)but i don't know how to do this right. Here is the shader:
uniform sampler2D tex0;
uniform vec2 mouse;
void main() {
vec2 pos = vec2(gl_FragCoord.x, gl_FragCoord.y);
if (distance(pos, mouse) < 90.0) {
vec2 p = gl_TexCoord[0].st;
vec2 m = normalize(mouse);
p.x = m.x + (cos(radians(10.0)) * (p.x - m.x) - sin(radians(10.0)) * (p.y - m.y));
p.y = m.y + (sin(radians(10.0)) * (p.x - m.x) + cos(radians(10.0)) * (p.y - m.y));
gl_FragColor = vec4(0.0, m.y, 0.0, 0.0);
gl_FragColor = texture2D( tex0, p );
}
else {
gl_FragColor.rgb = texture2D( tex0, gl_TexCoord[0].st ).rgb;
gl_FragColor.a = 1.0;
}
}
I'm using cinder to do this.
mShader.uniform( "mouse", Vec2f( m.x, 682 - m.y ) );
Thank you.
In order transform absolute mouse coordinates into [0,1] range (required for a texture sample), you don't need normalize function. You need a simple scale:
vec2 mouse_norm = vec2( m.x/screen.width, 1.0 - m.y/screen.height )
I wrote it in GLSL, but for you it will be easier to do on CPU side before passing a uniform, because originally it's CPU side that is aware of the screen resolution.
Related
I have a very simple shader program that takes in a bunch of position data as GL_POINTS that generate screen-aligned squares of fragments like normal with a size depending on depth, and then in the fragment shader I wanted to draw a very simple ray-traced sphere for each one with just the shadow that is on the sphere opposite to the light. I went to this shadertoy to try to figure it out on my own. I used the sphIntersect function for ray-sphere intersection, and sphNormal to get the normal vectors on the sphere for lighting. The problem is that the spheres do not align with the squares of fragments, causing them to be cut off. This is because I am not sure how to match the projections of the spheres and the vertex positions so that they line up. Can I have an explanation of how to do this?
Here is a picture for reference.
Here are my vertex and fragment shaders for reference:
//vertex shader:
#version 460
layout(location = 0) in vec4 position; // position of each point in space
layout(location = 1) in vec4 color; //color of each point in space
layout(location = 2) uniform mat4 view_matrix; // projection * camera matrix
layout(location = 6) uniform mat4 cam_matrix; //just the camera matrix
out vec4 col; // color of vertex
out vec4 posi; // position of vertex
void main() {
vec4 p = view_matrix * vec4(position.xyz, 1.0);
gl_PointSize = clamp(1024.0 * position.w / p.z, 0.0, 4000.0);
gl_Position = p;
col = color;
posi = cam_matrix * position;
}
//fragment shader:
#version 460
in vec4 col; // color of vertex associated with this fragment
in vec4 posi; // position of the vertex associated with this fragment relative to camera
out vec4 f_color;
layout (depth_less) out float gl_FragDepth;
float sphIntersect( in vec3 ro, in vec3 rd, in vec4 sph )
{
vec3 oc = ro - sph.xyz;
float b = dot( oc, rd );
float c = dot( oc, oc ) - sph.w*sph.w;
float h = b*b - c;
if( h<0.0 ) return -1.0;
return -b - sqrt( h );
}
vec3 sphNormal( in vec3 pos, in vec4 sph )
{
return normalize(pos-sph.xyz);
}
void main() {
vec4 c = clamp(col, 0.0, 1.0);
vec2 p = ((2.0*gl_FragCoord.xy)-vec2(1920.0, 1080.0)) / 2.0;
vec3 ro = vec3(0.0, 0.0, -960.0 );
vec3 rd = normalize(vec3(p.x, p.y,960.0));
vec3 lig = normalize(vec3(0.6,0.3,0.1));
vec4 k = vec4(posi.x, posi.y, -posi.z, 2.0*posi.w);
float t = sphIntersect(ro, rd, k);
vec3 ps = ro + (t * rd);
vec3 nor = sphNormal(ps, k);
if(t < 0.0) c = vec4(1.0);
else c.xyz *= clamp(dot(nor,lig), 0.0, 1.0);
f_color = c;
gl_FragDepth = t * 0.0001;
}
Looks like you have many spheres so I would do this:
Input data
I would have VBO containing x,y,z,r describing your spheres, You will also need your view transform (uniform) that can create ray direction and start position for each fragment. Something like my vertex shader in here:
Reflection and refraction impossible without recursive ray tracing?
Create BBOX in Geometry shader and convert your POINT to QUAD or POLYGON
note that you have to account for perspective. If you are not familiar with geometry shaders see:
rendring cubics in GLSL
Where I emmit sequence of OBB from input lines...
In fragment raytrace sphere
You have to compute intersection between sphere and ray, chose the closer intersection and compute its depth and normal (for lighting). In case of no intersection you have to discard; fragment !!!
From what I can see in your images Your QUADs does not correspond to your spheres hence the clipping and also you do not discard; fragments with no intersections so you overwrite with background color already rendered stuff around last rendered spheres so you have only single sphere left in QUAD regardless of how many spheres are really there ...
To create a ray direction that matches a perspective matrix from screen space, the following ray direction formula can be used:
vec3 rd = normalize(vec3(((2.0 / screenWidth) * gl_FragCoord.xy) - vec2(aspectRatio, 1.0), -proj_matrix[1][1]));
The value of 2.0 / screenWidth can be pre-computed or the opengl built-in uniform structs can be used.
To get a bounding box or other shape for your spheres, it is very important to use camera-facing shapes, and not camera-plane-facing shapes. Use the following process where position is the incoming VBO position data, and the w-component of position is the radius:
vec4 p = vec4((cam_matrix * vec4(position.xyz, 1.0)).xyz, position.w);
o.vpos = p;
float l2 = dot(p.xyz, p.xyz);
float r2 = p.w * p.w;
float k = 1.0 - (r2/l2);
float radius = p.w * sqrt(k);
if(l2 < r2) {
p = vec4(0.0, 0.0, -p.w * 0.49, p.w);
radius = p.w;
k = 0.0;
}
vec3 hx = radius * normalize(vec3(-p.z, 0.0, p.x));
vec3 hy = radius * normalize(vec3(-p.x * p.y, p.z * p.z + p.x * p.x, -p.z * p.y));
p.xyz *= k;
Then use hx and hy as basis vectors for any 2D shape that you want the billboard to be shaped like for the vertices. Don't forget later to multiply each vertex by a perspective matrix to get the final position of each vertex. Here is a visualization of the billboarding on desmos using a hexagon shape: https://www.desmos.com/calculator/yeeew6tqwx
I'm drawing textured quads to the screen in a 2D environment. The quads are used as a tile-map. In order to "blend" some of the tiles together I had the idea like:
A single "grass" tile drawn on top of dirt would render it as a faded circle of grass; faded from probably the quarter point.
If there was a larger area of grass tiles, then the edges would gradually fade from the quarter point that is on the edge of the grass.
So if the entire left-edge of the quad was to be faded, it would have 0 opacity at the left-edge, and then full opacity at one quarter of the width of the quad. Right edge fade would have full opacity at the three-quarters width, and fade down to 0 opacity at the right-most edge.
I figured that setting 4 corners as "on" or "off" would be enough to have the fragment shader work it out. However, I can't work it out.
If corner0 were 0 the result should be something like this for the quad:
If both corner0 and corner1 were 0 then it would look like this:
This is what I have so far:
#version 330
layout(location=0) in vec3 inVertexPosition;
layout(location=1) in vec2 inTexelCoords;
layout(location=2) in vec2 inElementPosition;
layout(location=3) in vec2 inElementSize;
layout(location=4) in uint inCorner0;
layout(location=5) in uint inCorner1;
layout(location=6) in uint inCorner2;
layout(location=7) in uint inCorner3;
smooth out vec2 texelCoords;
flat out vec2 elementPosition;
flat out vec2 elementSize;
flat out uint corner0;
flat out uint corner1;
flat out uint corner2;
flat out uint corner3;
void main()
{
gl_Position = vec4(inVertexPosition.x,
-inVertexPosition.y,
inVertexPosition.z, 1.0);
texelCoords = vec2(inTexelCoords.x,1-inTexelCoords.y);
elementPosition.x = (inElementPosition.x + 1.0) / 2.0;
elementPosition.y = -((inElementPosition.y + 1.0) / 2.0);
elementSize.x = (inElementSize.x) / 2.0;
elementSize.y = -((inElementSize.y) / 2.0);
corner0 = inCorner0;
corner1 = inCorner1;
corner2 = inCorner2;
corner3 = inCorner3;
}
The element position is provided in the range of [-1,1], the corner variables are all either 0 or 1. These are provided on an instance basis, whereas the vertex position and texelcoords are provided per-vertex. The vertex y-coord is inverted because I work in reverse and just flip it here for ease. ElementSize is on the scale of [0,2], so I'm just converting it to [0,1] range.
The UV coords could be any values, not neccessarily [0,1].
Here's the frag shader
#version 330
precision highp float;
layout(location=0) out vec4 frag_colour;
smooth in vec2 texelCoords;
flat in vec2 elementPosition;
flat in vec2 elementSize;
flat in uint corner0;
flat in uint corner1;
flat in uint corner2;
flat in uint corner3;
uniform sampler2D uTexture;
const vec2 uScreenDimensions = vec2(600,600);
void main()
{
vec2 uv = texelCoords;
vec4 c = texture(uTexture,uv);
frag_colour = c;
vec2 fragPos = gl_FragCoord.xy / uScreenDimensions;
// What can I do using the fragPos, elementPos??
}
Basically, I'm not sure what I can do using the fragPos and elementPosition to fade pixels toward a corner if that corner is 0 instead of 1. I kind of understand that it should be based on the distance of the frag from the corner position... but I can't work it out. I added elementSize because I think it's needed to determine how far from the corner the given frag is...
To achieve a fading effect, you have to use Blending. YOu have to set the alpha channel of the fragment color dependent on a scale:
frag_colour = vec4(c.rgb, c.a * scale);
scale has to be computed dependent on the texture coordinates (uv). If a coordinate is in range [0.0, 0.25] or [0.75, 1.0] then the texture has to be faded dependent on the corresponding cornerX variable. In the following the variables uv is assumed to be a 2 dimensional vector, in range [0, 1].
Compute a linear gradients for the left, right, bottom and top side, dependent on uv:
float gradL = min(1.0, uv.x * 4.0);
float gradR = min(1.0, (1.0 - uv.x) * 4.0);
float gradT = min(1.0, uv.y * 4.0);
float gradB = min(1.0, (1.0 - uv.y) * 4.0);
Or compute Hermite gradients by using smoothstep:
float gradL = smoothstep(0.0, 0.25, uv.x);
float gradR = 1.0 - smoothstep(0.75, 1.0, uv.x);
float gradT = smoothstep(0.0, 0.25, uv.y);
float gradB = 1.0 - smoothstep(0.75, 1.0, uv.y);
Compute the fade factor for the 4 corners and the 4 sides dependent on gradL, gradR, gradT, gradB and the corresponding cornerX variable. Finally compute the maximum fade factor:
float fade0 = float(corner0) * max(0.0, 1.0 - dot(vec2(0.707), vec2(gradL, gradT)));
float fade1 = float(corner1) * max(0.0, 1.0 - dot(vec2(0.707), vec2(gradL, gradB)));
float fade2 = float(corner2) * max(0.0, 1.0 - dot(vec2(0.707), vec2(gradR, gradB)));
float fade3 = float(corner3) * max(0.0, 1.0 - dot(vec2(0.707), vec2(gradR, gradT)));
float fadeL = float(corner0) * float(corner1) * (1.0 - gradL);
float fadeB = float(corner1) * float(corner2) * (1.0 - gradB);
float fadeR = float(corner2) * float(corner3) * (1.0 - gradR);
float fadeT = float(corner3) * float(corner0) * (1.0 - gradT);
float fade = max(
max(max(fade0, fade1), max(fade2, fade3)),
max(max(fadeL, fadeR), max(fadeB, fadeT)));
At the end compute the scale and set the fragment color:
float scale = 1.0 - fade;
frag_colour = vec4(c.rgb, c.a * scale);
I'm just getting started with shaders and am having trouble drawing a circle based on the cursor position without it appearing as an ellipses.
I'm using the following fragment shader (via Shadertoy):
void mainImage( out vec4 fragColor, in vec2 fragCoord )
{
vec2 st = fragCoord.xy/iResolution.xy;
float m_x = iMouse.x / iResolution.x;
float m_y = iMouse.y / iResolution.y;
vec3 m_color = vec3(1.0);
float mouse_pct = distance(vec2(m_x, m_y), st);
mouse_pct = step(0.01, mouse_pct);
m_color = vec3(mouse_pct);
fragColor = vec4(m_color, 1.0);
}
I can make the ellipses a circle by adding:
st.x *= iResolution.x/iResolution.y;
However this results in the circle being drawn in the wrong place on the X axis (this also doesn't feel like the right way to do this). I think I'm generally confused about how one would draw a shape that isn't based on the whole canvas and am unsure what I should be searching for to fill that gap in my understanding.
Shadertoy link - you need to click and drag to change the mouse position.
You have tor respect the aspect ratio on the vector of the fragment point to the center point of the circle:
vec2 dist = vec2(m_x, m_y) - st.xy;
dist.x *= iResolution.x/iResolution.y;
Change your code somehow like this:
void mainImage( out vec4 fragColor, in vec2 fragCoord )
{
vec2 st = fragCoord.xy/iResolution.xy;
vec2 dist = iMouse/iResolution - st.xy;
dist.x *= iResolution.x/iResolution.y;
float mouse_pct = length(dist);
mouse_pct = step(0.3, mouse_pct);
vec3 m_color = vec3(mouse_pct);
fragColor = vec4(m_color, 1.0);
}
Iam trying to create a procedural water puddle in webGL with "water ripples" by vertex displacement.
The problem I'm having is that I get a noise I can't explain.
Below is the first pass vertex shader where I calculate the vertex positions that i later render to a texture that i then use in the second pass.
void main() {
float damping = 0.5;
vNormal = normal;
// wave radius
float timemod = 0.55;
float ttime = mod(time , timemod);
float frequency = 2.0*PI/waveWidth;
float phase = frequency * 0.21;
vec4 v = vec4(position,1.0);
// Loop through array of start positions
for(int i = 0; i < 200; i++){
float cCenterX = ripplePos[i].x;
float cCenterY = ripplePos[i].y;
vec2 center = vec2(cCenterX, cCenterY) ;
if(center.x == 0.0 && center.y == 0.0)
center = normalize(center);
// wave width
float tolerance = 0.005;
radius = sqrt(pow( uv.x - center.x , 2.0) + pow( uv.y -center.y, 2.0));
// Creating a ripple
float w_height = (tolerance - (min(tolerance,pow(ripplePos[i].z-radius*10.0,2.0)) )) * (1.0-ripplePos[i].z/timemod) *5.82;
// -2.07 in the end to keep plane at right height. Trial and error solution
v.z += waveHeight*(1.0+w_height/tolerance) / 2.0 - 2.07;
vNormal = normal+v.z;
}
vPosition = v.xyz;
gl_Position = projectionMatrix * modelViewMatrix * v;
}
And the first pass fragment shader that writes to the texture:
void main()
{
vec3 p = normalize(vPosition);
p.x = (p.x+1.0)*0.5;
p.y = (p.y+1.0)*0.5;
gl_FragColor = vec4( normalize(p), 1.0);
}
The second vertex shader is a standard passthrough.
Second pass fragmentshader is where I try to calculate the normals to be used for light calculations.
void main() {
float w = 1.0 / 200.0;
float h = 1.0 / 200.0;
// Nearest Nieghbours
vec3 p0 = texture2D(rttTexture, vUV).xyz;
vec3 p1 = texture2D(rttTexture, vUV + vec2(-w, 0)).xyz;
vec3 p2 = texture2D(rttTexture, vUV + vec2( w, 0)).xyz;
vec3 p3 = texture2D(rttTexture, vUV + vec2( 0, h)).xyz;
vec3 p4 = texture2D(rttTexture, vUV + vec2( 0, -h)).xyz;
vec3 nVec1 = p2 - p0;
vec3 nVec2 = p3 - p0;
vec3 vNormal = cross(nVec1, nVec2);
vec3 N = normalize(vNormal);
float theZ = texture2D(rttTexture, vUV).r;
//gl_FragColor = vec4(1.,.0,1.,1.);
//gl_FragColor = texture2D(tDiffuse, vUV);
gl_FragColor = vec4(vec3(N), 1.0);
}
The result is this:
The image displays the normalmap and the noise I'm refering to is the inconsistency of the blue.
Here is a live demonstration:
http://oskarhavsvik.se/jonasgerling_water_ripple/waterRTT-clean.html
I appreciate any tips and pointers, not only fixes for this problem. But the code in genereal, I'm here to learn.
After a brief look it seems like your problem is in storing x/y positions.
gl_FragColor = vec4(vec3(p0*0.5+0.5), 1.0);
You don't need to store them anyway, because the texel position implicitly gives the x/y value. Just change your normal points to something like this...
vec3 p2 = vec3(1, 0, texture2D(rttTexture, vUV + vec2(w, 0)).z);
Rather than 1, 0 you will want to use a scale appropriate to the size of your displayed quad relative to the wave height. Anyway, the result now looks like this.
The height/z seems to be scaled by distance from the centre, so I went looking for a normalize() and removed it...
vec3 p = vPosition;
gl_FragColor = vec4(p*0.5+0.5, 1.0);
The normals now look like this...
I'm trying to add a fog effect to my scene in OpenGL 3.3. I tried following this tutorial. However, I can't seem to get the same effect on my screen. All that seems to happen is that my objects get darker, but there's no gray foggy mist on the screen. What could be the problem?
Here's my result.
When it should look like:
Here's my Fragment Shader with multiple light sources. It works fine without any fog. All GLSL variables are set and working correctly.
for (int i = 0; i < NUM_LIGHTS; i++)
{
float distance = length(lightVector[i]);
vec3 l;
// point light
attenuation = 1.0 / (gLight[i].attenuation.x + gLight[i].attenuation.y * distance + gLight[i].attenuation.z * distance * distance);
l = normalize( vec3(lightVector[i]) );
float cosTheta = clamp( dot( n, l ), 0,1 );
vec3 E = normalize(eyeVector);
vec3 R = reflect( -l, n );
float cosAlpha = clamp( dot( E, R ), 0,1 );
vec3 MaterialDiffuseColor = v_color * materialCoefficients.diffuse;
vec3 MaterialAmbientColor = v_color * materialCoefficients.ambient;
lighting += vec3(
MaterialAmbientColor
+ (
MaterialDiffuseColor * gLight[i].color * cosTheta * attenuation
)
+ (
materialCoefficients.specular * gLight[i].color * pow(cosAlpha, materialCoefficients.shininess)
)
);
}
float fDiffuseIntensity = max(0.0, dot(normalize(normal), -gLight[0].position.xyz));
color = vec4(lighting, 1.0f) * vec4(gLight[0].color*(materialCoefficients.ambient+fDiffuseIntensity), 1.0f);
float fFogCoord = abs(eyeVector.z/1.0f);
color = mix(color, fogParams.vFogColor, getFogFactor(fogParams, fFogCoord));
Two things.
First you should verify your fogParams.vFogColor value is getting set correctly. The simplest way to do this is to just short-circut the shader and set color to fogParams.vFogColor and immediately return. If the scene is black, then you know your fog color isn't being sent to the shader correctly.
Second, you need to eliminate your skybox. You can simply set glClearColor() with the fog color and not use a skybox at all, since everywhere the skybox should be visible you should be seeing fog instead, right? More advanced usage could modify the skybox shader to move from fog to the skybox texture depending on the angle of the vec3 off of horizontal, so when looking up the sky is (somewhat) visible, but looking horizontally simply shows the fog, and have a smooth transition between the two.