How to create noise inside WEBGL shader? - glsl

I'm finding the way to create some noise in WEBGL shaders. Some noise for vertex displacement or colors randomization inside shader itself. I want to have it inside a shader in order to not consider what i'm rendering. Just to make current fragment color randomized or vertex position moved a bit.
You can see my shaders code here:
https://github.com/rantiev/webgl-learning/blob/master/stars/index.html
It's plain simple shaders:
<script id="shader-fs" type="x-shader/x-fragment">
precision mediump float;
varying vec2 vTextureCoord;
uniform sampler2D uSampler;
uniform vec3 uColor;
void main(void) {
vec4 textureColor = texture2D(uSampler, vec2(vTextureCoord.s, vTextureCoord.t));
gl_FragColor = vec4(1.0, 1.0, 1.0, 1.0);
}
</script>

Here is one way to do it:
First, we calculate a screen space uv by dividing gl_Position.xy by gl_Position.ww and remap it from (-1, +1) to (0 to 1).
vec4 projCoords = projection * view * model * attr_pos;
vec2 screenUV = projCoords.xy/projCoords.ww; // (from -1 to 1)
screenUV = screenUV * 0.5 + 0.5; // remap to (0 to 1)
Then with this uv we can apply a screen space noise via either a random texture or some math calculation in the shader. Assuming that you generated a random texture, then noise is simply:
vec4 noise = texture2D(u_randomTexture, screenUV);
If you want to use calculations in the shader, look into implementing something like perlin2D noise or perhaps search in shadertoy.

Related

GLSL - per-fragment lighting

I started lighting with several light sources. All the manuals that I saw without taking into account the distance between the light source and the object (for example https://learnopengl.com/Lighting/Basic-Lighting). So I wrote my shader, but I'm not sure about its correctness. Please, analyze this shader, and tell me what's wrong / not correct in it. I will be very grateful for any help! Below I bring the shader itself, and the results of its work for different values of n and k.
Fragment shader:
#version 130
precision mediump float; // Set the default precision to medium. We don't need as high of a
// precision in the fragment shader.
#define MAX_LAMPS_COUNT 8 // Max lamps count.
uniform vec3 u_LampsPos[MAX_LAMPS_COUNT]; // The position of lamps in eye space.
uniform vec3 u_LampsColors[MAX_LAMPS_COUNT];
uniform vec3 u_AmbientColor = vec3(1, 1, 1);
uniform sampler2D u_TextureUnit;
uniform float u_DiffuseIntensivity = 12;
uniform float ambientStrength = 0.1;
uniform int u_LampsCount;
varying vec3 v_Position; // Interpolated position for this fragment.
varying vec3 v_Normal; // Interpolated normal for this fragment.
varying vec2 v_Texture; // Texture coordinates.
// The entry point for our fragment shader.
void main() {
float n = 2;
float k = 2;
float finalDiffuse = 0;
vec3 finalColor = vec3(0, 0, 0);
for (int i = 0; i<u_LampsCount; i++) {
// Will be used for attenuation.
float distance = length(u_LampsPos[i] - v_Position);
// Get a lighting direction vector from the light to the vertex.
vec3 lightVector = normalize(u_LampsPos[i] - v_Position);
// Calculate the dot product of the light vector and vertex normal. If the normal and light vector are
// pointing in the same direction then it will get max illumination.
float diffuse = max(dot(v_Normal, lightVector), 0.1);
// Add attenuation.
diffuse = diffuse / (1 + pow(distance, n));
// Calculate final diffuse for fragment
finalDiffuse += diffuse;
// Calculate final light color
finalColor += u_LampsColors[i] / (1 + pow(distance, k));
}
finalColor /= u_LampsCount;
vec3 ambient = ambientStrength * u_AmbientColor;
vec3 diffuse = finalDiffuse * finalColor * u_DiffuseIntensivity;
gl_FragColor = vec4(ambient + diffuse, 1) * texture2D(u_TextureUnit, v_Texture);
}
Vertex shader:
#version 130
uniform mat4 u_MVPMatrix; // A constant representing the combined model/view/projection matrix.
uniform mat4 u_MVMatrix; // A constant representing the combined model/view matrix.
attribute vec4 a_Position; // Per-vertex position information we will pass in.
attribute vec3 a_Normal; // Per-vertex normal information we will pass in.
attribute vec2 a_Texture; // Per-vertex texture information we will pass in.
varying vec3 v_Position; // This will be passed into the fragment shader.
varying vec3 v_Normal; // This will be passed into the fragment shader.
varying vec2 v_Texture; // This will be passed into the fragment shader.
void main() {
// Transform the vertex into eye space.
v_Position = vec3(u_MVMatrix * a_Position);
// Pass through the texture.
v_Texture = a_Texture;
// Transform the normal's orientation into eye space.
v_Normal = vec3(u_MVMatrix * vec4(a_Normal, 0.0));
// gl_Position is a special variable used to store the final position.
// Multiply the vertex by the matrix to get the final point in normalized screen coordinates.
gl_Position = u_MVPMatrix * a_Position;
}
n=2 k=2
n=1 k=3
n=3 k=1
n=3 k=3
And if my shader is correct, then how do I name these parameters (n, k)?
By "correct" I assume you mean is the code working as well as it should. These lighting calculations are not by any means physically accurate. Unless you are going for full compatibility with old devices, I would recommend you use a higher glsl version which allows you to use in and out and some other useful glsl features. The current is version 450 and you are still using 130. The vertex shader looks ok, as it is only passing through values to the fragment shader.
As for the fragment shader there are is one optimisation you could make.
The calculation u_LampsPos[i] - v_Position doesn't have to be repeated twice. Do it once and do the length and normalize on the same result from one calculation.
The code is quite small so there is not much to go wrong glsl wise however I was wondering why you did: finalColor /= u_LampsCount;?
This didn't make sense to me.

Gradient with fixed number of levels

I drawing a set of quads. For each quad I have a defined color in a vertex of it.
E.g. now my set of quads looks like:
I achive such result in rather primitive way just passing into vertex shader as attribute color of each vertex of an quad.
My shaders are pretty simple:
Vertex shader
#version 150 core
in vec3 in_Position;
in vec3 in_Color;
out vec3 pass_Color;
uniform mat4 projectionMatrix;
uniform mat4 viewMatrix;
uniform mat4 modelMatrix;
void main(void) {
gl_Position = projectionMatrix * viewMatrix * modelMatrix * vec4(in_Position, 1.0);
pass_Color = in_Color;
}
Fragment Shader
#version 150 core
in vec3 pass_Color;
out vec4 out_Color;
void main(void) {
out_Color = vec4(pass_Color, 1.0);
}
Now my goal is to get color non continuous distribution of colors from vertex to vertex. It also can be called "level distribution".
My set of quads should looks like this:
How can I achieve such result?
EDIT:
With vesan and Nico Schertler plate looks like this. (not acceptable variant)
My guess is there will be issues with the hue colours you're using and vertex interpolation (e.g. skipping some bands). Instead, maybe pass in a single channel value and calculate the hue and discrete levels (as #vesan does) within the fragment shader. I use these functions myself...
vec3 hueToRGB(float h)
{
h = fract(h) * 6.0;
vec3 rgb;
rgb.r = clamp(abs(3.0 - h)-1.0, 0.0, 1.0);
rgb.g = clamp(2.0 - abs(2.0 - h), 0.0, 1.0);
rgb.b = clamp(2.0 - abs(4.0 - h), 0.0, 1.0);
return rgb;
}
vec3 heat(float x)
{
return hueToRGB(2.0/3.0-(2.0/3.0)*clamp(x,0.0,1.0));
}
and then
float discrete = floor(pass_Value * steps + 0.5) / steps; //0.5 to round
out_Color = vec4(heat(discrete), 1.0);
where in float in_Value is 0 to 1.
Just expanding on Nico Schertler's comment: you can modify your fragment shader to:
void main(void) {
out_Color = vec4(pass_Color, 1.0);
out_Color = floor(color * steps)/steps;
}
where steps in the number of color steps you want. The floor function will indeed work on a vector, however, the steps will be calculated separately for every color, so the result might not be exactly what you want (the steps might not be as nice as in your example).
Alternatively, you can use some form of "toon shading" (see for example here). That means that you only pass a single number (think a color in grayscale) to your shader, then use your shader to select a color from a color table. The table can either be hardcoded in the shader or selected from a 1-dimensional texture.

How to achieve flat shading with light calculated at centroids?

I'd like to write a GLSL shader program for a per-face shading. My first attempt uses the flat interpolation qualifier with provoking vertices. I use the flat interpolation for both normal and position vertex attributes which gives me the desired old-school effect of solid-painted surfaces.
Although the rendering looks correct, the shader program doesn't actually do the right job:
The light calculation is still performed on a per-fragment basis (in the fragment shader),
The position vector is taken from the provoking vertex, not the triangle's centroid (right?).
Is it possible to apply the illumination equation once, to the triangle's centroid, and then use the calculated color value for the whole primitive? How to do that?
Use a geometry shader whose input is a triangle and whose output is a triangle. Pass normals and positions to it from the vertex shader, calculate the centroid yourself (by averaging the positions), and do the lighting, passing the output color as an output variable to the fragment shader, which just reads it in and writes it out.
Another simple approach is to compute the (screenspace) face normal in the fragment shader using the derivative of the screenspace position. It is very simple to implement and even performs well.
I have written an example of it here (requires a WebGL capable browser):
Vertex:
attribute vec3 vertex;
uniform mat4 _mvProj;
uniform mat4 _mv;
varying vec3 fragVertexEc;
void main(void) {
gl_Position = _mvProj * vec4(vertex, 1.0);
fragVertexEc = (_mv * vec4(vertex, 1.0)).xyz;
}
Fragment:
#ifdef GL_ES
precision highp float;
#endif
#extension GL_OES_standard_derivatives : enable
varying vec3 fragVertexEc;
const vec3 lightPosEc = vec3(0,0,10);
const vec3 lightColor = vec3(1.0,1.0,1.0);
void main()
{
vec3 X = dFdx(fragVertexEc);
vec3 Y = dFdy(fragVertexEc);
vec3 normal=normalize(cross(X,Y));
vec3 lightDirection = normalize(lightPosEc - fragVertexEc);
float light = max(0.0, dot(lightDirection, normal));
gl_FragColor = vec4(normal, 1.0);
gl_FragColor = vec4(lightColor * light, 1.0);
}

GLSL mixing base texture with decal texture at needed place

Lets say we texturing quad (two triangles). I think what this question is similiar to texture splatting like in next example
precision lowp float;
uniform sampler2D Terrain;
uniform sampler2D Grass;
uniform sampler2D Stone;
uniform sampler2D Rock;
varying vec2 tex_coord;
void main(void)
{
vec4 terrain = texture2D(Terrain, tex_coord);
vec4 tex0 = texture2D(Grass, tex_coord * 4.0); // Tile
vec4 tex1 = texture2D(Rock, tex_coord * 4.0); // Tile
vec4 tex2 = texture2D(Stone, tex_coord * 4.0); // Tile
tex0 *= terrain.r; // Red channel - puts grass
tex1 = mix( tex0, tex1, terrain.g ); // Green channel - puts rock and mix with grass
vec4 outColor = mix( tex1, tex2, terrain.b ); // Blue channel - puts stone and mix with others
gl_FragColor = outColor; //final color
}
But i want to just place a 1 decal on base quad texture in desired place.
Algorithm is just the same, but i think we don't need extra texture with 1 filled layer to hold positions(e.g. where red layer != 0) of decal, some how we must generate our own "terrain.r"(is this float?) variable and mix base texture and decal texture with it.
precision lowp float;
uniform sampler2D base;
uniform sampler2D decal;
uniform vec2 decal_location; //where we want place decal (e.g. 0.5, 0.5 is center of quad)
varying vec2 base_tex_coord;
varying vec2 decal_tex_coord;
void main(void)
{
vec4 v_base = texture2D(base, base_tex_coord);
vec4 v_decal = texture2D(Grass, decal_tex_coord);
float decal_layer = /*somehow get our decal_layer based on decal_position*/
gl_FragColor = mix(v_base, v_decal, decal_layer);
}
How achieve such thing?
Or i may just generate splat texture on opengl side and pass it to first shader? This will give me up to 4 various decals on quad but will be slow for frequent updates (e.g. machine gun hits wall)
float decal_layer = /*somehow get our decal_layer based on decal_position*/
Well, it's up to you, how you interpret decal_position. I think a simple distance metric would suffice. but this also requires the size of the quad. Let's assume you provide this through an additional uniform decal_radius. Then we can use
decal_layer = clamp(length(decal_position - vec2(0.5, 0.5)) / decal_radius, 0., 1.);
Yes, decal_layer is a float as you've described. Its range is 0 to 1. But you don't have quite enough info, here you've specified decal_location but no size for the decal. You also don't know where this fragment falls in the quad, you'll need a varying vec2 quad_coord; or similar input from the vertex shader if you want to know where this fragment is relative to the quad being rendered.
But let's try a different approach. Edit the top of your 2nd example to include these uniforms:
uniform vec2 decal_location; // Location of decal relative to base_tex_coord
uniform float decal_size; // Size of decal relative to base_tex_coord
Now, in main(), you should be able to compute decal_layer with something like this:
float decal_layer = 1.0 - smoothstep(decal_size - 0.01, decal_size, max(abs(decal_location.x - base_tex_coord.x), abs(decal_location.y - base_tex_coord.y)));
Basically you're trying to get decal_layer to be 1.0 within the decal, and 0.0 outside the decal. I've added a 0.01 fuzzy edge at the boundary that you can play with. Good luck!

Using both phong and textures in glsl

I have got problem with my shaders. I am trying to put textures and phong shading in my game using glsl but I can't get any good effect.
I've been searching google for a long time, and I can't find any info how connect ligh and texture together, so I've decided to wrote and ask here.
This is my game without texture:
and this is with texture:
What I want to achive is to make this pinkish texture make better visible with spot on a center - just like without texture, and also repair those per vertex shading on gutters - make it per pixel shading, I don't know what is wrong now.
I have checked about 10 shaders with phong shading and I have got also per vertex not per pixel shading.
This is my fragment vertex code mayby someone can see there something?
varying vec3 N;
varying vec3 v;
uniform sampler2D myTexture;
varying vec2 vTexCoord;
void main (void)
{
vec4 finalColor = vec4(0.0, 0.0, 0.0, 0.0);
vec3 L = normalize(gl_LightSource[0].position.xyz - v);
vec3 E = normalize(-v); // we are in Eye Coordinates, so EyePos is (0,0,0)
vec3 R = normalize(-reflect(L,N));
//calculate Ambient Term:
vec4 Iamb = gl_FrontLightProduct[0].ambient;
//calculate Diffuse Term:
vec4 Idiff = gl_FrontLightProduct[0].diffuse * max(dot(N,L), 0.0);
// calculate Specular Term:
vec4 Ispec = gl_FrontLightProduct[0].specular
* pow(max(dot(R,E),0.0),0.3*gl_FrontMaterial.shininess);
finalColor+=Iamb + Idiff + Ispec;
// write Total Color:
gl_FragColor = gl_FrontLightModelProduct.sceneColor + (texture2D(myTexture, vTexCoord)) + finalColor;
}
and my vertex shader
varying vec3 N;
varying vec3 v;
varying vec2 vTexCoord;
void main(void)
{
v = vec3(gl_ModelViewMatrix * gl_Vertex);
N = normalize(gl_NormalMatrix * gl_Normal);
vTexCoord = vec2(gl_MultiTexCoord0);
gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;
}
I'll be glad for help.
EDIT:
this is how I put my texture into shader. It works fine ( I think ) because I can edit texture values in shader - in some way.
int my_sampler_uniform_location = glGetUniformLocation(brickProg, "myTexture");
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_2D, texture);
glUniform1i(my_sampler_uniform_location,0);
CUTIL::drawBox();
glBindTexture(GL_TEXTURE_2D,0);
EDIT 2:
After Nicol Bolas suggestion about colors add I have edited my shader and change it like this:
gl_FragColor = (texture2D(myTexture, vTexCoord)) + finalColor;
Now it is only to bright, but now I have to chenge a little light on stage and would be great. But still I haven't got per pixel shading instead I have got per vertex shading. This is my current screen and I have marked on example what I mean talking about shading:
This equation:
gl_FragColor = gl_FrontLightModelProduct.sceneColor + (texture2D(myTexture, vTexCoord)) + finalColor;
Does not make any sense for most textures. You are taking the color produced from the lighting equation and adding it to the color sampled from the texture. This would only make sense if the values stored in the texture represented light emitting properties of the surface.
Generally, color values of a texture represent the diffuse reflectance of a surface. Which means you need to incorporate them into the lighting equation directly. The texture's color should either fully replace the diffuse color from the material or it should be combined with the material diffuse color in some way.