How to implement a Color Matrix Filter in a GLSL shader - glsl

I would like to implement a Color Matrix Filter in a GLSL shader but couldn't find any documentation regarding this matter. I'm totaly new to the world of shaders (never coded one myself) so please forgive me if my explanation/vocabulary doesn't make mush sense.
Informations I could gather so far:
A color matrix is composed of 5 columns (RGBA + offset) and 4 rows
The values in the first four columns are multiplied with the source red, green, blue, and alpha values respectively. The fifth column value is added (offset)
I believe the largest matrices in GLSL are 4×4 mat4 matrices (excluding the 'offset' column)
The only mat4 I've seen implemented in a shader looks like this:
colorMatrix = (GPUMatrix4x4){{0.3588, 0.7044, 0.1368, 0.0},
{0.2990, 0.5870, 0.1140, 0.0},
{0.2392, 0.4696, 0.0912 ,0.0},
{0,0,0,1.0}
};
Question:
How can implement one ? As stated above I've never coded a GLSL shader before and unfortunately I'm unable to provide an MCVE. I would love to see an example so I can learn from it.
Thank you
EDIT:
I'm working with Processing and this is the only example I've found of vertex and fragment shaders for color rendering:
colorvert.glsl:
uniform mat4 transform;
attribute vec4 position;
attribute vec4 color;
varying vec4 vertColor;
void main() {
gl_Position = transform * position;
vertColor = color;
}
colorfrag.glsl:
#ifdef GL_ES
precision mediump float;
precision mediump int;
#endif
varying vec4 vertColor;
void main() {
gl_FragColor = vertColor;
}

For starters I would try :
Vertex:
#version 410 core
layout(location = 0) in vec3 in_vertex;
layout(location = 3) in vec4 in_color;
out vec4 color;
void main()
{
const mat4x4 m=mat4x4 // RGBA matrix
(
0.3588, 0.7044, 0.1368, 0.0,
0.2990, 0.5870, 0.1140, 0.0,
0.2392, 0.4696, 0.0912 ,0.0,
0.0 , 0.0 , 0.0 ,1.0
);
const vec4 o=vec4(0.0,0.0,0.0,0.0); // offset
color = (m * in_color) + o; // transformation
gl_Position = vec4(in_vertex,1.0);
}
Fragment:
#version 410 core
in vec4 color;
out vec4 out_color;
void main()
{
out_color=color;
}
Just change the #version, layout and input attributes/uniforms to meet your needs (currently it use default nVidia attribute locations for fixed pipeline)
Now to convert image for example just render textured quad on <-1,+1> vertex coordinate x,y range.
If your matrices or colors change inside fragment (for example as a result of some proceduraly generated stuff) than just move the transformation to fragment shader instead.
You can also change the const to uniform (and move it above main) so you can pass custom parameters on the run ...
In case you need a GLSL start example see:
complete GL+GLSL+VAO/VBO C++ example

Related

Drawing smooth circle

I'm using OpenTK (C#) but OpenGL suggestions are welcome too.
I have a point list generated by iteration having 1 degrees per point around the center point which means there are 361 point including the center point. Point list can be different with different approaches, that's ok. I can draw the circle with the below simple Vertex and Fragment shaders. How can change the fragment and/or vertex shaders to have a smooth circle.
Vertex shader:
#version 330
in vec3 vPosition;
in vec4 vColor;
out vec4 color;
out vec4 fPosition;
uniform mat4 modelview;
void main()
{
fPosition = modelview * vec4(vPosition, 1.0);
gl_Position = fPosition;
color = vColor;
}
Fragment shader:
#version 330
in vec4 color;
in vec4 fPosition;
out vec4 outputColor;
void main()
{
outputColor = color;
}
C# code:
GL.DrawArrays(PrimitiveType.TriangleFan, 0, points.Length);
Hello what do you actually see ? Post a screenshot. Anyway for smooth edges we have what's called anti alising.
Use this line for your glControl to enable it
glControl = new GLControl(new OpenTK.Graphics.GraphicsMode(32, 24, 0, 8));

opengl glsl bug in which model goes invisible if i use glsl texture function with different parameters

I want to replicate a game. The goal of the game is to create a path between any 2 squares that have the same color.
Here is the game: www.mypuzzle.org/3d-logic-2
The cube has 6 faces. Each faces has 3x3 squares.
The cube has different square types: empty squares(reflect the environment), wall squares(you cant color them), start/finish squares(which have a black square in the middle but the rest of it is the colored).
I've close to finishing my project but i'm stuck with a bug. I used c++,sfml,opengl,glm.
The problem is in the shaders.
Vertex shader:
#version 330 core
layout (location = 0) in vec3 vPosition;
layout (location = 1) in vec3 vColor;
layout (location = 2) in vec2 vTexCoord;
layout (location = 3) in float vType;
layout (location = 4) in vec3 vNormal;
out vec3 Position;
out vec3 Color;
out vec2 TexCoord;
out float Type;
out vec3 Normal;
uniform mat4 model;
uniform mat4 view;
uniform mat4 projection;
void main()
{
gl_Position = projection * view * model * vec4(vPosition, 1.0f);
Position = vec3(model * vec4(vPosition, 1.0f));
Color=vColor;
TexCoord=vTexCoord;
Type=vType;
Normal = mat3(transpose(inverse(model))) * vNormal;
}
Fragment shader:
#version 330 core
in vec3 Color;
in vec3 Normal;
in vec3 Position;
in vec2 TexCoord;
in float Type;
out vec4 color;
uniform samplerCube skyboxTexture;
uniform sampler2D faceTexture;
uniform sampler2D centerTexture;
void main()
{
color=vec4(0.0,0.0,0.0,1.0);
if(Type==0.0)
{
vec3 I = normalize(Position);
vec3 R = reflect(I, normalize(Normal));
if(texture(faceTexture, TexCoord)==vec4(1.0,1.0,1.0,1.0))
color=mix(texture(skyboxTexture, R),vec4(1.0,1.0,1.0,1.0),0.3);*/
}
else if(Type==1.0)
{
if(texture(centerTexture, TexCoord)==vec4(1.0,1.0,1.0,1.0))
color=vec4(Color,1.0);
}
else if(Type==-1.0)
{
color=vec4(0.0,0.0,0.0,1.0);
}
else if(Type==2.0)
{
if(texture(faceTexture, TexCoord)==vec4(1.0,1.0,1.0,1.0))
color=mix(vec4(Color,1.0),vec4(1.0,1.0,1.0,1.0),0.5);
}
}
/*
Type== 0 ---> blank square(reflects light)
Type== 1 ---> start/finish square
Type==-1 ---> wall square
Type== 2 ---> colored square that was once a black square
*/
In the fragment shader i draw the pixels of a square that has a certain type, so the shader only enters in 1 of the 4 if's for each square. The program works fine if i only use the glsl function texture with the same texture. If i use this function 2 times with different textures ,in 2 differents if's ,my model goes invisible. Why is that happening?
https://postimg.org/image/lximpl0bz/
https://postimg.org/image/5dzvqz2r7/
The red square is of type 1. I've modified code in the type==0 if and then my model went invisible.
Texture sampler in OpenGL should only be accessed in (at least) dynamically uniform control flow. This basically means, that all invocations of a shader execute the same code path. If this is not the case, then no automatic gradients are available and mipmapping or anisotropic filtering will fail.
In your program this problem happens exactly when you try to use multiple textures. One solution might be not to use anything that requires gradients. There are also a number of other options, for example, patching all textures together in a texture atlas and just selecting the appropriate uv-coordinates in the shader or drawing each quad separately and providing the type through a uniform variable.

Why does my WebGL shader not let me using varyings?

When I try to link my vertex and fragment shaders into a program, WebGL throws Varyings with the same name but different type, or statically used varyings in fragment shader are not declared in vertex shader: textureCoordinates
I have varying vec2 test in both my vertex and fragment shaders, and can't see any reason why the compiler wouldn't be able to find the same varying in both.
Vertex Shader:
varying vec2 test;
void main(void) {
gl_Position = vec4(0.0, 0.0, 0.0, 0.0);
test = vec2(1.0, 0.0);
}
Fragment Shader:
precision highp float;
varying vec2 test;
void main(void) {
gl_FragColor = vec4(test.xy, 0.0, 1.0);
}
Test code:
const canvas = document.createElement('canvas');
gl = canvas.getContext('webgl')
let vert = gl.createShader(gl.VERTEX_SHADER);
gl.shaderSource(vert, "varying vec2 test;\nvoid main(void) {\n gl_Position = vec4(0.0, 0.0, 0.0, 0.0);\n test = vec2(1.0, 0.0);\n}");
gl.compileShader(vert);
let frag = gl.createShader(gl.FRAGMENT_SHADER);
gl.shaderSource(frag, "precision highp float;\nvarying vec2 test;\nvoid main() {\n\tgl_FragColor = vec4(test.xy, 0.0, 1.0);\n}");
gl.compileShader(frag);
let program = gl.createProgram();
gl.attachShader(program, vert);
gl.attachShader(program, frag);
gl.linkProgram(program);
gl.useProgram(program);
Just a guess, but I wonder if it's because you're not using the textureCoordinates in your fragment shader. The names & types match just fine, so i don't think that's the issue. I've done the same thing here:
Frag:
// The fragment shader is the rasterization process of webgl
// use float precision for this shader
precision mediump float;
// the input texture coordinate values from the vertex shader
varying vec2 vTextureCoord;
// the texture data, this is bound via gl.bindTexture()
uniform sampler2D texture;
// the colour uniform
uniform vec3 color;
void main(void) {
// gl_FragColor is the output colour for a particular pixel.
// use the texture data, specifying the texture coordinate, and modify it by the colour value.
gl_FragColor = texture2D(texture, vec2(vTextureCoord.s, vTextureCoord.t)) * vec4(color, 1.0);
}
Vert:
// setup passable attributes for the vertex position & texture coordinates
attribute vec3 aVertexPosition;
attribute vec2 aTextureCoord;
// setup a uniform for our perspective * lookat * model view matrix
uniform mat4 uMatrix;
// setup an output variable for our texture coordinates
varying vec2 vTextureCoord;
void main() {
// take our final matrix to modify the vertex position to display the data on screen in a perspective way
// With shader code here, you can modify the look of an image in all sorts of ways
// the 4th value here is the w coordinate, and it is called Homogeneous coordinates, (x,y,z,w).
// It effectively allows the perspective math to work. With 3d graphics, it should be set to 1. Less than 1 will appear too big
// Greater than 1 will appear too small
gl_Position = uMatrix * vec4(aVertexPosition, 1);
vTextureCoord = aTextureCoord;
}
Issue was resolved by updating Chrome for OSX from v51.something to 52.0.2743.82 (64-bit) Weird.

Strange and annoying GLSL error

My vertex shader looks as follows:
#version 120
uniform float m_thresh;
varying vec2 texCoord;
void main(void)
{
gl_Position = ftransform();
texCoord = gl_TexCoord[0].xy;
}
and my fragment shader:
#version 120
uniform float m_thresh;
uniform sampler2D grabTexture;
varying vec2 texCoord;
void main(void)
{
vec4 grab = vec4(texture2D(grabTexture, texCoord.xy));
vec3 colour = vec3(grab.xyz * m_thresh);
gl_FragColor = vec4( colour, 0.5 );
}
basically i am getting the error message "Error in shader -842150451 - 0<9> : error C7565: assignment to varying 'texCoord'"
But I have another shader which does the exact same thing and I get no error when I compile that and it works!!!
Any ideas what could be happening?
For starters, there is no sensible reason to construct a vec4 from texture2D (...). Texture functions in GLSL always return a vec4. Likewise, grab.xyz * m_thresh is always a vec3, because a scalar multiplied by a vector does not change the dimensions of the vector.
Now, here is where things get interesting... the gl_TexCoord [n] GLSL built-in you are using is actually a pre-declared varying. You should not be reading from this in a vertex shader, because it defines a vertex shader output / fragment shader input.
The appropriate vertex shader built-in variable in GLSL 1.2 for getting the texture coordinates for texture unit N is actually gl_MultiTexCoord<N>
Thus, your vertex and fragment shaders should look like this:
Vertex Shader:
#version 120
//varying vec2 texCoord; // You actually do not need this
void main(void)
{
gl_Position = ftransform();
//texCoord = gl_MultiTexCoord0.st; // Same as comment above
gl_TexCoord [0] = gl_MultiTexCoord0;
}
Fragment Shader:
#version 120
uniform float m_thresh;
uniform sampler2D grabTexture;
//varying vec2 texCoord;
void main(void)
{
//vec4 grab = texture2D (grabTexture, texCoord.st);
vec4 grab = texture2D (grabTexture, gl_TexCoord [0].st);
vec3 colour = grab.xyz * m_thresh;
gl_FragColor = vec4( colour, 0.5 );
}
Remember how I said gl_TexCoord [n] is a built-in varying? You can read/write to this instead of creating your own custom varying vec2 texCoord; in GLSL 1.2. I commented out the lines that used a custom varying to show you what I meant.
The OpenGL® Shading Language (1.2) - 7.6 Varying Variables - pp. 53
The following built-in varying variables are available to write to in a vertex shader. A particular one should be written to if any functionality in a corresponding fragment shader or fixed pipeline uses it or state derived from it.
[...]
varying vec4 gl_TexCoord[]; // at most will be gl_MaxTextureCoords
The OpenGL® Shading Language (1.2) - 7.3 Vertex Shader Built-In Attributes - pp. 49
The following attribute names are built into the OpenGL vertex language and can be used from within a vertex shader to access the current values of attributes declared by OpenGL.
[...]
attribute vec4 gl_MultiTexCoord0;
The bottom line is that gl_MultiTexCoord<N> defines vertex attributes (vertex shader input), gl_TexCoord [n] defines a varying (vertex shader output, fragment shader input). It is also worth mentioning that these are not available in newer (core) versions of GLSL.

DirectX FVF-like GLSL Shaders

Could someone assist me or head me in the right direction to implement the basic FVFs from DirectX in GLSL code? I completely understand how to create a program, apply VBOs and all that, but I'm having great difficulty in the actual creation of the shaders. Namely:
transformed+lit (x,y,color,specular,tu,tv)
lit (x,y,z,color,specular,tu,tv)
unlit (x,y,z,nx,ny,nz,tu,tv) [material/lights]
With this, I'd be given enough to implement far more interesting shaders.
So, I'm not asking for a mechanism to deal with FVFs. I'm simply asking, for the shader code, given the proper streams. I understand that the unlit and lit versions rely on passing in matrices and I completely understand the concept. I am just having trouble finding shader examples showing these concepts.
Okay. If you have troubles finding working shaders, there is example (Honestly, you can find it at any OpenGL book).
This shader program will use your object's world matrix and camera's matrices to transform vertices, and then map one texture to pixels and lit them with one directional light, (according to material properties and light direction).
Vertex shader:
#version 330
// Vertex input layout
attribute vec3 inPosition;
attribute vec3 inNormal;
attribute vec4 inVertexCol;
attribute vec2 inTexcoord;
attribute vec3 inTangent;
attribute vec3 inBitangent;
// Output
struct PSIn
{
vec3 normal;
vec4 vertexColor;
vec2 texcoord;
vec3 tangent;
vec3 bitangent;
};
out PSIn psin;
// Uniform buffers
layout(std140)
uniform CameraBuffer
{
mat4 mtxView;
mat4 mtxProj;
vec3 cameraPosition;
};
layout(std140)
uniform ObjectBuffer
{
mat4 mtxWorld;
};
void main()
{
// transform position
vec4 pos = vec4(inPosition, 1.0f);
pos = mtxWorld * pos;
pos = mtxView * pos;
pos = mtxProj * pos;
gl_Position = pos;
// just pass-through other stuff
psin.normal = inNormal;
psin.tangent = inTangent;
psin.bitangent = inBitangent;
psin.texcoord = inTexcoord;
psin.vertexColor = inVertexCol;
}
And fragment shader:
#version 330
// Input
in vec3 position;
in vec3 normal;
in vec4 vertexColor;
in vec2 texcoord;
in vec3 tangent;
in vec3 bitangent;
// Output
out vec4 fragColor;
// Uniforms
uniform sampler2D sampler0;
layout(std140)
uniform CameraBuffer
{
mat4 mtxView;
mat4 mtxProj;
vec3 cameraPosition;
};
layout(std140)
uniform ObjectBuffer
{
mat4 mtxWorld;
};
layout(std140)
uniform LightBuffer
{
vec3 lightDirection;
};
struct Material
{
float Ka; // ambient quotient
float Kd; // diffuse quotient
float Ks; // specular quotient
float A; // shininess
};
layout(std140)
uniform MaterialBuffer
{
Material material;
};
// function to calculate pixel lighting
float Lit( Material material, vec3 pos, vec3 nor, vec3 lit, vec3 eye )
{
vec3 V = normalize( eye - pos );
vec3 R = reflect( lit, nor);
float Ia = material.Ka;
float Id = material.Kd * clamp( dot(nor, -lit), 0.0f, 1.0f );
float Is = material.Ks * pow( clamp(dot(R,V), 0.0f, 1.0f), material.A );
return Ia + Id + Is;
}
void main()
{
vec3 nnormal = normalize(normal);
vec3 ntangent = normalize(tangent);
vec3 nbitangent = normalize(bitangent);
vec4 outColor = texture(sampler0, texcoord); // texture mapping
outColor *= Lit( material, position, nnormal, lightDirection, cameraPosition ); // lighting
outColor.w = 1.0f;
fragColor = outColor;
}
If you don't want texturing, just don't sample texture, but equate outColor to vertexColor.
If you don't need lighting, just comment out Lit() function.
Edit:
For 2D objects you can still use same program, but many of functionality will be redundant. You can strip out:
camera
light
material
all of vertex attributes, but inPosition and inTexcoord (maybe also inVertexCol, f you need vertices to have color) and all of code related with unneeded attributes
inPosition can be vec2
you will need to pass orthographic projection matrix instead of perspective one
you can even strip out matrices, and pass vertex buffer with positions in pixels. See my answer here about how to transform those pixel positions to screen space positions. You can do it either in C/C++ code or in GLSL/HLSL.
Hope it helps somehow.
Intro
You've not specified OpenGL/GLSL version that you targeting, so I'll assume that it is at least OpenGL 3.
One of the main advantages of programmable pipeline, to be compared with with fixed-function pipeline, is fully customizable vertex input. I'm not quite sure, if it is a good idea to introduce such constraints as fixed vertex format. For what?.. (You will find modern approach in paragraph "Another way" of my post)
But, if you really want to emulate fixed-function...
I think you'll need to have a vertex shader for each vertex format
you have, or somehow generate vertex shader on the fly. Or even for
all of the shader stages.
For example, for x, y, color, tu, tv input you will have vertex
shader such as:
attribute vec2 inPosition;
attribute vec4 inCol;
attribute vec2 inTexcoord;
void main()
{
...
}
As you don't have transforms, light and materials fixed-functionality in OpenGL 3, you must implement it yourself:
You must pass matrices for transformations
For lit shader you must pass additional variables, such as light direction
For material shader you must have materials in input
Typically, in shader, you do it with uniforms or uniform blocks:
layout(std140)
uniform CameraBuffer
{
mat4 mtxView;
mat4 mtxProj;
vec3 cameraPosition;
};
layout(std140)
uniform ObjectBuffer
{
mat4 mtxWorld;
};
layout(std140)
uniform LightBuffer
{
vec3 lightDirection;
};
struct Material
{
float Ka;
float Kd;
float Ks;
float A;
};
layout(std140)
uniform MaterialBuffer
{
Material material;
};
Probably, you can somehow combine all of shaders with different formats , uniforms, etc. in one big ubershader with branching.
Another way
You can stick to modern approach and just allow user to declare vertex format he wants (format, that he used in his shader). Just implement concept similar to IDirect3DDevice9::CreateVertexDeclaration or ID3D11Device::CreateInputLayout: you will make use of glVertexAttribPointer() and, probably, VAOs. This way you can also abstract out vertex layout, in API-independent way.
The main ideas are:
user passes an array of structures that describes format in API-independent way to your function (this struct can be similar to D3DVERTEXELEMENT9 or D3D11_INPUT_ELEMENT_DESC)
that function interpret array's elements one by one and builds some kind of internal info that describes format in API-specific way (such as IDirect3DVertexDeclaration9 for D3D9, ID3D11InputLayout for D3D11 or custom struct or VAO for OpenGL)
when it's time to set vertex format you just use this info
P.S. If you need ideas on how to properly implement light, materials in GLSL (I mean algorithms here), you'd better pick up some book or online tutorials, than asking here. Or just Google up "GLSL lighting".
You can find interesting these links:
Good resources for learning modern OpenGL (3.0 or later)?
OpenGL documentation
Select Books on OpenGL and 3D Graphics Coding
Happy coding!