I have a big function with complex mathematics in it, and I want to call this same function in several fragment shaders. Do I have to copy-paste the code of the function to every shader? Or is there any way to avoid this, to share code between shaders? Can I have any kind of a "library" for common shader functions?
The way to share code in shaders in WebGL is via string manipulation. Example
const hsv2rgb = `
vec3 hsv2rgb(vec3 c) {
c = vec3(c.x, clamp(c.yz, 0.0, 1.0));
vec4 K = vec4(1.0, 2.0 / 3.0, 1.0 / 3.0, 3.0);
vec3 p = abs(fract(c.xxx + K.xyz) * 6.0 - K.www);
return c.z * mix(K.xxx, clamp(p - K.xxx, 0.0, 1.0), c.y);
}
`;
const fragmentShader1 = `#version 300 es
${hsv2rgb}
in float hue;
out vec4 color;
void main() {
color = vec4(hsv2Rgb(vec3(hue, 1.0, 0.8)), 1);
}
const fragmentShader2 = `#version 300 es
${hsv2rgb}
in vec3 hsv;
out vec4 color;
void main() {
color = vec4(hsv2Rgb(hsv), 1);
}
`;
there is no need for a library as it's trivial. Example
Example
const snippets = {
hsv2rgb: `...code-from-above--...`,
rgb2hsv: `...some code ...`,
};
now just use the snippets
const fragmentShader2 = `#version 300 es
${snippets.hsv2rgb}
${snippets.rgb2hsv}
in vec3 v_color;
out vec4 color;
void main() {
vec3 hsv = rgb2hsv(v_color);
color = vec4(hsv2Rgb(hsv + vec3(0.5, 0, 0), 1);
}
`;
Though I'd recommend against using an object to collect strings as whatever builder you use may not be able to discard unused snippets.
To organize you can use es6 imports
/* hsv2rgb.glsl.js */
export default `
vec3 hsv2rgb(vec3 c) {
c = vec3(c.x, clamp(c.yz, 0.0, 1.0));
vec4 K = vec4(1.0, 2.0 / 3.0, 1.0 / 3.0, 3.0);
vec3 p = abs(fract(c.xxx + K.xyz) * 6.0 - K.www);
return c.z * mix(K.xxx, clamp(p - K.xxx, 0.0, 1.0), c.y);
}
`;
And then you can import
/* somefragshader.glsl.js */
import hsv2rgb from './hsv2rgb.glsl.js';
export default `#version 300 es
${hsv2rgb}
in vec3 hsv;
out vec4 color;
void main() {
color = vec4(hsv2Rgb(hsv), 1);
}
`;
And then use in some program
import someFragmentShaderSource from './somefragmentshader.glsl.js';
...
...compile shader using someFragmentShaderSource ...
If you don't like using template string substitution it's trivial to make your own
const subs = {
hsv2rgb: `...code-from-above--...`,
rgb2hsv: `...some code ...`,
};
// replace `#include <name>` with named sub
function replaceSubs(str, subs) {
return str.replace(/#include\s+<(\w+)>/g, (m, key) => {
return subs[key];
});
}
and then
const fragmentShader2 = replaceSubs(`#version 300 es
#include <hsv2rgb>
in vec3 hsv;
out vec4 color;
void main() {
color = vec4(hsv2Rgb(hsv), 1);
}
`, snippets);
In OpenGL / WebGL, GLSL code is passed just as a text. So that if you have a function reusable in multiple GLSL programs - you may write a shader manager that will concatenate various shader blocks.
There are several common approaches:
Mega Shader shared by all your GLSL programs, with #ifdefs in code to activate/deactivate specific blocks. May become very messy.
Shader Manager dynamically constructing GLSL programs from string constants (like code generation).
Shader Manager substituting sub-strings with predefined list of standard functions. Core GLSL syntax does not support #include directives, but Shader Manager might implement them or use another syntax for identifying sub-strings to substitute like %ColorLighting% (or just use ${theVariable} in case of JavaScript).
So that the sample in JavaScript might look like that:
// reusable GLSL functions
var getColor_Red = "vec4 getColor() { return vec4(1.0, 0.0, 0.0, 1.0); }\n"
// fragment shader generator
function getFragShaderRed() {
return "precision highp float;\n"
+ getColor_Red
+ "void main() { gl_FragColor = getColor(); }";
}
A longer answer covering also non-WebGL case is below.
Desktop OpenGL gives more flexibility in this context - it allows multiple Shaders for the same stage to be attached to a single GLSL program. This means, that dedicated function may be moved into dedicated Shader, reused in others Shaders with help of forward declaration of function without body, and linked in multiple GLSL programs - in similar way, how C++ programs are normally compiled and linked.
const GLchar* aShader1Text =
"vec4 getColor() { return vec4(1.0, 0.0, 0.0, 1.0); }";
GLuint aShader1Id = glCreateShader(GL_FRAGMENT_SHADER);
glShaderSource(aShader1Id, 1, &aShader1Text, NULL);
glCompileShader(aShader1Id);
const GLchar* aShader2Text =
"vec4 getColor();" // forward declaration
"void main() { gl_FragColor = getColor(); }"
GLuint aShader2Id = glCreateShader(GL_FRAGMENT_SHADER);
glShaderSource(aShader2Id, 1, &aShader2Text, NULL);
glCompileShader(aShader2Id);
GLuint aProgramID = glCreateProgram();
glAttachShader (aProgramID, aShader0Id); // some vertex shader
glAttachShader (aProgramID, aShader1Id); // fragment shader block 1
glAttachShader (aProgramID, aShader2Id); // fragment shader block 2
glLinkProgram (aProgramID);
There are two problems with this functionality:
Unlike C++ programs, OpenGL drivers normally do not really "compile" individual shader objects, but rather validate their syntax, while real compilation is done on "linkage" stage of entire GLSL program. This practically eliminates any benefit from compiling individual GLSL blocks compared to strings concatenation and re-compiling entire GLSL program source code (e.g. from performance point of view).
OpenGL ES and WebGL just removed this functionality from their specification, so that portable program cannot rely on this feature available in desktop OpenGL (from the very beginning of GLSL introduction). The API itself is the same, but OpenGL driver will fail to compile GLSL shader without a main() function.
Desktop OpenGL 4.0 introduced another functionality shader subroutines, which gives more flexibility to GLSL program definition, making it configurable in runtime. This is rather complex functionality is unlikely reasonable for static GLSL programs, and it is also unavailable in OpenGL ES / WebGL.
Related
Using OpenGL i've displayed a simple square with some color in it. Today I tried to set the color (actually just the green value) of it using a uniform, which contains some sort of sinus from the current time. It looks the value of the uniform is just 0.0, as it shows no green (black when setting the others colours to 0.0), unless I add a print statement to the loop (I can place it anywhere). If I do so, it displays a square that nicely changes in colour.
What's going on?!
This is the main source:
// MAIN LOOP
while !window.should_close() {
// UPDATE STUFF
let time_value = glfwGetTime();
let green_value = ((time_value.sin() / 2.0) + 0.5) as GLfloat;
program.set_uniform1f("uGreenValue", green_value);
println!("yoo"); // it only works when this is somewhere in the loop
// RENDER STUFF
gl::Clear(gl::COLOR_BUFFER_BIT);
gl::DrawElements(gl::TRIANGLES, 6, gl::UNSIGNED_INT, 0 as *const GLvoid);
window.swap_buffers();
This is the vertex shader:
#version 330 core
layout (location = 0) in vec2 aPosition;
layout (location = 1) in float aRedValue;
uniform float uGreenValue;
out float redValue;
out float greenValue;
void main()
{
gl_Position = vec4(aPosition, 0.0, 1.0);
redValue = aRedValue;
greenValue = uGreenValue;
}
and this the fragment shader:
#version 330 core
out vec4 Color;
in float redValue;
in float greenValue;
void main()
{
Color = vec4(redValue, 0.0f, greenValue, 1.0f);
}
I think I found the problem! In the set_uniform functions I provided a non null terminated &str to the glGetUniFormLocation function, which did not work in every occasion. Using a &cstr solved it. I still have no idea what the print statements (and other functions calls, even empty functions) had to do with it though...
I'm working on a simple particle system in OpenGL; so far I've written two fragment shaders to update velocities and positions in response to my mouse, and they seem to work! I've looked at those two textures and they both seem to respond properly (going from random noise to an orderly structure in response to my mouse).
However, I'm having issues with how to draw the particles. I'm rather new to vertex shaders (having previously only used fragment shaders); it's my understanding that the usual way is a vertex shader like this:
uniform sampler2DRect tex;
varying vec4 cur;
void main() {
gl_FrontColor = gl_Color;
cur = texture2DRect(tex, gl_Vertex.xy);
vec2 pos = cur.xy;
gl_Position = gl_ModelViewProjectionMatrix * vec4(pos, 0., 1.);
}
Would transform the coordinates to the proper place according to the values in the position buffer. However, I'm getting gl errors when I run this that it can't be compiled -- after some research, it seems that gl_ModelViewProjectionMatrix is deprecated.
What would be the proper way to do this now that the model view matrix is deprecated? I'm not trying to do anything fancy with perspective, I just need a plain orthogonal view of the texture.
thanks!
What version of GLSL are you using (don't see any #version directive)? Yes, i think gl_ModelViewProjectionMatrix is really deprecated. However if you want to use it maybe this could help. By the way varying qualifier is quite old too. I would rather use in and out qualifiers it makes your shader code more 'readable'.
'Proper' way of doing that is that you create your own matrices - model and view (use glm library for example) and multiply them and then pass them as uniform to your shader. Tutorial with an example can be found here.
Here is my vs shader i used for displaying texture (fullscreen quad):
#version 430
layout(location = 0) in vec2 vPosition;
layout(location = 1) in vec2 vUV;
out vec2 uv;
void main()
{
gl_Position = vec4(vPosition,1.0,1.0);
uv = vUV;
}
fragment shader:
#version 430
in vec2 uv;
out vec4 final_color;
uniform sampler2D tex;
void main()
{
final_color = texture(tex, uv).rgba;
}
and here are my coordinates (mine are static, but you can change it and update buffer - shader can be the same):
//Quad verticles - omitted z coord, because it will always be 1
float pos[] = {
-1.0, 1.0,
1.0, 1.0,
-1.0, -1.0,
1.0, -1.0
};
float uv[] = {
0.0, 1.0,
1.0, 1.0,
0.0, 0.0,
1.0, 0.0
};
Maybe you could try to turn off depth comparison before executing this shader glDisable(GL_DEPTH_TEST);
I am currently trying to add tessellation shaders to my program and I feel as though I'm at my wit's end. I have gone through several tutorials and delved into a lot of the questions here as well. Unfortunately I still do not understand what I am doing wrong after all of this. I will take any tips gladly, and do admit I'm a total novice here. My vertex and fragment shaders work, but no matter which tutorial I base my code off of, I can not get anything to display once I add tessellation shaders. I get no errors while loading, linking, or using the shaders/program either.
The four shaders in question:
Vertex:
layout (location = 0) in vec4 vertex;
out gl_PerVertex{
vec4 gl_Position;
};
void main()
{
gl_Position = gl_Vertex;
//note I have to multiply this by the MVP matrix if there is no tessellation
}
Tessellation Control Shader:
layout (vertices = 3) out;
out gl_PerVertex {
vec4 gl_Position;
} gl_out[];
void main()
{
if (gl_InvocationID == 0)
{
gl_TessLevelInner[0] = 3.0;
gl_TessLevelOuter[0] = 2.0;
gl_TessLevelOuter[1] = 2.0;
gl_TessLevelOuter[2] = 2.0;
}
gl_out[gl_InvocationID].gl_Position = gl_in[gl_InvocationID].gl_Position;
}
Tessellation Evaluation Shader:
layout(triangles, equal_spacing, ccw) in;
uniform mat4 ModelViewProjectionMatrix;
out gl_PerVertex{
vec4 gl_Position;
};
void main()
{
vec4 position = gl_TessCoord.x * gl_in[0].gl_Position +
gl_TessCoord.y * gl_in[1].gl_Position +
gl_TessCoord.z * gl_in[2].gl_Position;
gl_Position = ModelViewProjectionMatrix * position;
}
Fragment Shader:
void main()
{
gl_FragColor = vec4(0.1, 0.4, 0.0, 1.0);
}
I'm sure I'm overlooking some really simple stuff here, but I'm at an absolute loss here. Thanks for taking the time to read this.
When you draw using a Tessellation Control Shader (this is an optional stage), you must use GL_PATCHES as the primitive type.
Additionally, GL_PATCHES has no default number of vertices, you must set that:
glPatchParameteri (GL_PATCH_VERTICES​, 3);
glDrawElements (GL_PATCHES, 3, ..., ...);
The code listed above will draw a single triangle patch. Now, since geometry shaders and tessellation evaluation shaders do not understand GL_PATCHES, if you remove the tessellation control shader, the draw code will do nothing. Only the tessellation control shader can make sense out of GL_PATCHES primitive, and conversely it cannot make sense of any other kind of primitive.
when I switch to use OpenGL ES 3 with GLSL 300, I met the following error in my frag shader
undeclared identifier gl_FragColor
when using GLSL 100, everything is fine.
Modern versions of GLSL do fragment shader outputs simply by declaring them as out values, and gl_FragColor is no longer supported, hence your error. Try this:
out vec4 fragColor;
void main()
{
fragColor = vec4(1.0, 0.0, 0.0, 1.0);
}
Note that gl_FragDepth hasn't changed and is still available.
For more information see https://www.opengl.org/wiki/Fragment_Shader
The predefined variable gl_FragColor does not exist anymore in GLSL ES 3.00. You need to define your own out variable for the output of the fragment shader. You can use any name you want, for example:
out vec4 FragColor;
void main() {
...
FragColor = ...;
}
This follows the Core Profile of full OpenGL. The reason for not having a pre-defined fragment shader output is that it does not scale well for multiple render targets, and for render targets that need types other than float vectors.
I'm working on a beginner level GLSL shader program. I'm following this tutorial. But my sphere always appear in greyscale and not colored red as I expected.
Vertex Shader:
varying vec3 normal, lightDir;
void main() {
gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;
normal = gl_NormalMatrix * gl_Normal;
vec4 vertex_in_modelview_space = gl_ModelViewMatrx * gl_Vertex;
lightDir = vec3(gl_LightSource[0].position – vertex_in_modelview_space);
}
Frag Shader:
varying vec3 normal, lightDir;
void main()
{
const vec4 AmbientColor = vec4(0.1, 0.0, 0.0, 1.0);
const vec4 DiffuseColor = vec4(1.0, 0.0, 0.0, 1.0);
vec3 normalized_normal = normalize(normal);
vec3 normalized_lightDir = normalize(lightDir);
float DiffuseTerm = clamp(dot(normal, lightDir), 0.0, 1.0);
gl_FragColor = AmbientColor + DiffuseColor * DiffuseTerm;
}
The code is just copy and paste off the tutorial.
From the frag shader, the diffuse color is red, but my sphere is greyscale. I know that the shaders are loaded correctly though because if I take out the code in the frag shader and use the following:
gl_FragColor = vec4(0.0,1.0,0.0,1.0);
then my sphere is solid green as expected. I do not know if it's something in the openGL code (like, Renderer.cpp) that's causing a conflict, or if there's something else wrong.
This is my first time coding in GLSL, and I'm quite confused about what gl_Enable's I need to turn on/off for the shader to work properly.
Thanks for any feedback!
EDIT:
Ok, if I call glColor3f before rendering, I can get the right color. But doesn't the light's color directly result in a change of color in the sphere? I'm worried that I'm not actually calling the functions in the shader...
EDIT2:
So it turns out that whenever I put any code in the vertex shader or frag shader (other than gl_Color = ...), the solid color I get disappears... I guess this means that there's something horribly wrong with my shaders?
EDIT3:
Here's the code for setting up my shader (supplied by my TA):
char *vs = NULL,*fs = NULL;
v = glCreateShader(GL_VERTEX_SHADER);
f = glCreateShader(GL_FRAGMENT_SHADER);
vs = textFileRead(vert);
fs = textFileRead(frag);
const char * ff = fs;
const char * vv = vs;
glShaderSource(v, 1, &vv,NULL);
glShaderSource(f, 1, &ff,NULL);
free(vs);
free(fs);
glCompileShader(v);
glCompileShader(f);
p = glCreateProgram();
glAttachShader(p,f);
glAttachShader(p,v);
glLinkProgram(p);
int infologLength = 0;
int charsWritten = 0;
char *infoLog;
glGetProgramiv(p, GL_INFO_LOG_LENGTH,&infologLength);
if (infologLength > 0)
{
infoLog = (char *)malloc(infologLength);
glGetProgramInfoLog(p, infologLength, &charsWritten, infoLog);
printf("%s\n",infoLog);
free(infoLog);
}
EDIT4:
Using shader logs as suggested by kvark, I managed to fix the bugs in the shaders (turns out there were a couple of mistakes). If you would like to see the final code, please leave a comment or message me (this question is getting long).
It's a good idea to check not just the link log, but also compile logs for each shader and compile/link result:
glGetShaderInfoLog(...)
glGetShaderiv(...,GL_COMPILE_STATUS,...)
glGetProgramiv(...,GL_LINK_STATUS,...)
Make sure the results are positive and the logs are empty (or good).
The diffuse term is calculated incorrectly in your example. It should have the following value:
float DiffuseTerm = max(0.0, dot(normalized_normal,normalized_lightDir) );
You don't need clamp() as the dot() result of normalized vectors can't exceed 1.
If you made sure the shader program is linked correctly, activated it on a draw and the result is still weird, try to select different components of your final color equation to find out the wrong one:
gl_FragColor = DiffuseColor; //can't be grayscale
gl_FragColor = vec4(DiffuseTerm); //should be diffuse grayscale
BTW, glColor3f should have nothing to do with your shader as you don't use gl_Color inside. If the result changes when you call it - that would mean the shader activation failed (it didn't link or wasn't used at all).
Good Luck!
Maybe it's due to an unwanted behaviour with your alpha channel result.
You're actually computing lighting on your alpha channel, actually having something like : g
gl_FragColor.a = 1.0 + 1.0 * DiffuseTerm
which will give you >= 1.0 values.
You should be careful not to include your alpha channel in your output (or even in your calculations).
Try making sure your blending is disabled, or fix your shader to something like :
varying vec3 normal, lightDir;
void main()
{
const vec3 AmbientColor = vec3(0.1, 0.0, 0.0);
const vec3 DiffuseColor = vec3(1.0, 0.0, 0.0);
vec3 normalized_normal = normalize(normal);
vec3 normalized_lightDir = normalize(lightDir);
float DiffuseTerm = clamp(dot(normal, lightDir), 0.0, 1.0);
gl_FragColor = vec4(AmbientColor + DiffuseColor * DiffuseTerm, 1.0);
}