RenderMonkey - GLSL light - glsl

I am making a shader in witch i am using a spot light, I am trying some shaders that I´ve found in the Internet before I make my own.
I found this GLSL code:
vec4 final_color =
(gl_FrontLightModelProduct.sceneColor * gl_FrontMaterial.ambient) +
(gl_LightSource[0].ambient * gl_FrontMaterial.ambient);
Does anyone know how can i make this in the RenderMonkey? i know that i cannot use gl_LightSource[0], how can i make it?

In rendermonkey you would need to set variables for the light properties which your shader would use. such a a vec4 for the light's ambient, diffuse, and specular colors. Then some vec3 for the vector to the light / position of the light, etc.
Then you can set these variables to be artist variables, and you can edit them 'live' in the artist Editor on the right.
It's a bit awkward, meaning that you either need to adjust your usage of your shader such that you don't rely on the built in gl_ constructs (so you don't need to edit a shader for it to run both in your program and in RM. Or you need to edit the shaders when you go inbetween. I prefer the former.

Related

How to allow user's custom shader for an openGL software

In softwares like Unity or Unreal, for example, how do they allow users to add their own custom shaders to an object?
Is this custom shader just a normal fragment shader or is it another kind of shader? And if it is just a fragment shader, how do they deal with the lights?
I'm not gonna post the code here because it's big and would pollute the page, but I'm starting to learn through here: https://github.com/opentk/LearnOpenTK/blob/master/Chapter2/6-MultipleLights/Shaders/lighting.frag (it's a series of tutorials, this is the shader from the last one), and they say we should put the light types in functions, inside the fragment shader, to calculate the colors of each fragment.
For example, this function to calculate a directional light, extracted from the code I sent above:
vec3 CalcDirLight(DirLight light, vec3 normal, vec3 viewDir)
{
vec3 lightDir = normalize(-light.direction);
//diffuse shading
float diff = max(dot(normal, lightDir), 0.0);
//specular shading
vec3 reflectDir = reflect(-lightDir, normal);
float spec = pow(max(dot(viewDir, reflectDir), 0.0), material.shininess);
//combine results
vec3 ambient = light.ambient * vec3(texture(material.diffuse, TexCoords));
vec3 diffuse = light.diffuse * diff * vec3(texture(material.diffuse, TexCoords));
vec3 specular = light.specular * spec * vec3(texture(material.specular, TexCoords));
return (ambient + diffuse + specular);
}
But I've never seen people adding lights in their shaders in Unity for example, people just add textures and mess with colors, unless they really want to specifically mess with the lights.
Is there a way of making just one fragment shader that will compute all the types of light, and the user could then apply another shader, just for the object material, on top of that?
If you don't know how to answer but have some good reading material, or place where I could learn more about openGL and GLSL it would be of great value as well.
There are a couple of different ways to structure shader files, each with different pros and cons.
As individual programs. You make each file it's own shader program. Simple to add new programs, and would allow your users to just write a program in GLSL, HLSL, or an engines custom shader language. You will have to provide some way for the user to express what kind of data the program expects, unless you query it from the engine, but it might get complicated to make something that's generic enough.
Über Shader! Put all desired functionality in one shader and let the behavior be controlled by control flow or preprocessor macros, such as #ifdef. So the user would just have to write the main function (which the application adds to the rest of the shader). This allows you to let the user use all the predefined variables and functions. The obvious downside is that it could be big and hard to handle and small changes might break many shaders.
Micro Shaders. Each program contains a small, common functionality, and the application concatenate them all to a functioning shader. The user just write the main function and tells the program which functionality to add. The problem is that it's easy to get conflicts in names unless you're careful and is harder to implement than the über shader.
Effect files. Provided by Microsoft’s effect framework or NVIDIA’s CgFX framework (deprecated).
Abstract Shade Trees. Don't actually know what this is, but it's suppose to be a solution.
You can also combine some of these techniques or try to invent your own solution based on your needs. Here's the solutions discussed (in sector 2.3.3 Existing Solutions).

Alternative to gl_TexCoord.xy to get texture coordinate

I always did my shaders in glsl 3 (with the #version 330 line) but it's starting to be pretty old, so I recently tried to make a shader in glsl 4, and use it with the SFML library for rendering, instead of pure openGL.
For now, my goal is to do a basic shader for a 2d game, which takes the color of each pixel of a texture and modify them. I always did that with gl_TexCoord[0].xy, but it seems to be depreciated now, so I searched and I heard that I must use the in and out variables with a vertex shader, so I tried.
 
Fragment shader
#version 400
in vec2 fragCoord;
out vec4 fragColor;
uniform sampler2D image;
void main(){
// Get the color
vec4 color = texture( image, fragCoord );
/*
* Do things with the color
*/
// Return the color
fragColor = color;
}
 
Vertex shader
#version 400
in vec3 position;
in vec2 textureCoord;
out vec2 fragCoord;
void main(){
// Set the position of the pixel and vertex (I guess)
fragCoord = textureCoord;
gl_Position = vec4( position, 1.0 );
}
I also seen that we could add the projection, model, and view matrices, but I don't know how to do that with SFML (I don't even think we can), and I don't want to learn some complex things about openGL or SFML just to change some colors on a 2d game, so here is my question:
Is there an easy way to just get the coordinates of the pixel we're working on? Maybe get rid of the vertex shader, or use it without using matrices?
Unless you really want to learn a lot of nasty OpenGl, writing your own shaders just for textures is a little overkill. SFML can handle textures and shaders for you behind the scenes (here is a good article on how to use them) so you don't need to worry about shaders at all. Also note that you can change the color of SFML sprites (which is, I believe, what you are trying to do), with sprite.setColor(sf::color(*whatever*));. Plus, there's no problem in using version 330. That's what I usually use, albeit with in and out stuff as well.
If you really want to use your own shaders for fancy effects, like pixellation, blurring, etc. I can't help you much since I've only ever worked with pure OpenGl, so I don't know how the vertex information is handled by SFML, but this is some interesting example code you can check out, here is a tutorial, and here is a reference.
To more directly answer your question. gl_FragCoord is a built-in variable with GLSL that keeps track of the fragments position, but you have to set gl_Position in the vertex shader. You can't get rid of the vertex shader if you are doing anything OpenGl related. You'd have to do fancy matrix stuff (this is a wonderful library) and probably buffer stuff (like this) to tell GLSL yourself where everything is.

How to calculate directional light in GLSL shader?

Various examples of directional lights are all too varied to try and get a coherent picture of what's supposed to be happening; Some examples use matrices with unexplained contents and others, just use the vertex normal and light direction.
I've been attempting to write a shader based on what made the most sense to me but currently it leaves a scene fully lit or fully unlit; Dependent on the light direction.
In my fragment shader:
float diffuseFactor = dot(vertexNormal, -lightDirection)
vec4 diffuseColor = lightColor * diffuseFactor;
fragColor = color * diffuseColor;
So am I way off? Do I need to pass in more information (e.g: modelViewMatrix?) to achieved the desired result?

shadowcubemap checkered patterns- glsl texture() on samplerCubeShadow ignoring 4th component of vector

I'm trying to write the functionality for point-light shadowing. I got spotlight shadowing working first, but then when I switched to point light shadowing (by using a cube map, rendering depth from 6 POV's, etc...) I'm now getting a checkered pattern of light (with NO shadows). Does anyone have any intuition regarding why this might be?
Here's a screenshot:
(Note that if you look closely, you can clearly see that a cubemap is being rendered with the front face of the cube just to the right of the triangle)
And here's my render pass fragment shader glsl code (if you want to see the rest, I can post that as well, but figured this is the important bit)
Note: I'm using deferred lighting, so the vertex shader of this pass is just a quad, and the pos_tex, norm_tex, and col_tex are from the geometry buffer generated in a previous pass.
#version 330 core
in vec2 UV;
out vec3 color;
uniform vec3 lightPosVec; //Non rotation affine
uniform sampler2D pos_tex;
uniform sampler2D col_tex;
uniform sampler2D norm_tex;
uniform samplerCubeShadow shadow_tex;
uniform sampler2D accum_tex; //the tex being drawn to
void main()
{
vec3 lightToFragVec = texture(pos_tex, UV).xyz-lightPosVec;
vec3 l2fabs = abs(lightToFragVec);
float greatestMagnitudeOfLightToFragVec = max(l2fabs.x, max(l2fabs.y, l2fabs.z));
float lightToFragDepth = ((100.0+0.1) / (100.0-0.1) - (2*100.0*0.1)/(100.0-0.1)/greatestMagnitudeOfLightToFragVec + 1.0) * 0.5;
float shadowed = texture(shadow_tex, vec4(normalize(lightToFragVec),lightToFragDepth));
color =
texture(accum_tex, UV).xyz+ //current value
texture(col_tex, UV).xyz //frag color
*dot(lightPosVec-texture(pos_tex,UV).xyz,texture(norm_tex,UV).xyz) //dot product of angle of incidence of light to normal
*(4+15*shadowed) //shadow amplification
/max(0.000001,dot(lightToFragVec,lightToFragVec)); //distance cutoff
}
Thank you!
EDIT:
Ok, so I've been toying around with it, and now the texture seems to be random...
So this makes me think that the depth cube is just full of noise? But that seems unlikely, because for the texture() function with a samplerCubeShadow to return 1.0, the value at that point must be EXACTLY the value of the depth at that fragment... right? Also, I have it set up to control the position of the light with wasd+up+down, and the pattern moves with the movement of the light (when I back up, it gets bigger/dimmer). Which means that the values MUST be dynamically changing to match the actual distance? Oh man I'm confused...
EDIT 2
Ok, sorry, this question is out of control, but I figured I'd continue to document my progress for the sake of anyone in the future with similar problems.
Anyways, I've now figured out that I get the same exact result no matter what I put as the 4th component of the 4d vector passed to texture() with the shadowcubemap. Like, I can even put a constant, and I get the exact same result. How is that possible?
EDIT 3
Darn. Turns out the error had nothing to do with anything I've just said. See answer below for details :(
Darn. Turns out I'm just a fool. I was just rendering things in a wrong order- nothing was wrong with the shader code after all. So, the checkerboard was just from noise in the textures. And the reason noise still rendered is because I was using GL_TEXTURE_COMPARE_FUNC GL_LEQUAL (as I should be) and the noise was either well below 1 or well above 1, so it didn't matter where I moved the light.

Basic OpenGL lighting question

I think this is an extremely stupid and newbie question, but then I am a newbie in graphics and openGL. Having drawn a sphere and put a light source nearby, also having specified ambient light, I started experimenting with light and material values and came to a surprising conclusion: the colors which we specify with glColor* do not matter at all when lighting is enabled. Instead, the equivalent is the material's ambient component. Is this conclusion correct? Thanks
If the lighting is enabled, then instead of the vertex color, the material color (well, colors - there are several of them for different types of response to light) is used. Material colors are specified by glMaterial* functions.
If you want to reuse your code, you can use glEnable(GL_COLOR_MATERIAL) and glColorMaterial(GL_AMBIENT_AND_DIFFUSE) to have your old glColor* calls mapped to material color automatically.
(And please switch to shaders as fast as possible - the shader approach is both easier and more powerful)
I suppose you don't use fragment shader yet. From glprogramming.com:
vertex color =
the material emission at that vertex +
the global ambient light scaled by the materials ambient
property at that vertex +
the ambient, diffuse, and specular contributions from all the
light sources, properly attenuated
So yes, vertex color is not used.
Edit: You can also look for GL lightning equation in GL specification (you have one nearby, do you? ^^)