I'm trying to implement specular reflection using values sampled from a grayscale, 1D texture as the multiplicative term.
I've implemented a toggle so I can see the difference between the two, but for some reason when the sampled color is enabled, the areas of the scene with no light display as a light gray, where without the sampling those same areas display as black.
Why is this? Here's the area where I'm setting the fragment color.
if(u_specularRamp == 1)
{
specularAmount = clamp(specularAmount, 0.0, 1.0);
vec2 texCoords = vec2(specularAmount, 0.5);
vec4 sampledColor = texture(u_ramp_tex, texCoords);
specularReflection = vec3(0.3 * sampledColor.x);
}
else
{
specularReflection = vec3(0.3 * specularAmount);
}
FragColor = vec4(specularReflection, 1.0);
u_specularRamp is an integer uniform I'm passing in to toggle the sampled color on and off.
I've fixed the issue, in case anyone runs into this problem, this behaviour was caused by the texture parameter being set to GL_REPEAT. If you want black where there isn't any light, set that parameter to GL_CLAMP_TO_EDGE
Related
Hey everyone I'm working with lighting in a 2D Tile Based game and have run into a problem with my lighting calculations, in my game I take greyscale images then color them using shaders whatever color I like whether that be green(rgb=(0,1,0)) or red(rgb=(1,0,0)) or any color. So then I apply my lighting calculations to that textured and colored pixel. The lighting works fine when the light is white(rgb=(1,1,1)) but when it is say red or green it wont show the way I want it to. I know why this is happening of course because realistic a pure red light in a pure green room would reflect no red light so the room would remain dark. What I really want is to see a red light appear over a green surface. So my question is how can I show a red light clearly on a green surface?(or really any other color on any surface)
This is the code for my fragment shader, where attenuation is simply the attenuation for the light, lightColor is obviously the lights rgb value, distance is the distance from the given vector to that light(calculated in the vertex shader) and finally color is the rgb value that is applied to the texture.
Thanks in advance for your help!
vec3 totalDiffuse = vec3(0.0);
for(int i = 0; i < 4; i++)
{
float attFactor = attenuation[i].x + (attenuation[i].y * distance[i]) + (attenuation[i].z * distance[i] * distance[i]);
totalDiffuse = totalDiffuse + (lightColor[i])/attFactor;
}
totalDiffuse = max(totalDiffuse,0.2);
out_Color = texture(textureSampler, pass_textureCoords)*vec4(color,alpha)*vec4(totalDiffuse,1);
And here is an image of what a pure red light looks like on a surface currently, it should be inside the white circle and you may be able to see it is affecting the water a little bit because I give the water a small red component-
Light Demo Image
One possibility would be to change the light calculation.
Calculate a gray scales of the light color and the surface color. Multiply the surface color by the gray scale of the light color and the multiply the light color by the gray scale of the surface color, finally sum them up:
vec4 texCol = texture(textureSampler, pass_textureCoords);
float grayTex = dot(texCol.rgb, vec3(0.2126, 0.7152, 0.0722));
float grayCol = dot(colGray.rgb, vec3(0.2126, 0.7152, 0.0722));
vec3 mixCol = texCol.rgb * grayCol + color.rgb * grayTex;
out_Color = vec4(mixCol * totalDiffuse, texCol.a * alpha);
Note, this algorithm emphasizes the color of the light at the expense of the color of the surface. But that was what you wanted by dipping a green area in red light. Of course, that contradicts the desire to illuminate an area in its own color. If the light is white, then the surface will also shine white.
If you want some light sources with the effect described above, other sources but with the original effect of the question, then I recommend to introduce a parameter that mixes the two effects:
uniform float u_lightTint;
void main()
{
.....
vec3 mixCol = texCol.rgb * grayCol + color.rgb * grayTex;
mixCol = mix(texCol.rgb * color.rgb, mixCol.rgb, u_lightTint);
out_Color = vec4(mixCol * totalDiffuse, texCol.a * alpha);
}
If u_lightTint is set 1.0, then the "new" light calculation is uses, it it is set 0.0, then the original light calculation is use. Both algorithms can be interpolated linearly by u_lightTint.
Alternatively the u_lightTint parameter can be encoded in the alpha channel of the light color:
mixCol = mix(texCol.rgb * color.rgb, mixCol.rgb, color.a);
I am working on a test project where one big 3D quad is intersected by several other 3D quads (rotated differently). All quads have transparency (ranging from fully opaque to fully transparent). The small quads never overlap, well, they might, but the camera placement and the Z-buffer make sure that only those parts that may overlap actually overlap.
To render this, I first render the big quad to a different rgba framebuffer for later lookup.
Now I start rendering to the screen. A skybox is first rendered, then the small quads are rendered with alpha blending enabled: GL_SRC_ALHPA and GL_ONE_MINUS_SRC_ALPHA. After that I render the big quad again with alpha blending also enabled this time. Of course, only the pixels in front of the quads will be rendered because of the Z-buffer.
That's why, in the fragment shader of the small quads, I do a raytrace to find the intersection with the big quad. If the intersection point is BEHIND the small quad, I fetch the texel from the originally created framebuffer and blend this texel manually with the calculated small quad texel color.
But the result is not the same: the colors in front are rendered correctly (GPU handles the blending there), the colors behind them are "weird". They are lighter or darker but I never get the same result. After consideration, I think this must be because I am not emitting the correct alpha value from my small-quads shader, so that the GPU performed blending changes the colors even more.
So, how exactly is the alpha value calculated when blending on the GPU when using the above mentioned blending method. In the opengl manual, I find: A * Srgba + (1 - A) * Drgba. When I emit that result, the blending is not the same. I am quite sure that this is because the result is again passing through the GPU blending again.
The ray tracing is correct, I'm certain of that.
So, what should I do with my manual blending?
Or, should I use another method to get the right effect?
Not really necessary I believe, but here is some code (optimization is for later):
vec4 vRayOrg = vec4(0.0, 0.0, 0.0, 1.0);
vec4 vRayDir = vec4(normalize(vBBPosition.xyz), 0.0);
vec4 vPlaneOrg = mView * vec4(0.0, 0.0, 0.0, 1.0);
vec4 vPlaneNormal = mView * vec4(0.0, 1.0, 0.0, 0.0);
float div = dot(vRayDir.xyz, vPlaneNormal.xyz);
float t = dot(vPlaneOrg.xyz - vRayOrg.xyz, vPlaneNormal.xyz) / div;
vec4 pIntersection = vRayOrg + t * vRayDir;
vec3 normal = normalize(vec4(vCoord, 0.0, 0.0)).xyz;
vec3 vLight = normalize(vPosition.xyz / vPosition.w);
float distance = length(vCoord) - 1.0;
float distf = clamp(0.225 - distance, 0.0, 1.0);
float diffuse = clamp(0.25 + dot(-normal, vLight), 0.0, 1.0);
vec4 cOut = diffuse * 9.0 * distf * cGlow;
if (distance > 0.0 && t >= 0.0 && pIntersection.z <= vBBPosition.z)
{
vec2 vTexcoord = (vProjectedPosition.xy / vProjectedPosition.w) * 0.5 + 0.5;
vec4 cLookup = texture(tLookup, vTexcoord);
cOut = cLookup.a * cLookup + (1.0 - cLookup.a) * cOut;
}
vFragColor = cOut;
Is it possible in OpenGL to sample from, and then write to the same texture in one draw call using GLSL?
I think it actually IS possible. I did it like this for GLSL alpha blending:
vec4 prevColor = texture(dfFrameTexture, pointCoord);
vec4 result = vec4(color.a) * color + vec4(1.0 - color.a) * prevColor;
gl_FragColor = result;
dfFrameTexture is the texture that is being written to with gl_FragColor.
I tested this and it worked. Drawing a white quad onto dfFrameTexture after it's been cleared to black results in a grey quad when I set the color to (0.5, 0.5, 0.5, 0.5)
I'm using GLSL to draw sprites from a sprite-sheet. I'm using jME 3, yet there are only small differences, and only with regards to deprecated functions.
The most important part of drawing a sprite from a sprite sheet is to draw only a subset/range of pixels, for example the range from (100, 0) to (200, 100). In the following test case sprite-sheet, and using the previous bounds, only the green part of the sprite-sheet would be drawn.
.
This is what I have so far:
Definition:
MaterialDef Solid Color {
//This is the list of user-defined variables to be used in the shader
MaterialParameters {
Vector4 Color
Texture2D ColorMap
}
Technique {
VertexShader GLSL100: Shaders/tc_s1.vert
FragmentShader GLSL100: Shaders/tc_s1.frag
WorldParameters {
WorldViewProjectionMatrix
}
}
}
.vert file:
uniform mat4 g_WorldViewProjectionMatrix;
attribute vec3 inPosition;
attribute vec4 inTexCoord;
varying vec4 texture_coordinate;
void main(){
gl_Position = g_WorldViewProjectionMatrix * vec4(inPosition, 1.0);
texture_coordinate = vec4(inTexCoord);
}
.frag:
uniform vec4 m_Color;
uniform sampler2D m_ColorMap;
varying vec4 texture_coordinate;
void main(){
vec4 color = vec4(m_Color);
vec4 tex = texture2D(m_ColorMap, texture_coordinate);
color *= tex;
gl_FragColor = color;
}
In jME 3, inTexCoord refers to gl_MultiTexCoord0, and inPosition refers to gl_Vertex.
As you can see, I tried to give the texture_coordinate a vec4 type, rather than a vec2, so as to be able to reference its p and q values (texture_coordinate.p and texture_coordinate.q). Modifying them only resulted in different hues.
m_Color refers to the color, inputted by the user, and serves the purpose of altering the hue. In this case, it should be disregarded.
So far, the shader works as expected and the texture displays correctly.
I've been using resources and tutorials from NeHe (http://nehe.gamedev.net/article/glsl_an_introduction/25007/) and Lighthouse3D (http://www.lighthouse3d.com/tutorials/glsl-tutorial/simple-texture/).
Which functions/values I should alter to get the desired effect of displaying only part of the texture?
Generally, if you want to only display part of a texture, then you change the texture coordinates associated with each vertex. Since you don't show your code for how you're telling OpenGL about your vertices, I'm not sure what to suggest. But in general, if you're using older deprecated functions, instead of doing this:
// Lower Left of triangle
glTexCoord2f(0,0);
glVertex3f(x0,y0,z0);
// Lower Right of triangle
glTexCoord2f(1,0);
glVertex3f(x1,y1,z1);
// Upper Right of triangle
glTexCoord2f(1,1);
glVertex3f(x2,y2,z2);
You could do this:
// Lower Left of triangle
glTexCoord2f(1.0 / 3.0, 0.0);
glVertex3f(x0,y0,z0);
// Lower Right of triangle
glTexCoord2f(2.0 / 3.0, 0.0);
glVertex3f(x1,y1,z1);
// Upper Right of triangle
glTexCoord2f(2.0 / 3.0, 1.0);
glVertex3f(x2,y2,z2);
If you're using VBOs, then you need to modify your array of texture coordinates to access the appropriate section of your texture in a similar manner.
For the sampler2D the texture coordinates are normalized so that the leftmost and bottom-most coordinates are 0, and the rightmost and topmost are 1. So for your example of a 300-pixel-wide texture, the green section would be between 1/3rd and 2/3rds the width of the texture.
In my .cpp code I create a list of quads, a few of them have a flag, in the pixel shader I check if this flag is set or not, if the flag is not set, the quad gets colored in red for example, if the flag is set, I want to decide the color of every single pixel, so if I need to colour half of the flagged quad in red and the other half in blue I can simply do something like :
if coordinate in quad < something color = red
else colour = blue;
In this way I can get half of the quad colored in blue and another half colored in red, or I can decide where to put the red color or where to put the blue one.
Imagine I've got a quad 50x50 pixels
[frag]
if(quad.flag == 1)
{
if(Pixel_coordinate.x<25 ) gl_fragColor = vec4(1.0, 0.0, 0.0, 1.0);
else gl_fragColor = vec4(0.0, 1.0, 0.0, 1.0);
}
else
gl_FragColor = vec4(1.0, 0.0, 0.0, 1.0);
In this case I would expect that a quad with the flag set will get two colors per face.
I hope I have been more specific now.
thanks.
Just to add something I can't use any texture.
Ok i do this now :
Every quad has 4 textures coordinated (0,0), (0,1), (1,1), (1,0);
I enable the texture coordinates using :
glTexCoordPointer(2, GL_SHORT, sizeof(Vertex), BUFFER_OFFSET(sizeof(float) * 7));
[vert]
varying vec2 texCoord;
main()
{
texCoord = gl_MultiTexCoord0.xy;
}
[frag]
varying vec2 texCoord;
main()
{
float x1 = texCoord.s;
float x2 = texCoord.t;
gl_FragColor = vec4(x1, x2, 0.0, 1.0);
}
I get always the yellow color so x1 =1 and x2 = 1 almost always and some quad is yellow/green.
I would expect that the texture coordinates change in the fragment shader and so I should get a gradient, am I wrong?
If you want to know the coordinate within the quad, you need to calculate it yourself. In order to that, you'll need to create a new interpolant (call it something like vec2 quadCoord), and set it appropriately for each vertex, which means you'll likely also need to add it as an attribute and pass it through your vertex shader. eg:
// in the vertex shader
attribute vec2 quadCoordIn;
varying vec2 quadCoord;
main() {
quadCoord = quadCoordIn;
:
You'll need to feed in this attribute in your drawing code when drawing your quads. For each quad, the vertexes will have likely have quadCoordIn values of (0,0), (0,1), (1,1) and (1,0) -- you could use some other coordinate system if you prefer, but this is the easiest.
Then, in your fragment program, you can access quadCoord.xy to determine where in the quad you are.
In addition to Chris Dodd's answer, you can also access the screen-space coordinate (in pixels, though actually pixel centers and thus ?.5) of the currently processed fragment through the special fragment shader variable gl_FragCoord:
gl_FragColor = (gl_FragCoord.x<25.0) ? vec4(1.0, 0.0, 0.0, 1.0) : vec4(0.0, 1.0, 0.0, 1.0);
But this gives you the position of the fragment in screen-space and thus relative to the lower left corner of you viewport. If you actually need to know the position inside the individual quad (which makes more sense if you want to actually color each quad half-by-half, since the "half-cut" would otherwise vary with the quad's position), then Chris Dodd's answer is the correct approach.