How to use a shader on a piece uv of a map? - glsl

I have been looking for a solution to my problem for a long time. I am learning to write shaders and I have a problem to color only the part of the uv map, from which I load the 3D model. Could someone tell me how to do it? I tried texture2D but unfortunately to no avail. It is possible that I used this feature incorrectly. Below is my shader fragment which is responsible for color change.
shader.fragmentShader = shader.fragmentShader.replace(
`#include <dithering_fragment>`,
`#include <dithering_fragment>
vec3 base = shaderColor;
vec3 c = mix(gl_FragColor.rgb, base, 1.0);
gl_FragColor = vec4(c, diffuseColor.a);
`
);

Related

OpenGL(2.1) shader program not running, using color from glColor instead

I am working on an OpenGL 2.1 application, and trying to shift code from fixed to programmable pipeline piece by piece(i.e deprecated functions (glVertex*, glColor* etc) calls to buffers/shaders ).
The problem is color I set for each fragment from inside the fragment shader is not being set, instead it automatically paints fragment with color specified in last glColor3f call (which is somewhere else in the codebase, completely unrelated to piece of code at hand).
My vert/frag shaders are pretty straightforward :
#version 120
attribute vec3 position;
attribute vec3 color;
varying vec3 Color;
void main()
{
mat4 scaleMat = mat4(0.1,0,0,0,0,0.1,0,0,0,0,0.1,0,0,0,0,1);
Color = color;
//gl_Position = scaleMat * vec4(position,1.0); //scaling doesn't have effect!
gl_Position = vec4(position, 1.0);
}
and
#version 120
varying vec3 Color;
void main()
{
//gl_FragColor = vec4(0.3,0.3,1.0,1.0); //Const color does not work.
gl_FragColor = vec4(Color,1.0); //taking interpolated color doesn't work either!
}
One more thing, you may notice I applied a scaling matrix in vertex shader to make geometry 10x smaller, but it does not have any effect! Which makes me wonder whether the shader is even 'doing its thing' or not. Because if I don't write the shader correctly, I get compile errors(duh.)
I'm using Qt, below is the code used for setting up buffers and shaderprogram.
void myObject::init()
{
//Shader related code. 'shader' is QGLShaderProgram object
shader.addShaderFromSourceFile(QGLShader::Vertex,
":/shaders/mainshader.vert");
shader.addShaderFromSourceFile(QGLShader::Fragment,
":/shaders/mainshader.frag");
shader.link();
shader.bind();
vertexLocation = shader.attributeLocation("position");
colorLocation = shader.attributeLocation("color");
.......
//bathBuffer and colorBuffer are QGLBuffer objects,
//triList is a vector<vec3> containing triangle data
bathBuffer.create();
bathBuffer.bind();
bathBuffer.setUsagePattern(QGLBuffer::StaticDraw);
bathBuffer.allocate(sizeof(triList));
bathBuffer.write(0,triList.data(),sizeof(triList));
for(int i=0;i<triList.size();++i)
{
colorList.push_back(Vec(1,1,0));
}
colorBuffer.create();
colorBuffer.bind();
colorBuffer.setUsagePattern(QGLBuffer::StaticDraw);
colorBuffer.allocate(sizeof(colorList));
colorBuffer.write(0,colorList.data(),sizeof(colorList));
//copies both color and vert data
}
and here's the draw() function:
void myObject::draw()
{
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
shader.bind();
bathBuffer.bind();
shader.enableAttributeArray(vertexLocation);
shader.setAttributeArray(vertexLocation,GL_DOUBLE,triList.data(),3,0);
colorBuffer.bind();
shader.enableAttributeArray(colorLocation);
shader.setAttributeArray(colorLocation,GL_DOUBLE,colorList.data(),3,0);
glDrawArrays(GL_TRIANGLES, 0, triList.size());
shader.disableAttributeArray(vertexLocation);
shader.disableAttributeArray(colorLocation);
//shader.release();
}
I even tried to use a GL profiler to figure out what might be going wrong beneath all the abstraction:
As you can see, glCreateProgram() returns 1, but trying to glUseProgram returns error saying Program handle does not refer to an object generated by opengl. Like wtf.
Then I thought it might be some Qt thing where it might be making a buggy context from the start, so I added this code to my main.cpp as well, just to make sure we get correct context.
QGLFormat f;
f.setOption(QGL::DeprecatedFunctions| QGL::DepthBuffer | QGL::DoubleBuffer | QGL::Rgba | QGL::AlphaChannel);
f.setVersion(2,1);
QGLContext ogl_context(f);
ogl_context.create();
ogl_context.makeCurrent();
....to no avail.
tldr;
I'm not sure whether the shaders are working or not
Is there a way to ignore/'switch off' glColor calls so that I paint geometry from within shaders itself?
I have run out of options to think of. I can provide more info if needed. Any help on this would be appreciated.
Thanks.
EDIT: Whole class relevant to this question: bathymetry.h , bathymetry.cpp . I have added the whole class relevant to this question. Other code is a custom widget class(not doing any opengl stuff) that handles the Qt part and a main.cpp.

OpenGL 3.2 : cast right shadows by transparent textures

I can't seem to find any information on the Web about fixing shadow casting by objects, which textures have alpha != 1.
Is there any way to implement something like "per-fragment depth test", not a "per-vertex", so I could just discard appearing of the fragment on a shadowmap if colored texel has transparency? Also, in theory, it could make shadow mapping be more accurate.
EDIT
Well, maybe that was a terrible idea I gave above, but only I want is to tell shaders that if texel have alpha<1, there's no need to shadow things behind that texel. I guess depth texture require only vertex information, thats why every tutorial about shadow mapping has minimized vertex and empty fragment shader and nothing happens when trying do something with fragment shader.
Anyway, what is the main idea of fixing shadow casting by partly-transparent objects?
EDIT2
I've modified my shaders and now It discards every fragment, if at least one has transparency o_O. So those objects now don't cast any shadows (but opaque do)... Please, have a look at the shaders:
// Vertex Shader
uniform mat4 orthoView;
in vec4 in_Position;
in vec2 in_TextureCoord;
out vec2 TC;
void main(void) {
TC = in_TextureCoord;
gl_Position = orthoView * in_Position;
}
.
//Fragment Shader
uniform sampler2D texture;
in vec2 TC;
void main(void) {
vec4 texel = texture2D(texture, TC);
if (texel.a < 0.4)
discard;
}
And it's strange because I use the same trick with the same textures in my other shaders and it works... any ideas?
If you use discard in the fragment shader, then no depth information will be recorded for that fragment. So in your fragment shader, simply add a test to see whether the texture is transparent, and if so discard that fragment.

Pass-through geometry shader for points

I'm having some problems writing a simple pass-through geometry shader for points. I figured it should be something like this:
#version 330
precision highp float;
layout (points) in;
layout (points) out;
void main(void)
{
gl_Position = gl_in[0].gl_Position;
EmitVertex();
EndPrimitive();
}
I have a bunch of points displayed on screen when I don't specify a geometry shader, but when I try to link this shader to my shader program, no points show up and no error is reported.
I'm using C# and OpenTK, but I don't think that is the problem.
Edit: People requested the other shaders, though I did test these shaders without using the geometry shader and they worked fine without the geometry shader.
Vertex shader:
void main()
{
gl_FrontColor = gl_Color;
gl_Position = ftransform();
}
Fragment shader:
void main()
{
gl_FragColor = gl_Color;
}
I'm not that sure sure (have no real experience with geometry shaders), but don't you have to specify the maximum number of output vertices. In your case it's just one, so try
layout (points, max_vertices=1) out;
Perhaps the shader compiles succesfully because you could still specify the number of vertices by the API (at least in compatibility, I think).
EDIT: You use the builtin varying gl_FrontColor (and read gl_Color in the fragment shader), but then in the geometry shader you don't propagate it to the fragment shader (it doesn't get propagated automatically).
This brings us to another problem. You mix new syntax (like gl_in) with old deprecated syntax (like ftransform and the builtin color varyings). Perhaps that's not a good idea and in this case you got a problem, as gl_in has no gl_Color or gl_FrontColor member if I remember correctly. So the best thing would be to use your own color variable as out variable of the vertex and geometry shaders and as in variable of the geometry and fragment shaders (but remember that the in has to be an array in the geometry shader).

Why does GLSL lighting code shift the light spot with the camera?

I am trying to make a custom light shader and was trying a lot of different things over time.
Some of the solutions I found work better, others worse. For this question I'm using the solution which worked best so far.
My problem is, that if I move the "camera" around, the light positions seems to move around, too. This solution has very slight but noticeable movement in it and the light position seems to be above where it should be.
Default OpenGL lighting (w/o any shaders) works fine (steady light positions) but I need the shader for multitexturing and I'm planning on using portions of it for lighting effects once it's working.
Vertex Source:
varying vec3 vlp, vn;
void main(void)
{
gl_Position = ftransform();
vn = normalize(gl_NormalMatrix * -gl_Normal);
vlp = normalize(vec3(gl_LightSource[0].position.xyz) - vec3(gl_ModelViewMatrix * -gl_Vertex));
gl_TexCoord[0] = gl_MultiTexCoord0;
}
Fragment Source:
uniform sampler2D baseTexture;
uniform sampler2D teamTexture;
uniform vec4 teamColor;
varying vec3 vlp, vn;
void main(void)
{
vec4 newColor = texture2D(teamTexture, vec2(gl_TexCoord[0]));
newColor = newColor * teamColor;
float teamBlend = newColor.a;
// mixing the textures and colorizing them. this works, I tested it w/o lighting!
vec4 outColor = mix(texture2D(baseTexture, vec2(gl_TexCoord[0])), newColor, teamBlend);
// apply lighting
outColor *= max(dot(vn, vlp), 0.0);
outColor.a = texture2D(baseTexture, vec2(gl_TexCoord[0])).a;
gl_FragColor = outColor;
}
What am I doing wrong?
I can't be certain any of these are the problem, but they could cause one.
First, you need to normalize your per-vertex vn and vlp (BTW, try to use more descriptive variable names. viewLightPosition is a lot easier to understand than vlp). I know you normalized them in the vertex shader, but the fragment shader interpolation will denormalize them.
Second, this isn't particularly wrong so much as redundant. vec3(gl_LightSource[0].position.xyz). The "position.xyz" is already a vec3, since the swizzle mask (".xyz") only has 3 components. You don't need to cast it to a vec3 again.

Using gl_FragCoord to create a hole in a quad

I am learning WebGL, and would like to do the following:
Create a 3D quad with a square hole in it using a fragment shader.
It looks like I need to set gl_FragColor based on gl_FragCoord appropriately.
So should I:
a) Convert gl_FragCoord from window coordinates to model coordinates, do the appropriate geometry check, and set color.
OR
b) Somehow pass the hole information from the vertex shader to the fragment shader. Maybe use a texture coordinate. I am not clear on this part.
I am fuzzy about implementing either of the above, so I'd appreaciate some coding hints on either.
My background is that of an OpenGL old timer who has not kept up with the new shading language paradigm, and is now trying to catch up...
Edit (27/03/2011):
I have been able to successfully implement the above based on the tex coord hint. I've written up this example at the link below:
Quads with holes - example
The easiest way would be with texture coords. Simply supply the corrds as an extra attribute array then pass through to the fragment shader using a varying. The shaders should contain something like:
vertex:
attribute vec3 aPosition;
attribute vec2 aTexCoord;
varying vec2 vTexCoord;
void main(){
vTexCoord=aTexCoord;
.......
}
fragment:
varying vec2 vTexCoord;
void main(){
if(vTexCoord.x>{lower x limit} && vTexCoord.x<{upper x limit} && vTexCoord.y>{lower y limit} && vTexCoord.y<{upper y limit}){
discard; //this tell GFX to discard this fragment entirly
}
.....
}