I am currently trying to achieve a tron like game model in the bge (blender game engine). I believe the only way to do this is with a GLSL shader. I am extremely interesting in learning how to use this powerful tool. I have only found one tutorial website, being http://en.wikibooks.org/wiki/GLSL_Programming/Blender . No matter how much I look through that I can't seem to find out how to do what I need for the tron look. How I would like to do it is by having a secondary texture which has black where the material is to render normally and white where it is to add a "light". Light being just drawn shadeless with a bit of bloom.
This is the style:
http://imgur.com/a/6vOwN
The questions I can not seem to find the answers to are:
How would I get the matching color between the materials, so I can tell if that part of the material is either white (light) or black (opaque)?
How do I get the color from the material so that I can output it as shadeless or opaque?
How do I add bloom?
My current code (which does nothing) looks like this:
(This is just to show the structure that blender uses)
import bge
cont = bge.logic.getCurrentController()
VertexShader = """
varying vec4 color;
void main()
{
color = gl_MultiTexCoord0; // set the varying to this attribute
gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;
}
"""
FragmentShader = """
varying vec4 color;
void main()
{
gl_FragColor = color;
}
"""
mesh = cont.owner.meshes[0]
for mat in mesh.materials:
shader = mat.getShader()
if shader != None:
if not shader.isValid():
shader.setSource(VertexShader, FragmentShader, 1)
Related
I have been looking for a solution to my problem for a long time. I am learning to write shaders and I have a problem to color only the part of the uv map, from which I load the 3D model. Could someone tell me how to do it? I tried texture2D but unfortunately to no avail. It is possible that I used this feature incorrectly. Below is my shader fragment which is responsible for color change.
shader.fragmentShader = shader.fragmentShader.replace(
`#include <dithering_fragment>`,
`#include <dithering_fragment>
vec3 base = shaderColor;
vec3 c = mix(gl_FragColor.rgb, base, 1.0);
gl_FragColor = vec4(c, diffuseColor.a);
`
);
I am working on an OpenGL 2.1 application, and trying to shift code from fixed to programmable pipeline piece by piece(i.e deprecated functions (glVertex*, glColor* etc) calls to buffers/shaders ).
The problem is color I set for each fragment from inside the fragment shader is not being set, instead it automatically paints fragment with color specified in last glColor3f call (which is somewhere else in the codebase, completely unrelated to piece of code at hand).
My vert/frag shaders are pretty straightforward :
#version 120
attribute vec3 position;
attribute vec3 color;
varying vec3 Color;
void main()
{
mat4 scaleMat = mat4(0.1,0,0,0,0,0.1,0,0,0,0,0.1,0,0,0,0,1);
Color = color;
//gl_Position = scaleMat * vec4(position,1.0); //scaling doesn't have effect!
gl_Position = vec4(position, 1.0);
}
and
#version 120
varying vec3 Color;
void main()
{
//gl_FragColor = vec4(0.3,0.3,1.0,1.0); //Const color does not work.
gl_FragColor = vec4(Color,1.0); //taking interpolated color doesn't work either!
}
One more thing, you may notice I applied a scaling matrix in vertex shader to make geometry 10x smaller, but it does not have any effect! Which makes me wonder whether the shader is even 'doing its thing' or not. Because if I don't write the shader correctly, I get compile errors(duh.)
I'm using Qt, below is the code used for setting up buffers and shaderprogram.
void myObject::init()
{
//Shader related code. 'shader' is QGLShaderProgram object
shader.addShaderFromSourceFile(QGLShader::Vertex,
":/shaders/mainshader.vert");
shader.addShaderFromSourceFile(QGLShader::Fragment,
":/shaders/mainshader.frag");
shader.link();
shader.bind();
vertexLocation = shader.attributeLocation("position");
colorLocation = shader.attributeLocation("color");
.......
//bathBuffer and colorBuffer are QGLBuffer objects,
//triList is a vector<vec3> containing triangle data
bathBuffer.create();
bathBuffer.bind();
bathBuffer.setUsagePattern(QGLBuffer::StaticDraw);
bathBuffer.allocate(sizeof(triList));
bathBuffer.write(0,triList.data(),sizeof(triList));
for(int i=0;i<triList.size();++i)
{
colorList.push_back(Vec(1,1,0));
}
colorBuffer.create();
colorBuffer.bind();
colorBuffer.setUsagePattern(QGLBuffer::StaticDraw);
colorBuffer.allocate(sizeof(colorList));
colorBuffer.write(0,colorList.data(),sizeof(colorList));
//copies both color and vert data
}
and here's the draw() function:
void myObject::draw()
{
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
shader.bind();
bathBuffer.bind();
shader.enableAttributeArray(vertexLocation);
shader.setAttributeArray(vertexLocation,GL_DOUBLE,triList.data(),3,0);
colorBuffer.bind();
shader.enableAttributeArray(colorLocation);
shader.setAttributeArray(colorLocation,GL_DOUBLE,colorList.data(),3,0);
glDrawArrays(GL_TRIANGLES, 0, triList.size());
shader.disableAttributeArray(vertexLocation);
shader.disableAttributeArray(colorLocation);
//shader.release();
}
I even tried to use a GL profiler to figure out what might be going wrong beneath all the abstraction:
As you can see, glCreateProgram() returns 1, but trying to glUseProgram returns error saying Program handle does not refer to an object generated by opengl. Like wtf.
Then I thought it might be some Qt thing where it might be making a buggy context from the start, so I added this code to my main.cpp as well, just to make sure we get correct context.
QGLFormat f;
f.setOption(QGL::DeprecatedFunctions| QGL::DepthBuffer | QGL::DoubleBuffer | QGL::Rgba | QGL::AlphaChannel);
f.setVersion(2,1);
QGLContext ogl_context(f);
ogl_context.create();
ogl_context.makeCurrent();
....to no avail.
tldr;
I'm not sure whether the shaders are working or not
Is there a way to ignore/'switch off' glColor calls so that I paint geometry from within shaders itself?
I have run out of options to think of. I can provide more info if needed. Any help on this would be appreciated.
Thanks.
EDIT: Whole class relevant to this question: bathymetry.h , bathymetry.cpp . I have added the whole class relevant to this question. Other code is a custom widget class(not doing any opengl stuff) that handles the Qt part and a main.cpp.
I am using GLSL to render a basic cube (made from GL_QUADS surfaces). I would like to pass the gl_Vertex content from the vertex into the fragment shader. Everything works, if I am using gl_FrontColor (vertex shader) and gl_Color (fragment shader) for this, but it doesn't work, when using a plain varying (see code & image below). It appears the varying is not interpolated across the surface for some reason. Any idea what could cause this in OpenGL ?
glShadeModel is set to GL_SMOOTH - I can't think of anything else that could cause this effect right now.
Vertex Shader:
#version 120
varying vec4 frontSideValue;
void main() {
frontSideValue = gl_Vertex;
gl_Position = transformPos;
}
Fragment Shader:
#version 120
varying vec4 frontSideValue;
void main() {
gl_FragColor = frontSideValue;
}
The result looks just like you are not using values in the range [0,1] for the color vector. You basically use the untransformed vertex position, which might be well outside this range. Your cube seems centered around the origin, so you are seeing the small transition where the values are actually in the range [0,1] as that unsharp band.
With the builin gl_FrontColor, the value seems to get clamped before the interpolation.
VC++ 2010, OpenGL, GLSL, SDL
I am moving over to shaders, and have run into a problem that originally occured while working with the ogl pipeline. That is, the position of the light seems to point in whatever direction my camera faces. In the ogl pipeline it was just the specular highlight, which was fixable with:
glLightModelf(GL_LIGHT_MODEL_LOCAL_VIEWER, 1.0f);
Here are the two shaders:
Vertex
varying vec3 lightDir,normal;
void main()
{
normal = normalize(gl_NormalMatrix * gl_Normal);
lightDir = normalize(vec3(gl_LightSource[0].position));
gl_TexCoord[0] = gl_MultiTexCoord0;
gl_Position = ftransform();
}
Fragment
varying vec3 lightDir,normal;
uniform sampler2D tex;
void main()
{
vec3 ct,cf;
vec4 texel;
float intensity,at,af;
intensity = max(dot(lightDir,normalize(normal)),0.0);
cf = intensity * (gl_FrontMaterial.diffuse).rgb +
gl_FrontMaterial.ambient.rgb;
af = gl_FrontMaterial.diffuse.a;
texel = texture2D(tex,gl_TexCoord[0].st);
ct = texel.rgb;
at = texel.a;
gl_FragColor = vec4(ct * cf, at * af);
}
Any help would be much appreciated!
The question is: What coordinate system (reference frame) do you want the lights to be in? Probably "the world".
OpenGL's fixed-function pipeline, however, has no notion of world coordinates, because it uses a modelview matrix, which transforms directly from eye (camera) coordinates to model coordinates. In order to have “fixed” lights, you could do one of these:
The classic OpenGL approach is to, every frame, set up the modelview matrix to be the view transform only (that is, be the coordinate system you want to specify your light positions in) and then use glLight to set the position (which is specified to apply the modelview matrix to the input).
Since you are using shaders, you could also have separate model and view matrices and have your shader apply both (rather than using ftransform) to vertices, but only the view matrix to lights. However, this means more per-vertex matrix operations and is probably not an especially good idea unless you are looking for clarity rather than performance.
I am learning WebGL, and would like to do the following:
Create a 3D quad with a square hole in it using a fragment shader.
It looks like I need to set gl_FragColor based on gl_FragCoord appropriately.
So should I:
a) Convert gl_FragCoord from window coordinates to model coordinates, do the appropriate geometry check, and set color.
OR
b) Somehow pass the hole information from the vertex shader to the fragment shader. Maybe use a texture coordinate. I am not clear on this part.
I am fuzzy about implementing either of the above, so I'd appreaciate some coding hints on either.
My background is that of an OpenGL old timer who has not kept up with the new shading language paradigm, and is now trying to catch up...
Edit (27/03/2011):
I have been able to successfully implement the above based on the tex coord hint. I've written up this example at the link below:
Quads with holes - example
The easiest way would be with texture coords. Simply supply the corrds as an extra attribute array then pass through to the fragment shader using a varying. The shaders should contain something like:
vertex:
attribute vec3 aPosition;
attribute vec2 aTexCoord;
varying vec2 vTexCoord;
void main(){
vTexCoord=aTexCoord;
.......
}
fragment:
varying vec2 vTexCoord;
void main(){
if(vTexCoord.x>{lower x limit} && vTexCoord.x<{upper x limit} && vTexCoord.y>{lower y limit} && vTexCoord.y<{upper y limit}){
discard; //this tell GFX to discard this fragment entirly
}
.....
}