Hey I currently have a system in OpenGL that uses glBlendFunc for bleding diffrent shaders but I would like to do something like this
fragColor = currentColor * lightAmount;
I tried to use gl_Color but its depricated and my engine will not let me use it.
According to this document there is no built-in access for the fragment color in the fragment shader.
What you could do is render your previous passes in another textures, send those textures to the GPU (as uniforms) and do the blending in your last pass.
Related
I have a C++/OpenGL render engine that uses a library to do the character animation / rendering. Each character uses a mix of textures, no textures, vertex shaders, no vertex shaders etc. I want to be able to add a tint to some of the characters, but not all of them. I think the easiest way to do this is using a fragment shader to apply the tint color to the color of the fragment. I am using Cg, as this is a requirement of the project.
The main body of my rendering engine would be something like:
Enable my tint fragment shader
Call library code to do character rendering
Disable my tint fragment shader
Within the shader the tint is applied by multiplying the fragment color, fragment texture and tint color. This all works fine except when no texture is enabled/bound to GL_TEXTURE_2D. I just get black. I've been able to work around this by using textureSize and checking for texture width greater than 1, but this feels fairly cheesy. Is there a better way to do this?
Also, as I have implemented it, textures are applied as though the GK_MODULAR setting were on for textures. It would be nice to know what the current OpenGL setting is and apply that instead.
I'm new to OpenGL, and I'm trying to understand vertex and fragment shaders. It seems you can use a vertex shader to make a gradient if you define the color you want each of the vertices to be, but it seems you can also make gradients using a fragment shader if you use the FragCoord variable, for example.
My question is, since you seem to be able to make color gradients using both kinds of shaders, which one is better to use? I'm guessing vertex shaders are faster or something since everyone seems to use them, but I just want to make sure.
... since everyone seems to use them
Using vertex and fragment shaders are mandatory in modern OpenGL for rendering absolutely everything.† So everyone uses both. It's the vertex shader responsibility to compute the color at the vertices, OpenGL's to interpolate it between them, and fragment shader's to write the interpolated value to the output color attachment.
† OK, you can also use a compute shader with imageStore, but I'm talking about the rasterization pipeline here.
In OpenGL (not ES), is there a universal way to blend based a texture while drawing based on another texture or variable's value? On OpenGLES, I know that I can do custom blending on some platforms via extensions like GL_EXT_shader_framebuffer_fetch. The reason I ask, is that I have a special texture where the forth channel is not alpha, and I need to be able to blend it on a separate alpha which is available on a different map.
You want dual-source blending, which is available in core as of OpenGL 3.3. This allows you to provide a fragment shader with two outputs and use both of them in the blend function.
You would declare outputs in the fragment shader like this:
layout(location = 0, index = 0) out vec4 outColor;
layout(location = 0, index = 1) out vec4 outAlpha;
You could then set the blending function like this, for premultiplied alpha:
glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC1_COLOR);
Or non-premultiplied alpha:
glBlendFunc(GL_SRC1_COLOR, GL_ONE_MINUS_SRC1_COLOR);
Note that SRC1 here is the second output of the fragment shader. If I remember correctly, this blending will only work for one location.
I imported a model from blender to a C array and displayed it with openGL (glew) under Qt.
I have an embedded resource vertex and fragment shader too. I managed to make the vertex shader work and display correctly the model, but now I'd like to give it a gradient-like effect (it's a simple box, but I'm planning to write on it somehow, so I need to make it look decent).
How may I accomplish this? A texture with blender? Is there any better way?
For a simple linear gradient you could just add a line in your vertex shader that sets gl_FrontColor to some (vec4) value, e.g. depending on the vertex's Y-coordinate. And in the fragment shader you set the gl_FragColor to gl_Color (or multiply it by gl_Color if you're also texturing the object).
In some other version of GLSL,gl_BackColor seems to provide the access to the color behind the current rendering fragment.This is useful for some custom alpha blending.But glsl for webgl seems not to support it.On the other hand,read from gl_FragColor before assign any value to it seems get the correct backColor, but only works in my Ubuntu.On my Mac Book Pro, it fails and seems to get only some useless random color.
So my question is,is there any direct way to gain access to the backColor behind the current rendering fragment?If not,how can I do it?
In some other version of GLSL,gl_BackColor seems to provide the access to the color behind the current rendering fragment.
No, this has never been the case. gl_BackColor was the backface color, for doing two-sided lighting. And it was never accessible from the fragment shader; it was a vertex shader variable.
For two-sided lighting, you wrote to both gl_FrontColor and gl_BackColor in the vertex shader. The fragment shader's gl_Color variable is filled in with which ever side's color is facing the camera. So if the back-face of the triangle is forward, then it gets the interpolated gl_BackColor.
What you are asking for has never been available in GLSL.
There is no direct way, as Nicol Bolas write.
However you can use an indirect way, by using a render to texture approach:
First render the opaque objects (if any) to a offscreen texture instead of the screen.
Render the offscreen texture to the screen
Render the transparent "custom blending" object to the screen using a shader that does the custom blending. (Since you are doing the blending manually the GL's Blend flag should not be enabled). You should add the offscreen texture as a uniform to the fragment-shader which let you sample the background color and calculate your custom blending.
If you need to render multiple transparent objects you can use two offscreen textures and ping-pong between them and finally render the result to the screen when all objects has been rendered.