I'm texturing a 2D object in OpenGL, and while I have the texture loading fine, I'm just not quite sure how to invert the colours. How do I access the colour information of the bitmap and invert it? Would this be done best in a shader or in the Main program?
I'm leaving this kind of open as I'm not looking for a "fix my code" type of answer, but rather "this is how you access the colour information of the bitmap".
Its s simple as
gl_FragColor = vec4(1.0 - textureColor.r,1.0 -textureColor.g,1.0 -textureColor.b,1)
It is best to do this kind of thing in the shader, if you want to do it once and then reuse it for future draws just use render to texture. This is the fastest way to do color inversion.
EDIT: You use
vec4 textureColor = texture2D(uSampler,vTexCoords)
before your do gl_FragColor
using .r ,.g,.b and .a access the red, green, blue and alpha values respectively.
I'm writing a vision impaired game and for this I wanted to invert the colors, so white is black etc... which sounds like what you're doing. I eventually hit on this that works and does not involve shaders or modifying the textures. However it does kill blending. So I render to a texture buffer and then do
glLogicOp (GL_COPY_INVERTED);
glEnable (GL_COLOR_LOGIC_OP);
<bind to texture buffer and blit that>
glLogicOp (GL_COPY);
glDisable (GL_COLOR_LOGIC_OP);
Actually, you have to invert the texture coordinates before sampling the texture to achieve that.
So, in fragment shader:
vec2 vert_uv_New=vec2(1-vert_uv.x,1-vert_uv.y);
Now sample it:
gl_FragColor=texture2D(sampler,vert_uv_New);
This will do the job.
precision mediump float;
uniform sampler2D u_tex;
varying vec2 v_texCoord;
void main()
{
gl_FragColor = vec4(1,1,1,1) - texture2D(u_tex, v_texCoord);
}
Related
I always did my shaders in glsl 3 (with the #version 330 line) but it's starting to be pretty old, so I recently tried to make a shader in glsl 4, and use it with the SFML library for rendering, instead of pure openGL.
For now, my goal is to do a basic shader for a 2d game, which takes the color of each pixel of a texture and modify them. I always did that with gl_TexCoord[0].xy, but it seems to be depreciated now, so I searched and I heard that I must use the in and out variables with a vertex shader, so I tried.
Fragment shader
#version 400
in vec2 fragCoord;
out vec4 fragColor;
uniform sampler2D image;
void main(){
// Get the color
vec4 color = texture( image, fragCoord );
/*
* Do things with the color
*/
// Return the color
fragColor = color;
}
Vertex shader
#version 400
in vec3 position;
in vec2 textureCoord;
out vec2 fragCoord;
void main(){
// Set the position of the pixel and vertex (I guess)
fragCoord = textureCoord;
gl_Position = vec4( position, 1.0 );
}
I also seen that we could add the projection, model, and view matrices, but I don't know how to do that with SFML (I don't even think we can), and I don't want to learn some complex things about openGL or SFML just to change some colors on a 2d game, so here is my question:
Is there an easy way to just get the coordinates of the pixel we're working on? Maybe get rid of the vertex shader, or use it without using matrices?
Unless you really want to learn a lot of nasty OpenGl, writing your own shaders just for textures is a little overkill. SFML can handle textures and shaders for you behind the scenes (here is a good article on how to use them) so you don't need to worry about shaders at all. Also note that you can change the color of SFML sprites (which is, I believe, what you are trying to do), with sprite.setColor(sf::color(*whatever*));. Plus, there's no problem in using version 330. That's what I usually use, albeit with in and out stuff as well.
If you really want to use your own shaders for fancy effects, like pixellation, blurring, etc. I can't help you much since I've only ever worked with pure OpenGl, so I don't know how the vertex information is handled by SFML, but this is some interesting example code you can check out, here is a tutorial, and here is a reference.
To more directly answer your question. gl_FragCoord is a built-in variable with GLSL that keeps track of the fragments position, but you have to set gl_Position in the vertex shader. You can't get rid of the vertex shader if you are doing anything OpenGl related. You'd have to do fancy matrix stuff (this is a wonderful library) and probably buffer stuff (like this) to tell GLSL yourself where everything is.
I've seen the shader code using these two. But i don't understand what's difference between them, between texture and fragment.
As i know, fragment is pixels, so what's texture?
Some use these code:
vec2 uv = gl_FragCoord.xy / rectSize.xy;
vec4 bkg_color = texture2D(CC_Texture0, uv);
some use:
vec4 bkg_color = texture2D(CC_Texture0, v_texCoord);
with v_texCoord = a_texCoord;
Both works, except the first way displays inverted image.
In your second example 'v_texCoord' looks like a pre-calculated texture coordinate that is passed to the Fragment Shader as a Vertex Attribute, versus the 'uv' coordinate calculated within the Fragment Shader of the first example.
You can base texture coordinates off whatever you like - so long as you give the texture2D sampler normalised coordinates - its all about your use case and what you want to display from a texture.
Perhaps there is such a use-case difference here, which is why they give different visual outputs.
For more information about how texture coordinates work I recommend this question's answer: How do opengl texture coordinates work?
I'm trying to layer one texture over another, but I'm having alpha blending issues around the edges. I've tried many blending combinations with no luck. Where am I going wrong?
Current state of framebuffer (opaque):
Transparent texture rendered in off-screen framebuffer:
Result when I try to blend the two. Notice the edges on the circle:
Here's the blendFunc:
_gl.blendFuncSeparate( _gl.SRC_ALPHA, _gl.ONE_MINUS_SRC_ALPHA, _gl.ONE, _gl.ONE_MINUS_SRC_ALPHA );
Here's the shader. Just basic rendering of a texture:
uniform sampler2D texture;
varying vec2 vUv;
void main() {
vec4 tColor = texture2D(texture, vUv);
gl_FragColor = tColor;
}
Most likely your textures are using premultiplied alpha and so your blend function should be
_gl.blendFunc(_gl.ONE, _gl.ONE_MINUS_SRC_ALPHA);
If your textures are not premultiplied you probably want to premultiply them either in your shader
gl_FragColor.rgb *= gl_FragColor.a
or when you load them (before you call gl.texImage2D) you can tell the browser to premultiply them
_gl.pixelStorei(_gl.UNPACK_PREMULTIPLY_ALPHA_WEBGL, true);
This document probably explains the issues better
and you might find this relevant as well
WebGL: How to correctly blend alpha channel png
What I'm trying to accomplish: Drawing the depth map of my scene on top of my scene (so that objects closer are darker, and further away are lighter)
Problem: I don't seem to understand how to pass the right texture coordinates from my vertex shader to my fragment shader.
So I created my FBO, and the texture that the depth map gets drawn to... not that I'm entirely sure what I was doing, but whatever, it works. I tested drawing the texture using the fixed functionality pipeline, and it looks just like it's supposed to (the depth map that is).
But trying to use it in my shaders just isn't working...
Here's the part from my render method that binds the texture:
glActiveTexture(GL_TEXTURE7);
glBindTexture(GL_TEXTURE_2D, depthTextureId);
glUniform1i(depthMapUniform, 7);
glUseProgram(shaderProgram);
look(); //updates my viewing matrix
box.render(); //renders box VBO
So... I think that's sort of right? Maybe? No clue why texture 7, that was just something that was in a tutorial I was checking...
And here's the important stuff from my vertex shader:
out vec4 ShadowCoord;
void main() {
gl_Position = PMatrix * (VMatrix * MMatrix) * gl_Vertex; //projection, view and model matrices
ShadowCoord = gl_MultiTexCoord0; //something I kept seeing in examples, was hoping it would work.
}
Aaand, fragment shader:
in vec4 ShadowCoord;
in vec3 Color; //passed from vertex shader, didn't include the code for it though. Just the vertex color.
out vec4 FragColor;
void main(
FragColor = vec4(texture2D(ShadowMap,shadowCoord.st).x * vec3(Color), 1.0);
Now the problem is that the coordinate that the fragment shader receives for the texture is always (0,0), or the bottom-left corner. I tried changing it to ShadowCoord = gl_MultiTexCoord7, because I figured maybe it had something to do with me putting the texture in slot number 7... but alas, the problem persisted. When the color of (0, 0) changes, so does the color of the entire scene, rather than being a change in color for only the appropriate pixel/fragment.
And that's what I'm hoping to get some insight on... how to pass the correct coordinates (I'd like for the corners of the texture to be the same coordinates as the corners of my screen). And yes, this is a beginners question... but I have been looking in the Orange Book, and the problem with it is that it's great on the GLSL side of things, but the OpenGL side of things is severely lacking in the examples that I could really use...
The input variable gl_MultiTexCoord0 (or 7) is the builtin per-vertex texture coordinate for the 0th (or 7th) texture coordinate, set by gl(Multi)TexCoord (when using immediate mode) or by glTexCoordPointer (when using arrays/VBOs).
But as your depth buffer is already in screen space, what you want is not a usual texture laid onto the object, but just the value in the texture for a specific pixel/fragment. So the vertex shader isn't involved in any way. Instead you just use the current fragment's screen space position as texture coordinate, that can be read in the fragment shader using gl_FragCoord. But keep in mind that this coordinate is in [0,w]x[0,h] and textures are accessed by normalized texture coordinates in [0,1]. So you have to divide the fragment's coordinate by the screen size:
uniform vec2 screenSize;
...
... texture2D(ShadowMap, gl_FragCoord.st/screenSize) ...
But you actually don't need two passes for this effect anyway, as you can just use the fragment's depth directly, without writing it into a texture. Instead of
texture2D(ShadowMap, gl_FragCoord.st/screenSize).x
you can just use
gl_FragCoord.z
which is nothing else than the fragment's depth value, that would have been written into the texture in the first pass. This way you completely spare the first depth-writing pass and the texture access in the second pass.
I would like to increase the brightness on a texture used in OpenGL rendering. Such as making it bright red or white. This is a 2D rendering environment, where every sprite is mapped as a texture to an OpenGL polygon.
I know little to nothing on manipulating data, and my engine works with a texture cache, so altering the whole surface would affect everything using the texture.
I can simulate the effect by having a "mask" and overlaying it, allowing me to make the sprite having solid colors, but that takes away memory.
If there any other solution to this?
If your requirement afford it, you can always write a very simple GLSL fragment shader which does this. It's literally a one liner.
Something like:
uniform sampler2d tex;
void main()
{
gl_FragColor = texture2d(tex, gl_TexCoord[0]) + gl_Color;
}
Perhaps GL_ADD instead of GL_MODULATE?
use GL_MODULATE to multiply the texture color by the current color.
see the texture tutorial in this page.