Masking sprite from texture atlas - cocos2d-iphone

I need to mask a sprite. I followed this tutorial: http://www.raywenderlich.com/4428/how-to-mask-a-sprite-with-cocos2d-2-0 , however the problem is when I create sprite not from single png file, but from sprite sheet with "initWithSpriteFrameName" method.
The mask file is beying applied to the big sprite sheet's texture, instead of the small sprite's texture.
Any clues how can I fix this?
Cheers,
Marcin

The problem here is that the same tex coords are being used for your sprite and your mask.
You need to send through two more UV coordinates per vertex, which fit your mask in the atlas.
Create another varying, v_maskTexCoord, for these mask coordinates, and then where you do this:
vec4 texColor = texture2D(u_texture, v_texCoord);
vec4 maskColor = texture2D(u_mask, v_texCoord);
Change it to
vec4 texColor = texture2D(u_texture, v_texCoord);
vec4 maskColor = texture2D(u_mask, v_maskTexCoord);

Related

Render a texture (decal) with an alpha value?

I tried to draw two textures of decals and background, but only the alpha part of the decals becomes white.
I simply tried the following.
Draw 2 textures (background & decals)
Add glBlendFunc to apply decals alpha value
#version 330 core
in vec2 UV;
out vec3 color;
uniform sampler2D background;
in vec3 decalCoord;
uniform sampler2D decal;
void main(){
vec3 BGTex = texture( background, UV ).rgb;
vec3 DecalTex = texture(decal, decalCoord.xy).rgba;
color =
vec4(BGTex,1.0) + // Background texture is DDS DXT1 (I wonder if DXT1 is the cause?)
vec4(DecalTex,0.0); // Decal texture is DDS DXT3 for alpha
}
// Set below...
glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
glBlendEquation(GL_FUNC_ADD);
I was able to draw normally but there are the following problems.
  -The alpha part of the decals can not be made transparent.
Is this a problem that can be solved within the fragmentshader?
In that case, how should I rewrite the code?
The texture is DDS, one of which is DXT1 type (Background. Because this texture doesn't need alpha) and the other is DXT3 type (Decal). Is this also the cause?(Both need to be the DXT3 type?)
Also, should I look for another way to put on the decals?
DecalTex should be a vec4, not a vec3. Otherwise the alpha
value will not be stored.
You will also have to change the line at the end to: color =
vec4(BGTex, 1.0) + DecalTex * DecalTex.a As currently it sets the alpha component
to 0.
The DecalTex has an alpha channel. The alpha channel is "weight", which indicates the intensity of the DecalTex. If the alpha channel is 1, then the color of DecalTex has to be used, if the alpha channel is 0, then the color of BGTex has to be used.
Use mix to mix the color of BGTex and DecalTex dependent on the alpha channel of DecalTex. Of course the type of the DecalTex has to be vec4:
vec3 BGTex = texture( background, UV ).rgb;
vec4 DecalTex = texture(decal, decalCoord.xy).rgba;
color = vec4(mix(BGTex.rgb, DecalTex.rgb, DecalTex.a), 1.0);
Note, mix linear interpolates between the 2 values:
mix(x, y, a) = x * (1−a) + y * a
This is similar the operation which is performed by the blending function and equation:
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
glBlendEquation(GL_FUNC_ADD);
But Blending is applied to the fragment shader output and the current value in the framebuffer.
You don't need any blending at all, because the textures are "blended" in the fragment shader and the result is put in the framebuffer. Since the alpha channel of the fragment shader output is >= 1.0, the output is completely "opaque".

How to stop a Shader from distorting a texture

I am trying to learn how to use shaders and use GLSL. One of the shaders is working but is distorting the texture of the sprite it's working on. I'm doing this all on SFML.
Distorted texture on left, actual texture on right:
The problem comes from this line
When I started the texture was being rendered upside down but subtracting the y component of the cordinates from 1 fixed that issue. The line that is causing the issue is
vec2 texCoord = (gl_FragCoord.xy / sourceSize.xy);
Where the sourceSize is a uniform passing in the resolution of something as a vec2. I've been passing in various values into this and getting different distorted versions of the texture. I was wondering if there was a way a ratio to pass in or something to avoid this distortion.
Texture Size in Pixels: 512x512
Passed in values for the above image: 512x512
Shader
uniform sampler2D source;
uniform vec2 sourceSize;
uniform float time;
void main( void )
{
vec2 texCoord = (gl_FragCoord.xy / sourceSize.xy); //Gets the pixel position in a range of 0.0 to 1
texCoord = vec2 (texCoord.x,1.0-texCoord.y);//Inverts the y co ordinate
vec4 Color = texture2D(source, texCoord);//Gets the current pixture colour
gl_FragColor = Color;//Output
}
Found a solution. Posting it here for if other need the help.
Changing
vec4 Color = texture2D(source, texCoord);//Gets the current pixture colour
To
vec4 Color = texture2D(source, gl_TexCoord[0].xy);//Gets the current pixture colour
Will fix the distortion effect.

Add radial gradient texture to each white part of another texture in shader

Recently, I have read article about sun shader (XNA Sun Shader) and decided to implement it using OpenGL ES 2.0. But I faced with a problem connected with shader:
I have two textures, one of them is fire gradient texture:
And another one is texture each white part of which must be colored by the first texture:
So, I'm going to have a result like below (do not pay attention that the result texture is rendered on sphere mesh):
I really hope that somebody knows how to implement this shader.
You can first sampling the original texture, if the color is white, then sampling the gradient texture.
uniform sampler2D Texture0; // original texture
uniform sampler2D Texture1; // gradient texture
varying vec2 texCoord;
void main(void)
{
gl_FragColor = texture2D( Texture0, texCoord );
// If the color in original texture is white
// use the color in gradient texture.
if (gl_FragColor == vec4(1.0, 1.0, 1.0,1.0)) {
gl_FragColor = texture2D( Texture1, texCoord );
}
}

deriving screen-space coordinates in glsl shader

I'm trying to write a simple application for baking a texture from a paint buffer. Right now I have a mesh, a mesh texture, and a paint texture. When I render the mesh, the mesh shader will lookup the mesh texture and then based on the screen position of the fragment lookup the paint texture value. I then composite the paint lookup with the mesh lookup.
Here's a screenshot with nothing in the paint buffer and just the mesh texture.
Here's a screenshot with something in the paint buffer composited over the mesh texture.
So that all works great, but I'd like to bake the paint texture into my mesh texture. Right now I send the mesh's UVs down as the position with an ortho set to (0,1)x(0,1) so I'm actually doing everything in texture space. The mesh texture lookup is also the position. The problem I'm having though is computing the screen space position of the fragment from the original projection to figure out where to sample the paint texture. I'm passing the bake shader my original camera project matrices and the object position to send the fragment shader the device-normalized position of the fragment (again from my original camera projection) to do the lookup, but it's coming out funny.
Here's what the bake texture is generating if I render half the output using the paint texture and screen position I've derived.
I would expect that block line to be right down the middle.
Am I calculating the screen position incorrectly in my vertex shader? Or am I going about this in a fundamentally wrong way?
// vertex shader
uniform mat4 orthoPV;
uniform mat4 cameraPV;
uniform mat4 objToWorld;
varying vec2 uv;
varying vec2 screenPos;
void main() {
uv = gl_Vertex.xy;
screenPos = 0.5 * (vec2(1,1) + (cameraPV * objToWorld * vec4(gl_MultiTexCoord0.xyz,1)).xy);
screenPos = gl_MultiTexCoord0.xy;
gl_Position = orthoPV * gl_Vertex;
gl_FrontColor = vec4(1,0,0,1);
}
// fragment shader
uniform sampler2D meshTexture;
uniform sampler2D paintTexture;
varying vec2 uv;
varying vec2 screenPos;
void main() {
gl_FragColor = texture2D(meshTexture, uv);
if (screenPos.x > .5)
gl_FragColor = texture2D(paintTexture, uv);
}

OpenGL shadow map issue

I implemented a fairly simple shadow map. I have a simple obj imported plane as ground and a bunch of trees.
I have a weird shadow on the plane which I think is the plane's self shadow. I am not sure what code to post. If it would help please tell me and I'll do so then.
First image, camera view of the scene. The weird textured lowpoly sphere is just for reference of the light position.
Second image, the depth texture stored in the framebuffer. I calculated shadow coords from light perspective with it. Since I can't post more than 2 links, I'll leave this one.
Third image, depth texture with a better view of the plane projecting the shadow from a different light position above the whole scene.
LE: the second picture http://i41.tinypic.com/23h3wqf.jpg (Depth Texture of first picture)
Tried some fixes, adding glCullFace(GL_BACK) before drawing the ground in the first pass removes it from the depth texture but still appears in the final render(like in the first picture, the back part of the ground) - i tried adding CullFace in the second pass also, still showing the shadow on the ground , tried all combinations of Front and Back facing. Can it be because of the values in the ortographic projection ?
Shadow fragment shader:
#version 330 core
layout(location = 0) out vec3 color;
in vec2 texcoord;
in vec4 ShadowCoord;
uniform sampler2D textura1;
uniform sampler2D textura2;
uniform sampler2D textura_depth;
uniform int has_alpha;
void main(){
vec3 tex1 = texture(textura1, texcoord).xyz;
vec3 tex2 = texture(textura2, texcoord).xyz;
if(has_alpha>0.5) if((tex2.r<0.1) && (tex2.g<0.1) && (tex2.b<0.1)) discard;
//Z value of depth texture from pass 1
float hartaDepth=texture( textura_depth,(ShadowCoord.xy/ShadowCoord.w)).z;
float shadowValue=1.0;
if(hartaDepth < ShadowCoord.z-0.005)
shadowValue=0.5;
color = shadowValue * tex1 ;
}