Generate a sprite according to box2d path - cocos2d-iphone

I am making a path based game using cocos2d 2.0 which will randomly generate a path (curvature) for the player. This path has been created using box2d's b2EdgeShape. I want to place a sprite below that path, the route I have taken is this:
The sprite to be shown below the path will repeat continuously whose height will always be greater than the height of the path (Parallax background)
A white sprite will fit perfectly below the path using CCRenderTexture (my real question)
A fragment shader will compare the alpha values of the white sprite with the sprite I have to show and multiply both the values. For each pixel where the sprite has to be shown the alpha value after multiplication will be 1 and hopefully I could achieve the effect I am looking for.
How do I make that white sprite to fit perfectly below the path created using CCRenderTexture. I tried to follow the Tiny Wings tutorial on RayWenderlich but that tutorial is based on OpenGL ES 1.1 and has some fixed pipeline functions that I can not port to OpenGL 2.0 (I have very little knowledge of OpenGL to begin with.) I would rather use cocos2d 2.0 which has OpenGL 2.0. I have worked on the fragment shader, using a static white sprite and this approach can work if I can constantly get that white color underneath the path.
Can somebody please point me in the right direction? Any help would be appreciated :)

If the white sprite is just an alpha mask, you don't need to do that. Follow's Ray's example and I think all you have to change is:
In your -init:
// Built-in to Cocos2D v2.0
[self setShaderProgram: [[CCShaderCache sharedShaderCache] programForKey:kCCShader_PositionTexture]];
... Generate _hillVertices and _hillTexCoords as Ray does ...
In your -draw:
// Built-in to Cocos2D v2.0
[[self shaderProgram] use];
[[self shaderProgram] setUniformForModelViewProjectionMatrix];
// vertex data (glVertexPointer is depreciated)
glEnableVertexAttribArray ( kCCVertexAttrib_Position );
glVertexAttribPointer ( kCCVertexAttrib_Position, 2, GL_FLOAT, GL_FALSE, 0, _hillVertices );
// texture data (glTexCoordPointer is depreciated)
glEnableVertexAttribArray ( kCCVertexAttrib_TexCoords );
glVertexAttribPointer ( kCCVertexAttrib_TexCoords, 2, GL_FLOAT, GL_FALSE, 0, _hillTexCoords );
glDrawArrays ( GL_TRIANGLE_STRIP, 0, (GLsizei)_nHillVertices );
I have a working implementation that is similar (I used vertices and color coordinates, no texture) so I haven't actually compiled the code above. Cocos2d v2.0 has built-in shaders for using vertices with textures, vertices with colors, vertices with textures and colors, etc. Just pick the appropriate kCCShader_* key when you setShaderProgram and provide the right data.

Related

Drawing the grid over the texture

Before diving into details, I have added opengl tag because JoGL is a Java OpenGL binding and the questions seem to be accessible for experts of both to answer.
Basically what I am trying to do is to render the grid over the texture in JoGL using GLSL. So far, my idea was to render first the texture and draw the grid on top. So what I am doing is:
gl2.glBindTexture(GL2.GL_TEXTURE_2D, textureId);
// skipped the code where I setup the model-view matrix and where I do fill the buffers
gl2.glVertexAttribPointer(positionAttrId, 3, GL2.GL_FLOAT, false, 0, vertexBuffer.rewind());
gl2.glVertexAttribPointer(textureAttrId, 2, GL2.GL_FLOAT, false, 0, textureBuffer.rewind());
gl2.glDrawElements(GL2.GL_TRIANGLES, indices.length, GL2.GL_UNSIGNED_INT, indexBuffer.rewind());
And after I draw the grid, using:
gl2.glBindTexture(GL2.GL_TEXTURE_2D, 0);
gl2.glDrawElements(GL2.GL_LINE_STRIP, indices, GL2.GL_UNSIGNED_INT, indexBuffer.rewind());
Without enabling the depth test, the result look pretty awesome.
But when I start updating the coordinates of the vertices (namely updating one of its axes which corresponds to height), the rendering is done in a wrong way (some things which should be in front appear behind, which makes sense without the depth test enabled). So I have enabled the depth test:
gl.glEnable(GL2.GL_DEPTH_TEST);
gl.glDepthMask(true);
An the result of the rendering is the following:
You can clearly see that the lines of the grid are blured, some of the are displayed thinner then others, etc. What I have tried to do to fix the problem is some line smoothing:
gl2.glHint(GL2.GL_LINE_SMOOTH_HINT, GL2.GL_NICEST);
gl2.glEnable(GL2.GL_LINE_SMOOTH);
The result is better, but I am not still satisfied.
QUESTION: So basically the question is how to improve further the solution, so I can see solid lines and those are displayed nicely when I start updating the vertex coordinates.
If it is required I can provide the code of Shaders (which is really simple, Vertex Shader only calculates the position based on projection, model view matrix and the vertex coords and Fragment Shader calculates the color from texture sampler).

Mapping a texture to vertices in OpenGL

I'm trying to render textured meshes with OpenGL. Currently, my main class holds a state consisting of :
std::vector<vec3d> vertices
std::vector<face> mesh
std::vector<vec3d> colors
vec3d is an implementation of 3D vectors - nothing particular - and face a class holding 3 integers pointing to the index of a vertice in vertices.
So far, I rendered my meshes without a texture with the following code (working fine) :
glShadeModel(params.smooth ? GL_SMOOTH : GL_FLAT);
glEnableClientState(GL_VERTEX_ARRAY);
glEnableClientState(GL_NORMAL_ARRAY);
/*This is my attempt to add the texture
*
*if (_colors.size() != 0) {
* cout << "Hello" << endl;
* glClientActiveTexture(GL_TEXTURE0);
* glEnableClientState(GL_TEXTURE_COORD_ARRAY);
* glTexCoordPointer(3,GL_FLOAT,0,&_colors[0].x);
}*/
glNormalPointer(GL_FLOAT,0,&normals[0].x);
glVertexPointer(3,GL_FLOAT,0,&vertices[0].x);
glDrawElements(GL_TRIANGLES,mesh.size()*3,GL_UNSIGNED_INT,&mesh[0].v1);
glDisableClientState(GL_VERTEX_ARRAY);
glDisableClientState(GL_NORMAL_ARRAY);
glDisableClientState(GL_TEXTURE_COORD_ARRAY);
My texture is stored in colors as a list of triples of floats between 0 and 1. However, colors are not applied. I read many examples of texture mapping and tried to do the same, with no luck. Any idea of what I'm doing wrong ?
As seen from your comments, you are using the wrong OpenGL feature to achieve what you want. Texturing means to stick a 2d image onto a mesh by using e.g. uv-coordinates.
What you are doing is to specify a color on each vertex, so you will need to enable GL_COLOR_ARRAY instead of GL_TEXTURE_COORD_ARRAY and use the respective functions for that.
One additional hint: If you are learning OpenGL from scratch you should consider using only modern OpenGL (3.2+)
To answer the last comment:
Well, I read those colors from a texture file, that's what I meant. Is there a way to use such an array to display my mesh in color ?
Yes and no: You will most probably not get the result you expect when doing this. In general there will be multiple pixels in a texture that should be mapped to a face. With vertex-colors you can only apply one color-value per vertex which gets interpolated over the triangle. Have a look on how to apply textures to a mesh, you should be able to find a lot of resources on the internet.

LibGDX texture blending with OpenGL blending function

In libGdx, i'm trying to create a shaped texture: Take a fully-visible rectangle texture and mask it to obtain a shaped textured, as shown here:
Here I test it on rectangle, but i will want to use it on any shape. I have looked into this tutorial and came with an idea to first draw the texture, and then the mask with blanding function:
batch.setBlendFunction(GL20.GL_ZERO, GL20.GL_SRC_ALPHA);
GL20.GL_ZERO - because i really don't want to paint any pixels from the mask
GL20.GL_SRC_ALPHA - from original texture i want to paint only those pixels, where mask was visible (= white).
Crucial part of the test code:
batch0.enableBlending();
batch0.begin();
batch0.draw(original, 0, 0); //to see the original
batch0.draw(mask, width1, 0); //and the mask
batch0.draw(original, 0, height1); //base for the result
batch0.setBlendFunction(GL20.GL_ZERO, GL20.GL_SRC_ALPHA);
batch0.draw(mask, 0, height1); //draw mask on result
batch0.setBlendFunction(GL20.GL_SRC_ALPHA, GL20.GL_ONE_MINUS_SRC_ALPHA);
batch0.end();
The center ot the texture get's selected well, but instead of transparent color around, i see black:
Why is the result blank and not transparent?
(Full code - Warning: very messy)
What you're trying to do looks like a pretty clever use of blending. But I believe the exact way you apply it is "broken by design". Let's walk through the steps:
You render your background with red and green squares.
You render an opaque texture on top of you background.
You erase parts of the texture you rendered in step 2 by applying a mask.
The problem is that for the parts you erase in step 3, the previous background is not coming back. It really can't, because you wiped it out in step 2. The background of the whole texture area was replaced in step 2, and once it's gone there's no way to bring it back.
Now the question is of course how you can fix this. There are two conventional approaches I can think of:
You can combine the texture and mask by rendering them into an off-sreen framebuffer object (FBO). You perform steps 1 and 2 as you do now, but render into an FBO with a texture attachment. The texture you rendered into is then a texture with alpha values that reflect your mask, and you can use this texture to render into your default framebuffer with standard blending.
You can use a stencil buffer. Masking out parts of rendering is a primary application of stencil buffers, and using stencil would definitely be a very good solution for your use case. I won't elaborate on the details of how exactly to apply stencil buffers to your case in this answer. You should be able to find plenty of examples both online and in books, including in other answers on this site, if you search for "OpenGL stencil". For example this recent question deals with doing something similar using a stencil buffer: OpenGL stencil (Clip Entity).
So those would be the standard solutions. But inspired by the idea in your attempt, I think it's actually possible to get this to work with just blending. The approach that I came up with uses a slightly different sequence and different blend functions. I haven't tried this out, but I think it should work:
You render the background as before.
Render the mask. To prevent it from wiping out the background, disable writing to the color components of the framebuffer, and only write to the alpha component. This leaves the mask in the alpha component of the framebuffer.
Render the texture, using the alpha component from the framebuffer (DST_ALPHA) for blending.
You will need a framebuffer with an alpha component for this to work. Make sure that you request alpha bits for your framebuffer when setting up your context/surface.
The code sequence would look like this:
// Draw background.
glColorMask(GL_FALSE, GL_FALSE, GL_FALSE, GL_TRUE);
glDisable(GL_BLEND);
// Draw mask.
glColorMask(GL_TRUE, GL_TRUE, GL_TRUE, GL_TRUE);
glEnable(GL_BLEND);
glBlendFunc(GL_DST_ALPHA, GL_ONE_MINUS_DST_ALPHA);
// Draw texture.
A very late answer, but with the current version this is very easy. You simply draw the mask, set the blending mode to use the source color to the destination and draw the original. You'll only see the original image where the mask is.
//create batch with blending
SpriteBatch maskBatch = new SpriteBatch();
maskBatch.enableBlending();
maskBatch.begin();
//draw the mask
maskBatch.draw(mask);
//store original blending and set correct blending
int src = maskBatch.getBlendSrcFunc();
int dst = maskBatch.getBlendDstFunc();
maskBatch.setBlendFunction(GL20.GL_ZERO, GL20.GL_SRC_COLOR);
//draw original
maskBatch.draw(original);
//reset blending
maskBatch.setBlendFunction(src, dst);
//end batch
maskBatch.end();
If you want more info on the blending options, check How to do blending in LibGDX

opengl - how to put texture on 3d irregular object

I have to create animation where gatling gun will be shoot (it doesn't have to be complex, cause it's just a practice). I drew basic version of my gun which looks like this:
Don't bother colors - i made them like that to be able to see where are egdes of particular parts of gun. Now i would like to make it look better by using some texture - moro or something like metalic color - example 1 or example2. I know how to load texture and how to use it for 2d objects, but i have no idea if there is a possiblity to use this texture for my whole drawing or do i have to use texture for every part separately? This is my code which corresponds to load texture from bmp file and make it able to use:
void initTexture(string fileName)
{
loadBmp(fileName.c_str());
textureId = 13;
glBindTexture(GL_TEXTURE_2D, textureId); //Tell OpenGL which texture to edit
glPixelStorei(GL_UNPACK_ALIGNMENT, 1);
//Map the image to the texture
glTexImage2D(GL_TEXTURE_2D, //Always GL_TEXTURE_2D
0, //0 for now
GL_RGB, //Format OpenGL uses for image
tex.info.biWidth, tex.info.biHeight, //Width and height
0, //The border of the image
GL_RGB, //GL_RGB, because pixels are stored in RGB format
GL_UNSIGNED_BYTE, //GL_UNSIGNED_BYTE, because pixels are stored
//as unsigned numbers
tex.px); //The actual pixel data
}
loadBmp() is function to load bitmap file. I tried to search something in the Internet and stackoverflow, but all exmaples where about cubes or spheres, which doesn't help me. How can i put texture on this drawing?
Texture mapping requires to (manually) assign texture coordinates to each vertex. There are some approaches on automatic texture coordinate generation, or getting away without texture coordinates, by giving each face an individual pixel (Disney Animation pioneered the later method for their computer animated films).
Since you didn't specify which program you used to model I'll refer you to some tutorials on texture UV mapping for Blender
http://wiki.blender.org/index.php/Doc:2.6/Manual/Textures/Mapping/UV/Unwrapping
Please don't tell me you did "code" your gun, because this is wrong, wrong, wrong!
Once you got the texture coordinates you pass them to OpenGL as just another vertex attribute.

How to give a CCSprite an specified alpha channel?

I want to create a paint-pen in Cocos2D.
and I have a png file which specify the alpha channel value of a image.(only one channel)
I want to give the alpha value to a ccsprite's texture which is pure color. and make use of the ccsprite unit as pan.to draw on the screen.
How can I do it in programming?
Thanks very much!
I have to do this for my project too. I came up with this idea. (in cocos2d-x)
render your 'alpha sprite' with rgb disabled but alpha enabled(through glColorMask(GL_FALSE, GL_FALSE, GL_FALSE, GL_TRUE)) to a CCRenderTexture.
render your actual sprite with a disabled but rgb enabled (through glColorMask(GL_TRUE, GL_TRUE, GL_TRUE, GL_FALSE))
now you have rgb from actual sprite and a from 'alpha sprite', then you set the alpha blend properties of the sprite of the render texture to {GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA}.
now you can add the render texture and draw it to the scene.
I have tried this, this worked for my project. hope this will work for yours too.
sprite.opacity = 100;
opacity range is 0-255. You can't change alpha of a texture and all sprites using the texture at once, unless you write a custom shader.