I am drawing background Sprite
1) CCRenderTexture...draw texture using it
2)The taking sprite from this
So i need to add some images or sprite(from images) on to the texture..
so is there any API by which we can add to texture and then getting sprite (texture + images)..
or i have to draw on texture using open gl commands???
Thanks in advance
CCRenderTexture is a CCNode. As with every CCNode you can just add sprites as children and position them wherever you want. For animation to them you can use CCAnim just as with any other sprite.
Related
I am rendering a flower from textures from a spritesheet on screen, and all looks fine. When I use the same textures but not from a spritesheet the flower is rendered differently.
This image (first, below) of the flower rendered from a spritesheet and looks correctly. It is composed of two texture layers: petals and center.
This image (below) of the flower composed from two sprites, one holding petals texture, the other holding the center texture, not from a spritesheet. As you can see, there is transparency around the center, and that is caused by (I presume) blending of center texture onto the petals texture.
The petals and center textures were composed from original images read from files using CCRenderTexture. The original images are PMA (premultiplied-alpha), while the resulting texture from CCRenderTexture is NPMA (non-premultiplied-alpha).
Changing blending modes to PMA or NPMA does not help.
The sprite node hierarchy is simple:
ROOT-SPRITE (empty image, ie 1x1px size, fully tranparent image)
PETALS-SPRITE (petals texture), z=1
CENTER-SPRITE (center texture), z=2
I have the following questions:
What am I doing wrong?
How can I resolve this?
Using Cocos2D-X v3.4 in iOS simulator (device has same results).
I am using opengl and c++ doing image processing. The idea is simple, I will load an image, draw a polygon by clicking and then apply an effect (desaturation for instance) only to the pixels in the interior of the polygon shape just created.
Can anyone give me any direction on how to limit the effect to the pixels within the interior of the polygon? Loading the image and drawing the polygon is not a problem
Supposing the following situation :
The picture on which you want to apply the effect takes the whole screen
The picture is rendered using opengl , probably through a simple shader, with the picture passed as a texture
You can do the following approach :
consider the screen as being a big texture
you draw a polygon, which will be rendered on top of the rendered texture
inside the polygon's vertices insert the uv's coresponding to the 2D coordinates on the screen (so from screen space to uv space (0 , 1 ) )
draw the picture normaly
on top of the picture draw your polygon using the same picture as texture, but with a different shader
So instead of trying to desaturate a specific region from your picture, create a polygon on top of that region with the same picture, and desaturate that new polygon.
This would help you in avoiding the stencil buffer.
Another approach would be to create the polygon, but draw it only on the stencil buffer, before the picture is drawn.
The idea is: I have a sprite with rectangle image
CCSprite *sprite = [CCSprite spriteWithFile:#"Rectangle.png"];
When I touch the sprite, 8 red points will appear
Holding a point and drag it to scale(resize) the image like this
Can anyone show me how to do or give me a sample code.
I am not sure about this but you can redraw a texture on that 8 points.
It seems similar to draw texture of soft body. I have implemented that for my game and in that i use 12 points and on that 12 point i tried to draw the texture.
So what you can do is you have to redraw the texture on that points when any of the point dragged. This is a tutorial for soft body,but You can refer this tutorial for reference.
http://www.uchidacoonga.com/2012/04/soft-body-physics-with-box2d-and-cocos2d-part-44/
It's not the same thing what you are looking for but yes using this you can implement.
I want to build a LevelEditor and would need the sprites (their images) stored in a Texture Atlas of cocos2d in a NSTableView. Any idea how one could approach that?
You can achieve this by creating a CCSprite for each image in your texture atlas and then rendering them into a CCRenderTexture. CCRenderTexture has the ability to save its texture as a NSImage.
More info on how to do this: http://www.cocos2d-iphone.org/forum/topic/27769
I create some scene, and i would like to display some static background image, which would not change from how and what I am doing with the scene.
Usually it means that you can skip clearing your color buffer at the beginning of the scene, instead you set default orthographic projection and render a quad with (-1,-1,0), (-1,+1,0), (+1,+1,0), (+1,-1,0) vertices and apply a texture to this quad.
Then you can set necessary perspective projection and render whatever you want in your scene. The quad will serve you as a background.