I can capture scene change events by its onEnter and onExit methods. But when scene change events takes time, like fade in or fade out, onEnter is called to early (right before the fading) and onExit is called to late (after the fading completed).
I want another onEnter called right after the fading completed and another onExit called right before the fading. Can i?
There's a second onEnter callback just for transitions, it's called onEnterTransitionDidFinish. But like it has been mentioned, this will only fire if CCScheduler is being used in conjunction with CCSceneTransition.
Use a CCSequence with your CCFadeIn and then add a CCCallFunc after it.
onEnter and onExit are to do with CCNode object allocation and removal, not physical views.
Sample code:
[scene runAction:[CCSequence actions:
[CCFadeIn actionWithDuration:0.45f],
[CCCallFunc actionWithTarget:scene selector:#selector(fakeOnEnter:)], nil]];
Inside your scene object you will need a method as such,
-(void) fakeOnEnter:(id)sender {
// your code to run after the fadein
}
Related
Always the same problem .
I have a scene and i am adding it a CClayer from another class , which is some background with ccmenu on him .
When touching it, touches goes under this layer, and i dont want it .
otherClass *layer=[otherClass alloc]init]; //otherClass returns a cclayer .
[self addChild: layer];
layer is good, and is above my scene , but the touches goes down.
is there a way in cocos2d to enable ONLY touches at the top layer ??
I must change the touches priority now ?
You need to change touch priority and set to swallowsTouches.
To do this, register your layer on touchDispatcher setting these parameters (you can also see CCLayer registerWithTouchDispatcher method as example):
[[[CCDirector sharedDirector] touchDispatcher] addTargetedDelegate:self priority:0 swallowsTouches:YES];
I have a CCSprite subclass, and initially I had set it up with a
So I had the following code:
-(void)onEnter {
[super onEnter];
[[[CCDirector sharedDirector] touchDispatcher] addTargetedDelegate:self priority:0 swallowsTouches:YES];
}
-(void)onExit {
[super onExit];
[[[CCDirector sharedDirector] touchDispatcher] removeDelegate:self];
}
-(BOOL)ccTouchBegan:(UITouch *)touch withEvent:(UIEvent *)event {
if ([self containsTouch:touch]) {
// do stuff
return YES;
}
return NO;
}
But then I realized that I actually didn't want to use touchBegan, because I want to detect if a sprite has been dragged downward-- So I wanted to use touchMoved and touchEnded instead of touchBegan.
However, when I implement those methods, they are not called...
How can I tell when the sprite's touch ended, and if it was "swiped"?
Enabling multiple touches: In the applicationDidFinishLaunching:application method in your appdelegate, set multiple touches to YES: [glView setMultipleTouchEnabled:YES];
Then in your CCLayer subclass (the class you are working in for detecting touches), in the init method, add self.isTouchEnabled = YES;
Now your multi touch methods should get called.
Swiping: cocos2d does not support gestures out of the box. You will likely have to work yourself. You can start with the apple event handling guide about gestures. The How To Drag and Drop Sprites with Cocos2D totorial at raywenderlich.com hepled me.
In order for the dispatcher to call your methods (moved, ended, cancelled), you have to first claim the touch ie. you will process the events. That is done in ccTouchBegan, when you return YES. After that you will receive the other events.
CCTouchableSprite - my touchable sublcass of CCSprite with Objective-C blocks, you can use touchMoved to detect what you want.
I have a CCMenu with CCMenuItems that I add on a CCLayer. When I click on the CCMenuItems, my CCTouchesBegan doesn't fire up.
How can I call this method also when I touch my menu items?
CCMenu registers as targeted touch delegate and swallows touches on menu items. You can try to create your subclass of CCMenu and override it registerWithTouchDispatcher method like this
-(void) registerWithTouchDispatcher
{
[[CCTouchDispatcher sharedDispatcher] addTargetedDelegate:self priority:kCCMenuTouchPriority swallowsTouches:NO];
}
This should work as you want, but maybe can cause other problems with menu behavior.
I am using CCPanZoomController to make my 'map' (one image) zoomable and pan-able.
On this map I would like to have clickable/touchable sprites, which when clicked change the image in the sprite.
The problem is that when the user pinches the screen (to zoom out/in), they may touch the sprite, which changes the image of the sprite, which is something I don't want.
I had an idea to solve this, but as I'm new to Cocos2d I don't know how to implement it:
I thought that I could detect when the user touches the screen/the sprite, and doesn't move their touch (as if to pinch or pan) through detecting when the user first touches the screen, (transform that initial touch into a coordinate), and then when the user stops touching the screen (turn that into a coordinate), and compare the both, and if their is no change (or very little change) then change the image of a sprite?
How would I go about doing this? Big thanks to anyone who can help!!
So I've been working with CCPanZoomController myself in my game and ran into similar issues as you but with many different aspects such as when they touch a sprite, I didn't want to have the background move with it or I'd want the sprite to not move when the background was zooming. So what I did was to make methods to "turn off" touches for the layer that I didn't want to react and re-enable them once the action in the other layer was done.
I created the following method inside each layer to disable it or enable it for touch which I call from the different touch events.
// Public Method: Allows for disabling touch for this layer and re-enabling it
-(void)enableTouches:(BOOL)enable
{
// Check if the bool value is to enable or disable touches
if (enable) {
// Call for the removal of all touch locations in array in the CCLayerPanZoom instance
[_panZoomLayer removeTouchesFromArray];
// Call the touch dispatcher and add the CCLayerPanZoom back as a delegate for touches
[[CCTouchDispatcher sharedDispatcher] addStandardDelegate:_panZoomLayer priority:0];
CCLOG(#"PanZoomWrapperLayer:enableTouches - LayerPanZoom touches enabled");
} else {
// Call the touch dispatcher to remove the CCLayerPanZoom as a delegate to disable touches
[[CCTouchDispatcher sharedDispatcher] removeDelegate:_panZoomLayer];
CCLOG(#"PanZoomWrapperLayer:enableTouches - LayerPanZoom touches disabled");
}
}
I've found a simple solution to this problem. However it may not suit your needs!
I've subclassed the CCMenu class and overridden the -(void)ccTouchMoved:(UITouch *)touch withEvent:(UIEvent *)event as follows:
-(void) ccTouchMoved:(UITouch *)touch withEvent:(UIEvent *)event
{
[_selectedItem unselected];
_selectedItem = nil;
}
I've set the touchSwallow property of the instance of that new menu to NO.
I am using pushScene and popScene in my cocos2d-iphone game (1.0.1).
So I have this:
Scene1, Scene2 (Scene2 being the pushed scene).
Now I use popScene, so I have
Scene1.
Is there a way to run a method in Scene1 when it is recovered from the popScene method? I mean, I want Scene1 to realize that it is back to work. I tried putting something in the onEnter method, but really didn't work (either the screen was black or the touches didn't work)
Alright, fixed.
You can indeed use the onEnter method, but you have to call [super onEnter] first.
I was doing this:
-(void)onEnter {
[self doSomething];
}
But it should have been like this:
-(void)onEnter {
[super onEnter];
[self doSomething];
}