I just started working with Graph API and have it working to login, however not to do graph api calls. I have set the delegate (i think), but it isn't calling back the delegate methods i implemented. Here is my code:
#import <UIKit/UIKit.h>
#import "Profile.h"
#import "AppDelegate.h"
#import "FBRequest.h"
#interface HomeViewController : UITableViewController <FBRequestDelegate>
{
Profile *profile;
AppDelegate *appDelegate;
}
- (void)request:(FBRequest *)request didLoad:(id)result;
#end
implementation:
- (void)viewDidLoad
{
[super viewDidLoad];
profile = [[Profile alloc] initWithUserId:1];
[profile refreshMatchesWithCallback:^
{
[self.tableView reloadData];
}];
[appDelegate.facebook requestWithGraphPath:[NSString stringWithFormat: #"me", appDelegate.facebook.accessToken] andDelegate:self];
[[appDelegate facebook] requestWithGraphPath:#"me" andDelegate:self];
// Uncomment the following line to preserve selection between presentations.
// self.clearsSelectionOnViewWillAppear = NO;
// Uncomment the following line to display an Edit button in the navigation bar for this view controller.
// self.navigationItem.rightBarButtonItem = self.editButtonItem;
}
- (void)request:(FBRequest *)request didLoad:(id)result {
NSLog(#"%#", result);
}
- (void)request:(FBRequest *)request didFailWithError:(NSError *)error {
NSLog(#"erre");
}
From what I see - it looks like you haven't initialized your app delegate.
Something along the lines of :
appDelegate = (AppDelegate *)[[UIApplication sharedApplication] delegate];
Your appDelegate variable has not been initialized - so is nil.
Note: I'm basing this on the assumption you haven't initialized the app delegate anywhere else.
Related
I have Qt application, I'm calling UIImagePickerController to get file path for movie the user selected. So I have static functions in Objective-C which call functions from .m file, when user selects movie I can read selected path in another .m function. Question is, how to get this filePath into some Objective-C class where I can call another Qt classes (like Singleton) and pass this result to proper class ?
Objective-C CL_image_call.mm
#include "cl_image_call.h"
#include "cl_image_func.h"
myCLImageClass* CL_image_obj=NULL;
int CL_ImageCCall::CL_objectiveC_Call() {
//Objective C code calling.....
if( CL_image_obj==NULL ) {
//Allocating the new object for the objective C class we created
CL_image_obj=[[myCLImageClass alloc]init];
}
return 1;
}
void CL_ImageCCall::CL_openMedia() {
[CL_image_obj openMedia];
}
CL_image_call.h
#include <QImage>
#include <QString>
#include <QDebug>
#include <stdio.h>
#include "my_singleton.h"
class CL_ImageCCall
{
public:
static int CL_objectiveC_Call();
static void CL_openMedia();
};
CL_image_func.h
#import <Foundation/Foundation.h>
#import <CoreLocation/CoreLocation.h>
#import <CoreGraphics/CoreGraphics.h>
#import <UIKit/UIKit.h>
#import <MobileCoreServices/MobileCoreServices.h>
#interface myCLImageClass:NSObject
-(void)openMedia;
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info;
#end
CL_image_func.m
#import "cl_image_func.h"
#interface QIOSViewController : UIViewController
#end
#implementation myCLImageClass
UIImagePickerController *picker=NULL;
UIViewController *viewController=NULL;
NSString *f_path;
-(void)openMedia {
UIWindow *keyWindow = [[UIApplication sharedApplication] keyWindow];
viewController = (UIViewController *)([keyWindow rootViewController]);
if (!viewController)
return;
if([UIImagePickerController isSourceTypeAvailable:
UIImagePickerControllerSourceTypePhotoLibrary]) {
picker= [[UIImagePickerController alloc] init];
picker.delegate = self;
picker.sourceType = UIImagePickerControllerSourceTypePhotoLibrary;
picker.mediaTypes = [NSArray arrayWithObjects:(NSString *)kUTTypeMovie, nil];
[viewController presentViewController:picker animated:YES completion:NULL];
}
}
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
NSString *docDirPath = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
NSString *filePath = [docDirPath stringByAppendingPathComponent:#"movie.mov"];
NSLog (#"File Path = %#", filePath);
//filePath is what I need to pass to the rest of my app
[viewController dismissViewControllerAnimated:YES completion:nil];
}
So in CL_image_call.h I have my "my_singleton" where I can reach my whole app, but how to get filePath to this class ?
Best Regards
Marek
Here is the code to export media:
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
NSString *docDirPath = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
NSString *filePath = [docDirPath stringByAppendingPathComponent:#"movie.mov"];
NSLog (#"File Path = %#", filePath);
//filePath is what I need to pass to the rest of my app
NSURL *inputURL = [info objectForKey:UIImagePickerControllerMediaURL];
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:inputURL options:nil];
// config the session, maybe some option is not what you need, just config by what you need
AVAssetExportSession *session = [[AVAssetExportSession alloc] initWithAsset:asset
presetName:AVAssetExportPresetMediumQuality];
session.outputURL = [NSURL fileURLWithPath:filePath];
session.outputFileType = AVFileTypeMPEG4;
session.shouldOptimizeForNetworkUse = YES;
[session exportAsynchronouslyWithCompletionHandler:^(void)
{
// do a completion handler
}];
[viewController dismissViewControllerAnimated:YES completion:nil];
}
Then you can visit the output by the filePath, and use when you need.
I have modified code with your suggestion and it does what it should. I'm settings moviePath inside my Qt classes and then I call Objective-C code with this path
NSString *f_path;
//Your objective c code here....
-(void)openMedia:(NSString*)moviePath {
UIWindow *keyWindow = [[UIApplication sharedApplication] keyWindow];
f_path=moviePath.copy;
viewController = (UIViewController *)([keyWindow rootViewController]);
if (!viewController)
return;
if([UIImagePickerController isSourceTypeAvailable:
UIImagePickerControllerSourceTypePhotoLibrary]) {
picker= [[UIImagePickerController alloc] init];
picker.delegate = self;
picker.sourceType = UIImagePickerControllerSourceTypePhotoLibrary;
picker.mediaTypes = [NSArray arrayWithObjects:(NSString *)kUTTypeMovie, nil];
[viewController presentViewController:picker animated:YES completion:NULL];
}
}
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
NSURL *inputURL = [info objectForKey:UIImagePickerControllerMediaURL];
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:inputURL options:nil];
// config the session, maybe some option is not what you need, just config by what you need
AVAssetExportSession *session = [[AVAssetExportSession alloc] initWithAsset:asset
presetName:AVAssetExportPresetMediumQuality];
session.outputURL = [NSURL fileURLWithPath:f_path];
session.outputFileType = AVFileTypeMPEG4;
session.shouldOptimizeForNetworkUse = YES;
[session exportAsynchronouslyWithCompletionHandler:^(void)
{
// do a completion handler
NSLog (#"export complete = %#", f_path);
}];
[viewController dismissViewControllerAnimated:YES completion:nil];
}
with some trick I can view this movie inside Qt app. One thing that is missing:
I need to know somehow that session has been completed, to refresh file path for MediaPlayer. How can I notify my base class CL_ImageCCall about that. Or actually maybe another Objective-C++ class where I can actually mix Qt and Objective-C ?
Best Regards and thanks for Your help
Marek
I've got ViewController that contains tableView and an array of items to be displayed on that table.
In order to change table content, I could initialise the array and calling reloadData method from within ViewController button callback (see buttonTap method below).
However, I also need to change the table contents from external code outside the scope of viewController, but it's inaccessible.
In the minimal example below I'm using external C/C++ based thread that unsuccessfully attempts to access a viewController method but it fails on compilation due to No known class method for selector 'changeArr'
Any idea how to make viewController public ?
viewController.m
#import "ViewController.h"
#interface ViewController() <NSTableViewDelegate, NSTableViewDataSource>
#property (weak) IBOutlet NSTableView *tableView;
#property NSMutableArray<NSString*> * arr;
#end
#implementation ViewController
- (void)viewDidLoad {
[super viewDidLoad];
self.tableView.delegate = self;
self.tableView.dataSource = self;
[self.tableView reloadData];
}
- (void)viewWillAppear {
[super viewWillAppear];
[self.tableView reloadData];
}
- (NSInteger)numberOfRowsInTableView:(NSTableView *)tableView {
return self.arr.count;
}
- (NSView *)tableView:(NSTableView *)tableView viewForTableColumn:(NSTableColumn *)tableColumn row:(NSInteger)row {
NSTableCellView* cellView =(NSTableCellView*)[tableView makeViewWithIdentifier:#"cell1" owner:nil];
cellView.textField.stringValue = self.arr[row];
return cellView;
}
-(void)changeArr {
self.arr = [NSMutableArray arrayWithObjects:#"ccc", #"ddd", nil];
[self.tableView reloadData];
NSLog(#"fwefwe");
}
- (IBAction)buttonTap:(id)sender {
self.arr = [NSMutableArray arrayWithObjects:#"AAAA", #"BBB", nil];
[self.tableView reloadData];
}
main.m
#import <Cocoa/Cocoa.h>
#include <pthread.h>
#import "ViewController.h"
void updateTable() {
sleep(10);
dispatch_sync(dispatch_get_main_queue(), ^{ [ViewController changeArr]; }); // ERROR : No known class method for selector 'changeArr'
}
int main(int argc, const char * argv[]) {
#autoreleasepool {
// Setup code that might create autoreleased objects goes here.
}
pthread_t threadHandler;
pthread_create(&threadHandler, NULL, &updateTable, NULL);
return NSApplicationMain(argc, argv);
}
changeArr is a private function(-). So you can't call [ViewController changeArr].
If you change changeArr to public fuction(+), it's impossible to access internal array.
So I think the easy way to do this task is using a instance of ViewController by singletone.
...
#implementation ViewController
+ (id) sharedViewController {
static ViewController *sharedInstance = nil;
static dispatch_once_t onceToken;
dispatch_once(&onceToken, ^{
UIStoryboard *storyboard = [UIStoryboard storyboardWithName:#"MainStoryboard" bundle:nil];
sharedInstance = [storyboard instantiateViewControllerWithIdentifier:#"MyViewControllerIdentifier"];
});
return sharedInstance ;
}
- (void)viewDidLoad {
[super viewDidLoad];
...
and in updateTable:
dispatch_sync(dispatch_get_main_queue(), ^{ [[ViewController sharedViewController] changeArr]; });
Simple example of CvVideoCamera usage doesn't work as expected. The image modified in delegate's processImage is displayed with no changes at all.
ViewController.h:
#import <UIKit/UIKit.h>
#import <opencv2/videoio/cap_ios.h>
#interface ViewController : UIViewController
#property (weak, nonatomic) IBOutlet UIButton *btnStart;
#property (weak, nonatomic) IBOutlet UIImageView *myimg;
#end
ViewController.mm:
#import "ViewController.h"
#import <opencv2/videoio/cap_ios.h>
#interface ViewController ()<CvVideoCameraDelegate>
#property (strong, nonatomic) CvVideoCamera *videoCamera;
#end
#implementation ViewController
- (void)viewDidLoad {
[super viewDidLoad];
self.videoCamera = [[CvVideoCamera alloc] initWithParentView:self.myimg];
[self.videoCamera setDelegate:self];
self.videoCamera.defaultAVCaptureDevicePosition = AVCaptureDevicePositionBack;
self.videoCamera.defaultAVCaptureSessionPreset = AVCaptureSessionPresetHigh;
self.videoCamera.defaultAVCaptureVideoOrientation = AVCaptureVideoOrientationPortrait;
self.videoCamera.defaultFPS = 30;
self.videoCamera.useAVCaptureVideoPreviewLayer = YES;
}
- (IBAction)actionStart:(id)sender {
[self.videoCamera start];
}
- (void)processImage:(cv::Mat &)image;
{
cv::Mat image_copy;
cvtColor(image, image_copy, CV_BGRA2BGR);
bitwise_not(image_copy, image_copy);
cvtColor(image_copy, image, CV_BGR2BGRA);
}
Once I press the button the normal camera stream is rendered with no traces of any interventions. I also tried to draw shapes and even calling image.release() inside of processImage with no effect.
I assume I'm missing something obvious here.
There's the example repository
This is what I'm getting on the screen
This line was the root of the issue:
self.videoCamera.useAVCaptureVideoPreviewLayer = YES;
once removed, everything works as expected.
I have implemented admob banners in my app already, and now I would like to implement Intersitial ads. The code is written in Obj-C++. Here is the code I have for the banners:
#import "MyGameBridge.h"
#import "AppController.h"
#import "GameConfig.h"
#include "GADInterstitial.h"
void MyGameBridge::showBanner()
{
AppController* delegate = (AppController*)[UIApplication sharedApplication].delegate;
[delegate openAdmobBannerAds];
}
void MyGameBridge::showAds()
{
AppController* delegate = (AppController*)[UIApplication sharedApplication].delegate;
[delegate initiAdBanner];
}
void MyGameBridge::hideAds()
{
AppController* delegate = (AppController*)[UIApplication sharedApplication].delegate;
[delegate hideBanner];
}
What do I need to code to implement Interstitial ads?
Define this in .h file
GADInterstitial *mInterstitial_;
In .m file,
- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions {
...
mInterstitial_ = [[GADInterstitial alloc] init];
mInterstitial_.adUnitID = ADMOB_FULL_SCREEM_ID;
[mInterstitial_ loadRequest:[GADRequest request]];
}
//Use below function to call admob interstitial
-(void)showAdmobAdsFullScreen
{
[mInterstitial_ presentFromRootViewController:self.viewController];
mInterstitial_ = nil;
mInterstitial_ = [[GADInterstitial alloc] init];
mInterstitial_.adUnitID = ADMOB_FULL_SCREEM_ID;
[mInterstitial_ loadRequest:[GADRequest request]];
}
I know that my question might be stupid but I searched and I can't find why it's not working.
I create a CCLayer class BackgroundLayer with implementation below:
#import "BackgroundLayer.h"
#implementation BackgroundLayer
- (id)init {
if (self != nil) {
CCSprite *background = [CCSprite spriteWithFile:#"menu.png"];
background.anchorPoint = ccp(0, 0);
[self addChild:background z:-1];
NSLog(#"test");
}
return self;
}
#end
and I want to add it on main menu scene and I have:
#import "MainMenuScene.h"
#import "BackgroundLayer.h"
#implementation MainMenuScene
+ (id)scene {
CCScene *scene = [CCScene node];
BackgroundLayer *backgroundLayer = [BackgroundLayer node];
[scene addChild:backgroundLayer];
return scene;
}
- (id)init {
self = [super init];
if (self != nil) {
}
return self;
}
#end
My problem is that NSLog test appears but the background doesn't load. If I add the background on the init method of MainMenuScene it works... Shouldn't I suppose that the layer works this way?
Not sure if related but you forgot self = [super init]; in BackgroundLayer.
Try commenting out the anchor point line to see if the image shows up then.