AVAudioUnit (OS X) render block only called for certain sample rates - audiounit

I'm having trouble getting AVAudioEngine (OS X) to play nice with all sample rates.
Here's my code for building the connections:
- (void)makeAudioConnections {
auto hardwareFormat = [self.audioEngine.outputNode outputFormatForBus:0];
auto format = [[AVAudioFormat alloc] initStandardFormatWithSampleRate:hardwareFormat.sampleRate channels:2];
NSLog(#"format: %#", format);
#try {
[self.audioEngine connect:self.avNode to:self.audioEngine.mainMixerNode format:format];
[self.audioEngine connect:self.audioEngine.inputNode to:self.avNode format:format];
} #catch(NSException* e) {
NSLog(#"exception: %#", e);
}
}
On my audio interface, the render callback is called for 44.1, 48, and 176.4kHz. It is not called for 96 and 192 kHz. On the built-in audio, the callback is called for 44.1, 48, 88 but not 96.
My AU's allocateRenderResourcesAndReturnError is being called for 96kHz. No errors are returned.
- (BOOL) allocateRenderResourcesAndReturnError:(NSError * _Nullable *)outError {
if(![super allocateRenderResourcesAndReturnError:outError]) {
return NO;
}
_inputBus.allocateRenderResources(self.maximumFramesToRender);
_sampleRate = _inputBus.bus.format.sampleRate;
return YES;
}
Here's my AU's init method, which is mostly just cut & paste from Apple's AUv3 demo:
- (instancetype)initWithComponentDescription:(AudioComponentDescription)componentDescription options:(AudioComponentInstantiationOptions)options error:(NSError **)outError {
self = [super initWithComponentDescription:componentDescription options:options error:outError];
if (self == nil) {
return nil;
}
// Initialize a default format for the busses.
AVAudioFormat *defaultFormat = [[AVAudioFormat alloc] initStandardFormatWithSampleRate:44100. channels:2];
// Create the input and output busses.
_inputBus.init(defaultFormat, 8);
_outputBus = [[AUAudioUnitBus alloc] initWithFormat:defaultFormat error:nil];
// Create the input and output bus arrays.
_inputBusArray = [[AUAudioUnitBusArray alloc] initWithAudioUnit:self busType:AUAudioUnitBusTypeInput busses: #[_inputBus.bus]];
_outputBusArray = [[AUAudioUnitBusArray alloc] initWithAudioUnit:self busType:AUAudioUnitBusTypeOutput busses: #[_outputBus]];
self.maximumFramesToRender = 256;
return self;
}
To keep things simple, I'm setting the sample rate before starting the app.
I'm not sure where to begin tracking this down.
Update
Here's a small project which reproduces the issue I'm having:
Xcode project to reproduce issue
You'll get errors pulling from the input at certain sample rates.
On my built-in audio running at 96kHz the render block is called with alternating 511 and 513 frame counts and errors -10863 (kAudioUnitErr_CannotDoInCurrentContext) and -10874 (kAudioUnitErr_TooManyFramesToProcess) respectively. Increasing maximumFramesToRender doesn't seem to help.
Update 2
I simplified my test down to just connecting the input to the main mixer:
[self.audioEngine connect:self.audioEngine.inputNode to:self.audioEngine.mainMixerNode format:nil];
I tried explicitly setting the format argument.
This still will not play through at 96kHz. So I'm thinking this may be a bug in AVAudioEngine.

For play-through with AVAudioEngine, the input and output hardware formats and all the connection formats must be at the same sample rate. So the following should work.
AVAudioFormat *outputHWFormat = [self.audioEngine.outputNode outputFormatForBus:0];
AVAudioFormat *inputHWFormat = [self.audioEngine.inputNode inputFormatForBus:0];
if (inputHWFormat.sampleRate == outputHWFormat.sampleRate) {
[self.audioEngine connect:self.audioEngine.inputNode to:self.audioEngine.mainMixerNode format:inputHWFormat];
[self.audioEngine connect:self.audioEngine.mainMixerNode to:self.audioEngine.outputNode format:inputHWFormat];
}

Related

iOS playing audio in iOS 13 throws C++ exception, freezes app

TL;DR:
The code below (all ten lines of it, apart from debugging)
fails (UI freezes) on iOS 13.x (Simulator)
succeeds (audio plays) on 14.x (Simulator and devices)
I don't have any devices with iOS 13.x. But...analytics from live apps suggest it is failing in the field on both iOS 13 and 14 devices. False positives? (See line of code commented with $$$.)
Steps To Reproduce
Create a new SwiftUI project that can run in iOS 13. Replace the text in ContentView.swift with the code below. Add an audio resource named clip.mp3. Build and run.
I am using Xcode 12.4, macOS 11.1, Swift 5.
See Also
Apple Dev Forum 1   // Unresolved
Apple Dev Forum 2   // Attributed to beta iOS/Xcode
Stackoverflow 1   // Unresolved
Stackoverflow 2   // Refers to next link
Apple Dev Forum 3   // Claims fixed in Xcode 12b5
[...and many more...]
Code
import SwiftUI
import AVKit
struct ContentView: View {
var body: some View {
Text("Boo!").onAppear { playClip() }
}
}
var clipDelegate: AudioTimerDelegate! // Hold onto it to forestall GC.
var player : AVAudioPlayer! // Ditto.
func playClip() {
let u = Bundle.main.url(forResource: "clip", withExtension: "mp3")!
player = try! AVAudioPlayer(contentsOf: u)
clipDelegate = AudioTimerDelegate() // Wait till now to instantiate, for correct timing.
player.delegate = clipDelegate
player.prepareToPlay()
NSLog("*** Starting clip play") // NSLog so we get timestamp.
player.play()
// Wait 5 seconds and see if audioPlayerDidFinishPlaying.
DispatchQueue.main.asyncAfter(deadline: .now() + 5) {
if let d = clipDelegate.clipDuration {
NSLog("*** Caller clip duration = \(d)")
} else {
NSLog("!!! Caller found nil clip duration")
// $$$ In live app, post audio-freeze event to analytics.
}
}
}
class AudioTimerDelegate: NSObject, AVAudioPlayerDelegate {
private var startTime : Double
var clipDuration: Double?
override init() {
self.startTime = CFAbsoluteTimeGetCurrent()
super.init()
}
func audioPlayerDidFinishPlaying(_ player: AVAudioPlayer, successfully flag: Bool) {
clipDuration = CFAbsoluteTimeGetCurrent() - startTime
NSLog("*** Delegate clip duration = \(clipDuration!)")
}
}
Console Output
Simulator iOS 14.4
The audio plays and the Console (edited for brevity) reads:
14:33:17 [plugin] AddInstanceForFactory: No factory registered for ... F8BB1C28-...
14:33:17 *** Starting clip play
14:33:19 *** Delegate clip duration = 1.692...
14:33:22 *** Caller clip duration = 1.692...
I gather that the first line is innocuous and related to the Simulator's sound drivers.
Is anyone else getting this console message with AVAudioPlayer in Xcode 11 (and 11.1)?
Device 14.4
Results are the same, without the AddInstanceForFactory complaint.
Simulator 13.6
Audio never sounds, the delegate callback never runs, and in the Console I get:
14:30:10 [plugin] AddInstanceForFactory: No factory registered for ... F8BB1C28-...
14:30:11 HALB_IOBufferManager_Client::GetIOBuffer: the stream index is out of range
14:30:11 HALB_IOBufferManager_Client::GetIOBuffer: the stream index is out of range
14:30:11 [aqme] AQME.h:254:IOProcFailure: AQDefaultDevice (1): output stream 0: null buffer
14:30:11 [aqme] AQMEIO_HAL.cpp:1774:IOProc: EXCEPTION thrown (-50): error != 0
14:30:26 [aqme] AQMEIO.cpp:179:AwaitIOCycle: timed out after 15.000s (0 1); suspension count=0 (IOSuspensions: )
14:30:26 CA_UISoundClient.cpp:241:StartPlaying_block_invoke: CA_UISoundClientBase::StartPlaying: AddRunningClient failed (status = -66681).
14:30:26 *** Starting clip play
14:30:26 HALB_IOBufferManager_Client::GetIOBuffer: the stream index is out of range
14:30:26 HALB_IOBufferManager_Client::GetIOBuffer: the stream index is out of range
14:30:26 [aqme] AQME.h:254:IOProcFailure: AQDefaultDevice (1): output stream 0: null buffer
14:30:26 [aqme] AQMEIO_HAL.cpp:1774:IOProc: EXCEPTION thrown (-50): error != 0
14:30:41 [aqme] AQMEIO.cpp:179:AwaitIOCycle: timed out after 15.000s (1 2); suspension count=0 (IOSuspensions: )
14:30:46 !!! Caller found nil clip duration
Remarks
It seems that there are two fifteen-second delays going on in the failure case.

Calling a Web Service (containg multiple pages) does not load all the pages (without an added sleep delay)

My question is about a strange behavious I notice both on my iPhone device and the codenameone simulator (NetBeans).
I invoke the following code below which calls a google web service to provide a list of food places around a GPS coordinate:
The web service that is called is as follows (KEY OBSCURED):
https://maps.googleapis.com/maps/api/place/nearbysearch/json?location=40.714353,-74.00597299999998&radius=200&types=food&key=XXXXXXXXXXXXXXXXXXXXXXX
Each result contains the next page token and thus, the second call (for the subsequent page) is as follows:
https://maps.googleapis.com/maps/api/place/nearbysearch/json?location=40.714353,-74.00597299999998&radius=200&types=food&key=XXXXXXXXXXXXXXXXXXXXXXX&pagetoken=YYYYYYYYYYYYYYYYYY
public static byte[] getWSResponseData(String urlString, boolean usePost)
{
ConnectionRequest r = new ConnectionRequest();
r.setUrl(urlString);
r.setPost(usePost);
InfiniteProgress prog = new InfiniteProgress();
Dialog dlg = prog.showInifiniteBlocking();
r.setDisposeOnCompletion(dlg);
NetworkManager.getInstance().addToQueueAndWait(r);
try
{
Thread.sleep(2000);
}
catch (InterruptedException ex)
{
}
byte[] responseData = r.getResponseData();
return responseData;
}
public static void getLocationsList(double lat, double lng)
{
boolean done = false;
while (!done)
{
byte[] responseData = getWSResponseData(finalURL,false);
result = Result.fromContent(parser.parseJSON(new InputStreamReader(new ByteArrayInputStream(responseData))));
String venueNames[] = result.getAsStringArray("/results/name");
nextToken = result.getAsString("/next_page_token");
if ( nextToken == null || nextToken.equals(""))
done = true;
else
finalURL = completeURL + "&pagetoken=" + nextToken;
}
.....
}
This code works fine with the sleep timer, but when I remove the Thread.sleep, only the first page gets called.
Any help would be appreciated.
Using the debugger does not help as this is a timing issue and the issue does not occur when using the debugger.
Also when I put some print statements into the code
while (!done)
{
String nextToken = null;
**System.out.println(finalURL);**
...
}
System.out.println("Total Number of entries returned: " + itemCount);
I get the following output:
First Run (WITHOUT SLEEP):
https://maps.googleapis.com/maps/api/place/nearbysearch/json?location=40.714353,-74.00597299999998&radius=200&types=food&key=XXXXXXXX
https://maps.googleapis.com/maps/api/place/nearbysearch/json?location=40.714353,-74.00597299999998&radius=200&types=food&key=XXXXXXXX&pagetoken=CqQCF...
Total Number of entries returned: 20
Using the network monitor I see that the response to the second WS call returns:
{
"html_attributions" : [],
"results" : [],
"status" : "INVALID_REQUEST"
}
Which is strange as when I cut and paste the WS URL into my browser, it works fine...
Second Run (WITH SLEEP):
https://maps.googleapis.com/maps/api/place/nearbysearch/json?location=40.714353,-74.00597299999998&radius=200&types=food&key=XXXXXXXXX
https://maps.googleapis.com/maps/api/place/nearbysearch/json?location=40.714353,-74.00597299999998&radius=200&types=food&key=XXXXXXXXX&pagetoken=CqQCFQEAA...
https://maps.googleapis.com/maps/api/place/nearbysearch/json?location=40.714353,-74.00597299999998&radius=200&types=food&key=XXXXXXXXX&pagetoken=CsQDtQEAA...
Total Number of entries returned: 60
Well it seems to be a google API issue as indicated here:
Paging on Google Places API returns status INVALID_REQUEST
I still could not get it to work by changing the WS URL with a random parameter as they suggested, but I will keep trying and post something here if I get it to work. For now I will just keep a 2 second delay between the calls which seems to work.
Well gave up on using the google WS for this and switched to Yelp, works very well:
https://api.yelp.com/v3/businesses/search?.....

Upgrade to Yosemite has negative impact on objectForKey mp3 image

Just updated to Yosemite and Xcode 6.0.01 and this code no longer works:
- (NSImage *)songImage {
if (!_songImage) {
AVAsset *asset = [AVAsset assetWithURL:self.fileURL];
for (AVMetadataItem *metadataItem in asset.commonMetadata) {
if ([metadataItem.commonKey isEqualToString:#"artwork"]){
NSDictionary *imageDataDictionary =
(NSDictionary *)metadataItem.value;
NSData *imageData = [imageDataDictionary objectForKey:#"data"];
_songImage =[[NSImage alloc] initWithData:imageData];
}
}
}
if (!_songImage) {
return Nil;
}
return _songImage;
}
I'm not sure if the process was replaced or removed but I now get a message:
2014-10-17 14:36:23.756 FSC Music[3317:122917] -[__NSCFData objectForKey:]: unrecognized
selector sent to instance 0x600000241cb0 2014-10-17 14:36:23.764
FSC Music[3317:122917] -[__NSCFData objectForKey:]: unrecognized selector sent to instance
0x600000241cb0
need to research a solution, but wanted to ask if anyone else has come across this?
I changed to the following code to get it working again.
-(NSImage *)songImage {
if (!_songImage) {
AVAsset *asset = [AVAsset assetWithURL:self.fileURL];
NSArray *metadata = [asset commonMetadata];
for ( AVMetadataItem *item in metadata ) {
if ([item.commonKey isEqualToString:#"artwork"]){
NSData *thePix = (NSData *)item.value;
_songImage =[[NSImage alloc] initWithData:thePix];
}
}
}
if (!_songImage) {
return Nil;
}
return _songImage;
}

Synchronize IOS core data with web service?

Here is my problem:
I want to use core data - speed and connectivity issues to build my IOS app. The data stored in core data is coming from a SQLServer database which I can access through a yet-to-be-defined web service.
Any changes to the data stored in core data needs to be synchronized with the SQLServer via a web service. In addition, I need to buffer changes that don't get synchronized because of connectivity issues.
I also need to update core data with any changes that have occured on the server. This could happen on a schedule set in user preferences.
Solutions I've Explored:
Using NSIncrementalStore class (new in IOS 5). I'm very confused on what this does exactly but it sounds promising. From what I can tell, you subclass NSIncrementalStore which allows you to intercept the regular core data API calls. I could then pass on the the information to core data as well as sync it with the external database via a web service. I could be completely wrong. But assuming I'm right, how would I sync deltas if the connection to the internet is down?
AFIncrementalStore - This is a subclass off of NSIncrementalStore using AFNetworking to do the web services piece.
RestKit - I'm a little concerned on how active this API is and it seems to be going through a transition to block functionality. Has anyone used this extensively?
I'm leaning towards AFIncrementalStore since this is using (what seems to be) a more standard approach. The problem is, I could be completely off on what NSIncrementalStore really is.
A link to some sample code or tutorial would be great!
My solution to this was to store two copies of the data set in a CoreData database. One represents the last-known server state and is immutable. The other is edited by the user.
When it is time to sync changes, the app creates a diff between the edited and immutable copies of the data. The app sends the diff to a web service which applies the diff to its own copy of the data. It replies with a full copy of the data set, which the app overwrites onto both of its copies of the data.
The advantages are:
If there is no network connectivity, no changes are lost: the diff is calculated each time the data set needs to be sent, and the immutable copy is only changed on a successful sync.
Only the minimum amount of information that needs to be sent is transmitted.
Multiple people can edit the same data at the same time without using locking strategies with a minimum opportunity for data loss via overwrites.
The disadvantages are:
Writing the diffing code is complex.
Writing the merging service is complex.
Unless you are a metaprogramming guru, you'll find that your diff/merge code is brittle and has to change whenever you change your object model.
Here are some of the considerations I had when coming up with the strategy:
If you allow changes to be made offline, checkin/checkout locking won't work (how can you establish a lock with no connection?).
What happens if two people edit the same data at the same time?
What happens if one person edits data on one iOS device when connectionless, switches it off, edits on another device and then turns the original device back on?
Multithreading with CoreData is an entire problem class in itself.
The closest thing I've heard of to out-of-the-box support to do anything remotely like this is the new iCloud/CoreData syncing system in iOS6, which automatically transmits entities from a CoreData database to iCloud when they change. However, that means you have to use iCloud.
EDIT: This is very late, I know, but here's a class that is capable of producing a diff between two NSManagedObject instances.
// SZManagedObjectDiff.h
#interface SZManagedObjectDiff
- (NSDictionary *)diffNewObject:(NSManagedObject *)newObject withOldObject:(NSManagedObject *)oldObject
#end
// SZManagedObjectDiff.m
#import "SZManagedObjectDiff.h"
#implementation SZManagedObjectDiff
- (NSDictionary *)diffNewObject:(NSManagedObject *)newObject withOldObject:(NSManagedObject *)oldObject {
NSDictionary *attributeDiff = [self diffAttributesOfNewObject:newObject withOldObject:oldObject];
NSDictionary *relationshipsDiff = [self diffRelationshipsOfNewObject:newObject withOldObject:oldObject];
NSMutableDictionary *diff = [NSMutableDictionary dictionary];
if (attributeDiff.count > 0) {
diff[#"attributes"] = attributeDiff;
}
if (relationshipsDiff.count > 0) {
diff[#"relationships"] = relationshipsDiff;
}
if (diff.count > 0) {
diff[#"entityName"] = newObject ? newObject.entity.name : oldObject.entity.name;
NSString *idAttributeName = newObject ? newObject.entity.userInfo[#"id"] : oldObject.entity.userInfo[#"id"];
if (idAttributeName) {
id itemId = newObject ? [newObject valueForKey:idAttributeName] : [oldObject valueForKey:idAttributeName];
if (itemId) {
diff[idAttributeName] = itemId;
}
}
}
return diff;
}
- (NSDictionary *)diffRelationshipsOfNewObject:(NSManagedObject *)newObject withOldObject:(NSManagedObject *)oldObject {
NSMutableDictionary *diff = [NSMutableDictionary dictionary];
NSDictionary *relationships = newObject == nil ? [[oldObject entity] relationshipsByName] : [[newObject entity] relationshipsByName];
for (NSString *name in relationships) {
NSRelationshipDescription *relationship = relationships[name];
if (relationship.deleteRule != NSCascadeDeleteRule) continue;
SEL selector = NSSelectorFromString(name);
id newValue = nil;
id oldValue = nil;
if (newObject != nil && [newObject respondsToSelector:selector]) newValue = [newObject performSelector:selector];
if (oldObject != nil && [oldObject respondsToSelector:selector]) oldValue = [oldObject performSelector:selector];
if (relationship.isToMany) {
NSArray *changes = [self diffNewSet:newValue withOldSet:oldValue];
if (changes.count > 0) {
diff[name] = changes;
}
} else {
NSDictionary *relationshipDiff = [self diffNewObject:newValue withOldObject:oldValue];
if (relationshipDiff.count > 0) {
diff[name] = relationshipDiff;
}
}
}
return diff;
}
- (NSDictionary *)diffAttributesOfNewObject:(NSManagedObject *)newObject withOldObject:(NSManagedObject *)oldObject {
NSMutableDictionary *diff = [NSMutableDictionary dictionary];
NSArray *attributeNames = newObject == nil ? [[[oldObject entity] attributesByName] allKeys] : [[[newObject entity] attributesByName] allKeys];
for (NSString *name in attributeNames) {
SEL selector = NSSelectorFromString(name);
id newValue = nil;
id oldValue = nil;
if (newObject != nil && [newObject respondsToSelector:selector]) newValue = [newObject performSelector:selector];
if (oldObject != nil && [oldObject respondsToSelector:selector]) oldValue = [oldObject performSelector:selector];
newValue = newValue ? newValue : [NSNull null];
oldValue = oldValue ? oldValue : [NSNull null];
if (![newValue isEqual:oldValue]) {
diff[name] = #{ #"new": newValue, #"old": oldValue };
}
}
return diff;
}
- (NSArray *)diffNewSet:(NSSet *)newSet withOldSet:(NSSet *)oldSet {
NSMutableArray *changes = [NSMutableArray array];
// Find all items that have been newly created or updated.
for (NSManagedObject *newItem in newSet) {
NSString *idAttributeName = newItem.entity.userInfo[#"id"];
NSAssert(idAttributeName, #"Entities must have an id property set in their user info.");
id newItemId = [newItem valueForKey:idAttributeName];
NSManagedObject *oldItem = nil;
for (NSManagedObject *setItem in oldSet) {
id setItemId = [setItem valueForKey:idAttributeName];
if ([setItemId isEqual:newItemId]) {
oldItem = setItem;
break;
}
}
NSDictionary *diff = [self diffNewObject:newItem withOldObject:oldItem];
if (diff.count > 0) {
[changes addObject:diff];
}
}
// Find all items that have been deleted.
for (NSManagedObject *oldItem in oldSet) {
NSString *idAttributeName = oldItem.entity.userInfo[#"id"];
NSAssert(idAttributeName, #"Entities must have an id property set in their user info.");
id oldItemId = [oldItem valueForKey:idAttributeName];
NSManagedObject *newItem = nil;
for (NSManagedObject *setItem in newSet) {
id setItemId = [setItem valueForKey:idAttributeName];
if ([setItemId isEqual:oldItemId]) {
newItem = setItem;
break;
}
}
if (!newItem) {
NSDictionary *diff = [self diffNewObject:newItem withOldObject:oldItem];
if (diff.count > 0) {
[changes addObject:diff];
}
}
}
return changes;
}
#end
There's more information about what it does, how it does it and its limitations/assumptions here:
http://simianzombie.com/?p=2379
Use the Parse platform and its IOS SDK to structure and store info. It can cache data locally so you can retrieve it quickly and when there is no connectivity.

MediaEngine audio playback on WinRT

I'm trying to add music to my game that runs on WinRT. The music should be in an encoded format (mp3, ogg, etc.) and should be streamable and be decoded by the hardware (for performance reasons).
I've looked through the samples, and found out that MediaEngine can do something like this (I hope).
However, I'm having problems making it work. I keep getting ComExceptions everytime I try to create IMFByteStream from IRandomAccessStream via MFCreateMFByteStreamOnStreamEx().
It might be that I'm not handling tasks correctly, since they are a new paradigm for me.
Here's some code (pretty similar to the sample I mentioned before):
void MyMedia::PlayMusic ()
{
try
{
StorageFolder^ installedLocation = Windows::ApplicationModel::Package::Current->InstalledLocation;
Concurrency::task<StorageFile^> m_pickFileTask = Concurrency::task<StorageFile^>(installedLocation->GetFileAsync("music.mp3"), m_tcs.get_token());
SetURL(StringHelper::toString("music.mp3"));
auto player = this;
m_pickFileTask.then([&player](StorageFile^ fileHandle)
{
Concurrency::task<IRandomAccessStream^> fOpenStreamTask = Concurrency::task<IRandomAccessStream^> (fileHandle->OpenAsync(Windows::Storage::FileAccessMode::Read));
fOpenStreamTask.then([&player](IRandomAccessStream^ streamHandle)
{
try
{
player->SetBytestream(streamHandle);
if (player->m_spMediaEngine)
{
MEDIA::ThrowIfFailed(
player->m_spMediaEngine->Play()
);
}
} catch(Platform::Exception^)
{
MEDIA::ThrowIfFailed(E_UNEXPECTED);
}
}
);
}
);
} catch(Platform::Exception^ ex)
{
Printf("error: %s", ex->Message);
}
}
void MyMedia::SetBytestream(IRandomAccessStream^ streamHandle)
{
HRESULT hr = S_OK;
ComPtr<IMFByteStream> spMFByteStream = nullptr;
//The following line always throws a ComException
MEDIA::ThrowIfFailed(
MFCreateMFByteStreamOnStreamEx((IUnknown*)streamHandle, &spMFByteStream)
);
MEDIA::ThrowIfFailed(
m_spEngineEx->SetSourceFromByteStream(spMFByteStream.Get(), m_bstrURL)
);
return;
}
Bonus: If you know a better solution to my audio needs, please leave a comment.
I managed to fix this. There was two problems I found.
Media Foundation was not initialized
MFStartup(MF_VERSION); needs to be called before Media Foundation can be used. I added this code just before creating the media engine.
Referencing a pointer.
Line m_pickFileTask.then([&player](StorageFile^ fileHandle) should be m_pickFileTask.then([player](StorageFile^ fileHandle). This is already a pointer to the current class, and & provides the address of variable, so I was actually passing the pointer's pointer.