iOS playing audio in iOS 13 throws C++ exception, freezes app - swiftui

TL;DR:
The code below (all ten lines of it, apart from debugging)
fails (UI freezes) on iOS 13.x (Simulator)
succeeds (audio plays) on 14.x (Simulator and devices)
I don't have any devices with iOS 13.x. But...analytics from live apps suggest it is failing in the field on both iOS 13 and 14 devices. False positives? (See line of code commented with $$$.)
Steps To Reproduce
Create a new SwiftUI project that can run in iOS 13. Replace the text in ContentView.swift with the code below. Add an audio resource named clip.mp3. Build and run.
I am using Xcode 12.4, macOS 11.1, Swift 5.
See Also
Apple Dev Forum 1   // Unresolved
Apple Dev Forum 2   // Attributed to beta iOS/Xcode
Stackoverflow 1   // Unresolved
Stackoverflow 2   // Refers to next link
Apple Dev Forum 3   // Claims fixed in Xcode 12b5
[...and many more...]
Code
import SwiftUI
import AVKit
struct ContentView: View {
var body: some View {
Text("Boo!").onAppear { playClip() }
}
}
var clipDelegate: AudioTimerDelegate! // Hold onto it to forestall GC.
var player : AVAudioPlayer! // Ditto.
func playClip() {
let u = Bundle.main.url(forResource: "clip", withExtension: "mp3")!
player = try! AVAudioPlayer(contentsOf: u)
clipDelegate = AudioTimerDelegate() // Wait till now to instantiate, for correct timing.
player.delegate = clipDelegate
player.prepareToPlay()
NSLog("*** Starting clip play") // NSLog so we get timestamp.
player.play()
// Wait 5 seconds and see if audioPlayerDidFinishPlaying.
DispatchQueue.main.asyncAfter(deadline: .now() + 5) {
if let d = clipDelegate.clipDuration {
NSLog("*** Caller clip duration = \(d)")
} else {
NSLog("!!! Caller found nil clip duration")
// $$$ In live app, post audio-freeze event to analytics.
}
}
}
class AudioTimerDelegate: NSObject, AVAudioPlayerDelegate {
private var startTime : Double
var clipDuration: Double?
override init() {
self.startTime = CFAbsoluteTimeGetCurrent()
super.init()
}
func audioPlayerDidFinishPlaying(_ player: AVAudioPlayer, successfully flag: Bool) {
clipDuration = CFAbsoluteTimeGetCurrent() - startTime
NSLog("*** Delegate clip duration = \(clipDuration!)")
}
}
Console Output
Simulator iOS 14.4
The audio plays and the Console (edited for brevity) reads:
14:33:17 [plugin] AddInstanceForFactory: No factory registered for ... F8BB1C28-...
14:33:17 *** Starting clip play
14:33:19 *** Delegate clip duration = 1.692...
14:33:22 *** Caller clip duration = 1.692...
I gather that the first line is innocuous and related to the Simulator's sound drivers.
Is anyone else getting this console message with AVAudioPlayer in Xcode 11 (and 11.1)?
Device 14.4
Results are the same, without the AddInstanceForFactory complaint.
Simulator 13.6
Audio never sounds, the delegate callback never runs, and in the Console I get:
14:30:10 [plugin] AddInstanceForFactory: No factory registered for ... F8BB1C28-...
14:30:11 HALB_IOBufferManager_Client::GetIOBuffer: the stream index is out of range
14:30:11 HALB_IOBufferManager_Client::GetIOBuffer: the stream index is out of range
14:30:11 [aqme] AQME.h:254:IOProcFailure: AQDefaultDevice (1): output stream 0: null buffer
14:30:11 [aqme] AQMEIO_HAL.cpp:1774:IOProc: EXCEPTION thrown (-50): error != 0
14:30:26 [aqme] AQMEIO.cpp:179:AwaitIOCycle: timed out after 15.000s (0 1); suspension count=0 (IOSuspensions: )
14:30:26 CA_UISoundClient.cpp:241:StartPlaying_block_invoke: CA_UISoundClientBase::StartPlaying: AddRunningClient failed (status = -66681).
14:30:26 *** Starting clip play
14:30:26 HALB_IOBufferManager_Client::GetIOBuffer: the stream index is out of range
14:30:26 HALB_IOBufferManager_Client::GetIOBuffer: the stream index is out of range
14:30:26 [aqme] AQME.h:254:IOProcFailure: AQDefaultDevice (1): output stream 0: null buffer
14:30:26 [aqme] AQMEIO_HAL.cpp:1774:IOProc: EXCEPTION thrown (-50): error != 0
14:30:41 [aqme] AQMEIO.cpp:179:AwaitIOCycle: timed out after 15.000s (1 2); suspension count=0 (IOSuspensions: )
14:30:46 !!! Caller found nil clip duration
Remarks
It seems that there are two fifteen-second delays going on in the failure case.

Related

XCUIElementQuery in SwiftUI testing returns inconsistent count

I'm doing UI testing of a SwiftUI view, and I have the following code to fetch a List element to check the count of staticTexts in it:
let pred = NSPredicate(format: "identifier == 'detailViewTable'")
let detailViewTable = app.descendants(matching: .any).matching(pred).firstMatch
let arrOfTexts = detailViewTable.staticTexts
let ct = arrOfTexts.count
print("Count is: \(ct)") // This prints 0
print("Count is now: \(detailViewTable.staticTexts.count)") // Also prints 0
The count always prints as 0.
The SwiftUI view basically looks like this:
List {
Section(header: Text("Item list")) {
HStack {
Text("Number")
Spacer()
Text(rec.recNum)
}
}
// Lots more Sections/HStacks/Texts/etc here
// ...
// ...
}
.accessibility(identifier: "detailViewTable")
.listStyle(.grouped)
When I put a breakpoint as shown,
and use po in the Debug Console, I get 0 for ct, but 21 if calling count directly:
(lldb) po ct
0
(lldb) po detailViewTable.staticTexts.count
21
Pic here:
Why is the var ct set to 0, but calling the count directly gives me the correct number 21?
It makes me wonder if the XCUIElementQuery takes a moment to run and returns the answer in some kind of implicit callback, and the data isn't ready at the moment the variable ct is set, but will return the correct answer in the debug console because it waits for the response. However, I don't see any discussion of a callback in the Apple documentation:
https://developer.apple.com/documentation/xctest/xcuielementquery
This answer
https://stackoverflow.com/a/37320452
pointed me in the right direction. The fix is to put a pause in the form of sleep(1), to give the query time to do its fetch, I assume.
let pred = NSPredicate(format: "identifier == 'detailViewTable'")
let detailViewTable = app.descendants(matching: .any).matching(pred).firstMatch
let arrOfTexts = detailViewTable.staticTexts
sleep(1) // <-- Add this
let ct = arrOfTexts.count
print("Count is: \(ct)") // Now prints 21
print("Count is now: \(detailViewTable.staticTexts.count)") // Also prints 21

i have error with the sample rate 8khz to read real time audio buffer

with my following code i have the error :
var engine = AVAudioEngine()
let input = engine.inputNode!
let bus = 0
let mixer = AVAudioMixerNode()
engine.attach(mixer)
engine.connect(input, to: mixer, format: input.outputFormat(forBus: 0))
//pcmFormatFloat64 -- pcmFormatFloat32
let fmt = AVAudioFormat(commonFormat: .pcmFormatFloat32, sampleRate: 8000, channels: 1, interleaved: true)
mixer.installTap(onBus: bus, bufferSize: 512, format: fmt) { (buffer, time) -> Void in
// 8kHz buffers!
print(buffer.format)
}
try! engine.start()
}
ERROR : kAudioUnitErr_TooManyFramesToProcess : inFramesToProcess=1024, mMaxFramesPerSlice=768
with the sample rate 441khz is everything fine but with 8khz not
what is wrong with this code ?
I encountered the same error message at 8kHz and fixed it by setting the preferred sampling rate and capture duration for my audio session instance:
let audioSession = AVAudioSession.sharedInstance()
try audioSession.setPreferredSampleRate(sampleRate)
try audioSession.setPreferredIOBufferDuration(TimeInterval(processingInterval))

Swift 3 - CFHostScheduleWithRunLoop crash

I am doing a reverse DNS in Swift, my previous code on Swift 2.2 was working fine, also I have implemented it in Objective-C and it works to. However I am not able to make it works in Swift 3.0
Swift 2.2
//: Let's set up the `sockaddr_in` C structure using the initializer.
var sin = sockaddr_in(
sin_len: UInt8(sizeof(sockaddr_in)),
sin_family: sa_family_t(AF_INET),
sin_port: in_port_t(0),
sin_addr: in_addr(s_addr: inet_addr(ip)),
sin_zero: (0,0,0,0,0,0,0,0)
)
//: Now convert the structure into a `CFData` object.
let data = withUnsafePointer(&sin) { ptr in
CFDataCreate(kCFAllocatorDefault, UnsafePointer(ptr), sizeof(sockaddr_in))
}
//: Create the `CFHostRef` with the `CFData` object and store the retained value for later use.
let host = CFHostCreateWithAddress(kCFAllocatorDefault, data).takeRetainedValue()
//: Now schedule the runloop for the host.
CFHostScheduleWithRunLoop(host!, CFRunLoopGetCurrent(), kCFRunLoopDefaultMode)
var error = CFStreamError()
//: Start the info resolution.
CFHostStartInfoResolution(host!, .Names, &error)
Swift 3.0
//: Let's set up the `sockaddr_in` C structure using the initializer.
var sin = sockaddr_in(
sin_len: UInt8(sizeof(sockaddr_in)),
sin_family: sa_family_t(AF_INET),
sin_port: in_port_t(0),
sin_addr: in_addr(s_addr: inet_addr(ip)),
sin_zero: (0,0,0,0,0,0,0,0)
)
//: Now convert the structure into a `CFData` object.
let data = NSData(bytes: &sin, length: MemoryLayout<sockaddr_in>.size) as CFData
//: Create the `CFHostRef` with the `CFData` object and store the retained value for later use.
let host = CFHostCreateWithAddress(kCFAllocatorDefault, data).takeRetainedValue()
//: Now schedule the runloop for the host.
CFHostScheduleWithRunLoop(host!, CFRunLoopGetCurrent(), CFRunLoopMode.defaultMode as! CFString)
var error = CFStreamError()
//: Start the info resolution.
CFHostStartInfoResolution(host!, .Names, &error)
When I ran this code it it crash at
CFHostScheduleWithRunLoop
any idea?
Try to replace:
CFRunLoopMode.defaultMode as! CFString
with:
CFRunLoopMode.defaultMode!.rawValue

AVAudioUnit (OS X) render block only called for certain sample rates

I'm having trouble getting AVAudioEngine (OS X) to play nice with all sample rates.
Here's my code for building the connections:
- (void)makeAudioConnections {
auto hardwareFormat = [self.audioEngine.outputNode outputFormatForBus:0];
auto format = [[AVAudioFormat alloc] initStandardFormatWithSampleRate:hardwareFormat.sampleRate channels:2];
NSLog(#"format: %#", format);
#try {
[self.audioEngine connect:self.avNode to:self.audioEngine.mainMixerNode format:format];
[self.audioEngine connect:self.audioEngine.inputNode to:self.avNode format:format];
} #catch(NSException* e) {
NSLog(#"exception: %#", e);
}
}
On my audio interface, the render callback is called for 44.1, 48, and 176.4kHz. It is not called for 96 and 192 kHz. On the built-in audio, the callback is called for 44.1, 48, 88 but not 96.
My AU's allocateRenderResourcesAndReturnError is being called for 96kHz. No errors are returned.
- (BOOL) allocateRenderResourcesAndReturnError:(NSError * _Nullable *)outError {
if(![super allocateRenderResourcesAndReturnError:outError]) {
return NO;
}
_inputBus.allocateRenderResources(self.maximumFramesToRender);
_sampleRate = _inputBus.bus.format.sampleRate;
return YES;
}
Here's my AU's init method, which is mostly just cut & paste from Apple's AUv3 demo:
- (instancetype)initWithComponentDescription:(AudioComponentDescription)componentDescription options:(AudioComponentInstantiationOptions)options error:(NSError **)outError {
self = [super initWithComponentDescription:componentDescription options:options error:outError];
if (self == nil) {
return nil;
}
// Initialize a default format for the busses.
AVAudioFormat *defaultFormat = [[AVAudioFormat alloc] initStandardFormatWithSampleRate:44100. channels:2];
// Create the input and output busses.
_inputBus.init(defaultFormat, 8);
_outputBus = [[AUAudioUnitBus alloc] initWithFormat:defaultFormat error:nil];
// Create the input and output bus arrays.
_inputBusArray = [[AUAudioUnitBusArray alloc] initWithAudioUnit:self busType:AUAudioUnitBusTypeInput busses: #[_inputBus.bus]];
_outputBusArray = [[AUAudioUnitBusArray alloc] initWithAudioUnit:self busType:AUAudioUnitBusTypeOutput busses: #[_outputBus]];
self.maximumFramesToRender = 256;
return self;
}
To keep things simple, I'm setting the sample rate before starting the app.
I'm not sure where to begin tracking this down.
Update
Here's a small project which reproduces the issue I'm having:
Xcode project to reproduce issue
You'll get errors pulling from the input at certain sample rates.
On my built-in audio running at 96kHz the render block is called with alternating 511 and 513 frame counts and errors -10863 (kAudioUnitErr_CannotDoInCurrentContext) and -10874 (kAudioUnitErr_TooManyFramesToProcess) respectively. Increasing maximumFramesToRender doesn't seem to help.
Update 2
I simplified my test down to just connecting the input to the main mixer:
[self.audioEngine connect:self.audioEngine.inputNode to:self.audioEngine.mainMixerNode format:nil];
I tried explicitly setting the format argument.
This still will not play through at 96kHz. So I'm thinking this may be a bug in AVAudioEngine.
For play-through with AVAudioEngine, the input and output hardware formats and all the connection formats must be at the same sample rate. So the following should work.
AVAudioFormat *outputHWFormat = [self.audioEngine.outputNode outputFormatForBus:0];
AVAudioFormat *inputHWFormat = [self.audioEngine.inputNode inputFormatForBus:0];
if (inputHWFormat.sampleRate == outputHWFormat.sampleRate) {
[self.audioEngine connect:self.audioEngine.inputNode to:self.audioEngine.mainMixerNode format:inputHWFormat];
[self.audioEngine connect:self.audioEngine.mainMixerNode to:self.audioEngine.outputNode format:inputHWFormat];
}

QCRenderer.renderAtTime fails with 'invalid framebuffer operation'

I initialize my QCRenderer like this:
let glPFAttributes:[NSOpenGLPixelFormatAttribute] = [
UInt32(NSOpenGLPFABackingStore),
UInt32(0)
]
let glPixelFormat = NSOpenGLPixelFormat(attributes: glPFAttributes)
if glPixelFormat == nil {
println("Pixel Format is nil")
return
}
let openGLView = NSOpenGLView(frame: glSize, pixelFormat: glPixelFormat)
let openGLContext = NSOpenGLContext(format: glPixelFormat, shareContext: nil)
let qcRenderer = QCRenderer(openGLContext: openGLContext, pixelFormat: glPixelFormat, file: compositionPath)
Further down in the code, I call renderAtTime like this:
if !qcRenderer.renderAtTime(frameTime, arguments: nil) {
println("Rendering failed at \(frameTime)s.")
return
}
which always produces this error message:
2014-10-30 15:30:50.976 HQuartzRenderer[3996:692255] *** Message from <QCClear = 0x100530590 "Clear_1">:
OpenGL error 0x0506 (invalid framebuffer operation)
2014-10-30 15:30:50.976 HQuartzRenderer[3996:692255] *** Message from <QCClear = 0x100530590 "Clear_1">:
Execution failed at time 0.000
2014-10-30 15:30:50.976 HQuartzRenderer[3996:692255] *** Message from <QCPatch = 0x100547860 "(null)">:
Execution failed at time 0.000
Rendering failed at 0.0s.
The Quartz Composition is just a simple GLSL shader which runs just fine in Quartz Composer.
Here's a screenshot of the Quartz Composer window:
There's not much on the internet on this that I could find. I hope someone here knows something that might help.
By the way, I know that I could just initialize the QCRenderer like
let qcRenderer = QCRenderer(offScreenWithSize: size, colorSpace: CGColorSpaceCreateWithName(kCGColorSpaceGenericRGB), composition: qcComposition)
but I want to take advantage of my GPU's multisampling capabilities to get an antialiased image. That's gotta be more performant than rendering at 4x size, then manually downsizing the image.
Edit: Changed pixel format to
let glPFAttributes:[NSOpenGLPixelFormatAttribute] = [
UInt32(NSOpenGLPFAAccelerated),
UInt32(NSOpenGLPFADoubleBuffer),
UInt32(NSOpenGLPFANoRecovery),
UInt32(NSOpenGLPFABackingStore),
UInt32(NSOpenGLPFAColorSize), UInt32(128),
UInt32(NSOpenGLPFADepthSize), UInt32(24),
UInt32(NSOpenGLPFAOpenGLProfile),
UInt32(NSOpenGLProfileVersion3_2Core),
UInt32(0)
]