Constant beeping when playing 1st order ambisonic audio in Unity app deployed on Samsung S7 - resonance-audio

I'm developing an app using Unity (2017.2.03f) and Google Resonance (v1.1.1) for mobile phones. When testing the application on the Samsung S7, there is a loud constant beep heard in the scenes where there are 1st order ambisonc audio playing. This issue did not occur when testing the app on the two other devices that I have at hand - Pixel XL or the Nexus 5x.
Also, in some scenes, I've used a lowpass filter on the ResonanceAudioMixer ehich seem to be able to reduce the level of the beep.
Have anyone had any similar issues with the Samsung S7, or any other phones?
Does anyone have any idea about what could be causing the issue?
Any input would be highly appreciated!

This turned out to be an issue with the audio recording and not with Resonance Audio plugin. Please see original poster's comment above for details.

Related

What is the best way to cast Hololens to a PC without delay?

I am working in a showroom, and need to stream the Hololens experience to a PC which is then connected to a large screen over WiFi. On the PC I am using Hololens Companion App. We coupled two Hololens 2 to the app. But on the livestream we experience sometimes that the cast is coming late for some seconds or it hangs completely on the PC.
Is there a possibly a more performant way to do a casting from the Hololens to the PC - possibly using a different app or a hardware ? This would help us a lot. Thank you in advance !
Best regards
Karsten
To share the first-person perspective as a video stream for the local user, you can also leverage built-in Miracast support to stream videos to the display receivers(such as your PC). For more information, please refer to Mixed Reality official doc:Shared experiences in mixed reality.

Performance issue using a bluetooth loudspeaker on Android NDK app

In my Android app, I use Android NDK to play music by doing the following:
extract audio samples from an OGG file using the Vorbis library
process the audio samples
redirect the processed samples to the audio output using the Oboe library
In order to avoid underruns, I do the first 2 steps in a separate thread, to extract and process the sound a little bit in advance (it adds a bit of latency, but this is not a problem for my app). That solution works great on every device I've tested so far.
But for some reason, when I pair my device with a bluetooth speaker and when I play music, I have what seems to be underruns on some devices like Samsung S7 or Nokia 1 (but not on every device).
This bug looks so random to me that I don't know where to start. It acts like the bluetooth connection is using quite a lot of CPU so my app doesn't have enough resource to run properly.
Does anyone have experienced something similar? Should I do anything in my code to handle the bluetooth connection so it doesn't use CPU (for example to avoid audio resampling)?
Thanks for your help.
Android + Bluetooth audio is a world of pain. The major thing to appreciate about Bluetooth is the audio sink runs at a rate independent of other audio devices, which is why the native mediaplayer will do things like display video in accordance with whatever rate the attached audio device consumes samples, essentially slaving itself to the clock of the BT audio device. If you want to drive the speed from Android (i.e. SystemClock timebase) you'll need to use a timestretching AudioTrack. (This can be done, but driver support is erratic and overall system stability tanks).
Firstly, you want to eliminate the devices themselves being problems. Can you play the ogg files in a media player to a Bluetooth speaker from the S7 or Nokia 1 without problems? If so, it's your code!
It sounds to me like the speaker is consuming samples faster than the device is producing them, for whatever reason. Basically check your callbacks to make sure whenever the audio subsystem requests more data you are actually providing it. Be sure to drive your decoding pipeline according to the callbacks being made and not the system clock or any other assumptions about timing.
Finally, Bluetooth audio, at least A2DP, as opposed to directly streaming MP3, is going to require some processing to recompress the audio as it is sent out, but those devices should have plenty of headroom for this, maybe even special DSPs. I've done it with 1080P video playback at the same time before, and it starts to fall apart with two videos at once!

Ducking. When I accept call in my VoIP-softphone, the Volume level in my softphone drops down to 10% , if I use default audio device (Windows 8)

I am working VoIP softphone which used webrtc. And I have meet follow issue:
When I accept call in my softphone, the Volume level in my softphone drops down to 10%.
It is reproduced, when few audio devices are connected to PC and the default audio device (not headset) is used in my softphone.
I tried to disable "ducking" using WASAPI function "SetDuckingPreference", but it not help.
Could you please help me?
Please let me know why the volume is dropped down. How can I fix this?
Thanks

Map Overlays Native Android Maps Not Animating on Samsung Devices

I have Google Maps as my background map and on top of it I have weather radar TileOverlays. The animation of these tiles is not working on all Samsung devices I've tested on (S3, S4, S5) and the Sony Xperia...but it works on all other devices I've tested on (Nexus, HTC, Motorola, and many more). Any ideas off hand before I gather the code logic from multiple classes?
It appears that the getTileUrl I override is not being called with whatever updates are being sent out to phones lately. Anyone heard anything of the sort? This is the overridden method for the abstract method here: http://developer.android.com/reference/com/google/android/gms/maps/model/UrlTileProvider.html#getTileUrl(int, int, int)
This is a quote I received directly from Google:
"The workaround is to uninstall Google Play Services updates by the user, and I'm not aware of a workaround on the app developer side.
This bug was introduced with the last version of GMSCore released to the devices. On the bright side, developers are already working for a fix and we're expecting it to be released today or tomorrow."

How to display video in/with Adobe Alchemy?

I want to display a video from a not supported USB camera in Air (or Flash).
There is a SDK (of the camera) to display the video stream.
My question:
How should the C/C++ routine build to compile it with Adobe Alchemy?
I want only to display the video stream in Adobe Air (or Flash).
No audio or something special is needed - only video.
I am working on Linux.
Some ideas?
If you cannot already use the camera from Flash, Alchemy is not going to help you.
Alchemy can only do things that ActionScript can do-- it does not help you "get around" the Flash sandbox. The reason people use Alchemy is so that they can compile large legacy codebases and/or open-source libraries.