Playback not working in offline mode when trying to play offline playlist (libspotify 9.1.32 / armv7 build) - offline

I have an issue with libspotify and playback of offline synchronized playlists. When there is no active internet connection, the login suceeds (and sp_session_connectionstate reports that user is logged-in offline); however playback does not work when trying to play back any track on an offline playlist. Playback does work if logging in with an active internet connection, and then switching the connection off (i.e., seems that it is necessary to be online initially for offline playback to work). Is this an issue with the library, and in such case, can a fix be expected?

I wouldnt count on a fix from Spotify. They barely listen to the people who use their API.

Related

Native WebRTC dropping frames

Summary: How do I stream high quality video using WebRTC native?
I have an h264 stream that's 1920x1080 at about 30fps. I can currently stream this from a server on localhost to a native client on localhost just fine.
I wrote a WebRTC server using Google's WebRTC native library. I've written a VideoEncoder and VideoEncoderFactory that takes frames consisting of already encoded data and and broadcasts it over a video track. Using this I can send my h264 stream to the WebRTC server over a pipe and I can see the video stream in a browser.
However, any time something moves the video gets corrupted. It continues to play but is full of artifacts. Eventually I discovered that WebRTC is dropping some of my frames. When I attach a sequentially increasing ID to each frame before I pass it to rtc::AdaptedVideoTrackSource::OnFrame and I log this same ID in webrtc::VideoEncoder::Encode I can see that some of my frames simply disappear.
This kind of makes sense, I'm trying to stream high quality video over something meant for video chat and lowing my framerate fixes the corruption. However, I'm not asking the WebRTC library to do a lot, it's just forwarding already encoded data to a client on localhost. I have a native app that does this fine and I've seen one browser WebRTC client that can do this. Is there a field in the SDP or some configuration change that will allow me to stream my video?
This was the solution How to control bandwidth in WebRTC video call? .
I had heard about changing the offer sdp but dismissed it because I was told that the browser will accept unlimited bandwidth by default and that you'd only need to to this if you want to limit bandwidth. However, adding "b=AS:high number" has fixed all of my problems.

Which AWS EC2 instance type is most optimal for audio streaming?

I'm in testing stage of launching an online radio. I'm using AWS CloudFormation stack with Adobe Media Server.
My existing instance type is m1.large and my Flash Media Live Encoder is streaming mp3 at 128kbps which i think is pretty normal but it's producing a stream that isn't smooth & stable at all and seems to have a lot of breaks.
Should i pick an instance type with higher specs?
I'm running my test directly off of LiveHLSManifest link that opens on my iPhone's Safari and plays on browser's build-in player..which doesn't set any buffering on client side - could this be the issue?
Testing HLS/HDS links directly on iPhone's Safari was a bad idea. I relied on built-in players already having some sort of buffering configuration by default but noo... I was able to receive stable & smooth stream when i used players like Strobe Media Playback, FlowPlayer etc.. Hopefully, this answer will save someone some time.

Available event for connection timeout for streams publishing over RTSP?

I use Wowza GoCoder to publish video to a custom Wowza live application. In my application I attach an IRTSPActionNotify event listener within the onRTPSessionCreate callback. In the onRecord callback of my IRTSPActionNotify I perform various tasks - start recording the live stream, among other things. In my onTeardown callback I then stop the recording and do some additional processing of the recorded video, like moving the video file to a different storage location.
What I just noticed was that if the encoder timeout, due to a lost connection, power failure or some other sudden event, I wont receive an onTeardown event - not even when the RTSP session timeout. This is a big issue for me, since I need to do this additional processing before I make the published stream available for on demand view through another application.
I've been browsing through the documentation looking for an event or a utility-class that could help me out, but so far to no avail.
Is there some reliable event, or other reliable way to know that a connection has timed out, so that I can trigger this processing also for streams that doesn't fire a teardown-event?
I first discovered this issue when I lost connection on my mobile device while encoding video using the Wowza GoCoder app for iOS, but I guess the issue would be the same for any encoder.
In my Wowza modules, I have the following pattern, which proved to be quite reliable so far:
I have a custom worker thread and that iterates over all client types. Now this allows me to keep track of clients, and I have found that eventually all kind of disasters lead to clients being removed from those lists after unclear timeouts.
I think try tracking (add / remove) clients in your own Set and see if that is more accurate.
You could also try and see if anything gets called in that case in IMediaStreamActionNotify2.
I have seen onStreamCreate(IMediaStream stream) and onStreamDestroy(IMediaStream stream) being called in ModuleBase in case of GoCoder on iOS, and I am attaching an instance of IMediaStreamActionNotify2 to the stream by calling stream.addClientListener(actionNotify)
On the GoCoder platform: I am not sure that it's the same on Android, the GoCoder Android and iOS versions have a fundamental difference, that is the streaming protocol itself, which leads to different API calls and behaviour on backend side. Don't go live without testing on both platforms.. :-)

How to transfer big-sized video files from Google Glass, to be viewed few minutes after the capture?

I have to transfer large-sized video file from glass to a server or on a local computer, just after capture (~400Mb), to be viewed as quick as possible on the target device.
I tried to use the mirror API to get the video but the upload time to google platform is really high (more than one hour for a 10-minutes video length).
I also tried to make an app to capture video and send it over the wifi, but it took about 3 minutes for 18Mb (So, more than one hour for 400Mb).
We also studied the android Wi-fi P2P solution, but it seems not available on Glass at this time.
And I tried to launch Droid NAS or AirDroid application to access Glass file system over the air. Application installation is OK, but I can't run them.
Has someone any solution/recommendation ?
Thanks in advance,
Julian
Since you're willing to do it to a local computer, you can just plug Glass into the local machine and transfer the file. Glass is treated as a Camera device, and most PCs and Macs have software that will let you transfer the videos directly over.

How can we do video recording using VNC

How can we do video recording using VNC? I want to record all the session. we have multiple clients and server. So efficiency is important too.
Anyone has any idea of some opensource project which can handle this. I can think of vncrec only but haven't used it. So nyone who used this project.
Here is some info (Debian centric) about recording and transcoding VNC sessions into movie files including how to use vncrec and other options.