Icecast - Moving listeners from autodj to live stream without resetting the stream - icecast

I have Icecast server and I have /autodj.mp3 and another mountpoint called /stream.mp3
When I broadcast Live I am broadcasting to /stream.mp3
But when I'm listening and I start broadcasting Live it does not automatically switch to Live, unless I reload the page with the stream, then it works
Any ideas?
Icecast.xml
<mount>
<mount-name>/autodj.mp3</mount-name>
<stream-name>Niall FM AutoDJ</stream-name>
</mount>
<mount>
<mount-name>/stream.mp3</mount-name>
<fallback-mount>/autodj.mp3</fallback-mount>
<fallback-override>1</fallback-override>
<stream-name>Niall FM</stream-name>
<public>1</public>
</mount>

Out of band discussion on IRC with the user revealed that they were streaming Ogg/Vorbis content to /autodj.mp3 and Icecast consequently refused to move listeners to a MP3 stream logging in at /stream.mp3.
Icecast enforces a very high level compatibility, but actually to ensure largest possible listener software compatibility it is highly recommended to closely match all stream and codec parameters between streams that fall back to each other.

Related

Web LiveStreaming WebRTC and Sockets (Flask Backend)

I want to build a live streaming app.
My thought process:
Get the Video/Audio data from the
navigator.mediaDevices.getUserMedia(constraints); [client-streamer]
create rooms using sockets(Socket.IO or WebSockets from flask) [backend]
Send the data in 1 to the room members using sockets.
display the media on the client-side.
Is that correct? How should I do it?
how do I broadcast data to specific room members and not to everyone? (flask)
How to consistently send data from the streamer -> server -> room members. the stream is given from 1 is an object, where is the data?
any other better ideas will be great! thanks.
I need to implement the server-side by myself without help from libraries that will do the work for me.
Implementing a streaming platform is not trivial. Unfortunately, it is not as simple as emitting chunks received from the MediaRecorder with onndatavailable and forwarding them to users using a WebSocket server - this is not scalable nor efficient nor reliable.
Below are some strategies you can try for different types of scenarios:
P2P: If you want to have simple peer-to-peer streaming, you can use WebRTC to achieve that with a simple socket.io server for signaling purposes.
Conference: Here things start to get more complicated. You will need a media server if you want to be somewhat scalable. One approach is to route your stream to the users using an SFU or MCU. This will take care of forwarding/processing media to different peers efficiently.
Broadcast: Here things are also non-trivial. Common WebRTC-based architectures include ingesting the WebRTC stream and forward that to an HLS server which will let your stream chunks available for clients through a CDN, or perform RTP forwarding of the WebRTC stream, convert it to RTMP using something like FFmpeg and deliver it through Youtube Live or Twitch to leverage from their infrastructure.
Be aware that the last 2 items are resource-intensive and will certainly not be cheap to maintain.
Below are some open source projects that could help you along the way:
Janus
MediaSoup
AntMedia
Jitsi
Good luck!
Explaining all this is far beyond the scope of a Stack Overflow answer.
Here are a few hints:
You need to use the MediaRecorder API to capture compressed data from your gUM (getUserMedia) stream. MediaRecorder support is inconsistent between makes and models of browser. though.
It kicks a Blob into its onndatavailable handler every so often.
They're compressed as a webm data stream.
You can push those Blobs to a server with socket.io, and the server can turn around and push them to whatever clients you want to.
Playing the webm on the clients is tricky. You may, on some makes and models of browsers, be able to feed the webm stream to the Media Source API using appendBuffer(). But some browsers cannot consume the webm streams.
These webm streams are useless to a player without all their Blob data in order. You can't just start sending a new client the Blobs of the stream when they sign in; you have to restart the MediaRecorder.
(You may be able to make it work without a MediaRecorder restart if you send the first few k bytes of the stream to each new client before sending the current Blob. Extracting those bytes is an intricate programming job involving the ebml package to parse the webm stream and extract the prologue. I have not proven this concept.)
Because getting all this to work -- originator -- server -- viewer is such a pain in the xxx neck, you may want to investigate using something like mediasoup instead. It uses WebRTC transport rather than socket.io, and works cross-platform.

stream audio from browser to WebRTC native C++ application

I manged to run WebRTC peerconnection example, but it is not running on the browser.
I'm trying to find a way to stream both video and audio from browser to my native program.
Is there any way?
It can be done. WebRTC is designed to work in a peer-to-peer manner between two WebRTC agents (typically a Web Browser). Your native program needs to become the second peer.
If you need to rely on open source components a good starting point is:
OpenSSL for the DTLS key exchange.
libsrtp to encrypt the RTP packets.
ffmpeg to decode the PCM audio from the browser (libvpx if you need to do video).
You'll also need to handle the ICE negotiation which requires processing STUN messages. Also extract the media payloads from the RTP packets. All these steps are also after you've determined a signalling method to exchange the SDP offer and answer between you app and the browser.
As you've probably realised starting from scratch it's a major task. There are probably some commercial libraries that will do the job and save you a lot of pain.
If that doesn't scare you and you do still want to make an attempt using open source components this example "may" help. The sample is doing the reverse of what you've asked and is sending a video stream to Chrome rather than receiving an audio stream. The useful aspect is the connection negotiation. The sample program is able to get RTP packets flowing which is often the main problem.
The example is also using Windows Media Foundation which is Windows specific. It also has lots of shortcuts particularly with the RTP and STUN packet processing.

Wowza Streaming Live Server Data two questions

Does Wowza Streaming Server (used as live streaming services) care about the direction of data stream?
Possibility of Wowza Server as a server side mode and client side mode
Since I am trying to send data stream via LTE, the cost of sending data stream is very high. So I am wondering if it is possible to have live stream data only when the request is present.
Thank you
It's not 100% clear what do you mean by "Wowza acting as a client" to me. Wowza can be configured to be an edge instance, that pulls stream from an origin Wowza server, when you want to have multiple, possibly geographically distributed stream sources.
Or, if you think about the other case, Wowza accepts the stream from encoders, in which case the data flow direction is also towards Wowza. This is the case when you install for example a Flash Media Live Encoder, or a VLC and push your encoded stream "into" Wowza, which then distributes this stream to all other players.
But obviously Wowza won't act as a video player to you.. :-)
Can you clarify your first question, maybe I can give a better answer then.

How to send SDP over RTP

I've developed an app which sends RTP packets to a local ip client. So the client has to listen on the specified port (rtp://:#portnumber, on VLC) to play the streamed data. Right now i'm going to develop the code needed to create the SDP file needed to start streaming.
My doubt is, how to send this file to the client? At the beginning of the RTP stream?
Really n00b at this point. Any help will be useful.
Thanks
VLC specifically supports RTSP, HTTP, SAP protocols for establishing session and communication. And of course the local file system (file://)
so basically you can call vlc in some manner like this (I cannot test it but should be like this):
vlc file://path/to/sdp-file
or
vlc rtsp://server-path:port/sdpfile.sdp
and so on
Aside from storing the SDP file in the local system, perhaps HTTP would be easiest if you have up and running http server on your server machine.

RTSP client in android

I am writing a RTSP client in Android. I am able to receive the Responses for all the requests
i.e.,
DESCRIBE it sends back the 200 OK
SETUP with transport: RTP/AVP:unicast:client_port=4568:4569 got the 200 OK Message back
Sent PLAY, and got the OK Message
After that how to get the audio and video frames?
I have searched on blogs, but all say to listen at client_port but I am not receiving any packets.
Please let me know am I doing correctly.
You may or may not know this, but Android has built in support for RTSP using the VideoView.
http://developer.android.com/reference/android/widget/VideoView.html
This may cut down on your development time...or it may be totally useless if you're trying to roll your own RTSP stack.
RTSP is only used to start the streaming. It gives you an SDP description of the real streams. You have to manage an RTCP connection and a RTP connection per channel (audio / video). The ports to use are the "client_port" ones.
It is pretty complex to code a RTSP/RTCP/RTP stack from scratch. You can have a look at the live555 library that implement such a stack in c++.
Put a sniffer on the network, you should see UDP packet with destination port 4568 targeted at your IP address.
With a decent sniffer, you will be able to see the rtsp dialog. Maybe you are missing something in the answers
You should also check the content of the SETUP response, to see if the port you requested were accepted.
Things to check :
Listening in UDP.
Firewall rules.
Range of the play request : Don't specify any to be sure the server will be playing something.
If you are behind a router or firewall, you probably won't receive anything, because your router / firewall don't know what to do with incoming UDP packets
Try first with a local Darwin Streaming server installed within your LAN.that way Firewall wont matter.Streaming will work.
If you want to try from external server then:
1) Check the client_ports mentioned in the SERVER response,some servers suggest different ports from the one requested.you have to use the ports suggested by server.
2) If the ports are correct, then you can send 64byte empty packets from each of the UDP ports to the server(called "door openers").
3) If the above two don't fix it, check the server side logs.The server might be closing the UDP ports.