Live RTMPS stream to Wowza - wowza

I set up a Wowza server and succeeded to stream RTMP to it from an Android device using a third party library.
The library can also stream RTMPS.
I checked the Wowza documentation and I didn't see that it can ingest RTMPS or any other encrypted stream (although can stream RTMPS to players).
It is possible to ingest RTMPS live stream with Wowza?

If your Android library you are using is attempting to push the stream into Wowza, it would not be supported. If somehow you are trying to ingest the stream from the device, this should likely work.

Using the Stream lock you can deliver the RTMPS to the Player..
Option 1 : Stream lock - Self Signed - https://www.wowza.com/docs/how-to-request-an-ssl-certificate-from-a-certificate-authority
Option 2 : Stream Lock - You will get 2 Stream lock .JKS file from wowza if you are using any subscription based streaming engine.
After configuring the Stream Lock you can deliver the RTMPS from wowza.

Related

Stream Amazon Connect audio in real time with KVS

I have a contactflow in AWS Connect with customer audio streaming enabled. I get the customer audio steam in KVS and can read bytes from the stream and convert it to an audio file when the call is completed in Java with the examples provided by AWS.
But I want to steam the audio in a web page for real-time monitoring exactly like the AWS provides real-time monitoring in built-in CCP.
I get the steam ARN and other contact data. How can I use that stream for real-time monitoring/streaming?
Any heads up will be appreciated.
You're going to want to use a WebRTC client in the browser/page you want to use monitoring and controlling the the stream. AWS provides a WebRTC SDK for Kinesis Video Streams that can be used for this. The SDK documentation can be found here, which includes a link to samples and config details on GitHub

How To send a RSTP stream with the newest FFmpeg(NOT BY COMMAND)

Can any body give me a example or indicate to use FFmpeg library to capture the USB video (or capture the internet RSTP stream )and send the information to our own server by RTSP stream.Very Thanks your answer

Receive rtsp stream using gstreamer

I want to receive rtsp stream using gstreamer I knw rtspsrc can be used for this purpose but the problem is that it only receives it as a client but in my case i have a ffmpeg application which streams the video as a client and waits for a server to connect with it before streaming. So i want gstreamer to act as server and receive the stream from ffmpeg
I haven't used it myself, but I believe there is a separate package for RTSP server functionality. In Debian based systems it should be under something like:
libgstrtspserver-0.10-0

Wowza Streaming Live Server Data two questions

Does Wowza Streaming Server (used as live streaming services) care about the direction of data stream?
Possibility of Wowza Server as a server side mode and client side mode
Since I am trying to send data stream via LTE, the cost of sending data stream is very high. So I am wondering if it is possible to have live stream data only when the request is present.
Thank you
It's not 100% clear what do you mean by "Wowza acting as a client" to me. Wowza can be configured to be an edge instance, that pulls stream from an origin Wowza server, when you want to have multiple, possibly geographically distributed stream sources.
Or, if you think about the other case, Wowza accepts the stream from encoders, in which case the data flow direction is also towards Wowza. This is the case when you install for example a Flash Media Live Encoder, or a VLC and push your encoded stream "into" Wowza, which then distributes this stream to all other players.
But obviously Wowza won't act as a video player to you.. :-)
Can you clarify your first question, maybe I can give a better answer then.

OpenCV and RTMP

I want to be able to create an application that can read and publish an RTMP stream.
Using OpenCV i could read rtp due to it's ffmpeg backend.
Stream video from ffmpeg and capture with OpenCV
C++ RTMP is another possibility, but this is an RTMP server so it mainly requests and sends files. Although open source, i am unsure how to build or integrate this into a Visual Studio application in such a way as to make the function calls available to my project.
OTher sources indicate that OpenCV's RTSP isn't great.
http://workingwithcomputervision.blogspot.co.nz/2012/06/issues-with-opencv-and-rtsp.html
How can you run a streaming server, such as RTMP C++ and get the raw data out. OpenCV can encode and decode image data for streaming, but how can you link the two?
Could a C++ application pipe a stream together? How could i interface with that stream to send it more images? Also, for receiving images?
Regards,
cRMTPServer and LibRTMP works well.