Why AWS IVS Stream doesn't play? - amazon-web-services

I have a healthy streaming sent to AWS IVS.
When using the very same javascript code given by AWS to play the streaming, it's not working :
I got the code from here
<script src="https://player.live-video.net/1.7.0/amazon-ivs-player.min.js"></script>
<video id="video-player" playsinline></video>
<script>
if (IVSPlayer.isPlayerSupported) {
const player = IVSPlayer.create();
player.attachHTMLVideoElement(document.getElementById('video-player'));
player.load("https://b9ce18423ed5.someregion.playback.live-video.net/api/video/v1/someregion.242198732211.channel.geCBmnQ6exSM.m3u8");
player.play();
}
</script>
The playback URL is coming from the IVS channel.
When running this code, nothing happens and the video tag source is set to :
<video id="video-player" playsinline="" src="blob:null/b678db19-6b9a-42fc-979e-1e0eda4a3b46"></video>
There is no code from my side. It's only AWS code. Is that a bug or am I doing something wrong ?
Thanks.
Regards,

You can confirm if the stream is healthy using the AWS Management console. If it's loading in the Live stream tab, then it should play in the AWS IVS player that you've integrated.
I used the below code & upon successful streaming, it was loading fine.
<!DOCTYPE html>
<html>
<head>
<!--Set reference to the AWS IVS Player-->
<script src="https://player.live-video.net/1.8.0/amazon-ivs-player.min.js">
</script>
<!--Create video tags-->
<video id="video-player" playsinline controls muted="true" height="500px" width="700px"></video>
<script>
//Once AWS IVS Player is loaded, it creates a global object - "IVSPlayer". We use that to create a player instance that loads the playback URL & plays it the connected video html element.
if (IVSPlayer.isPlayerSupported) {
const player = IVSPlayer.create();
player.attachHTMLVideoElement(document.getElementById('video-player'));
player.load("*PLACE_YOUR_PLAYBACK_URL_HERE{.m3u8 extention}*");
player.play();
}else{
console.warn("Error: Browser not supported!");
}
</script>
</head>
<body>
</body>
</html>

Here is the reason why in my case the IVS stream didn't play. Maybe it may help someone else.
In my case, it was not playing because the video streamed was completely black. So it thinks the video stream is "empty". Once I got something in the video, it was playing properly.

Related

how to stream microphone audio from browser to S3

I want to stream the microphone audio from the web browser to AWS S3.
Got it working
this.recorder = new window.MediaRecorder(...);
this.recorder.addEventListener('dataavailable', (e) => {
this.chunks.push(e.data);
});
and then when user clicks on stop upload the chunks new Blob(this.chunks, { type: 'audio/wav' }) as multiparts to AWS S3.
But the problem is if the recording is 2-3 hours longer then it might take exceptionally longer and user might close the browser before waiting for the recording to complete uploading.
Is there a way we can stream the web audio directly to S3 while it's going on?
Things I tried but can't get a working example:
Kineses video streams, looks like it's only for real time streaming between multiple clients and I have to write my own client which will then save it to S3.
Thought to use kinesis data firehose but couldn't find any client data producer from brower.
Even tried to find any resource using aws lex or aws ivs but I think they are just over engineering for my use case.
Any help will be appreciated.
You can set the timeslice parameter when calling start() on the MediaRecorder. The MediaRecorder will then emit chunks which roughly match the length of the timeslice parameter.
You could upload those chunks using S3's multipart upload feature as you already mentioned.
Please note that you need a library like extendable-media-recorder if you want to record a WAV file since no browser supports that out of the box.

Cannot achieve 1000 concurrent function requests with Google Cloud Functions

I am trying to run an RNA-Seq application with Google Cloud Functions. To run this application I need to be able to have over 800 functions running concurrently. This has been achieved using AWS Lambda, but I have not been able to do this on Google Cloud Functions.
When I attempt to run hundreds of basic HTTP requests with the default HTTP trigger, I start getting tons of 500 errors:
<html><head>
<meta http-equiv="content-type" content="text/html;charset=utf-8">
<title>500 Server Error</title>
</head>
<body text=#000000 bgcolor=#ffffff>
<h1>Error: Server Error</h1>
<h2>The server encountered an error and could not complete your request.<p>Please try again in 30 seconds.</h2>
<h2></h2>
</body></html>
If I check the logs for my function, I see no error messages! The cloud console makes it seem like everything is perfect even though my requests are failing.
How should I got about diagnosing this problem? It looks like it's something wrong on Google's end, as my code works fine when requests do go through. Does Google limit the amount of HTTP requests you can make?
Any help would be really appreciated.
The scaling limit for HTTP type functions is different than background functions. Please read the documentation about scalability to be clear on the limits.
For background functions, it will scale gradually up to 1000 current invocations. Since you're writing an HTTP function, this does not apply.
For HTTP functions, note that rates are limited by the amount of outbound network bandwidth generated by the function (among other things). You will have to take a close look at what your function is actually doing to figure out if it's exceeding the documented rate limits.
If you can limit what the function is doing internally to meet the scalability limits, one thing you can try is to shard your functions. Instead of one HTTP function, create two, and split traffic between them. The stated limits are per-function (not per-project), so you should be able to handle more load that way.

Gstreamer to html with busybox

I'm trying to stream video using gstreamer to html usign busybox in petalinux, within of a zynqmp based platform.
So the main idea is to stream a video pipeline into an embedded web server and open it from a external PC web browser (I'm using chrome).
The index.html file looks like that:
<!DOCTYPE html>
<html>
<p> video test ! </p>
<body>
<video autoplay controls width=1280 height=1024>
<source src="http://192.168.1.12:5000" type="video/mp4">
Your browser does not support the video tag.
</video>
</body>
</html>
And the testing pipeline is:
gst-launch-1.0 -v videotestsrc pattern=snow is-live=true ! video/x-raw,width=1280,height=1024 ! theoraenc ! oggmux ! tcpserversink host=192.168.1.12 port=5000
Also I tried to stream a video camera live from a C application like so:
appsrc -> queue -> omxh264 -> h264parser -> mp4mux (streamable=true fragment-duration = 100) -> queue -> tcpserversink (host=192.168.1.12 port=5000)
(Please note that the above line is a pseudo code to illustrate what the application is doing)
I'm expecting to see the video in the PC web browser but unfortunately I've got a grey box with the video controls instead.
Does anyone know if this would be possible using busybox? or is there something fundamentally wrong with this approach?

Janus-Gateway RTP-Forward to send stream to AWS Elemental MediaLive

I'm using rtp_forward from the videoroom plugin in Janus-Gateway to stream WebRTC.
My target pipeline looks like this:
WebRTC --> Janus-Gateway --> (RTP_Forward) MediaLive RTP_Push Input
I've achieved this:
WebRTC --> Janus-Gateway --> (RTP-Forward) Janus-Gateway [Streaming Plugin]
I've tried multiple rtp_forward requests, like:
register = {"request": "rtp_forward", "publisher_id": 8097546391494614, "room": 1234, "video_port": 5000, "video_ptype": 100, "host": "medialive_rtp_input", "secret": "adminpwd"}
But medialive just doesn't receive any stream. Anything I'm missing?
I'm not familiar with AWS MediaLive: initially I thought that, since most media servers like this expect RTMP and not RTP, that was the cause of the issue, but it looks like it does indeed support a plain RTP input mode. At this point this is very likely a codec issue: probably MediaLive doesn't support the codecs your browser is sending (opus and vp8?). Looking at the supported codecs, this seems to be the issue: https://docs.aws.amazon.com/medialive/latest/ug/inputs-supported-containers-and-codecs.html
You can probably get video working if you use H.264 in the browser, but audio is always Opus and definitely not AAC, so you'll need an intermediate node to do transcoding.
Since you're using RTP PUSH, Are you pushing stream it to correct RTP endpoint provided by AWS ? If so, you can see alerts in health check if Medialive received it but it failed to read or corrupted. You'll see error is any of these pieplines where you're pushing the stream, if you don't see anything which means some Network problem, try RTMP as it's on TCP and should get something in packet capturer.
https://docs.aws.amazon.com/medialive/latest/ug/monitoring-console.html

Streaming private videos using jwplayer and amazon cloudfront

I use JWPlayer 6.7 as a client to show an .mp4 video. The video is hosted in an S3 bucket on amazon and is only accessible through a private rtmp cloudfront distribution. This works fine for devices that support flash (which can use RTMP), but it does not work for iOS devices that can only use HTML5 video (that does not support RTMP as I have learned).
I use the code listed below. The fallback (the second file item in the sources list) needs to be http instead of rtmp, because of the HTML5 player. The distribution I use in the example below is the same in both cases, but I guess it cannot handle the http call because it is an rtmp distribution (, right?).
So the question is: how do I set up the amazon cloudfront distributions to get this working? I would prefer to be able to use the same mp4 file in the S3 bucket and for the file to be streamed in the HTML5 player instead of downloaded (is that possible?) The video needs to be private (using a private distribution and requiring a key to see it) in both cases (rtmp and http)
Many thanks!
jwplayer('video').setup({
playlist: [{
image: '//d12q7hepqvd422.cloudfront.net/image.png',
sources: [
{file: 'rtmp://s3e5mnr1tue3qm.cloudfront.net/cfx/st/name&Key-Pair-Id=APKAIAS7DDQFOAHAHOTQ'},
{file: 'http://s3e5mnr1tue3qm.cloudfront.net/cfx/st/name&Key-Pair-Id=APKAIAS7DDQFOAHAHOTQ'}
]
}],
primary: 'flash',
flashplayer: '//d12q7hepqvd422.cloudfront.net/global/js/jwplayer6.7.4071/jwplayer.flash.swf?v=2',
html5player: '//d12q7hepqvd422.cloudfront.net/global/js/jwplayer6.7.4071/jwplayer.htm5.js?v=2',
width: '940',
height: '403'
});