I'm trying to stream video using gstreamer to html usign busybox in petalinux, within of a zynqmp based platform.
So the main idea is to stream a video pipeline into an embedded web server and open it from a external PC web browser (I'm using chrome).
The index.html file looks like that:
<!DOCTYPE html>
<html>
<p> video test ! </p>
<body>
<video autoplay controls width=1280 height=1024>
<source src="http://192.168.1.12:5000" type="video/mp4">
Your browser does not support the video tag.
</video>
</body>
</html>
And the testing pipeline is:
gst-launch-1.0 -v videotestsrc pattern=snow is-live=true ! video/x-raw,width=1280,height=1024 ! theoraenc ! oggmux ! tcpserversink host=192.168.1.12 port=5000
Also I tried to stream a video camera live from a C application like so:
appsrc -> queue -> omxh264 -> h264parser -> mp4mux (streamable=true fragment-duration = 100) -> queue -> tcpserversink (host=192.168.1.12 port=5000)
(Please note that the above line is a pseudo code to illustrate what the application is doing)
I'm expecting to see the video in the PC web browser but unfortunately I've got a grey box with the video controls instead.
Does anyone know if this would be possible using busybox? or is there something fundamentally wrong with this approach?
Related
I have a working pipeline begins with :
gst-launch-1.0 rtspsrc location=rtsp://Test_Map:T3$t1ng_2022_Map#x.x.x.x:zzzz/ISAPI/Streaming/Channels/101
which gives Unauthorized error. But vlc can open same stream .
any idea?
gst-launch should open the rtsp
You should use the user-id and user-pw argument with proper ""
than it will work.
I have a healthy streaming sent to AWS IVS.
When using the very same javascript code given by AWS to play the streaming, it's not working :
I got the code from here
<script src="https://player.live-video.net/1.7.0/amazon-ivs-player.min.js"></script>
<video id="video-player" playsinline></video>
<script>
if (IVSPlayer.isPlayerSupported) {
const player = IVSPlayer.create();
player.attachHTMLVideoElement(document.getElementById('video-player'));
player.load("https://b9ce18423ed5.someregion.playback.live-video.net/api/video/v1/someregion.242198732211.channel.geCBmnQ6exSM.m3u8");
player.play();
}
</script>
The playback URL is coming from the IVS channel.
When running this code, nothing happens and the video tag source is set to :
<video id="video-player" playsinline="" src="blob:null/b678db19-6b9a-42fc-979e-1e0eda4a3b46"></video>
There is no code from my side. It's only AWS code. Is that a bug or am I doing something wrong ?
Thanks.
Regards,
You can confirm if the stream is healthy using the AWS Management console. If it's loading in the Live stream tab, then it should play in the AWS IVS player that you've integrated.
I used the below code & upon successful streaming, it was loading fine.
<!DOCTYPE html>
<html>
<head>
<!--Set reference to the AWS IVS Player-->
<script src="https://player.live-video.net/1.8.0/amazon-ivs-player.min.js">
</script>
<!--Create video tags-->
<video id="video-player" playsinline controls muted="true" height="500px" width="700px"></video>
<script>
//Once AWS IVS Player is loaded, it creates a global object - "IVSPlayer". We use that to create a player instance that loads the playback URL & plays it the connected video html element.
if (IVSPlayer.isPlayerSupported) {
const player = IVSPlayer.create();
player.attachHTMLVideoElement(document.getElementById('video-player'));
player.load("*PLACE_YOUR_PLAYBACK_URL_HERE{.m3u8 extention}*");
player.play();
}else{
console.warn("Error: Browser not supported!");
}
</script>
</head>
<body>
</body>
</html>
Here is the reason why in my case the IVS stream didn't play. Maybe it may help someone else.
In my case, it was not playing because the video streamed was completely black. So it thinks the video stream is "empty". Once I got something in the video, it was playing properly.
I'm using rtp_forward from the videoroom plugin in Janus-Gateway to stream WebRTC.
My target pipeline looks like this:
WebRTC --> Janus-Gateway --> (RTP_Forward) MediaLive RTP_Push Input
I've achieved this:
WebRTC --> Janus-Gateway --> (RTP-Forward) Janus-Gateway [Streaming Plugin]
I've tried multiple rtp_forward requests, like:
register = {"request": "rtp_forward", "publisher_id": 8097546391494614, "room": 1234, "video_port": 5000, "video_ptype": 100, "host": "medialive_rtp_input", "secret": "adminpwd"}
But medialive just doesn't receive any stream. Anything I'm missing?
I'm not familiar with AWS MediaLive: initially I thought that, since most media servers like this expect RTMP and not RTP, that was the cause of the issue, but it looks like it does indeed support a plain RTP input mode. At this point this is very likely a codec issue: probably MediaLive doesn't support the codecs your browser is sending (opus and vp8?). Looking at the supported codecs, this seems to be the issue: https://docs.aws.amazon.com/medialive/latest/ug/inputs-supported-containers-and-codecs.html
You can probably get video working if you use H.264 in the browser, but audio is always Opus and definitely not AAC, so you'll need an intermediate node to do transcoding.
Since you're using RTP PUSH, Are you pushing stream it to correct RTP endpoint provided by AWS ? If so, you can see alerts in health check if Medialive received it but it failed to read or corrupted. You'll see error is any of these pieplines where you're pushing the stream, if you don't see anything which means some Network problem, try RTMP as it's on TCP and should get something in packet capturer.
https://docs.aws.amazon.com/medialive/latest/ug/monitoring-console.html
I have found the way to stream the videotestsrc element to the internet by calling these functions in the test-video.c file of gst-rtsp-server examples directory:
gst_rtsp_server_set_address(server,"10.xxx.xx.xxx");
gst_rtsp_server_set_service(server,"8555");
and unblocking the ports in the aws console.
But now replacing videotestsrc with the following in test-video.c, like below:
gst_rtsp_media_factory_set_launch (factory, "( "
"filesrc location=/home/ubuntu/DELTA.mpg ! mpeg2dec ! x264enc ! rtph264pay name=pay0 pt=96 "
")");
I am not able to get the stream visible on the vlc player.
Trying the below command produces following output:
ubuntu#ip-xx-xxx-xx-xxx:~/gst-rtsp-server-1.2.3/examples$ /usr/local/bin/gst-launch-1.0 playbin uri=rtsp://10.xxx.xx.xxx:8555/test
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Progress: (open) Opening Stream
Progress: (connect) Connecting to rtsp://10.185.10.118:8555/test
Progress: (open) Retrieving server options
Progress: (open) Retrieving media info
Progress: (request) SETUP stream 0
Progress: (open) Opened Stream
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Progress: (request) Sending PLAY request
Progress: (request) Sending PLAY request
Progress: (request) Sent PLAY request
(lt-test-video1:9027): GLib-CRITICAL **: unblock_source: assertion `!SOURCE_DEST
ROYED (source)' failed
WARNING: from element /GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0/GstRTSP
Src:source: Could not read from resource.
Additional debug info:
gstrtspsrc.c(4367): gst_rtspsrc_reconnect (): /GstPlayBin:playbin0/GstURIDecodeB
in:uridecodebin0/GstRTSPSrc:source:
Could not receive any UDP packets for 5.0000 seconds, maybe your firewall is blo
cking it. Retrying using a TCP connection.
♥handling interrupt.
Interrupt: Stopping pipeline ...
Execution ended after 0:00:22.473442001
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
ubuntu#ip-10-xxx-xx-xxx:~/gst-rtsp-server-1.2.3/examples$
What am I doing wrong here ?
I have the following problem, jackaudiosrc connects automatically to the first jack ports, with my capture_1 and capture_2.
I set the option connect=O, but this is not that what i think I want. What I want is that when I start script jackaudiosrc automaticly connect to another port, original script here:
gst-launch v4l2src device=/dev/video0 ! video/x-raw-yuv,width=320,height=240 ! queue ! videorate ! ffmpegcolorspace ! tee name=tscreen ! queue ! autovideosink tscreen. ! queue ! theoraenc quality=16 ! queue ! oggmux name=mux jackaudiosrc connect=0 ! audio/x-raw-float,channels=2 ! queue ! audioconvert ! vorbisenc quality=0.2 ! queue ! mux. mux. ! queue ! shout2send ip=xxx port=xxx mount=test.ogg password=xxxxx name= description= genre= url=
I have a program aj-snapshot, that makes an xml file, in this file is the connect I use, here is:
jack
client name=idjc_default
port name=str_out_l
connection port=idjc_default:output_in_l
connection port=camstream1.py:in_jackaudiosrc0_1
port
port name=str_out_r
connection port=idjc_default:output_in_r
connection port=camstream1.py:in_jackaudiosrc0_2
port
client
jack
My Question is, how can I add connect to this ports in cmd gst-launch jackaudiosrc to automaticly connect to this ports when i start my script.
I don't think it is possible to do this from gst-launch. You could write a small application that use gst_parse_launch and talks to jack to setup the connections.