Why isn't this gstreamer/hls.js configuration working on all browsers? - gstreamer

I am trying to get a live RTSP feed from a webcam to display on a website. I have a Linux server I am running gstreamer on and I am using hls.js to serve the feed up. I have followed a number of examples out there, but nothing I try can get this working across all browsers/devices. Here's what I have right now, and the results I am seeing.
Gstreamer config
This is my gstreamer script - I suspect the issue might be here with encoding settings, but I'm not sure what to try:
#!/bin/bash
gst-launch-1.0 -v -e rtspsrc protcols=tcp location=rtsp://XXX.XXX.XXX.XXX:XXXX/user=USER_password=PASSWORD_channel=1_stream=0.sdp?real_stream ! queue ! rtph264depay ! h264parse config-interval=-1 ! mpegtsmux ! hlssink location="%06d.ts" target-duration=5
index.html
Here is the webpage serving the feed up:
<!DOCTYPE html>
<head>
<title>Live Cam Test</title>
<script src="https://cdn.jsdelivr.net/npm/hls.js#latest"></script>
</head>
<body>
<video id="video" controls="controls" muted autoplay></video>
<script>
if (Hls.isSupported()) {
var video = document.getElementById('video');
var hls = new Hls();
// bind them together
hls.attachMedia(video);
hls.on(Hls.Events.MEDIA_ATTACHED, function () {
console.log("video and hls.js are now bound together !");
hls.loadSource("http://ServerName/live/playlist.m3u8");
hls.on(Hls.Events.MANIFEST_PARSED, playVideo);
});
}
</script>
</body>
</html>
Currently, this setup works the best in Chrome on Windows. The video is loaded, it autoplays, and it loads new segments as it plays, although it does seem to pause for a few seconds here and there and eventually gets a bit behind the live video.
On iOS devices, I cannot browse to the index.html page, I need to navigate directly to the playlist.m3u8 file. Once I do that, it appears to work pretty well.
On OSX, it doesn't appear to work in any browser...tried Chrome, Safari, Brave... I get weird results, sometimes it loads a single frame of the video and stops, sometimes it doesn't load anything.
I have tried the tutorials and code examples from hls.js's documentation and still no dice, so I think I must be doing something wrong in my gstreamer setup. Any help is much appreciated!

Related

adding audio to appsrc video pipeline

I'm using appsrc to generate an HLS stream, this is my successful pipeline:
appsrc->videoconvert->openh264enc->h264parse->mpegtsmux->hlssink
However, I'd like to generate some audio via audiotestsrc before mpegtsmux which would look like the following:
audiotestsrc->lamemp3enc->mpegtsmux
Audiotestsrc and lame have 'always' pads, so I link the two just like my other video elements.
When it comes to linking lame's "always" "src", to mpegtsmux's "request" "sink_%d", the result says that there's no issue:
//Returns 0
gst_pad_link(h264ParsePad, mpegtsmuxSinkPad);
//Returns 0
gst_pad_link(audioEncPad, mpegtsmuxSinkPad);
//Returns 0
gst_pad_link(mpegtsmuxSrcPad, hlssinkPad);
But running the app results in pipeline failure with
"Internal data stream error."
Removing the audioEncPad linking just makes the stream work like normal but of course without audio. How should I go about doing this?
Few things needed to be done:
Use aacparse
Clean the solution
Link voaacenc with aacparse
#2 caused me a lot of torment since everything theoretically should've worked. D'oh.

Get gstreamer pipeline object pointer in source code

I am running below gstreamer command for live streaming:
gst-launch-1.0 imxv4l2videosrc device=/dev/video0 fps-n=30 imx-capture-mode=0 ! textoverlay name=overlay text=\"Overlay text here\" valignment=top halignment=left font-desc=\"Sans, 22\"! gdkpixbufoverlay name=imageoverlay location=/home/user/LZ_50/CamOverlay.png ! imxg2dvideotransform ! imxg2dvideosink framebuffer=/dev/fb1 use-vsync=true sync=false"
I want to change the text overlay dynamic in the GStreamer pipeline.
How can I get the pipeline object pointer to change the text overlay dynamic?
Thank
I had written an application but I have an issue with crashing the application using the GStreamer pipeline with overlay image & text.
Finally got the issue of crashing the application.more detail can be found at : imx Gstreamer plugins .
This also helps to reduce CPU usage(almost 20% 1 core).

GST (gstreamer) command in QMediaPlayer command

I am using Qt Creator 4.5.2 (Qt 5.9.5, GCC 7.3.0 64-bit) and running on Ubuntu 18.04 I am just trying to get live video stream from a IP camera. I used 'QGraphicsView', 'QGraphicsScene', 'QGraphicsVideoItem' and QMediaPlayer methods.
Right now, the video streaming source is a IP camera and I am using 'QMediaPlayer' with 'RTSP' to get the live video and it works. However, for performance and other reasons, I need to change to gstreamer type command, like 'gst-launch-1.0', to get the live video. I am having trouble to get the correct 'gst pipe' string. Need helps.
In the document for 'QMediaPlayer', it states: Since Qt 5.12.2, the url scheme gst-pipeline provides custom pipelines for the GStreamer backend.
My version is 5.9.5 so I think the GStreamer type command should work.
Related Code and comments:
// Setup GraphicsScene
mpView = ui->gvCam;
mpView->setVisible(true);
mpScene = new QGraphicsScene;
mpView->setScene(mpScene);
mpScene->setSceneRect(0, 0, mpView->width(), mpView->height());
mpView->setSceneRect(QRectF());
// Setup IP camera
mpPlayer1 = new QMediaPlayer;
mpVideoItem1 = new QGraphicsVideoItem;
mpPlayer1->setVideoOutput(mpVideoItem1);
//The following line works and I got the live stream.
mpPlayer1->setMedia(QUrl("rtsp://20.0.2.118:8554/0"));
//However, I need to use GST type command, like:
//gst-launch-1.0 rtspsrc location=rtsp://20.0.2.118:8554/0 ! decodebin ! videoscale \
! 'video/x-raw, width=480, height=270, format=I420' \
! xvimagesink sync=false force-aspect-ratio=false;
//The above GST command worked if I issued from the terminal and I got the live stream.
//But, I don't know how to put it as a 'gst pipeline' string as a parameter for 'setMedia' call.
mpScene->addItem(mpVideoItem1);
QSizeF qf1(mpView->width(), mpView->height());
mpVideoItem1->setSize(qf1);
mpVideoItem1->setAspectRatioMode(Qt::IgnoreAspectRatio);
mpPlayer1->play();
If your Qt version is prior to 5.12.2 then a custom pipeline won't work with QMediaPlayer, because playbin is used instead.

Streaming from dvblast to HLS using gstreamer

I have dvblast that is successfully multicasting an MPEG2 stream originating from DVB-T onto a network, and I am trying to pick up this multicast MPEG2 stream and convert it to HLS on a Raspberry Pi 2 using gstreamer v1.0 as follows:
gst-launch-1.0 udpsrc port=5004 multicast-group=239.255.1.30 caps="application/x-rtp,media=(string)video,clock-rate=(int)90000" ! rtpbin ! rtpmp2tdepay ! tsdemux ! mpegvideoparse ! omxmpeg2videodec ! queue ! videoconvert ! omxh264enc ! mpegtsmux ! hlssink max-files=5 location=/var/www/stream/segment%05d.ts playlist-location=/var/www/stream/output.m3u8 playlist-root=http://192.168.225.2/stream/
The HLS files are successfully created, and are served via httpd successfully to mediastreamvalidator which is happy with the results:
Processed 7 out of 7 segments: OK
Segment bitrate: Average: 430.90 kbits/sec, Max: 741.38 kbits/sec
The MPEG2 license is in place and works.
Neither Safari nor an iPhone can view this stream, in both cases the play button appears but no video or audio pays. Eventually Safari will claim "Missing plugin". I am struggling to see where I have gone wrong, and am struggling to find any documentation or examples on this specific scenario. Can anyone point out where in the pipeline this has gone wrong?
Discovered that the current gstreamer gst-omx code doesn't handle include AU delimiters, and the following patch is required to make omxh264enc generate a stream that Safari and/or iOS will play:
https://bugzilla.gnome.org/show_bug.cgi?id=736211
Using the June 9 2015 version of mediastreamvalidator shows the following issues, but the stream does now play on Safari and iOS:
WARNING: Video segment does not contain an IDR frame
--> Track ID 1
ERROR: (-12642) Playlist vs segment duration mismatch
--> Segment duration 4.7600, Playlist duration: 2.4000

Gstreamer: Pausing/resuming video in RTP streams

I'm constructing a gstreamer pipeline that receives two RTP streams from an networked source:
ILBC Audio stream + corresponding RTCP stream
H263 Video stream + corresponding RTCP stream
Everything is put into one gstreamer pipeline so it will use the RTCP from both streams to synchronize audio/video. So far I've come up with this (using gst-launch for prototyping):
gst-launch -vvv gstrtpbin name=rtpbin
udpsrc caps="application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)H263-2000" port=40000 ! rtpbin.recv_rtp_sink_0
rtpbin. ! rtph263pdepay ! ffdec_h263 ! xvimagesink
udpsrc port=40001 ! rtpbin.recv_rtcp_sink_0
rtpbin.send_rtcp_src_0 ! udpsink port=40002 sync=false async=false
udpsrc caps="application/x-rtp,media=(string)audio,clock-rate=(int)8000,encoding-name=(string)PCMU,encoding-params=(string)1,octet-align=(string)1" port=60000 rtpbin.recv_rtp_sink_1
rtpbin. ! rtppcmudepay ! autoaudiosink
udpsrc port=60001 ! rtpbin.recv_rtcp_sink_1
rtpbin.send_rtcp_src_1 ! udpsink port=60002 sync=false async=false
This pipeline works well if the networked source starts out with sending both video and audio. If the videostream is paused later on, gstreamer will still playback audio and even will start playing back the video when the networked source resumes the video stream.
My problem is however that if the networked source starts out with only an audio stream (video might be added later on), the pipeline seems to pause/freeze until the video stream starts as well.
Since video is optional (and can be added/removed at will by the user) in my application, is there any way I can hook up for instance an 'videotestsrc' that will provide some kind of fallback video data to keep the pipeline running when there is no networked video data?
I've tried experimenting with 'videotestsrc' and a thing called 'videomixer' but I think that mixer still requires both streams to be alive. Any feedback is greatly appreciated!
I present a simple function for pause resume by changing bins. In the following example I provide the logic to change destination bin on the fly dynamically. This shall not completely stop the pipeline which is what you seek I believe. A similar logic could be used for src bins. Here you may remove your network source bin and related decoder/demux bins and add videotestsrc bins.
private static void dynamic_bin_replacement(Pipeline pipe, Element src_bin, Element dst_bin_new, Element dst_bin_old) {
pipe.pause();
src_bin.unlink(dst_bin_old);
pipe.remove(dst_bin_old);
pipe.add(dst_bin_new);
dst_bin_new.syncStateWithParent();
src_bin.link(dst_bin_new);
pipe.ready();
pipe.play();
}
The other logic you may want to try is "PADLOCKING". Please take a look at the following posts
http://cgit.freedesktop.org/gstreamer/gstreamer/tree/docs/design/part-block.txt
and
http://web.archiveorange.com/archive/v/8yxpz7FmOlGqxVYtkPb4
and
Adding and removing audio sources to/from GStreamer pipeline on-the-go
UPDATE
Try output-selector and input-selector bins as they seem to be better alternative. I found them most reliable and have had immense luck with them. I use fakesink or fakesrc respectively as the other end of the selector.
valve bin is another alternative that I found doesn't even need fakesink or fakesrc bins. It is also extremely reliable.
Also the correct state transition order for media file source
NULL -> READY -> PAUSED -> PLAYING (Upwards)
PLAYING -> PAUSED -> READY -> NULL (Downwards)
My order in the above example should be corrected where ready() should come before pause(). Also I would tend to think un-linking should be performed after null() state and not after pause(). I haven't tried these changes but theoretically they should work.
See the following link for detailed info
http://cgit.freedesktop.org/gstreamer/gstreamer/tree/docs/design/part-states.txt?h=BRANCH-RELEASE-0_10_19