How to detect one source of funnel plugin has ended in gstreamer pipeline? - gstreamer

I am using gstreamer to build a pipeline with two source. One is a file source (filesrc), the other is appsrc. When the filesrc got EOS, the pipeline do not quit. appsrc still get need-data signal and will never stop itself. It seems funnel plugin will wait for all sources to end before sending EOS to pipeline. Is there a way to get notified when filesrc got end of file?
gst-launch command looks like this:
gst-launch-1.0 funnel name=f \
appsrc name=appsrc-h264-sei do-timestamp=true block=true is-live=true ! video/x-h264, stream-format=byte-stream, alignment=au ! queue ! f. \
filesrc input.h264 ! queue ! f. \
f. ! queue ! h264parse ! video/x-h264, stream-format=byte-stream, alignment=au ! mp4mux ! filesink location=file.mp4

You can install a GstPadProbe at the the funnel input pad and check for the EOS event on the callback.

Related

Demux video and KLV data from MPEG-TS stream, in sync

I need to demux the video frames and KLV data from an MPEG-TS stream in sync, frame-by-frame.
The following command to demux the KLV data and outputs a text file with the KLV data.
gst-launch-1.0 filesrc location="some_file.ts" ! tsdemux name=demux \
demux. ! queue ! meta/x-klv ! filesink location="some_file-KLV.txt"
The following command to demux the video and outputs a video file.
gst-launch-1.0 filesrc location="some_file.ts" ! tsdemux name=demux \
demux. ! queue ! decodebin ! videorate ! videoscale ! x264enc ! mp4mux ! filesink location="some_file-video.mp4"
On combining the above two:
gst-launch-1.0 filesrc location="some_file.ts" ! tsdemux name=demux \
demux. ! queue ! decodebin ! videorate ! videoscale ! x264enc ! mp4mux ! filesink location="some_file-video.mp4"
demux. ! queue ! meta/x-klv ! filesink location="some_file.txt"
The command doesn't work. It just gets stuck after the following message on the terminal;
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
and, the size text and video files is 0 bytes.
An example .ts file can be found at(this file hasn't been uploaded and created by me, it is part of data for some code on github(https://gist.github.com/All4Gis/509fbe06ce53a0885744d16595811e6f)): https://drive.google.com/drive/folders/1AIbCGTqjk8NgA4R818pGSvU1UCcm-lib?usp=sharing
Thank you for helping! Cheers. :)
Edit:
I realised that there can be some confusion. The files in the link above were just used to create the .ts file.
The .ts file I am using, is available directly in either of the links below:
https://drive.google.com/drive/folders/1t-u8rnEE2MftWQkS1q3UB-J3ogXBr3p9?usp=sharing
https://easyupload.io/xufeny
It seems if we use Gstreamer's multiqueue element, instead of queue, the files are being created.
I tried the above based on a suggestion from a commenter on another website I had posted the question on.
But, the KLV data and frames are still not in sync. That is what I am trying to do now.
Regards.

gstreamer: Demux & Remux MKV, preserving video

I am trying to reencode the audio part of a MKV file that contains some video/x-h264 and some audio/x-raw. I can't manage to just demux the MKV and remux it. Even simply:
gst-launch-1.0 filesrc location=test.mkv ! matroskademux name=demux \
matroskamux name=mux ! filesink location=test2.mkv \
demux.video_00 ! mux.video_00 \
demux.audio_00 ! mux.audio_00
fails miserably with:
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
WARNING: from element /GstPipeline:pipeline0/GstMatroskaDemux:demux: Delayed linking failed.
Additional debug info:
../gstreamer/gst/parse/grammar.y(506): gst_parse_no_more_pads (): /GstPipeline:pipeline0/GstMatroskaDemux:demux:
failed delayed linking pad video_00 of GstMatroskaDemux named demux to pad video_00 of GstMatroskaMux named mux
WARNING: from element /GstPipeline:pipeline0/GstMatroskaDemux:demux: Delayed linking failed.
Additional debug info:
../gstreamer/gst/parse/grammar.y(506): gst_parse_no_more_pads (): /GstPipeline:pipeline0/GstMatroskaDemux:demux:
failed delayed linking pad audio_00 of GstMatroskaDemux named demux to pad audio_00 of GstMatroskaMux named mux
ERROR: from element /GstPipeline:pipeline0/GstMatroskaDemux:demux: Internal data stream error.
Additional debug info:
../gst-plugins-good/gst/matroska/matroska-demux.c(5715): gst_matroska_demux_loop (): /GstPipeline:pipeline0/GstMatroskaDemux:demux:
streaming stopped, reason not-linked (-1)
ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...
Freeing pipeline ...
My best attempt at the transcoding mentioned above goes:
gst-launch-1.0 -v filesrc location=test.mkv ! matroskademux name=demux \
matroskamux name=mux ! filesink location=test2.mkv \
demux.video_00 ! queue ! 'video/x-h264' ! h264parse ! mux. \
demux.audio_00 ! queue ! rawaudioparse ! audioconvert ! audioresample ! avenc_aac ! mux.
with the same result. Removing the pad name audio_00 leads to gst being stuck at PREROLLING.
I have seen a few people facing similar problems:
http://gstreamer-devel.966125.n4.nabble.com/Putting-h264-file-inside-a-container-td4668158.html
http://gstreamer-devel.966125.n4.nabble.com/Changing-the-container-format-td3576914.html
As therein, keeping only video or only audio works.
I think the rawaudioparse should not be here. I tried your pipeline and trouble with it too. I just came up with something as I would have done it and it seemed to work:
filesrc location=test.mkv ! matroskademux \
matroskademux0. ! queue ! audioconvert ! avenc_aac ! matroskamux ! filesink location=test2.mkv \
matroskademux0. ! queue ! h264parse ! matroskamux0.
Audio in my case was:
Stream #0:0(eng): Audio: pcm_f32le, 44100 Hz, 2 channels, flt, 2822 kb/s (default)
Another format may require addiitonal transformations..
The problem is that the pads video_00 and audio_00 have been renamed video_0 and audio_0. This can be seen using gst-inspect-1.0 matroskademux, which indicates that the format for the pads now reads video_%u. Note that some documentation pages of gstreamer are not updated to reflect that.
The first command, MKV to MKV should read:
gst-launch-1.0 filesrc location=test.mkv ! matroskademux name=demux \
matroskamux name=mux ! filesink location=test2.mkv \
demux.video_0 ! queue ! mux.video_0 \
demux.audio_0 ! queue ! mux.audio_0
(Note the added queues)
The second command, MKV to MKV reencoding audio should read:
gst-launch-1.0 -v filesrc location=test.mkv ! matroskademux name=demux \
matroskamux name=mux ! filesink location=test2.mkv \
demux.video_0 ! queue ! 'video/x-h264' ! h264parse ! mux. \
demux.audio_0 ! queue ! rawaudioparse ! audioconvert ! audioresample ! avenc_aac ! mux.
The same result could have been achieved by not specifying the pads and using cap filters if needed.
Thanks go to user Florian Zwoch for providing a working pipeline.

Save H264 encoded stream without re-encoding

I have a gstreamer pipeline that streams using :
v4l2src ! x264enc ! rtph264pay pt=96 ! udpsink host=ip port=8554
And this pipeline that receives this stream :
/ queue ! avdec_h264 ! appsink
udpsrc ! capsfilter ! rtpjitterbuffer ! rtph264depay ! tee !
\ queue ! h264parse ! mp4mux ! filesink
Simplified receiver pipeline without the tee is :
gst-launch-1.0 udpsrc port=8080 caps="lots-of-caps" ! rtpjitterbuffer ! rtph264depay ! h264parse ! mp4mux ! filesink location=/home/rish/Desktop/recorded.264 -e
Question :
Is there a way to save the H264 encoded stream received from udpsrc without having to re-encode it? How do I correctly close the filesink?
What I've tried so far : The discussion from this thread suggests the pipeline I've tried above but file is still corrupt. (not correctly closed).
This question asks a similar question. However I do not want to decode and re-encode. Another answer in the thread suggests using matroskamux element instead of mp4mux. This works, but I'd rather prefer using mp4mux (no particular reason, but I'd like to know why matroskamux works and mp4mux doesn't).
Your pipeline is already muxing without re-encoding, there is no encoder on your pipeline. h264parse is just a parser.
you've already got an answer on how to close the stream here: Sending EoS to filesink while removing branch from tee

record camera stream from gstreamer

I have a gstreamer pipeline which works perfectly and takes a camera stream, encodes it as H.264 video, saves it to a file AND displays it on the screen as follows:
gst-launch-1.0 -v autovideosrc ! tee name = t ! queue ! omxh264enc !
'video/x-h264, stream-format=(string)byte-stream' ! h264parse ! qtmux !
filesink location=test.mp4 t. ! queue ! videoscale ! video/x-raw,
width=480,height=270 ! xvimagesink -e sync=false
Now, I am trying to do something even simple and just record the stream to a file (without displaying on screen) and this does not seem to work! It writes a file but cannot play it. What I have tried so far is:
gst-launch-1.0 -v autovideosrc ! queue ! omxh264enc ! 'video/x-h264,
stream-format=(string)byte-stream' ! h264parse ! qtmux ! filesink
location=test.mp4 sync=false
I can also remove the queue element but with the same result:
gst-launch-1.0 -v autovideosrc ! omxh264enc ! 'video/x-h264,
stream-format=(string)byte-stream' ! h264parse ! qtmux ! filesink
location=test.mp4 sync=false
It does not give any errors but just does not write a valid stream to my filesink, it seems.
How do you stop the stream? Will the camera correctly inject an EOS signal? If not and you just press ctrl-c to stop the operation the .mp4 file will missing important headers which are required for proper playback.
Add -e to your command line. In that case when you press ctrl-c the pipeline will not just stop but is being properly shut down by sending an EOS signal through the pipeline.

How to remove a branch of tee in an active GStreamer pipeline?

everyone
The version of GStreamer I use is 1.x. I've spent a lot of time in searching a way to delete a tee branch.
In an active pipeline, a recording bin is created as below and inserted into this pipeline by branching the tee element.
"queue ! video/x-h264, width=800, height=600, framerate=10/1, stream-format=(string)byte-stream ! h264parse ! mp4mux ! filesink location=/xxxx"
It works perfectly except that I want to dynamically delete the recording bin and get a playable mp4 file. According to some discussion and tutorial, to get a correct mp4 file , we need to handle something about EOS. After trying some methods, I always got broken mp4 files.
Does anyone have sample code written in C to show me ? I'd appreciate your help.
Your best bet for cases like this may be to create two processes. The first process would run the video, and half of the tee it has would deliver h264 data to the second process through whatever means.
Here are two pipelines demonstrating the concept using UDP sockets.
gst-launch-1.0 videotestsrc ! x264enc ! tee name=t ! h264parse ! avdec_h264 ! videoconvert ! ximagesink t. ! queue ! h264parse ! rtph264pay ! udpsink host=localhost port=8888
gst-launch-1.0 udpsrc port=8888 num-buffers=300 ! application/x-rtp,media=video,encoding-name=H264 ! rtph264depay ! h264parse ! mp4mux ! filesink location=/tmp/264.mp4
The trick to getting that clean mp4 is to make sure an EOS event is delivered reliably.
Instead of dynamically adding it you just have it in the pipeline by default, and add a probe callback at the source pad of the queue in the probe callback you have to do the trick either to pass the buffer or not (GST_PAD_PROBE_DROP drops the buffer and GST_PAD_PROBE_OK passes on the buffer to next element) so when you get an event to start/stop recoding you just need to return appropriate values. And filesink you can use multifilesink instead so as to write to different files everytime you start/stop.
Note the queue which drops the buffers needs before the mux element otherwise the file would be corrupt.
Hope that helps!
Finally, I came up with a solution.
Let's say that there is an active pipeline including a recording bin.
"udpsrc port=4444 caps=\"application/x-rtp, media=(string)video,
clock-rate=(int)90000, encoding-name=(string)H264 ! rtph264depay !
tee name=tp tp. ! queue ! video/x-h264, width=800, height=600,
framerate=10/1 ! decodebin ! videoconvert ! video/x-raw, format=RGBA !
autovideosink"
recording bin:
"queue ! video/x-h264, width=800, height=600, framerate=10/1,
stream-format=(string)byte-stream ! h264parse ! mp4mux ! filesink
location=/xxxx"
After a period of time, we want to stop recording and save as a mp4 file, and video media is still streaming.
First, I use a blocking probe to block the src pad of tee. In this blocking probe callback, I use an event probe to catch EOS in the sink pad of filesink and do a busy waiting.
*if EOS is catched in the event probe callback
self->isGotEOS = YES;
*busy waiting in the blocking probe callback
while (self->isGotEOS == NO) {
usleep(100000);
}
Before entering the busy waiting while loop, an EOS event is created and sent to the sink pad of recording bin.
After the busy waiting is done:
usleep(200000);
[self destory_record_elements];
I think usleep(200000) is a trick. Without it, a non-playable mp4 file is usually the result. It would seem that 200ms is long enough handling the EOS.
I had similar problem previously, my pipeline
videotestsrc do-timestamp="TRUE" ! videoflip method=0 ! tee name=t
t. ! queue ! videoconvert ! glupload ! glshader ! autovideosink async="FALSE"
t. ! queue ! identity drop-probability=1 ! videoconvert name=conv2 ! openh264enc ! h264parse ! avimux ! multifilesink async="FALSE" post-messages=true next-file=4
Then I just change drop-probability property on identity element
drop-probability = 1 + gst_pad_send_event(conv2_sinkpad, gst_event_new_eos()); - stop recording
drop-probability = 0 - resume recording