Demux video and KLV data from MPEG-TS stream, in sync - gstreamer

I need to demux the video frames and KLV data from an MPEG-TS stream in sync, frame-by-frame.
The following command to demux the KLV data and outputs a text file with the KLV data.
gst-launch-1.0 filesrc location="some_file.ts" ! tsdemux name=demux \
demux. ! queue ! meta/x-klv ! filesink location="some_file-KLV.txt"
The following command to demux the video and outputs a video file.
gst-launch-1.0 filesrc location="some_file.ts" ! tsdemux name=demux \
demux. ! queue ! decodebin ! videorate ! videoscale ! x264enc ! mp4mux ! filesink location="some_file-video.mp4"
On combining the above two:
gst-launch-1.0 filesrc location="some_file.ts" ! tsdemux name=demux \
demux. ! queue ! decodebin ! videorate ! videoscale ! x264enc ! mp4mux ! filesink location="some_file-video.mp4"
demux. ! queue ! meta/x-klv ! filesink location="some_file.txt"
The command doesn't work. It just gets stuck after the following message on the terminal;
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
and, the size text and video files is 0 bytes.
An example .ts file can be found at(this file hasn't been uploaded and created by me, it is part of data for some code on github(https://gist.github.com/All4Gis/509fbe06ce53a0885744d16595811e6f)): https://drive.google.com/drive/folders/1AIbCGTqjk8NgA4R818pGSvU1UCcm-lib?usp=sharing
Thank you for helping! Cheers. :)
Edit:
I realised that there can be some confusion. The files in the link above were just used to create the .ts file.
The .ts file I am using, is available directly in either of the links below:
https://drive.google.com/drive/folders/1t-u8rnEE2MftWQkS1q3UB-J3ogXBr3p9?usp=sharing
https://easyupload.io/xufeny

It seems if we use Gstreamer's multiqueue element, instead of queue, the files are being created.
I tried the above based on a suggestion from a commenter on another website I had posted the question on.
But, the KLV data and frames are still not in sync. That is what I am trying to do now.
Regards.

Related

Gst-launch How do i edit this pipeline so play the audio?

As title, how can I change this so it also plays the files audio too?
gst-launch-1.0 filesrc location='/usr/share/myfile.mp4' ! qtdemux ! h264parse ! imxvpudec ! imxipuvideosink framebuffer=/dev/fb2 &
The I can get the file to play with audio using
gst-launch-1.0 -v playbin uri=file:///path/to/somefile.mp4
But I need the output to be onto device fb2 like in the first example
Many thanks
I posted a link to this question into the gstreamer reddit and a hero called Omerzet saved the day.
The following is the solution:
gst-launch-1.0 filesrc location='/usr/share/myfile.mp4' ! qtdemux name=demux demux.video_0 ! queue ! h264parse ! imxvpudec ! imxipuvideosink framebuffer=/dev/fb2 demux.audio_0 ! queue ! decodebin ! audioconvert ! audioresample ! alsasink device="sysdefault:CARD=imxhdmisoc"
Where framebuffer diverts the video to device /dev/fb2.
And
alsasink device="sysdefault:CARD=imxhdmisoc"
Diverts the audio to my define sound card.

How can I fix the missing duration of a video file from a RTSP stream on Gstreamer?

I'm using the following pipeline to listen to a RTSP stream and save a video file:
gst-launch-1.0 -q rtspsrc location=rtsp://<ip>:<port>/video ! decodebin ! autovideoconvert ! x264enc pass=5 quantizer=25 speed-preset=6 ! h264parse ! matroskamux ! filesink location=<filename>
But even though I can see the files generated, they lack the duration of the video when playing on VLC.
I can fix it by passing it through ffmpeg later, but I want to generate the video from gstreamer already completely valid. How can I fix this pipeline?
gst-launch-1.0 -e rtspsrc location=rtsp://<ip>:<port>/video ! decodebin ! videoconvert ! x264enc ! mp4mux ! filesink location="filename.mp4"
This will create a video with duration shown correctly

How to send data from a file to webrtcbin element in gstreamer?

I am a beginner with gstreamer so bear with me.
I have a working pipeline where audio and video from a test source is sent to the webrtcbin element used to send out offer. Pipeline is as follows:
PIPELINE_DESC = '''
webrtcbin name=sendrecv stun-server=stun://stun.l.google.com:19302
audiotestsrc is-live=true wave=red-noise ! audioconvert ! audioresample ! queue ! opusenc ! rtpopuspay !
queue ! application/x-rtp,media=audio,encoding-name=OPUS,payload=96 ! sendrecv.
videotestsrc is-live=true pattern=ball ! video/x-raw,width=320,height=240 ! videoconvert ! queue ! x264enc ! rtph264pay !
queue ! application/x-rtp,media=video,encoding-name=H264,payload=97 ! sendrecv.
'''
However doing this consumes a lot of CPU/Memory as gstreamer has to encode audio/video. Hence I was to use a pre-recorded file to lower the resource usage.
I want to use a sample file (sample.mp4) to send audio and video to the webRTCbin element. The mp4 file has H264 video and AAC audio. I have tried a lot of combinations of elements but it is not working. Could you please help me correct my pipeline?
PIPELINE_DESC = '''
webrtcbin name=sendrecv stun-server=stun://stun.l.google.com:19302
filesrc location=sample.mp4 ! decodebin ! audioconvert ! sendrecv.
filesrc location=sample.mp4 ! decodebin ! videoconvert ! sendrecv.
'''
Many thanks in advance.
mp4 file is a container file format and it needs to be demultiplexed to get video and audio. For that purpose, you can use GStreamer's qtdemux element.
Considering above, example pipeline could be something like this
PIPELINE_DESC = '''
filesrc location=test.mp4 ! qtdemux name=demux
webrtcbin name=sendrecv stun-server=stun://stun.l.google.com:19302
demux.audio_%u ! aacparse ! rtpmp4apay !
queue ! application/x-rtp,media=audio,encoding-name=MP4A-LATM,payload=96 ! sendrecv.
demux.video_%u ! h264parse ! rtph264pay !
queue ! application/x-rtp,media=video,encoding-name=H264,payload=97 ! sendrecv.
'''

Gstreamer taginject pipeline doesnt work

im trying to construct a pipeline that will read any file (mp3, ogg, flac etc) and updates its tags using the taginject element, but it is not working.
Here are my attempts:
gst-launch-1.0 filesrc location=file.mp3 ! decodebin ! taginject tags="title=bla,artist=blub" ! filesink location=output_file.mp3
Result: The Pipeline runs, but it creates a 50mb file from a 4mb file, and that large file is not playable (and probably not containing tags, also).
gst-launch-1.0 filesrc location=file.mp3 ! taginject tags="title=test,artist=blub" ! filesink location=output_file.mp3
Result: The pipeline runs and creates a playable output file, but it contains no tags.
gst-launch-1.0 filesrc location=file.mp3 ! decodebin ! taginject tags="title=test,artist=blub" ! encodebin ! filesink location=output_file.mp3
Result: The pipeline does not run. It says taginject cannot be linked with encodebin.
I would appreciate any help on this, I just dont know what I am doing wrong (probably using the wrong elements... but I just cant find which are the right ones)
You need to add a muxer after taginject, e.g. something like:
gst-launch-1.0 filesrc location=file.mp3 ! parsebin ! \
taginject tags="title=bla,artist=blub" ! id3v2mux ! \
filesink location=output_file.mp3
Also using parsebin avoids decoding.

Save H264 encoded stream without re-encoding

I have a gstreamer pipeline that streams using :
v4l2src ! x264enc ! rtph264pay pt=96 ! udpsink host=ip port=8554
And this pipeline that receives this stream :
/ queue ! avdec_h264 ! appsink
udpsrc ! capsfilter ! rtpjitterbuffer ! rtph264depay ! tee !
\ queue ! h264parse ! mp4mux ! filesink
Simplified receiver pipeline without the tee is :
gst-launch-1.0 udpsrc port=8080 caps="lots-of-caps" ! rtpjitterbuffer ! rtph264depay ! h264parse ! mp4mux ! filesink location=/home/rish/Desktop/recorded.264 -e
Question :
Is there a way to save the H264 encoded stream received from udpsrc without having to re-encode it? How do I correctly close the filesink?
What I've tried so far : The discussion from this thread suggests the pipeline I've tried above but file is still corrupt. (not correctly closed).
This question asks a similar question. However I do not want to decode and re-encode. Another answer in the thread suggests using matroskamux element instead of mp4mux. This works, but I'd rather prefer using mp4mux (no particular reason, but I'd like to know why matroskamux works and mp4mux doesn't).
Your pipeline is already muxing without re-encoding, there is no encoder on your pipeline. h264parse is just a parser.
you've already got an answer on how to close the stream here: Sending EoS to filesink while removing branch from tee