How to extract subtitles from a .mkv file using GStreamer? - gstreamer

If gst-discover-1.0 verified a .mkv has subtitles, then how to extract the subtitles with gst-launch-1.0? Thanks.

What kind of subtitles? You'll have to get the caps from gst-discoverer-1.0 and then do something like
gst-launch-1.0 filesrc location=/path/to/mkv ! matroskademux ! "text/x-raw" ! filesink location=subtitles
where "text/x-raw" is replaced by the caps of the subtitle stream. Alternatively you can also specify the link by pad name
gst-launch-1.0 filesrc location=/path/to/mkv ! matroskademux name=demux demux.subtitle_%u ! filesink location=subtitles
where %u should be the track number of the subtitle stream.

Related

Demux video and KLV data from MPEG-TS stream, in sync

I need to demux the video frames and KLV data from an MPEG-TS stream in sync, frame-by-frame.
The following command to demux the KLV data and outputs a text file with the KLV data.
gst-launch-1.0 filesrc location="some_file.ts" ! tsdemux name=demux \
demux. ! queue ! meta/x-klv ! filesink location="some_file-KLV.txt"
The following command to demux the video and outputs a video file.
gst-launch-1.0 filesrc location="some_file.ts" ! tsdemux name=demux \
demux. ! queue ! decodebin ! videorate ! videoscale ! x264enc ! mp4mux ! filesink location="some_file-video.mp4"
On combining the above two:
gst-launch-1.0 filesrc location="some_file.ts" ! tsdemux name=demux \
demux. ! queue ! decodebin ! videorate ! videoscale ! x264enc ! mp4mux ! filesink location="some_file-video.mp4"
demux. ! queue ! meta/x-klv ! filesink location="some_file.txt"
The command doesn't work. It just gets stuck after the following message on the terminal;
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
and, the size text and video files is 0 bytes.
An example .ts file can be found at(this file hasn't been uploaded and created by me, it is part of data for some code on github(https://gist.github.com/All4Gis/509fbe06ce53a0885744d16595811e6f)): https://drive.google.com/drive/folders/1AIbCGTqjk8NgA4R818pGSvU1UCcm-lib?usp=sharing
Thank you for helping! Cheers. :)
Edit:
I realised that there can be some confusion. The files in the link above were just used to create the .ts file.
The .ts file I am using, is available directly in either of the links below:
https://drive.google.com/drive/folders/1t-u8rnEE2MftWQkS1q3UB-J3ogXBr3p9?usp=sharing
https://easyupload.io/xufeny
It seems if we use Gstreamer's multiqueue element, instead of queue, the files are being created.
I tried the above based on a suggestion from a commenter on another website I had posted the question on.
But, the KLV data and frames are still not in sync. That is what I am trying to do now.
Regards.

How can I fix the missing duration of a video file from a RTSP stream on Gstreamer?

I'm using the following pipeline to listen to a RTSP stream and save a video file:
gst-launch-1.0 -q rtspsrc location=rtsp://<ip>:<port>/video ! decodebin ! autovideoconvert ! x264enc pass=5 quantizer=25 speed-preset=6 ! h264parse ! matroskamux ! filesink location=<filename>
But even though I can see the files generated, they lack the duration of the video when playing on VLC.
I can fix it by passing it through ffmpeg later, but I want to generate the video from gstreamer already completely valid. How can I fix this pipeline?
gst-launch-1.0 -e rtspsrc location=rtsp://<ip>:<port>/video ! decodebin ! videoconvert ! x264enc ! mp4mux ! filesink location="filename.mp4"
This will create a video with duration shown correctly

Why is encoding required for video transfer? (gstreamer)

The video I transferred is already encoded. Why do I encode again when transferring?
example: gst-launch-1.0 -v filesrc location=123.mp4 ! decodebin ! x264enc ! rtph264pay ! udpsink host=192.168.10.186 port=9001
Just send the video without encoding. Can I view it on the other side?
for example:
server: gst-launch-1.0 -v filesrc location =123.mp4 ! udpsink host=192.168.10.186 port=9001
123.mp4 encoded h265
client: gst-launch-1.0 udpsrc port=9001 caps = "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H265, payload=(int)96" ! rtph265depay ! h265parse ! nvh265dec ! autovideosink
best regards
Okay, with the clarification that your input is assumed to be an MPEG4 file which contains an H.265: yes, then it's possible (if this is assumption doesn't hold, then this will not work).
The following should do the trick:
gst-launch-1.0 filesrc location=123.mp4 ! qtdemux ! h265parse config-interval=-1 ! rtph265pay ! udpsink host=192.168.10.186 port=9001
Explanation:
the qtdemux will demux the MPEG4 container into the contained video/audio/subtitile streams (if there's more than one stream inside that container, you'll need to link to it multple times, or GStreamer will error out)
the h265parse config-interval=1 will make sure that your contains the correct SPS/PPS parameter sets. If the stream inside your original input file is not an H265 video stream, this will fail to link.
the rth265pay will translate this into RTP packets.
... which the udpsink can then send over the specified socket
P.S.: you might also be interested in the rtpsink (which used to live out-of-tree, but is now included in the latest master of GStreamer)
P.P.S.: you should use an even port to send an RTP stream

How do I take a screeshot from a video with gstreamer

I'd like to take a screenshot of an existing video - either an mp4 or an flv - using GStreamer1.0
I see multifilesink but it looks like that generates a bunch of images. I'd only like 1.
You can use:
gst-launch-1.0 filesrc location=/media/datos/videos/video.mp4 ! decodebin ! videoconvert ! pngenc snapshot=true ! filesink location=frame.png

Adding subtitle while doing H.264 encoding into a Matroska container

I have a requirement where I need to encode a v4l2src source in H.264 while using a Matroska container. If I have .mkv file with embedded subtitles it is easy to extract subtitles with
gst-launch-1.0 filesrc location=test.mkv ! matroskademux ! "text/x-raw" ! filesink location=subtitles
From what I understand and assuming I understand correctly, during the encoding process the "subtitle_%u" pad needs to be linked to text/x-raw source using textoverlay.
gst-launch-1.0 textoverlay text="Video 1" valignment=top halignment=left font-desc="Sans, 60" ! mux. imxv4l2src device=/dev/video0 ! timeoverlay ! videoconvert ! queue ! vpuenc_h264 ! capsfilter
caps="video/x-h264" ! matroskamux name=mux ! filesink location=sub.mkv
I use the above pipeline but I do not get the overlay in the .mkv video. What is the correct way to encode a subtitle/text overlay while encoding a source in H.264 in a matroska container and then also later be able to extract it using the first pipeline?
Sanchayan.
You may try this:
gst-launch-1.0 \
filesrc location=subtitles.srt ! subparse ! kateenc category=SUB ! mux.subtitle_0 \
imxv4l2src device=/dev/video0 ! timeoverlay ! videoconvert ! queue ! vpuenc_h264 ! \
capsfilter caps="video/x-h264" ! matroskamux name=mux ! filesink location=sub.mkv
And the subtitles.srt file may be like this:
1
00:00:00,500 --> 00:00:05,000
CAM 1
2
00:00:05,500 --> 00:00:10,000
That's all folks !