Create a gst-launch description from a coded gstreamer pipeline - gstreamer

Consider that I have coded a Gstreamer pipeline in C. How can I generate the text description of that pipeline so that I can use it on the command line with gst-launch?

Gstreamer can create dot files of pipelines, though not the same format as gst-launch it gives me the information I'm looking for:
https://gstreamer.freedesktop.org/documentation/tutorials/basic/debugging-tools.html

You just need to dump the string which you pass as an input to gst_parse_launch() function. You can use exactly same string with gst launch command

Related

GStreamer pipeline stiring from pipeline

is there a way how to get a pipeline string from the pipeline? I am looking for something like opposite of the gst_parse_launch function which would take the pipeline as parameter and returned the string.
Thanks.
To the best of my knowledge, no. The closest I can think of is to generate a pipeline diagram in the dot format. To do so, you can place gst_debug_bin_to_dot_file at the point in your code when you want to render the pipeline graph (likely when it's already playing). Then execute you app with the GST_DEBUG_DUMP_DOT_DIR variable. Something like:
GST_DEBUG_DUMP_DOT_DIR=. ./app

How to extract encoded information from VVC compression log VTM?

I am trying to extract encoded information like this from the VVC compression-log. Is there any setting, config in EncodeApp source for writing output values to file, I am using VTM Encoder Version 8.2. Thanks
lick here to view
Your question is pretty unrelated to the topic of video compression.
But anyway... What you may do is to read the console output and parse it. To do so, you can first write it into a text file by adding > log.txt at the end of your command line. Then, parse the text file and write it to a CSV file.
You could save the screen context as *.txt ( for instance:EncoderApp.exe -c xxx.cfg>Enc.txt, and convert Enc.txt to Enc.csv( Implemented in Python )

Gstreamer subtitles while recording

I am new to GStreamer and am trying to encode a video stream (for now v4l2src) with a subtitle stream and muxed into an MPEG ts container. I am able to use 'textoverlay' to set the data but I don't want to burn the data into the image. However I am wanting to use the subtitle stream to encode 'metadata' that is generated while the video is being recorded.
Is there a way that I can add subtitles into the MPEG ts as time passes? The content of the subtitle text is not known before hand, for example the gps coords of a moving camera.
There is the 'subtitleoverlay' plugin but I do not fully understand this one. Does it burn the text into the image like the 'textoverlay' or does it add a separate stream?
I think that subtitleoverlay renders and burn text into video frames.. check the example pipeline there is no magic - after subtitle overlay there is videoconvert which works with video frames..
I guess you can just attach subtitle stream into mpegtsmux element. I hope this is possible now - there is this bug/feature request which I hope makes this possible..
I checked the capabilities of mpegtsmux and it supports:
subpicture/x-dvb
 application/x-teletext
If you can somehow manage to input subtitles in form of subpicture/x-dvb then later on receiver you can use dvbsuboverlay element to display them..
I didnt find a way how you can actualy create such stream from text file(I found this question but no answer, maybe ask on IRC)..
I have a feeling that teletext was capable of showing subtitles.. but this may not be what you want (I dont know)..
In both cases I think that if you had rendered stream with rendered subtitles (only subtitles) in form of subtitles.mpg you could use that.. I guess there are some tools out there in wild that you can use for that..
Hope you can use that somehow

How to hook custom file parser to Gstreamer Decoder?

The HTTP file and its contents are already downloaded and are present in memory. I just have to pass on the content to a decoder in gstreamer and play the content. However, I am not able to find the connecting link between the two.
After reading the documentation, I understood that gstreamer uses httpsoupsrc for downloading and parsing of http files. But, in my case, I have my own parser as well as file downloader to do the same. It takes the url and returns the data in parts to be used by the decoder. I am not sure howto bypass httpsoupsrc and use my parser instead also how to link it to the decoder.
Please let me know if anyone knows how things can be done.
You can use appsrc. You can pass chunks of your data to app source as needed.

Finding Bitrate of video file

How can we find bitrate of a video file in c++? Can we do this by file handling?
Thanks
Install FFMEPG it will give you all the information related to the video
e.g.
ffmpeg -i filename.flv
If you want to implement such yourself you need to be able to read the video container format (Quicktime, ASF, AVI, Matroska etc) and try to find the bitrate from the metadata.
You can use ffprobe from the ffmpeg project to get the information about your video files and get a nice JSON ouput.
Check this answer for an example.