Gstreamer - Convert command line gst-launch to C code - c++

I have been making a few experiments with GStreamer by using the gst-launch utility. However, ultimately, the aim is to implement this same functionality on my own application using GStreamer libraries.
The problem is that it's ultimately difficult (at least for someone that is not used to the GStreamer API) to "port" what I test on the command line to C/C++ code.
An example of a command that I may need to port is:
gst-launch filesrc location="CLIP8.mp4" ! decodebin2 ! jpegenc ! multifilesink location="test%d.jpg"
What's the most "straight forward" way/approach to take such command and write it in C on my own app.
Also, as a side question, how could I replace the multifilesink with the possibility of doing this work on memory (I'm using OpenCV to perform a few calculation on a given image that should be extracted from the video). Is it possible to decode directly to memory and use it right away without first saving to the filesystem? It could (and should) be sequential, I mean that would only move on to the next frame after I'm done with processing the current one so that I wouldn't have to keep thousands of frames in memory.
What do you say?

I found the solution. There's a function built in on GStreamer that parses gst-launch arguments and returns a pipeline. The function is called gst_parse_launch and is documented here: http://gstreamer.freedesktop.org/data/doc/gstreamer/head/gstreamer/html/gstreamer-GstParse.html
I haven't tested it but it's possible the fastest solution to convert what have been testing on the command line to C/C++ code.

You could always pop open the source of gst-launch and grab the bits that parse out the command-line and turn it into a GStreamer pipeline.
That way you can just pass in the "command line" as a string, and the function will return a complete pipeline for you.

By the way, there is an interesting GStreamer element that provides a good way to integrate a processing pipeline into your (C/C++) application: appsink
http://gstreamer.freedesktop.org/data/doc/gstreamer/head/gst-plugins-base-libs/html/gst-plugins-base-libs-appsink.html
With this one you can basically retrieve the frames from the pipeline in a big C array and do whatever you want with them. You setup a callback function, which will be activated every time a new frame is available from the pipeline thread...

Related

Dynamically change GStreamer plugin (equalizer) parameters

Context: I have an audio device running Mopidy, which outputs to a gstreamer pipeline. My device has an interface for an equalizer - for this I've set up my ALSA config to go through ALSA equalizer - the GStreamer pipeline targets this. The code that handles the interface uses python alsamixer to realise the values.
This works, but ALSA equalizer is a bit janky and has a very narrow range before it distorts the audio. GStreamer has an equalizer plugin which I think is better; I can implement this as per the example launch line at:
gst-launch-1.0 filesrc location=song.ogg ! oggdemux ! vorbisdec ! audioconvert ! equalizer-10bands band2=3.0 ! alsasink
However, I want to be able to dynamically change band0-band9 parameters while the stream is playing - either via python or from the command line. I'm not sure what direction to look - is this possible?
Properties from a plugin can be set via g_object_set() function. Whether they can be changed on the fly or only when the pipeline is stopped depends on the plugin's implementation.

emit (action) signal from gst-launch-1.0 pipeline

I've got a rtsp cam with backchannel support and I'm trying to get it to work with the command line tool gst-launch-1.0. The incoming streams are not an issue, but the backchannel when enabled doesn't produce a sink. however I've digged through the sources and got this little hint from the developer from the element rtspsrc:
Set backchannel=onvif to enable, and use the 'push-backchannel-sample'
action signal with the correct stream id.
I can't seem to find any info about (action) signals on the command line for gst-launch-1.0
Does anyone know if it is even possible to send signals from gst-launch-1.0?
Thanks,
Bram
I think this is meant to be called from code and not usable from gst-launch-1.0.
Just for reference, the signal is called push-backchannel-buffer (not -sample).
Also, the above linked manual page for gst-launch-1.0 says:
Please note that gst-launch-1.0 is primarily a debugging tool. You
should not build applications on top of it. For applications, use the
gst_parse_launch() function of the GStreamer API as an easy way to
construct pipelines from pipeline descriptions.

running gstreamer app without v4l2 driver

I would like to implement a gstreamer pipeline for video streaming without using a v4l2 driver in Linux. The thing is that the video frames I have them already in the RAM(the vdma core which is configured by a different OS on a different core takes care of that) . And also I had difficulties debugging some DMA slave errors which appeared always after a dma completion callback.
Therefore I would be happy if I would not have to use v4l2 driver in order to have gstreamer on top.
I have found this plugin from Bosch that fits my case:
https://github.com/igel-oss/v4l-gst
My question would be if somebody has experience with this approach and if is a feasible one?
Other question would be how to configure the source in the gstreamer pipeline as it is not a device /dev/videoxxx but rather a memory location or even a bmp file.
Thanks, Mihaita
You could use appsrc and repeatedly call gst_app_src_push_buffer (). Your application will have all freedom to read the video data from anywhere it likes - memory, files etc. See also the relevant section of the GStreamer Application Development Manual.
If you want more flexibility, like using the video source in several applications, you should consider implementing your own custom GStreamer element.

How to check for bad timestamping with gstreamer

I am a beginner to gstreamer. When I create a pipeline to play a video file, I get the following message, "There may be a timestamping problem, or this computer is too slow". After some search I found that this problem might occur if there is a bad timestamping. Is there a way to figure out whether the video file has bad timestamps?
Here is the pipeline that I'm using,
gst-launch-0.10 filesrc location=.mp4 ! qtdemux ! ffdec_mpeg4 ! dri2videosink.
You can insert a identity element between e.g. ffdec_mpeg4 and drivideosink and use the check-imperfect-timestamp + check-imperfect-offset properties and watch the debug log.
If you work on an embedded device, watch the cpu load to see if maybe the pipeline runs indeed too slow.

How to change encoding bitrate dynamically while streaming in Gstreamer?

I am developing c program to perform adaptive streaming but I could't change the "bitrate" property of x264enc element with g_object_set() function.How can i change this?
Thanks.
Install the git version of the gstreamer ugly plugin.Then g_object_set() function works fine.