How to set a GstPlayer pipeline? - c++

I have constructed a custom GStreamer pipeline that I will use to play RTSP streams. At the same time I'd like to create a new GstPlayer to use this pipeline. The problem is that there isn't a way that I can see to set a GstPlayer's pipeline (the only related method is gst_player_get_pipeline(). I don't understand how there is no way to customize a pipeline for a GstPlayer. This seems like basic functionality, so I must be missing something.
My pipeline:
GstElement *pipeline, *source, *filter, *sink;
// Create pipeline elements
pipeline = gst_pipeline_new ("vdi-pipeline");
source = gst_element_factory_make ("rtspsrc", "vdi-source");
filter = gst_element_factory_make ("decodebin", "vdi-filter");
sink = gst_element_factory_make ("appsink", "vdi-sink");
if (!source || !filter || !sink)
{
__android_log_print (ANDROID_LOG_ERROR, "Error", "A GstElement could not be created. Exiting.");
return;
}
// Add elements to pipeline
gst_bin_add_many (GST_BIN (pipeline), source, filter, sink, NULL);
// Link elements together
if (!gst_element_link_many (source, filter, sink, NULL)) {
__android_log_print (ANDROID_LOG_ERROR, "Warning", "Failed to link elements!");
}

But you can play rtsp via GstPlayer out of the box.. why do you want custom pipeline?
The player is using playbin which accept any kind of url.. and it will create pipeline dynamically according to what is being played..
What about patching the player itself, if you really cannot use playbin? I dont think it is intended for custom pipelines.. but you can hack it here.
You will then have hook the newpads and other callback on the rtspsrc instead of playbin.. and other stuff - I guess you do not want this.
The other way is - when the playbin constructs pipeline it uses rtspsrc inside - you can get this element from pipeline object and change some parameters.. but be carefull as changing parameters during playback is very tricky..
UPDATE:
Hm I think I overlook the appsink somehow.. well I think you can set playbin property audio-sink or video-sink to override it to use appsink.
But still you will have to somehow get the playbin element out of GstPlayer or set the playbin parameter upon initialization (I dont know how) - in this case I would ask on IRC (freenode, #gstreamer) if you are going the right direction.
Maybe better way would be to create your own application using decodebin and or even playbin and pass there the appsink element.. why do you want to use GstPlayer if you are not playing but processing buffers?
HTH

Related

How to play avi video using GStreamer

I am trying to play my first video in GSTreamer, by using GstElement, without pre-configured things like gst_parse_launch etc
I dont understand why my pipeline cant be linked and I get an error "unable to set the pipeline to playing state" ?
How can I fix it? What is missed?
#include <iostream>
#include <gst/gst.h>
int main(int argc, char *argv[])
{
GstElement *pipeline;
GstElement *source, *sink;
gst_init(&argc, &argv); //! Initialize GStreamer
pipeline = gst_pipeline_new("my-pipeline"); //! Creating pipeline
source = gst_element_factory_make("filesrc", "file-source"); //! Creating source
g_object_set(G_OBJECT(source), "location", "file:///D:/workspace/rocket.mp4", NULL);
sink = gst_element_factory_make("autovideosink", "sink"); //! Creating sink
if (sink == NULL)
{
g_error("Could not create neither 'autovideosink' element");
}
gst_bin_add_many(GST_BIN(pipeline), source, sink, NULL); //! Adding elements to pipeline container
if (!gst_element_link_many(source, sink, NULL)) //! Linking all elements together
{
g_warning("Unable to link elements!");
}
auto ret = gst_element_set_state(pipeline, GST_STATE_PLAYING); //! Turning pipeline in PLAYING STATE
if (ret == GST_STATE_CHANGE_FAILURE)
{
g_printerr("unable to set the pipeline to playing state");
gst_object_unref(pipeline);
return -1;
}
return 0;
}
Thanks in advance!
Try to change "file:///D:/workspace/rocket.mp4" to "D:/workspace/rocket.mp4".
And also maybe you need to change gst_element_link_many(source, sink, NULL) to gst_element_link(source, sink).
Here is an answer
First of all, here is github link to my solution with comments.
Explanation:
If you want to play video file, without predefined pipelines, like playbin, gst_parse_launch etc, you have two options
Dynamically link uridecodebin with 2 pipeline sleeves (one for audio and second for video)
Dynamically link by using avidemux and make it on 'atomic' level. Its very complicated, and there is no working help in net. Maybe I will update this branch later with working solution.
In my github link, you will find first solution with 2 pipelines and uridecodebin

Is it possible to set initial index of splitmuxsink?

I have setup gstreamer with few pipes (with help of RidgRun GSTd & gst-interpipe).
First pipe realize snapshots with multifilesink with max-files and could setup starting index=start_index.
Second pipe realize record with splitmuxsink and max-files & max-size-time
GStreamer 1.10.4
gstd v.0.7.0
multifilesink name=snapshot_sink index=${start_index} max-files=20 location=pic_%04d.jpg
splitmuxsink name=rec_file_sink location=rec_%03d.mpg max-size-time=60000000000 send-keyframe-requests=true max-files=5 muxer=mpegtsmux
The problem is that if I restart gstreamer (respectively gstd) the indexes are reset.
If I start recording in second pipe index begins from 000.
I could setup starting index in multifilesink pipe I couldn't find same for splitmuxsink.
Any ideas ?
How about the start-index property
https://gstreamer.freedesktop.org/documentation/multifile/splitmuxsink.html?gi-language=c#splitmuxsink:start-index
I just ran into this issue myself and I am afraid there is no way to do that using command line parameters only.
However, for those who are not afraid of diving into the API and create a gstreamer application, it is achievable using the 'format-location' signal (see the splitmuxsink documentation).
In C/C++, you may define the signal handler as follows:
static gchar* cb_FormatLocation(GstElement* splitmux, guint fragment_id, const int* offset)
{
char* location;
g_object_get(splitmux, "location", &location, nullptr);
gchar* fileName = g_strdup_printf(location, fragment_id + *offset);
g_free(location);
return fileName;
}
and, in the pipeline definition, all you need to do is to compute an offset and pass it to g_signal_connect:
#include <filesystem>
...
GstElement* sink = gst_element_factory_make("splitmuxsink", "sink");
...
std::filesystem::path fileTemplate = "/path/to/folder/%04d.mp4";
int offset = 0;
while (std::filesystem::exists(g_strdup_printf(fileTemplate.c_str(), offset))) offset++;
g_object_set(sink, "location", fileTemplate.c_str(), nullptr);
g_signal_connect (sink, "format-location", G_CALLBACK(cb_FormatLocation), &offset);
Side note: make sure the offset variable is not destroyed before the application terminates.
It should be possible to achieve the same behaviour with the Python API.

Data Transfer through RTSP in Gstreamer

UPDATE::
I want to stream video data (H264) through RTSP in Gstreamer.
gst_rtsp_media_factory_set_launch (factory, "videotestsrc ! x264enc ! rtph264pay name=pay0 pt=96 ");
I want "videotestsrc ! x264enc ! rtph264pay name=pay0 pt=96" this pipeline would also be in C programming in place of direct command.
Actually I have custom pipeline, i want to pass this pipeline to GstRTSPMediaFactory.
With launch i am not able to pass my pipline.
source = gst_element_factory_make("videotestsrc", "test-source");
parse = gst_element_factory_make("x264enc", "parse");
sink = gst_element_factory_make("rtph264pay", "sink");
gst_bin_add_many(GST_BIN(pipeline), source, parse, sink, NULL);
gst_element_link_many(source, parse, sink, NULL);
Now, I want to stream this pipeline using RTSP. I can stream with gst_rtsp_media_factory_set_launch,
But i want to pass only pipeline variable, and has to stream the video.
Can it possible, if so How?
I Modified the rtsp-media-factory.c as follows,
Added GstElement *pipeline in struct _GstRTSPMediaFactoryPrivate.
And the Added two more functions get_pipeline & set pipeline
void
gst_rtsp_media_factory_set_launch_pipeline (GstRTSPMediaFactory * factory, GstElement *pipeline)
{
g_print("PRASANTH :: SET LAUNCH PIPELINE\n");
GstRTSPMediaFactoryPrivate *priv;
g_return_if_fail (GST_IS_RTSP_MEDIA_FACTORY (factory));
g_return_if_fail (pipeline != NULL);
priv = factory->priv;
GST_RTSP_MEDIA_FACTORY_LOCK (factory);
// g_free (priv->launch);
priv->pipeline = pipeline;
Bin = priv->pipeline;
GST_RTSP_MEDIA_FACTORY_UNLOCK (factory);
}
In the Same way get also.
And at last in place of gst_parse_launch in function default_create_element,
added this line
element = priv->pipeline; // priv is of type GstRTSPMediaFactoryPrivate
return element;
but I am not able to receive the data.
When i put pay0 for rtpmp2pay it is working.
But it is working for once only. If Client stops and again starts its not working. To work it, again i am restarting the server.
What is the problem?
** (rtsp_server:4292): CRITICAL **: gst_rtsp_media_new: assertion 'GST_IS_ELEMENT (element)' failed
To have some answer here.
It solves the main problem according to comments discussion, but there is still problem with requesting another stream (when stopping and starting client).
The solution was to add proper name for payloader element as stated in docs:
The pipeline description should contain elements named payN, one for each
stream (ex. pay0, pay1, ...). Also, for increased compatibility each stream
should have a different payload type which can be configured on the payloader.
So this has to be changed to:
sink = gst_element_factory_make("rtph264pay", "pay0");
notice the change in name of element from sink -> pay0.
For the stopping client issue I would check if this works for parse version.
If yes then check if the parse pipeline string (in original source code of rtsp server) is saved anywhere and reused after restart.. you need to debug this.

How to check type of new added pad?

My pipeline scheme(dynamic link):
videotestsrc OR audiotestsrc ! decodebin ! queue ! autovideosink OR
autoaudiosink
I trying to use this advice to check which type of data I got (video/audio), but if I use decodebin like demuxer, then I get just "src_0" instead of "audio" or "video". How I can check my pad type for linking right element for playback? May be I can use one universal element for audio playback and video playback, like playsink(but it does not work for video)?
You can get the caps of the newly added pad and check if it contains audio or video caps (or something else).
Try with:
gst_pad_get_current_caps (pad);
or:
gst_pad_get_allowed_caps (pad);
If you are using gstreamer 0.10 (which is 3+ years obsolete an unmantained), you have:
gst_pad_get_caps_reffed (pad);
Then just check the returned caps if it is audio or video by getting the structure from the caps and checking if its name starts with video or audio.
/* There might be multiple structures depending on how you do it,
* but usually checking one in this case is enough */
structure = gst_caps_get_structure (caps, 0);
name = gst_structure_get_name (structure);
if (g_str_has_prefix (name, "video/")) {
...
} else if (g_str_has_prefix (name, "audio/")) {
...
}

dynamically replacing elements in a playing gstreamer pipeline

I'm looking for the correct technique, if one exists, for dynamically replacing an element in a running gstreamer pipeline. I have a gstreamer based c++ app and the pipeline it creates looks like this (using gst-launch syntax) :
souphttpsrc location="http://localhost/local.ts" ! mpegtsdemux name=d ! queue ! mpeg2dec ! xvimagesink d. ! queue ! a52dec ! pulsesink
During the middle of playback (i.e. GST_STATE_PLAYING is the pipeline state and the user is happily watching video), I need to remove souphttpsrc from the pipeline and create a new souphttpsrc, or even a new neonhttpsource, and then immediately add that back into the pipeline and continue playback of the same uri source stream at the same time position where playback was before we performed this operation. The user might see a small delay and that is fine.
We've barely figured out how to remove and replace the source, and we need more understanding. Here's our best attempt thus far:
gst_element_unlink(source, demuxer);
gst_element_set_state(source, GST_STATE_NULL);
gst_bin_remove(GST_BIN(pipeline), source);
source = gst_element_factory_make("souphttpsrc", "src");
g_object_set(G_OBJECT(source), "location", url, NULL);
gst_bin_add(GST_BIN(pipeline), source);
gst_element_link(source, demuxer);
gst_element_sync_state_with_parent(source);
This doesn't work perfectly because the source is playing back from the beginning and the rest of the pipeline is waiting for the correct timestamped buffers (I assume) because after several seconds, playback picks back up. I tried seeking the source in multiple ways but nothing has worked.
I need to know the correct way to do this. It would be nice to know a general technique, if one exists, as well, in case we wanted to dynamically replace the decoder or some other element.
thanks
I think this may be what you are looking for:
http://cgit.freedesktop.org/gstreamer/gstreamer/tree/docs/design/part-block.txt
(starting at line 115)