Trying to do the linking for tee to a queue. Here is what i do in the program codes
Program snippets
/// create the tee pad template
tee_src_pad_template = gst_element_class_get_pad_template (GST_ELEMENT_GET_CLASS (tee),
"src_%u");
if(!tee_src_pad_template)
g_print("thread Live: no tee_src_pad_template \n");
/// request the 2 pad
tee_pad = gst_element_request_pad (tee,
tee_src_pad_template,
NULL,
NULL);
queue_pad = gst_element_get_static_pad (queue, "sink");
/// verify the object is created
if(!tee_pad)
g_print(" no tee_pad \n");
if(!queue_pad)
g_print("no queue_pad \n");
/// link the pads together
GstPadLinkReturn ret = gst_pad_link (tee_stream_pad, queue_stream_pad);
g_print(" Link return %d \n", ret);
The program compiles but there is error at the link pad stage, the value returned is -4
Link return -4
Check out on the meaning of GstPadLinkReturn value. Just wonder what causes the following
GST_PAD_LINK_NOFORMAT (-4) – pads do not have common format
And what does it mean they do not have common format? Aren't they neutral linkers?
Regards
Manage to figure out the common problem as I linked the wrong element type
GST_PAD_LINK_NOFORMAT (-4) – pads do not have common format
means exactly what it is, that the pads type is compatible
Related
I`m a beginner at using Gstreamer to handle some input videos. I have already built the pipeline using GStreamer to transcode the videos but the last part I cannot do is How I can get those batches of frames and do some custom image processing techniques to handle the purpose of my task.
Input Videos -----> Gstreamer Pipeline -----> Task: Apply some Image Processing Techniques
I`ve been searching about this problem on the Internet but cannot find any solution and the more I search, the more I am confused.
AppSink is the good element for you. You can enable "emit-signal" property and listen the event "new-sample". Then you can get an access to the buffer.
Here the entire documentation :
https://gstreamer.freedesktop.org/documentation/tutorials/basic/short-cutting-the-pipeline.html?gi-language=c
You have to create appsink element, enable "emit-signals" then register "new-sample" callback like this :
g_signal_connect (data.app_sink, "new-sample", G_CALLBACK (new_sample), &data)
static GstFlowReturn new_sample (GstElement *sink, CustomData *data) {
GstSample *sample;
/* Retrieve the buffer */
g_signal_emit_by_name (sink, "pull-sample", &sample);
if (sample) {
/* The only thing we do in this example is print a * to indicate a received buffer */
g_print ("*");
gst_sample_unref (sample);
return GST_FLOW_OK;
}
return GST_FLOW_ERROR;
}
Now you can retrieve buffer from sample instead of g_print ... (gst_sample_get_buffer)
https://gstreamer.freedesktop.org/documentation/gstreamer/gstsample.html?gi-language=c
Then read data inside the buffer :
GstMapInfo info;
gst_buffer_map (buf, &info, GST_MAP_READ);
gst_buffer_unmap (buf, &info);
gst_buffer_unref (buf);
info.data ==> buffer content.
Best regards.
Is it possible to pass two streams through a single element? I have two streams
Need to extract data from, can be destroyed in element or passed
through to a sink.
Video stream, will be edited based on the
data extracted from stream 1, passed through to autovideosink
GStreamer Core Library version 1.16.2
Written in c
chain functions:
static GstFlowReturn
gst_test2_chain (GstPad * pad, GstObject * parent, GstBuffer * buf)
{
Gsttest2 *filter;
filter = GST_TEST2 (parent);
/* just push out the incoming buffer without touching it */
return gst_pad_push (filter->srcpad, buf);
}
//second pads chain function
static GstFlowReturn
gst_test2_chain2 (GstPad * pad, GstObject * parent, GstBuffer * buf)
{
g_print("\ninside chain2\n");
Gsttest2 *filter;
filter = GST_TEST2 (parent);
return gst_pad_push (filter->srcpad2, buf);
}
//Pad templates:
static GstStaticPadTemplate src_factory = GST_STATIC_PAD_TEMPLATE ("src",
GST_PAD_SRC,
GST_PAD_ALWAYS,
GST_STATIC_CAPS ("video/x-raw")
);
Extracting the data from one stream and editing the other works fine. Currently using two video/x-raw src and sink pads for testing, but the stream for extracting the data would eventually be meta/x-klv. Using a single pad and source works fine with videotestsrc, but trying to use both sources and sinks result in pipeline errors unable to link or syntax. Does gstreamer support sending two streams through a single element? Would it be simpler to destroy the buffer of the no longer needed stream in element?
I have char* buffer that I read from video.mp4 file. This buffer has size 4096.
I tried to create GstBuffer from char* buffer
GstBuffer* Buffer = gst_buffer_new_wrapped(data, size);
dataBuffer = gst_buffer_copy(tmpBuf);
Then I push this buffer to the appsrc
GstElement* source = gst_bin_get_by_name (GST_BIN (consumer), "source");
gst_app_src_push_buffer (GST_APP_SRC (source), dataBuffer);
gst_object_unref (source);
Pipeline consumer was created in the next way:
gchar* videoConsumerString = g_strdup_printf ("appsrc max-buffers=5 drop=false name=source ! decodebin ! xvimagesink");
consumer = gst_parse_launch (videoConsumerString, NULL);
gst_element_set_state (consumer, GST_STATE_NULL);
g_free (videoConsumerString);
After the create of pipeline I set its state to the GST_STATE_NULL.
When I starts playing I set its state to GST_STATE_PLAYING.
But in the out I got error:
ERROR from element mpegvparse0: No valid frames found before end of stream
I tried to change size of char* buffer, use different elements in the pipeline (e.g. ffmpegcolorspace, videconvert, some other) but did not resolve this issue.
If run with GST_DEBUG=3, i have a lot of warnings
0:00:00.064480642 4059 0x12c66d0 WARN codecparsers_mpegvideo gstmpegvideoparser.c:887:gst_mpeg_video_packet_parse_picture_header: Unsupported picture type : 0
I use gstreamer 1.0.
Does anybody faced with such problem?
P.S. I don't have possibility to read data from file with Gstreamer. I only can read buffers from file with fread and then try to play them.
Maybe I have to set some specific fixed size of readed buffer?
I solved this problem.
Unexpectedly for me it was in the creating of the GstBuffer.
Correct way to create such buffer from data(char*) with known size is
GstBuffer * buffer = gst_buffer_new_allocate(NULL, size, NULL);
gst_buffer_fill(m_dataBufferProducer, 0, data, size);
Thank you for your help!
I'm implementing gstreamer media player with my own source of data using appsrc. Everything works fine except one thing:
When stream reaches it's end, callback emits "end-of-stream" signal. Signals sending fucntion g_signal_emit_by_name(appsrc, "end-of-stream", &ret) returns GstFlowReturn value GST_FLOW_OK. But then it calls need-data my callback again, so it returns "end-of-stream" signal again. And this time GstFlowReturn value is (-3) which is GST_FLOW UNEXPECTED. I assume that it does not expect "end-of-stream" signal when it already recieved one, but why it requests more data than? Maybe it is because I didn't set size value iof the steam?
Gstreamer version is 0.10.
Callback function code (appsrc type is seekable btw):
static void cb_need_data (GstElement *appsrc, guint size, gpointer user_data)
{
GstBuffer *buffer;
GstFlowReturn ret;
AppsrcData* data = static_cast<AppsrcData*>(user_data);
buffer = gst_buffer_new_and_alloc(size);
int read = fread(GST_BUFFER_DATA(buffer), 1, size, data->file);
GST_BUFFER_SIZE(buffer) = read;
g_signal_emit_by_name (appsrc, "push-buffer", buffer, &ret);
if (ret != GST_FLOW_OK) {
/* something wrong, stop pushing */
g_printerr("GST_FLOW != OK, return value is %d\n", ret);
g_main_loop_quit (data->loop);
}
if(feof(data->file) || read == 0)
{
g_signal_emit_by_name(appsrc, "end-of-stream", &ret);
if (ret != GST_FLOW_OK) {
g_printerr("EOF reached, GST_FLOW != OK, return value is %d\nAborting...", ret);
g_main_loop_quit (data->loop);
}
}
}
You should provide some corrections to your code(if they are not there already) that should alleviate your issue and help the overall application:
Never try and send a buffer without first checking if it actually has data. So, simply check the buffer data and length to make sure that the data is not NULL and that the length is >0
You can flag that a stream is ended in your user_data. When you send your EOS, set an item in your userdata to indicate that it has been sent and if the appsrc requests more data, simply check if it has been sent and then do not send anything else to the buffer.
Listen for the EOS on your pipeline bus so that it can destroy the stream and close the loop when the EOS message is handled so that you can be sure that your mediasink has received the EOS and you can safely dispose of the pipeline and loop without losing any data.
Have you tried the method gst_app_src_end_of_stream()? I'm not sure what return code you should use after invoking it, but it should be either GST_FLOW_OK or GST_FLOW_UNEXPECTED.
In GStreamer 1.x you return GST_FLOW_EOS.
I want manualy link tee and autovideosink but
Can't get pad template with gst_element_class_get_pad_template:
data->video_pipeline = gst_pipeline_new ("videopipeline");
gst_bin_add_many(GST_BIN(data->video_pipeline),udpsrc,rtph264depay,avdec_h264,/*videorate,* /clockoverlay,tee,/*queue,*/autovideosink,NULL);
if (!gst_element_link_filtered (udpsrc,rtph264depay,udpsrc_caps)){
GST_ERROR ("Can't link udpsrc and rtph264depay with caps");
}
if (!gst_element_link_many (rtph264depay,avdec_h264,/*videorate,*/clockoverlay,tee,NULL)){
GST_ERROR ("Can't link many to tee");
}
gst_object_unref (G_OBJECT(videorate_src_pad));//Возможно получится ошибка с пямятью
gst_caps_unref(videorate_caps);///Освобождаем caps
tee_src_pad_template = gst_element_class_get_pad_template(GST_ELEMENT_GET_CLASS(tee),"src_1");
pad_surface_src = gst_element_request_pad(tee,tee_src_pad_template,"src_%d",NULL);
// pad_surface_src = gst_element_get_request_pad(tee,"src_%d");
if(!pad_surface_src){
g_printerr ("Can't obtain request pad src for tee.\n");
}
/
pad_surface_sink = gst_element_get_static_pad(autovideosink,"sink");
if(!pad_surface_sink){
g_printerr ("Can't obtain request pad sink for autovideosink.\n");
}
if (gst_pad_link (pad_surface_src,pad_surface_sink)!=GST_PAD_LINK_OK){
g_printerr ("Tee could not be linked.\n");
gst_object_unref (data->video_pipeline);
return -1;
}
gst_object_unref(pad_surface_sink);
why does this happens?
The pad template on tee is called "src_%u" (or "src_%d" in 0.10). You'll have to use that as a name instead of "src_1".
For requesting a pad you can use the latter, but only do that if you want your pads with those names instead of automatically chosen names by tee. Letting tee choose names is more efficient.