pad not being added to uridecodebin - gstreamer

I'm currently working on a gstreamer pipeline that begins with a uridecodebin that opens a png file and that I hope to eventually link to an imagefreeze element (although in the future I may want to link it to any arbitrary element). I've connected to the "pad-added" signal, but it appears that uridecodebin doesn't ever actually create the pad.
After looking closely at the logs, it appears that it successfully opens the png file and links the filesrc to the decodebin, but there doesn't appear to be any pad created (the callback is never called, and when I iterate over the src pads of the uridecodebin after the file is opened, there are none). Does anyone know why this might be the case?
Unfortunately it's part of a much larger codebase so I can't share the full code but I can give excerpts of the relevant samples:
GstElement *uridecodebin = gst_element_factory_make ("uridecodebin", NULL);
GstElement *imagefreeze = gst_element_factory_make ("imagefreeze", NULL);
GstElement *sink = gst_element_factory_make ("fakesink", NULL);
g_object_set (G_OBJECT (uridecodebin), "uri"
"file:///test.png", NULL);
g_signal_connect (uridecodebin, "pad-added",
G_CALLBACK (uri_pad_added_cb), NULL);
gst_bin_add_many (GST_BIN (bin), uridecodebin, imagefreeze, sink, NULL);
gst_element_link (imagefreeze, sink);
And then, the callback (at this point, just a stub):
static void
uri_pad_added_cb (GstElement * element, GstPad * pad, gpointer data)
{
GST_WARNING ("uri_pad_added_cb");
}

Related

Gstreamer Elements could not be linked

I'm new to GStreamer, I followed the Basic tutorial 2 on GStreamer website and try to make it work with local mp4 file.
My problem is that I can't link "uridecodebin" and "autovideosink", here is my related code :
GstElement *pipeline, *source, *sink;
GstBus *bus;
GstMessage *msg;
GstStateChangeReturn ret;
/* Initialize GStreamer */
gst_init (&argc, &argv);
/* Create the elements */
source = gst_element_factory_make ("uridecodebin", "source");
sink = gst_element_factory_make ("autovideosink", "sink");
/* Create the empty pipeline */
pipeline = gst_pipeline_new ("pipeline");
if (!pipeline || !source || !sink) {
g_printerr ("Not all elements could be created.\n");
return -1;
}
/* Build the pipeline */
gst_bin_add_many (GST_BIN (pipeline), source, sink, NULL);
if (gst_element_link (source, sink) != TRUE) {
g_printerr ("Elements could not be linked.\n");
gst_object_unref (pipeline);
return -1;
}
It always return false with gst_element_link (source, sink), but it worked well if I simply use gst-launch-1.0 uridecodebin uri=file://MyPathToVideo/test.mp4 ! autovideosink command, what am I doing wrong?
Lots of thanks.
Try using gst_parse_launch() and giving it your pipeline. It is shorter this way. And I believe it takes care of some particularities in your case.
The main issue your approach is not working is that uridecodebin does not expose any pads because at that point in time it does not know anything about your MP4 file. So it can contain audio, video, both - or whatever. The correct approach is to implement delayed linking.
So instead of linking it directly you implement the pad-added signal on uridecodebin:
https://gstreamer.freedesktop.org/documentation/gstreamer/gstelement.html?gi-language=c#GstElement::pad-added
Then you start the pipeline with the elements disconnected.
This pad-added signal is triggered when uridecodebin has scanned your media file and exposes pads which can be linked. In case it is your video pad you can connect it to the autovideosink.
gst_parse_launch() if I'm not mistaken will take of this automatically for you (at least that is what gst-lauch-1.0 is doing - not sure if that specific functionality moved to that API as well).
P.S. You jumped the gun. Tutorial 2 does not use uridecodebin but more basic elements. Tutorial 3 will cover dynamic pipelines.

Streaming a webm video from a URL into a C++ windows.h application

I'm using C++ to make my own windows application. There's a .webm video I'd like to play within this application, but I'd like to play it from a URL, as opposed to loading it in from the same directory that I'd put my .exe in. I'm running Windows 10, and just using Emacs and g++ to write/compile.
Does anyone know how I can accomplish this? What include's do I need, is it possible, etc.?
Note: the webm video can be converted to mp4 as well.
For clarification, by "windows application", I mean one of these:
HWND hwnd = CreateWindowEx(0, CLASS_NAME, L"WindowName", WS_OVERLAPPEDWINDOW, CW_USEDEFAULT, CW_USEDEFAULT, CW_USEDEFAULT, CW_USEDEFAULT, NULL, NULL, hInstance, NULL);
I'd like to keep using this, since I can make a nice, borderless window with it!
This sample used IMFPMediaPlayer::CreateMediaItemFromURL, You can directly pass the URL to the function PlayMediaFile, like:
WCHAR uri[] = L"http://dl5.webmfiles.org/big-buck-bunny_trailer.webm";
hr = PlayMediaFile(hwnd, uri);
Use gstreamer with uridecodebin (you need to set uri property). It might be needed to add extra autovideoconvert and/or videoscale element between src and sink.
GstElement *pipeline = gst_pipeline_new ("xvoverlay");
GstElement *src = gst_element_factory_make ("uridecodebin", NULL);
GstElement *sink = gst_element_factory_make ("d3dvideosink", NULL);
g_object_set (G_OBJECT (src), "uri","some_url", NULL);
gst_bin_add_many (GST_BIN (pipeline), src, sink, NULL);
gst_element_link (src, sink);
gst_video_overlay_set_window_handle (GST_VIDEO_OVERLAY (sink), (guintptr)hwnd);
GstStateChangeReturn sret = gst_element_set_state (pipeline,
GST_STATE_PLAYING);
If you have gstreamer installed test your setup using:
gst-launch-1.0 uridecodebin uri="http://dl5.webmfiles.org/big-buck-bunny_trailer.webm" ! d3dvideosink

gstreamer filesrc is not working with error "sink-actual-sink-d3dvideo", but gst-launch is working correctly on windows

I want to read a file and playback. very simple.
*windows10, visual studio 2017 community.
in command prompt, this is working correctly.
gst-launch-1.0 filesrc location="C:/test.webm" ! decodebin ! autovideosink
but my code is not working, this.
int main(int argc, char *argv[]) {
..... declare variable
/* Initialize GStreamer */
gst_init (&argc, &argv);
/* Create the elements */
source = gst_element_factory_make ("filesrc", "source");
decode = gst_element_factory_make("decodebin", "decode");
sink = gst_element_factory_make ("autovideosink", "sink");
pipeline = gst_pipeline_new ("test-pipeline");
gst_bin_add_many (GST_BIN (pipeline), source, decode, sink, NULL);
g_object_set (G_OBJECT(source), "location", "C:/test.webm", NULL);
bus = gst_element_get_bus (pipeline);
... error processing
}
and my error is this
Error received from element sink-actual-sink-d3dvideo: Output window was closed
Debugging information: ../sys/d3dvideosink/d3dhelpers.c(1911): d3d_render_buffer (): /GstPipeline:test-pipeline/GstAutoVideoSink:sink/GstD3DVideoSink:sink-actual-sink-d3dvideo
please help me what is my problem.
my code is almost same to official tutorial. I just changed videotestsrc to filesrc, add the decodebin between source and sink, set the property for giving media file location

gstreamer pipeline EOS issues

I'm writing a program to mimic a gsteramer pipeline I have working from the command line.
I have been able to successfully trap some signals like:
g_signal_connect (data2.source, "pad-added", G_CALLBACK (pad_added_handler), &data2);
g_signal_connect (data2.source, "drained", G_CALLBACK (eos_cb), &data);
to add pads and tell when the url reader has reached end of stream — EOS.
I'm trying create a trap to find when the bus has reached EOS but am having issues. I've seen examples of trapping errors from the bus like this:
g_signal_connect (G_OBJECT (bus), "message::error", (GCallback)error_cb, &data);
I'm thinking something like this should work:
g_signal_connect (G_OBJECT (bus), "message::eos", (GCallback)eos_cb_bus, &data);
But I do not know exactly what to look for (the 'message::eos' part).
Can someone help me? Thanks much!
The GStreamer hello world example is a good start to see how this should be handled:
https://gstreamer.freedesktop.org/documentation/application-development/basics/helloworld.html
Basically you set up a GstBus callback and pick the messages from there which you are interested in. In your case it will be EOS.
Compare your code with How to use a bus. Copying the example code from there:
bus = gst_pipeline_get_bus (GST_PIPELINE (pipeline);
gst_bus_add_signal_watch (bus);
g_signal_connect (bus, "message::error", G_CALLBACK (cb_message_error), NULL);
g_signal_connect (bus, "message::eos", G_CALLBACK (cb_message_eos), NULL);
So "message::eos" is the correct signal name. Possibly you forgot gst_bus_add_signal_watch() in your code?
Compare also Difference between gst_bus_add_watch() and g_signal_connect().

GStreamer pre-recording

I'm trying to implement a pre-record
I use a shared memory of 20 seconds as a circular buffer.
I use shared memory as a circular buffer to permanently record video in it.
When an event occurs, I want to write the entire buffer to the file, and then record the video for 40 seconds.
How can I instantly encode the video from shared memory and write to a file, and then continue to write from memory to the file for some time?
you can ask the gstreamer queue to do the pre-buffering as follows:
g_object_set (G_OBJECT (queue), "max-size-bytes", 0, NULL);
g_object_set (G_OBJECT (queue), "max-size-buffers", 0, NULL);
g_object_set (G_OBJECT (queue),
"max-size-time", (guint64)threshold_time,
NULL);
/* Drop old buffers when max-size-time is reached */
g_object_set (G_OBJECT (queue), "leaky", 2, NULL);
Install a callback on the pad of the queue:
gst_pad_add_probe(pad, GST_PAD_PROBE_TYPE_BUFFER | GST_PAD_PROBE_TYPE_BLOCK,
(GstPadProbeCallback) callback, NULL, NULL);
Whenever you dont want pass the buffers return GST_PAD_PROBE_DROP in the callback, and when you want to pass the buffers return GST_PAD_PROBE_PASS
have pipeline something as below:
appsrc-- > queue --> encode --> mux --> filesink