GStreamer: dynamically link a tee while pipline is PLAYING - c++

Using GStreamer 1.16.3 on Ubuntu 20, C/C++. At some point during runtime, I'm trying to link a tee element (its src_0 is already linked and playing) to a recording bin which ends with a filesink.
The following snippet: create the recording bin, get the new filesink element, set its location property, add the recording bin to pipeline, sync recording bin state with pipeline, get pipeline's tee element, get its src request pad, get recording bin's sink pad, link src and sink pads to complete the process.
auto error = (GError *)nullptr;
auto recording_bin = gst_parse_bin_from_description("queue ! filesink name=recording_filesink", true, &error);
auto filesink = gst_bin_get_by_name(GST_BIN(recording_bin), "recording_filesink");
g_object_set(G_OBJECT(filesink), "location", "/home/.../record.ts", NULL);
g_object_unref(filesink);
gst_bin_add(GST_BIN(pipeline), recording_bin);
gst_element_sync_state_with_parent(recording_bin);
auto tee = gst_bin_get_by_name(GST_BIN(pipeline), "tee");
auto tee_src = gst_element_get_request_pad(tee, "src_%u");
auto recording_sink = gst_element_get_static_pad(recording_bin, "sink");
auto ret = gst_pad_link(tee_src, recording_sink);
g_object_unref(recording_sink);
g_object_unref(tee_src);
g_object_unref(tee);
After executing this above, ret is GST_PAD_LINK_OK, but the pipeline gets stuck - not displaying (1st tee src) and not recording (2nd tee src).
Is this the right way to dynamically link a tee src to a recording bin?

Related

Gstreamer: Signal RTP header extension to the payloader

I have an RTP streaming app which implements the following pipeline using the C API.
gst-launch-1.0 -v rtpbin name=rtpbin \
videotestsrc ! x264enc ! rtph264pay! rtpbin.send_rtp_sink_0 \
rtpbin.send_rtp_src_0 ! udpsink port=5002 host=127.0.0.1 \
rtpbin.send_rtcp_src_0 ! udpsink port=5003 host=127.0.0.1 sync=false async=false \
udpsrc port=5007 ! rtpbin.recv_rtcp_sink_0
I want to add header extensions to the RTP packet; therefore I created an extension using the new GstRTPHeaderExtension class introduced in GStreamer v1.20. I want to set the attributes of the extension (e.g. color space properties for the example below). AFAIU this should be done by providing those as caps to the payloader element. However, I can't figure out how I should provide these caps exactly. Do I need to use a capsfilter here or what is the right way? In the current state, I can send the RTP packets and see that the extension is added but can't set the attributes.
Related parts of the code are below:
#define URN_COLORSPACE "http://www.webrtc.org/experiments/rtp-hdrext/color-space"
const GstVideoColorimetry colorimetry = {
GST_VIDEO_COLOR_RANGE_0_255,
GST_VIDEO_COLOR_MATRIX_BT601,
GST_VIDEO_TRANSFER_BT2020_10,
GST_VIDEO_COLOR_PRIMARIES_BT2020};
const GstVideoChromaSite chroma_site = GST_VIDEO_CHROMA_SITE_MPEG2;
ext = gst_rtp_header_extension_create_from_uri(URN_COLORSPACE);
gst_rtp_header_extension_set_id(ext, 1);
g_signal_emit_by_name(videopay, "add-extension", ext);
// other element definitions, links..
videopay = gst_element_factory_make("rtph264pay", "videopay");
colorimetry_str = gst_video_colorimetry_to_string(&colorimetry);
// How to provide these caps to the payloader set the extension properties?
caps = gst_caps_new_simple("application/x-rtp",
"media", G_TYPE_STRING, "video",
"clock-rate", G_TYPE_INT, 90000,
"encoding-name", G_TYPE_STRING, "H264",
"colorimetry", G_TYPE_STRING, colorimetry_str,
"chroma-site", G_TYPE_STRING,
gst_video_chroma_to_string(chroma_site), NULL);
The caps should be provided to the sink of the RTP payloader element using a capsfilter element:
GstElement *capsfilt;
capsfilt = gst_element_factory_make("capsfilter", "capsfilter");
g_object_set(capsfilt, "caps", caps, NULL);
gst_element_link_many(videosrc, videoenc, capsfilt, videopay, NULL)
where videosrc, videoenc, videopay are the source, encoder and payloader elements, respectively.
Also, the caps should have a media type matching to the encoder element. E.g. video/x-h264 if the encoder element is an instance of x264enc.
The payloader will try to automatically enable the extension with the attributes set in the caps by passing the caps to the extension, if auto-header-extension is enabled (set to true by default).
In a gst-launch pipeline, the caps are passed automatically when the header extension is inserted after the payloader.

Gstreamer command line pipeline to C++ code

I have a gstreamer pipeline that works in the command line and I am trying to convert it to C++ code. I have most of it, except I need to be able to write the -e flag in C++ but I'm not sure how to add it to the pipeline. Here is the command line
gst-launch-1.0 -e udpsrc port=8000 ! application/x-rtp, encoding-name=H264, payload=109 ! tee name=t t. ! rtph264depay ! h264parse ! queue ! avdec_h264 ! videoconvert ! autovideosink t. ! rtph264depay ! h264parse ! queue ! mp4mux ! filesink location=!/camera.mp4"
Here is the C++ code I have. This works to display a live stream from a camera and write a mp4 file, however it is not readable. The -e flag makes the file able to be played.
// [1] Create Elements
pipeline = gst_pipeline_new("xvoverlay");
src = gst_element_factory_make("udpsrc", NULL);
caps = gst_element_factory_make("capsfilter", NULL);
tee = gst_element_factory_make("tee", "tee");
// Display
rtpDepay = gst_element_factory_make("rtph264depay", NULL);
h264Parse = gst_element_factory_make("h264parse", NULL);
displayQueue = gst_element_factory_make("queue", NULL);
decoder = gst_element_factory_make("avdec_h264", NULL);
videoConvert = gst_element_factory_make("videoconvert", NULL);
upload = gst_element_factory_make("d3d11upload", NULL);
sink = gst_element_factory_make("d3d11videosink", NULL);
// Record
recordRtpDepay = gst_element_factory_make("rtph264depay", NULL);
recordH264Parse = gst_element_factory_make("h264parse", NULL);
recordQueue = gst_element_factory_make("queue", "save_queue");
mux = gst_element_factory_make("mp4mux", NULL);
filesink = gst_element_factory_make("filesink", NULL);
// [2] Set element properties
g_object_set(src, "port", port, NULL);
g_object_set(caps, "caps", gst_caps_from_string("application/x-rtp, encoding-name=H264, payload=109"), NULL);
g_object_set(filesink, "location", "camera.mp4", NULL);
//g_object_set(mux, "faststart", true, NULL);
// [3] Add elements to pipeline and link together
//gst_bin_add_many(GST_BIN(pipeline), src, caps, rtpDepay, h264Parse, displayQueue, decoder, videoConvert, upload, sink, NULL);
//gst_element_link_many(src, caps, rtpDepay, h264Parse, displayQueue, decoder, videoConvert, upload, sink, NULL);
gst_bin_add_many(GST_BIN(pipeline), src, caps, tee, rtpDepay, h264Parse, displayQueue, decoder, videoConvert, upload, sink, recordRtpDepay, recordH264Parse, recordQueue, mux, filesink, NULL);
if (!gst_element_link_many(src, caps, tee, NULL)
|| !gst_element_link_many(tee, rtpDepay, h264Parse, displayQueue, decoder, videoConvert, upload, sink, NULL)
|| !gst_element_link_many(tee, recordRtpDepay, recordH264Parse, recordQueue, mux, filesink, NULL))
{
qDebug() << "Failed to link elements";
}
How do I add a -e flag as a GstElement? I've searched online and I can't find anyone trying to do this programatically with that flag.
The -e flag sends and EOS at the end of the stream. While processing the EOS message, the video writer will write the header information needed for the video to be playable.
The solution is to change the way you stop your pipeline. Instead of however you currently do it (you did not include that code), you should get access to the srcpad of your udpsrc object, and then send a GST_EVENT_EOS. This will signal to the application that you have ended the stream. Each element will process what it needs to, and then forward that event further down the pipeline. It will reach the video writer, which will then write the needed header information to your videofile, before entering a paused state.
On shutdown, call
gst_element_send_event(src, gst_event_new_eos ())
This will send EOS event downstream and write the required meta data.
it's easier to use gst_parse_launch and then gst_bin_get_by_name on elements you want to do fancy things with.
The -e flags can be given in gst_init

how to use x264enc and avdec_h264?

I'm new for gstreamer. I want to encode the video of my MacbookPro. built-in cam to h264 and then play. in command line, I tried "
gst-launch-1.0 autovideosrc ! queue ! x264enc ! avdec_h264 ! queue ! autovideosink " and it works.but when I run the c++ code, it failed, only show a green screen.
video_src = gst_element_factory_make("autovideosrc", "video_source");
video_enc = gst_element_factory_make("x264enc", "videoEncoder");
video_dec = gst_element_factory_make("avdec_h264", "videodecoder");
video_sink = gst_element_factory_make("osxvideosink", nullptr);
gst_bin_add_many...
gst_element_link_many (video_src, screen_queue, video_enc, video_dec, video_sink, NULL);
not sure how to correct it. thanks!

Gstreamer : 'src' element task going to PAUSE state when one of the elements in pipeline is added dynamically

I have the following pipeline :
v4l2src -> queue -> h264parse -> avdec_h264 -> identity ->
imagefreeze(added/removed dynamically) -> glupload -> glcolorconvert ->
gltransformation -> glimagesink
I have a added a probe on the element identity srcpad.
Based, on the user-input I add or remove dynamically the element imagefreeze
Here is the pseudo code :
#show live video on rendering window till no user input
#if user_input == 1:
insert_imagefreeze #analogous to image being displayed on rendering window
#if user_input == 2:
delete_imagefreeze #resume back showing live video as before
Inserting imagefreeze is no problem, it works fine. I can observe the results that I would want to with a imagefreeze
However, after the element imagefreeze is added, the element v4l2src task goes to a paused state. Here is the info log :
0:03:39.608226968 [333m29510[00m 0x1561c00 [36mINFO [00m [00m v4l2src gstv4l2src.c:949:gst_v4l2src_create:<source>[00m sync to 0:03:39.066664476 out ts 0:03:39.375180156
0:03:39.608449406 [333m29510[00m 0x1561c00 [36mINFO [00m [00m basesrc gstbasesrc.c:2965:gst_base_src_loop:<source>[00m pausing after gst_pad_push() = eos
0:03:39.608561724 [333m29510[00m 0x1561c00 [36mINFO [00m [00m task gsttask.c:316:gst_task_func:<source:src>[00m Task going to paused.
Can anyone explain, why the source element of pipeline goes to a paused state once a new element is added to the pipeline.
And snippets from actual code :
def add_delete(self):
if ui_input_cnt = 1 #updated based on user input
self.idsrcpad = self.identity.get_static_pad("src")
self.in_idle_probe = False
self.probeID = self.idsrcpad.add_probe(Gst.PadProbeType.IDLE,self.lengthen_pipeline)
if ui_input_cnt = 2
self.probeID2 = self.idsrcpad.add_probe(Gst.PadProbeType.IDLE,self.shorten_pipeline)
def lengthen_pipeline(self,pad,info):
print("entered callback")
global pipeline
#restrict only 1 call to this callback
if self.in_idle_probe == True:
print("callback for now restricted to one call per thread")
return Gst.PadProbeReturn.OK
if self.in_idle_probe == False:
self.in_idle_probe == True
#create image freze element
self.ifreeze = Gst.ElementFactory.make("imagefreeze","ifreeze")
# increment reference
self.ifreeze.ref()
#add imagefreze to pipeline
pipeline.add(self.ifreeze)
#sync image freeze state to main pipeline
self.ifreeze.sync_state_with_parent()
#unlink identity and upload
#1.get sink pad of upload and srcpad of identity
sinkpad = self.upload.get_static_pad("sink")
srcpad = self.identity.get_static_pad("src")
print("unlinking identit srcpad - uplaod sinkpad")
if self.check_and_unlink(srcpad,sinkpad):
#2.get sink pad of imagefreeze
sinkpad = self.ifreeze.get_static_pad("sink")
#3. link identity src pad to image freeze sinkpad
print("linking identity srcpad - ifreeze sinkpad")
self.check_and_link(srcpad,sinkpad)
#4. link imagefreeze src pad to upload sink pad
#get image freeze srcpad and sinkpad of upload
srcpad = self.ifreeze.get_static_pad("src")
sinkpad = self.upload.get_static_pad("sink")
print("linking ifreeze srcpad - upload sinkpad")
if self.check_and_link(srcpad,sinkpad):
return Gst.PadProbeReturn.REMOVE
else:
print("ERORR : unlinking")
return -1
The functions check_and_link(srcpad,sinkpad) and check_and_unlink(srcpad,sinkpad), does no more than checking the src and sink pads, and then linking and unlinking accordingly.
The sink sends a EOS event, when the imagefreeze is added dynamically. This makes the pipeline go into a PAUSED state. However, why the sink receives a EOS event is still unclear, and needs further investigation. More information of the observations are provided here.

gstreamer add an element to a pipline created through gst_parse_launch

I am trying to create a pipeline and then add on a videosink after I've created it. I need to do this so I can set the video overlay window id of the videosink before I commit it to the pipeline.
so I have this code
pipeline = gst_parse_launch( "filesrc location=file.svg ! rsvgdec ! imagefreeze", &err );
sink = gst_element_factory_make( "glimagesink" );
gst_video_overlay_set_window_handle( GST_VIDEO_OVERLAY( sink ), this->winId() );
gst_bin_add_many( GST_BIN( pipeline ), sink, nullptr );
if ( !gst_element_link_many( pipeline, sink, nullptr ) )
{
qCritical() << "Unable to link elements";
}
When I run it it fails to link the elements.
Any idea why this is happening. I'm assuming it's because I'm trying to link an element to a "bin" rather than another element. However, I can't see any examples of where someone adds an element to a pipeline which was created through gst_parse_launch.
You can't connect it to a bin. You need to specify a pad - or an element from where it tries to pick a pad. So you would need iterate through the bin and pick the imagefreeze element from the list.
Alternative approach - add the sink and get it from the pipeline:
pipeline = gst_parse_launch( "filesrc location=file.svg ! rsvgdec ! imagefreeze ! glimagesink name=mysink", &err );
GstElement *sink = gst_bin_get_by_name( GST_BIN( pipeline ), "mysink" );
gst_video_overlay_set_window_handle( GST_VIDEO_OVERLAY( sink ), this->winId() );
gst_object_unref( sink );
You may have issues here as well tough as you may require a videoconvert to satisfy the sink's format requirements.
filesrc location=file.svg ! rsvgdec ! imagefreeze ! videoconvert ! glimagesink
or maybe
filesrc location=file.svg ! rsvgdec ! imagefreeze ! glupload ! glcolorconvert ! glimagesink