Assisted-autoplugging (cutting) of uridecodebin - gstreamer

bool demuxDone = false;
gboolean
autopluggerCallback (GstElement * elem, GstPad *pad, GstCaps * caps)
{
if (cmpType(caps, "video/x-h264")) {
relayVideoPad = pad;
demuxDone = true;
}
if (cmpType(caps, "audio/x-ac3")) {
relayAudioPad = pad;
demuxDone = true;
}
if (demuxDone) {
return FALSE;
}
return TRUE;
}
I connected the autoplug-continue signal handler to uridecodebin. My goal is to prevent it from creating anything after the tsdemux and then connect video/audio to flvmux.
But the problem I am having is that one more element is still created, the multiqueue that is connected right after the tsdemux0. Why? I tried to detect the creation of a demuxer by catching the element-added signal instead of waiting for video/x-h264, but the result is the same.
The resulting pipeline is dumped to dot:
http://pastebin.com/acBUdfpi
Well, I can probably just connect multiqueue to the flvmux, but then I do not know how to get the multiqueue pointer. I tried gst_pad_get_peer->gst_get_pad_parent_element, (to go from demuxer src-video-pad to the next element), but gst_get_pad_parent_element returns 0 even though the peer is non 0.

Related

How to use splitmuxsink in a dynamic pipeline

What is the correct way of using splitmuxsink in a dynamic pipeline?
Previously I have used filesink to record (no problem what so ever) but there is requirement to save the file in segments so I have tried to use splitmuxsink in dynamic pipeline(there is async time in recording). In doing so I have faced two problems
when I tried to stop the recording, I use a idle pad to block the recording queue and launch a callback function to do steps to delink the recording branch (send eos, set elements in recording bin to NULL, then dequeue the bin). I have set a downstream data probe to notify me that the eos has reached the splitmuxsink sink before I tried to do step 2..(set elemets to null)
However, the end result is that i still have an empty last file (o bytes). It seem that the pipe is not yet closed or having some problem. I did a workaround to split the video immediately when the record stop (though I lost a few frames)
How should one stop in a dynamic branch?
When I tried to create the recording bin when i start the recording(utilizing the pad-added signal when a pad is created to connect the recording bin). Previously I have created the recording bin in normal sequence (not creating them during the glib loop that I have created). The previous step work ok but the present step has the splitmuxsink's filesink in a locked state
How should I workaround this? What causes the lock state?
Here is my code
/// create record bin
static void
pad-added(GstElement * self,
GstPad * new_pad,
gpointer user_data)
{
char* pad_name = gst_pad_get_name(new_pad);
if(g_str_equal(pad_name,"src"))
{
//RECORD records;
records.recording = gst_bin_new("recording");
records.queue = gst_element_factory_make("queue","queue");
records.enc = gst_element_factory_make("vpuenc_h264","enc");
records.parser = gst_element_factory_make("h264parse","parser");
records.sink = gst_element_factory_make("splitmuxsink","sink");
// Add it to the main pipeline
gst_bin_add_many(GST_BIN(records.recording),
records.queue,
records.enc,
records.parser,
records.sink,NULL);
// link up the recording elements queue
gst_element_link_many(records.queue,
records.enc,
records.parser,
records.sink,NULL)
g_object_set(G_OBJECT(records.fsink),
//"location","video_%d.mp4",
"max-size-time", (guint64) 10L * GST_SECOND,
"async-handling", TRUE,
"async-finalize", TRUE,
NULL);
records.queue_sink_pad = gst_element_get_static_pad (records.queue, "sink");
records.ghost_pad = gst_ghost_pad_new ("sink", records.queue_sink_pad);
gst_pad_set_active(records.ghost_pad, TRUE);
gst_element_add_pad(GST_ELEMENT(records.recording),records.ghost_pad);
g_signal_connect (records.sink, "format-location",
(GCallback)format_location_callback,
&records);
}
}
gboolean cmd_loop()
{
// other cmd not shown here
if(RECORD)
{
//create tee sink pad
// this step will trigger the pad-added function
tee_sink_pad = gst_element_get_request_pad (tee,"src");
// ....other function
}
}
int main()
{
// add the pad-added signal response
g_signal_connect(tee, "pad-added", G_CALLBACK(pad-added), NULL);
// use to construct the loop (cycle every 1s)
GSource* source = g_timeout_source_new(1000);
// set function to watch for command
g_source_set_callback(source,
(GSourceFunc)cmd_loop,
NULL,
NULL);
}

GStreamer sendonly to multiple WebRTC clients

I've been trying to setup a simple sendonly WebRTC client with GStreamer but I'm having issues with getting the actual video to display on the WebRTC receiver side. I am new to both GStreamer and WebRTC.
I'm using the examples from https://gitlab.freedesktop.org/gstreamer/gst-examples/-/tree/master/webrtc to try and come up with a combination of certain parts. I've had 1:1 communication working but I wanted to introduce the rooms so I can have more clients viewing the "view-only" stream from GStreamer.
My current code is based on the multiparty-sendrecv example where I swapped out the audio for video. Furthermore, I'm using a modified version of the signalling server and a modified version of the javascript webrtc client. If necessary I could provide code for all of the above, but to keep things simple I won't. This is because I don't think the problem lies in either the signalling server or webrtc client, because the ICE candidates have been successfully negotiated along with the SDP offer & answer according to chrome://webrtc-internals/. See the image below.
In order to figure out what's going on I've exported a graph that shows the GStreamer pipeline after a user has joined the room and was added to the pipeline. See graph below.
As far as I can tell I should be receiving video data on my frontend, but I'm not. I've had a single weird case where the videotestsrc did show up, but I haven't been able to reproduce it. But because of this, it makes me think that the pipeline itself isn't neccesarily wrong, but perhaps we're dealing with some kind of race condition.
I've added the modified example of multiparty-sendrecv below, please take a look at it. Most of the methods have purposely been left out due to Stackoverflow's character limit.
Main functions
static void
handle_media_stream(GstPad* pad, GstElement* pipe, const char* convert_name,
const char* sink_name)
{
GstPad* qpad;
GstElement* q, * conv, * sink;
GstPadLinkReturn ret;
q = gst_element_factory_make("queue", NULL);
g_assert_nonnull(q);
conv = gst_element_factory_make(convert_name, NULL);
g_assert_nonnull(conv);
sink = gst_element_factory_make(sink_name, NULL);
g_assert_nonnull(sink);
gst_bin_add_many(GST_BIN(pipe), q, conv, sink, NULL);
gst_element_sync_state_with_parent(q);
gst_element_sync_state_with_parent(conv);
gst_element_sync_state_with_parent(sink);
gst_element_link_many(q, conv, sink, NULL);
qpad = gst_element_get_static_pad(q, "sink");
ret = gst_pad_link(pad, qpad);
g_assert_cmpint(ret, == , GST_PAD_LINK_OK);
}
static void
on_incoming_decodebin_stream(GstElement* decodebin, GstPad* pad,
GstElement* pipe)
{
GstCaps* caps;
const gchar* name;
if (!gst_pad_has_current_caps(pad)) {
g_printerr("Pad '%s' has no caps, can't do anything, ignoring\n",
GST_PAD_NAME(pad));
return;
}
caps = gst_pad_get_current_caps(pad);
name = gst_structure_get_name(gst_caps_get_structure(caps, 0));
if (g_str_has_prefix(name, "video")) {
handle_media_stream(pad, pipe, "videoconvert", "autovideosink");
}
else if (g_str_has_prefix(name, "audio")) {
handle_media_stream(pad, pipe, "audioconvert", "autoaudiosink");
}
else {
g_printerr("Unknown pad %s, ignoring", GST_PAD_NAME(pad));
}
}
static void
on_incoming_stream(GstElement* webrtc, GstPad* pad, GstElement* pipe)
{
GstElement* decodebin;
GstPad* sinkpad;
if (GST_PAD_DIRECTION(pad) != GST_PAD_SRC)
return;
decodebin = gst_element_factory_make("decodebin", NULL);
g_signal_connect(decodebin, "pad-added",
G_CALLBACK(on_incoming_decodebin_stream), pipe);
gst_bin_add(GST_BIN(pipe), decodebin);
gst_element_sync_state_with_parent(decodebin);
sinkpad = gst_element_get_static_pad(decodebin, "sink");
gst_pad_link(pad, sinkpad);
gst_object_unref(sinkpad);
}
static void
add_peer_to_pipeline(const gchar* peer_id, gboolean offer)
{
int ret;
gchar* tmp;
GstElement* tee, * webrtc, * q;
GstPad* srcpad, * sinkpad;
tmp = g_strdup_printf("queue-%s", peer_id);
q = gst_element_factory_make("queue", tmp);
g_free(tmp);
webrtc = gst_element_factory_make("webrtcbin", peer_id);
g_object_set(webrtc, "bundle-policy", GST_WEBRTC_BUNDLE_POLICY_MAX_BUNDLE, NULL);
gst_bin_add_many(GST_BIN(pipeline), q, webrtc, NULL);
srcpad = gst_element_get_static_pad(q, "src");
g_assert_nonnull(srcpad);
sinkpad = gst_element_get_request_pad(webrtc, "sink_%u");
g_assert_nonnull(sinkpad);
ret = gst_pad_link(srcpad, sinkpad);
g_assert_cmpint(ret, == , GST_PAD_LINK_OK);
gst_object_unref(srcpad);
gst_object_unref(sinkpad);
tee = gst_bin_get_by_name(GST_BIN(pipeline), "videotee");
g_assert_nonnull(tee);
srcpad = gst_element_get_request_pad(tee, "src_%u");
g_assert_nonnull(srcpad);
gst_object_unref(tee);
sinkpad = gst_element_get_static_pad(q, "sink");
g_assert_nonnull(sinkpad);
ret = gst_pad_link(srcpad, sinkpad);
g_assert_cmpint(ret, == , GST_PAD_LINK_OK);
gst_object_unref(srcpad);
gst_object_unref(sinkpad);
/* This is the gstwebrtc entry point where we create the offer and so on. It
* will be called when the pipeline goes to PLAYING.
* XXX: We must connect this after webrtcbin has been linked to a source via
* get_request_pad() and before we go from NULL->READY otherwise webrtcbin
* will create an SDP offer with no media lines in it. */
if (offer)
g_signal_connect(webrtc, "on-negotiation-needed",
G_CALLBACK(on_negotiation_needed), (gpointer)peer_id);
/* We need to transmit this ICE candidate to the browser via the websockets
* signalling server. Incoming ice candidates from the browser need to be
* added by us too, see on_server_message() */
g_signal_connect(webrtc, "on-ice-candidate",
G_CALLBACK(send_ice_candidate_message), (gpointer)peer_id);
/* Incoming streams will be exposed via this signal */
g_signal_connect(webrtc, "pad-added", G_CALLBACK(on_incoming_stream),
pipeline);
/* Set to pipeline branch to PLAYING */
ret = gst_element_sync_state_with_parent(q);
g_assert_true(ret);
ret = gst_element_sync_state_with_parent(webrtc);
g_assert_true(ret);
GST_DEBUG_BIN_TO_DOT_FILE(GST_BIN(pipeline), GST_DEBUG_GRAPH_SHOW_ALL, "pipeline");
}
static gboolean
start_pipeline(void)
{
GstStateChangeReturn ret;
GError* error = NULL;
/* NOTE: webrtcbin currently does not support dynamic addition/removal of
* streams, so we use a separate webrtcbin for each peer, but all of them are
* inside the same pipeline. We start by connecting it to a fakesink so that
* we can preroll early. */
/*pipeline = gst_parse_launch("tee name=videotee ! queue ! fakesink "
"videotestsrc is-live=true pattern=ball ! videoconvert ! queue ! vp8enc deadline=1 ! rtpvp8pay ! "
"queue ! " RTP_CAPS_VP8 "96 ! videotee. ", &error);*/
pipeline = gst_parse_launch("tee name=videotee ! queue ! fakesink "
"videotestsrc is-live=true pattern=ball ! videoconvert ! queue ! vp8enc deadline=1 ! rtpvp8pay ! "
"queue ! " RTP_CAPS_VP8 "96 ! videotee. ", &error);
if (error) {
g_printerr("Failed to parse launch: %s\n", error->message);
g_error_free(error);
goto err;
}
g_print("Starting pipeline, not transmitting yet\n");
ret = gst_element_set_state(GST_ELEMENT(pipeline), GST_STATE_PLAYING);
if (ret == GST_STATE_CHANGE_FAILURE)
goto err;
return TRUE;
err:
g_print("State change failure\n");
if (pipeline)
g_clear_object(&pipeline);
return FALSE;
}
/*
* When we join a room, we are responsible for calling by starting negotiation
* with each peer in it by sending an SDP offer and ICE candidates.
*/
static void
do_join_room(const gchar* text)
{
gint ii, len;
gchar** peer_ids;
if (app_state != ROOM_JOINING) {
cleanup_and_quit_loop("ERROR: Received ROOM_OK when not calling",
ROOM_JOIN_ERROR);
return;
}
app_state = ROOM_JOINED;
g_print("Room joined\n");
/* Start recording, but not transmitting */
if (!start_pipeline()) {
cleanup_and_quit_loop("ERROR: Failed to start pipeline", ROOM_CALL_ERROR);
return;
}
peer_ids = g_strsplit(text, " ", -1);
g_assert_cmpstr(peer_ids[0], == , "ROOM_OK");
len = g_strv_length(peer_ids);
/* There are peers in the room already. We need to start negotiation
* (exchange SDP and ICE candidates) and transmission of media. */
if (len > 1 && strlen(peer_ids[1]) > 0) {
g_print("Found %i peers already in room\n", len - 1);
app_state = ROOM_CALL_OFFERING;
for (ii = 1; ii < len; ii++) {
gchar* peer_id = g_strdup(peer_ids[ii]);
g_print("Negotiating with peer %s\n", peer_id);
/* This might fail asynchronously */
call_peer(peer_id);
peers = g_list_prepend(peers, peer_id);
}
}
g_strfreev(peer_ids);
return;
}
int
main(int argc, char* argv[])
{
GOptionContext* context;
GError* error = NULL;
context = g_option_context_new("- gstreamer webrtc sendrecv demo");
g_option_context_add_main_entries(context, entries, NULL);
g_option_context_add_group(context, gst_init_get_option_group());
if (!g_option_context_parse(context, &argc, &argv, &error)) {
g_printerr("Error initializing: %s\n", error->message);
return -1;
}
if (!check_plugins())
return -1;
if (!room_id) {
g_printerr("--room-id is a required argument\n");
return -1;
}
if (!local_id)
local_id = g_strdup_printf("%s-%i", g_get_user_name(),
g_random_int_range(10, 10000));
/* Sanitize by removing whitespace, modifies string in-place */
g_strdelimit(local_id, " \t\n\r", '-');
g_print("Our local id is %s\n", local_id);
if (!server_url)
server_url = g_strdup(default_server_url);
/* Don't use strict ssl when running a localhost server, because
* it's probably a test server with a self-signed certificate */
{
GstUri* uri = gst_uri_from_string(server_url);
if (g_strcmp0("localhost", gst_uri_get_host(uri)) == 0 ||
g_strcmp0("127.0.0.1", gst_uri_get_host(uri)) == 0)
strict_ssl = FALSE;
gst_uri_unref(uri);
}
loop = g_main_loop_new(NULL, FALSE);
connect_to_websocket_server_async();
g_main_loop_run(loop);
gst_element_set_state(GST_ELEMENT(pipeline), GST_STATE_NULL);
g_print("Pipeline stopped\n");
gst_object_unref(pipeline);
g_free(server_url);
g_free(local_id);
g_free(room_id);
return 0;
}

GStreamer pipeline hangs on gst_element_get_state

I have following very basic code using GStreamer library (GStreamer v1.8.1 on Xubuntu 16.04 if it important)
#include <gst/gst.h>
int main(int argc, char *argv[])
{
gst_init(&argc, &argv);
const gchar* pd =
"filesrc location=some.mp4 ! qtdemux name=d "
"d.video_0 ! fakesink "
"d.audio_0 ! fakesink ";
GError* error = nullptr;
GstElement *pipeline = gst_parse_launch(pd, &error);
GstState state; GstState pending;
switch(gst_element_set_state(pipeline, GST_STATE_PAUSED)) {
case GST_STATE_CHANGE_FAILURE:
case GST_STATE_CHANGE_NO_PREROLL:
return -1;
case GST_STATE_CHANGE_ASYNC: {
gst_element_get_state(pipeline, &state, &pending, GST_CLOCK_TIME_NONE);
}
case GST_STATE_CHANGE_SUCCESS:
break;
}
GMainLoop* loop = g_main_loop_new(nullptr, false);
g_main_loop_run(loop);
gst_object_unref(pipeline);
return 0;
}
The problem is when I try run this code it hangs on
gst_element_get_state(pipeline, &state, &pending, GST_CLOCK_TIME_NONE);
The question is - why it hangs? Especially if take into account, if I remove d.audio_0 ! fakesink from pipeline description it doesn't hang.
It is good practice to always add queues (or a multiqueue) after elements that produces multiple output branches in the pipeline e.g. demuxers.
The reason is that sinks will block waiting for other sinks to receive the first buffer (preroll). With a single thread, as your code, it will block the only thread available to push data into the sinks. A single thread is going from the demuxers to both sinks, once 1 blocks the there is no way for data to arrive on the second sink.
Using queues will spawn new threads and each sink will have a dedicated one.
That's quite an old thread but it probably hangs because you have an infinite timeout (GST_CLOCK_TIME_NONE).

Gstreamer. Write appsink to filesink

I have written a code for appsrc to appsink and it works. I see the actual buffer. It's encoded in H264(vpuenc=avc). Now I want to save it in a file(filesink). How I approach it?
app:
int main(int argc, char *argv[]) {
gst_init (NULL, NULL);
GstElement *pipeline, *sink;
gchar *descr;
GError *error = NULL;
GstAppSink *appsink;
descr = g_strdup_printf (
"mfw_v4lsrc device=/dev/video1 capture_mode=0 ! " // grab from mipi camera
"ffmpegcolorspace ! vpuenc codec=avc ! "
"appsink name=sink"
);
pipeline = gst_parse_launch (descr, &error);
if (error != NULL) {
g_print ("could not construct pipeline: %s\n", error->message);
g_error_free (error);
exit (-1);
}
gst_element_set_state(pipeline, GST_STATE_PAUSED);
sink = gst_bin_get_by_name (GST_BIN (pipeline), "sink");
appsink = (GstAppSink *) sink;
gst_app_sink_set_max_buffers ( appsink, 2); // limit number of buffers queued
gst_app_sink_set_drop( appsink, true ); // drop old buffers in queue when full
gst_element_set_state (pipeline, GST_STATE_PLAYING);
int i = 0;
while( !gst_app_sink_is_eos(appsink) )
{
GstBuffer *buffer = gst_app_sink_pull_buffer(appsink);
uint8_t* data = (uint8_t*)GST_BUFFER_DATA(buffer);
uint32_t size = GST_BUFFER_SIZE(buffer);
gst_buffer_unref(buffer);
}
return 0; }
If as mentioned in the comments, what you actually want to know is how to do a network video stream in GStreamer, you should probably close this question because you're on the wrong path. You don't need to use an appsink or filesink for that. What you'll want to investigate are the GStreamer elements related to RTP, RTSP, RTMP, MPEGTS, or even MJPEGs (if your image size is small enough).
Here are two basic send/receive video stream pipelines:
gst-launch-0.10 v4l2src ! ffmpegcolorspace ! videoscale ! video/x-raw-yuv,width=640,height=480 ! vpuenc ! h264parse ! rtph264pay ! udpsink host=localhost port=5555
gst-launch-0.10 udpsrc port=5555 ! application/x-rtp,encoding-name=H264,payload=96 ! rtph264depay ! h264parse ! ffdec_h264 ! videoconvert ! ximagesink
In this situation you don't write your own while loop. You register callbacks and wait for buffers (GStreamer 0.10) to arrive. If you're using GStreamer 1.0, you use samples instead of buffers. Samples are a huge pain in the ass compared to buffers but oh well.
Register the callback:
GstAppSinkCallbacks* appsink_callbacks = (GstAppSinkCallbacks*)malloc(sizeof(GstAppSinkCallbacks));
appsink_callbacks->eos = NULL;
appsink_callbacks->new_preroll = NULL;
appsink_callbacks->new_sample = app_sink_new_sample;
gst_app_sink_set_callbacks(GST_APP_SINK(appsink), appsink_callbacks, (gpointer)pointer_to_data_passed_to_the_callback, free);
And your callback:
GstFlowReturn app_sink_new_sample(GstAppSink *sink, gpointer user_data) {
prog_data* pd = (prog_data*)user_data;
GstSample* sample = gst_app_sink_pull_sample(sink);
if(sample == NULL) {
return GST_FLOW_ERROR;
}
GstBuffer* buffer = gst_sample_get_buffer(src);
GstMemory* memory = gst_buffer_get_all_memory(buffer);
GstMapInfo map_info;
if(! gst_memory_map(memory, &map_info, GST_MAP_READ)) {
gst_memory_unref(memory);
gst_sample_unref(sample);
return GST_FLOW_ERROR;
}
//render using map_info.data
gst_memory_unmap(memory, &map_info);
gst_memory_unref(memory);
gst_sample_unref(sample);
return GST_FLOW_OK;
}
You can keep your while loop as it is--using gst_app_sink_is_eos()--but make sure to put a sleep in it. Most of the time I use something like the following instead:
GMainLoop* loop = g_main_loop_new(NULL, FALSE);
g_main_loop_run(loop);
g_main_loop_unref(loop);
Note: Unless you need to do something special with the data you can use the "filesink" element directly.
Simpler option would be write to the file directly in the appsink itself ie when you get a callback when the buffer is done write to the file and make sure you close it on eos.
Hope that helps.

Gstreamer appsrc: odd behaviour of need-data callback

I'm implementing gstreamer media player with my own source of data using appsrc. Everything works fine except one thing:
When stream reaches it's end, callback emits "end-of-stream" signal. Signals sending fucntion g_signal_emit_by_name(appsrc, "end-of-stream", &ret) returns GstFlowReturn value GST_FLOW_OK. But then it calls need-data my callback again, so it returns "end-of-stream" signal again. And this time GstFlowReturn value is (-3) which is GST_FLOW UNEXPECTED. I assume that it does not expect "end-of-stream" signal when it already recieved one, but why it requests more data than? Maybe it is because I didn't set size value iof the steam?
Gstreamer version is 0.10.
Callback function code (appsrc type is seekable btw):
static void cb_need_data (GstElement *appsrc, guint size, gpointer user_data)
{
GstBuffer *buffer;
GstFlowReturn ret;
AppsrcData* data = static_cast<AppsrcData*>(user_data);
buffer = gst_buffer_new_and_alloc(size);
int read = fread(GST_BUFFER_DATA(buffer), 1, size, data->file);
GST_BUFFER_SIZE(buffer) = read;
g_signal_emit_by_name (appsrc, "push-buffer", buffer, &ret);
if (ret != GST_FLOW_OK) {
/* something wrong, stop pushing */
g_printerr("GST_FLOW != OK, return value is %d\n", ret);
g_main_loop_quit (data->loop);
}
if(feof(data->file) || read == 0)
{
g_signal_emit_by_name(appsrc, "end-of-stream", &ret);
if (ret != GST_FLOW_OK) {
g_printerr("EOF reached, GST_FLOW != OK, return value is %d\nAborting...", ret);
g_main_loop_quit (data->loop);
}
}
}
You should provide some corrections to your code(if they are not there already) that should alleviate your issue and help the overall application:
Never try and send a buffer without first checking if it actually has data. So, simply check the buffer data and length to make sure that the data is not NULL and that the length is >0
You can flag that a stream is ended in your user_data. When you send your EOS, set an item in your userdata to indicate that it has been sent and if the appsrc requests more data, simply check if it has been sent and then do not send anything else to the buffer.
Listen for the EOS on your pipeline bus so that it can destroy the stream and close the loop when the EOS message is handled so that you can be sure that your mediasink has received the EOS and you can safely dispose of the pipeline and loop without losing any data.
Have you tried the method gst_app_src_end_of_stream()? I'm not sure what return code you should use after invoking it, but it should be either GST_FLOW_OK or GST_FLOW_UNEXPECTED.
In GStreamer 1.x you return GST_FLOW_EOS.