How to get video/x-raw from GstCaps - gstreamer

I'm trying to support all camera drivers but I need to separate from audio and video drivers. So I need to get
video/x-raw, image/jpeg from GstCaps element but I couldn't find anything on the internet how to get only video/x-raw from GstCaps.
I know how to get height and width from GstCaps
gst_structure_get_int(s, "height", &height);
gst_structure_get_int(s, "width", &width);
But I don't know how to get "video/x-raw","image/jpeg","application/x-rtp" What is the function of it?

Use gst_structure_get_name (const GstStructure *structure);
const gchar *media_type = gst_structure_get_name (s);
g_print("media_type is: %s\n", media_type);

Related

GStreamer - Pipeline how to connect filesrc to qmlglsink

I'm new in the world of GStreamer, so I can't figure out how it works, and how to pair all GstElements.
I want to merge Video (mp4 for example, or any other video format) with qml (from Qt) as an overlay.
This example works perfectly fine.
GstElement *pipeline = gst_pipeline_new(NULL);
GstElement *src = gst_element_factory_make("videotestsrc",NULL);
GstElement *glupload = gst_element_factory_make("glupload",NULL);
GstElement *qmlglsink = gst_element_factory_make("qmlglsink",NULL);
g_assert(src && glupload && qmlglsink);
gst_bin_add_many(GST_BIN(pipeline), src, glupload, sink);
gst_element_link_many(src, glupload, sink, NULL);
But that example uses videotestsrc as Source, I would prefer to use something like filesrc.
I tried this code:
GstElement *pipeline = gst_pipeline_new (NULL);
GstElement *src = gst_element_factory_make ("filesrc", "file-source");
GstElement *parser = gst_element_factory_make("h264parse",NULL);
GstElement *decoder = gst_element_factory_make("avdec_h264",NULL);
GstElement *colors = gst_element_factory_make("glcolorconvert",NULL);
GstElement *glupload = gst_element_factory_make ("glupload", NULL);
GstElement *sink = gst_element_factory_make ("qmlglsink", NULL);
g_assert (src && parser && decoder %% colors && glupload && sink);
g_object_set (G_OBJECT (src), "location", "file:///home/test.mp4", NULL);
gst_bin_add_many (GST_BIN (pipeline), src, parser, decoder, glupload, colors, sink, NULL);
gst_element_link_many (src, parser, decoder, glupload, colors, sink, NULL);
It compiles, but the output is just a black screen.
Since I'm not sure how the GStreamer pipeline works, I tried that.
First, get the file from memory with filesrc, then parse it with h265parse and decode it with avdec_h264. Then forward that (I guess raw uncompressed data) to glupload and make colors good with glcolorconvert, since qmlglsink uses RGBA, and avdec_h264 is I420. After colors, are adjusted forward it to qmlglsink to be displayed in qml.
I'm missing something, and I don't know how to pair GstElemnts, as I said I need to pair filesrc (any video format) and qmlglsink.
You can try like below
MediaPlayer{
id: playVideo
source: "gst-pipeline: filesrc location=/home/root/skim-debris.mp4 ! qtdemux ! avdec_h264 ! qtvideosink"
autoLoad: true
autoPlay: true
playbackRate: 1.0
loops: 10
}
VideoOutput {
anchors.fill: parent
source: playVideo
}
It is easier to use a bin or any of GStreamers auto-pluggers.
But the main issue here is that you try treating an MP4 file as H.264 stream. This cannot work this way. You need to demux media streams from your container.
E.g. the pipeline should be something like this:
gst-launch-1.0 filesrc location=/home/test.mp4 ! qtdemux ! \
h264parse ! avdec_h264 ! glupload ! glcolorconvert ! qmlglsink

GStreamer. Probe after rtph265pay never called

i have rtsp server and i want to extend rtp buffer header. For this purpose i added probe to src of rtph265pay, but it never called. My pipeline:
( appsrc name=vsrc ! nvvidconv ! video/x-raw(memory:NVMM),format=NV12 !
omxh265enc MeasureEncoderLatency=true bitrate=20000000 control-rate=2 !
rtph265pay name=pay0 pt=96 )
Code where i attach probe:
static GstPadProbeReturn test_probe (GstPad *pad, GstPadProbeInfo *info,
gpointer user_data)
{
cout << "i'm here";
}
void mediaConfigure (GstRTSPMediaFactory* factory, GstRTSPMedia* media,
gpointer user_data)
{
GstElement *element, *rtph265pay; GstPad *pad;
element = gst_rtsp_media_get_element (media);
rtph265pay = gst_bin_get_by_name_recurse_up (GST_BIN (element), "pay0");
pad = gst_element_get_static_pad (rtph265pay, "src");
gst_pad_add_probe (pad, GST_PAD_PROBE_TYPE_BUFFER,
(GstPadProbeCallback) test_probe, NULL, NULL);
gst_object_unref (pad);
}
If i set "sink" instead of "src", probe works, but i need "src" to change rtp buffer header...
What is wrong here?
Maybe because the rtph265pay's src pad isn't linked to any other pad - meaning rtph265pay is the end of the pipeline - the element doesn't pass any buffers to its src pad?
Try and attach a fakesink after the rtph265pay.

Gstreamer udp-source pcm playback

I have these gst-launch parameters that do what I want:
gst-launch-1.0.exe udpsrc port=22122 ! audio/x-raw,format=S16LE,rate=16000,channels=1 ! autoaudiosink
However, I cannot convert it into code. I'm trying the following:
GstElement *pipeline = gst_pipeline_new("audio-player");
GstBus *bus = gst_pipeline_get_bus(GST_PIPELINE(pipeline));
guint bus_watch_id = gst_bus_add_watch(bus, bus_call, m_gstMainLoop);
gst_object_unref(bus);
GstElement *source = gst_element_factory_make("udpsrc", "udpsrc0");
GstElement *sink = gst_element_factory_make("autoaudiosink", "autoaudiosink0");
g_object_set(G_OBJECT(source), "port", 7200, "buffer-size", 1000000, NULL);
gst_bin_add_many(GST_BIN(pipeline), source, sink, NULL);
GstCaps *caps = gst_caps_new_simple("audio/x-raw",
"format", G_TYPE_STRING, "S16LE",
"layout", G_TYPE_STRING, "INTERLEAVED",
"rate", G_TYPE_INT, 16000,
"channels", G_TYPE_INT, 1,
NULL);
gst_element_link_filtered(source, sink, caps);
gst_caps_unref(caps);
gst_element_set_state(pipeline, GST_STATE_PLAYING);
g_main_loop_run(m_gstMainLoop);
In the dot-graph they look almost alike, but not entirely, though I can't figure out what I'm missing.
Not sure why, but it works if I leave out INTERLEAVED (which should be lower-case if entered) and I also missed that I entered the wrong port-numer (doh!).

GStreamer add probe with playbin

The following code adds a callback when each frame is displayed and it's working well:
pipeline = gst_parse_launch("filesrc location=/path ! decodebin ! autovideosink", &error);
video_sink = gst_bin_get_by_interface(GST_BIN(pipeline), GST_TYPE_VIDEO_OVERLAY);
GstPad *pad = gst_element_get_static_pad(video_sink, "sink");
gst_pad_add_probe(pad, GST_PAD_PROBE_TYPE_BUFFER, (GstPadProbeCallback)cb_have_data, data, NULL);
The following code adds the same callback but it's never been called:
pipeline = gst_parse_launch("playbin uri=file:///path", &error);
video_sink = gst_bin_get_by_interface(GST_BIN(pipeline), GST_TYPE_VIDEO_OVERLAY);
GstPad *pad = gst_element_get_static_pad(video_sink, "sink");
gst_pad_add_probe(pad, GST_PAD_PROBE_TYPE_BUFFER, (GstPadProbeCallback)cb_have_data, data, NULL);
Any idea why and how to fix that?
playbin has no input pads and no output pads, so you can't put a
probe, as a probe has to go on a pad.
However, there is a get-video-pad action signal you can run on
playbin, which it's possible to attach a probe to.

Pushing images into a gstreamer pipeline

After playing around with some toy applications, exploring the
documentation and googling around (including the mailing list
archives) I am still puzzled for what I would think is a rather common
use case.
I have an existing code that generates images (in memory) and I would
like to push these images into a gstreamer pipeline (to create a flv
video at the end).
I could not find an "obvious way to do it". My best guess will be to
dig in the source code of GstMultiFileSrc and its parent GstPushSrc,
to figure it out.
Could any of you point me out to the "obvious way" of doing this ?
Is it there any related piece of documentation/tutorial/example on this ?
Once I have the input right, the rest is a piece of cake, thanks to
Gstreamer awesomeness !
(something like "my magic input -> ffmpegcolorspace ! ffenc_flv !
flvmux ! filesink location=desktop.flv" )
Thanks for your answers.
GStreamer uses plugins to do everything. Plugins that create data or take it from an external source are called "src" plugins.
The generic src plugin for injecting application-generated data into a pipeline is called appsrc. The API provided by appsrc is documented as part of the App Library.
Here's one example that demonstrates feeding appsrc with generated images: gdk-gstappsrc-stream.c. It seems to be derived from some test code in the GStreamer source tree: here.
Another approach would be to create your own src plugin. Look at the goom music visualization plugin for an example that seems to work in a way similar to what you have specified.
I found a solution (maybe) to this (i get the images with OpenCV) ... but i have an error with the pipeline: ERROR from element mysource: Error en el flujo de datos interno.
Debugging info: gstbasesrc.c(2574): gst_base_src_loop (): /GstPipeline:pipeline0/GstAppSrc:mysource:
streaming task paused, reason not-negotiated (-4)
this is the code:
typedef struct _App App;
struct _App{
GstElement *pipeline;
GstElement *appsrc;
GMainLoop *loop;
guint sourceid;
GTimer *timer;
};
App s_app;
CvCapture *capture;
static gboolean read_data(App *app){
GstFlowReturn ret;
GstBuffer *buffer = gst_buffer_new();
IplImage* frame = cvQueryFrame(capture);
GST_BUFFER_DATA(buffer) = (uchar*)frame->imageData;
GST_BUFFER_SIZE(buffer) = frame->width*frame->height*sizeof(uchar*);
g_signal_emit_by_name(app->appsrc,"push-buffer",buffer,&ret);
gst_buffer_unref(buffer);
if(ret != GST_FLOW_OK){
GST_DEBUG("Error al alimentar buffer");
return FALSE;
}
return TRUE;
}
static void start_feed(GstElement* pipeline,guint size, App* app){
if(app->sourceid == 0){
GST_DEBUG("Alimentando");
app->sourceid = g_idle_add((GSourceFunc) read_data, app);
}
}
static void stop_feed(GstElement* pipeline, App* app){
if(app->sourceid !=0 ){
GST_DEBUG("Stop feeding");
g_source_remove(app->sourceid);
app->sourceid = 0;
}
}
static gboolean
bus_message (GstBus * bus, GstMessage * message, App * app)
{
GST_DEBUG ("got message %s",
gst_message_type_get_name (GST_MESSAGE_TYPE (message)));
switch (GST_MESSAGE_TYPE (message)) {
case GST_MESSAGE_ERROR: {
GError *err = NULL;
gchar *dbg_info = NULL;
gst_message_parse_error (message, &err, &dbg_info);
g_printerr ("ERROR from element %s: %s\n",
GST_OBJECT_NAME (message->src), err->message);
g_printerr ("Debugging info: %s\n", (dbg_info) ? dbg_info : "none");
g_error_free (err);
g_free (dbg_info);
g_main_loop_quit (app->loop);
break;
}
case GST_MESSAGE_EOS:
g_main_loop_quit (app->loop);
break;
default:
break;
}
return TRUE;
}
int main(int argc, char* argv[]){
App *app = &s_app;
GError *error = NULL;
GstBus *bus;
GstCaps *caps;
capture = cvCaptureFromCAM(0);
gst_init(&argc,&argv);
/* create a mainloop to get messages and to handle the idle handler that will
* feed data to appsrc. */
app->loop = g_main_loop_new (NULL, TRUE);
app->timer = g_timer_new();
app->pipeline = gst_parse_launch("appsrc name=mysource ! video/x-raw-rgb,width=640,height=480,bpp=24,depth=24 ! ffmpegcolorspace ! videoscale method=1 ! theoraenc bitrate=150 ! tcpserversink host=127.0.0.1 port=5000", NULL);
g_assert (app->pipeline);
bus = gst_pipeline_get_bus (GST_PIPELINE (app->pipeline));
g_assert(bus);
/* add watch for messages */
gst_bus_add_watch (bus, (GstBusFunc) bus_message, app);
/* get the appsrc */
app->appsrc = gst_bin_get_by_name (GST_BIN(app->pipeline), "mysource");
g_assert(app->appsrc);
g_assert(GST_IS_APP_SRC(app->appsrc));
g_signal_connect (app->appsrc, "need-data", G_CALLBACK (start_feed), app);
g_signal_connect (app->appsrc, "enough-data", G_CALLBACK (stop_feed), app);
/* set the caps on the source */
caps = gst_caps_new_simple ("video/x-raw-rgb",
"bpp",G_TYPE_INT,24,
"depth",G_TYPE_INT,24,
"width", G_TYPE_INT, 640,
"height", G_TYPE_INT, 480,
NULL);
gst_app_src_set_caps(GST_APP_SRC(app->appsrc), caps);
/* go to playing and wait in a mainloop. */
gst_element_set_state (app->pipeline, GST_STATE_PLAYING);
/* this mainloop is stopped when we receive an error or EOS */
g_main_loop_run (app->loop);
GST_DEBUG ("stopping");
gst_element_set_state (app->pipeline, GST_STATE_NULL);
gst_object_unref (bus);
g_main_loop_unref (app->loop);
cvReleaseCapture(&capture);
return 0;
}
Any idea???
You might try hacking imagefreeze to do what you want. appsrc might also do it.