Gst-rtsp-server 1.0 how to use my own pipeline - gstreamer

I have finished read the userguide in the github of gst-rtsp-server,
I have found the demo always use such code to construct a static pipeline,
factory = gst_rtsp_media_factory_new ();
gst_rtsp_media_factory_set_launch (factory,
"( rtspsrc location=rtsp://admin:Admin12345#192.168.1.126 ! rtph264depay ! h264parse ! rtph264pay pt=96 name=pay0 )");
but, if I want to use my own pipeline so that I could get the GstElement* pointer of the pipeline for the next work, how should I do?
I have read the examples of gst-rtsp-server in GitHub,but it is no help

To make your own pipeline you have to inherit GstRTSPMediaFactory and override create_element virtual member.
As example you could look at GstRTSPMediaFactory default implementation:
https://github.com/GStreamer/gst-rtsp-server/blob/master/gst/rtsp-server/rtsp-media-factory.c#L1636

You can use gst_parse_launch, and input your custom pipeline you need.
Also take a look at other functions provided in the link.

Related

glimagesink render-rectangle can not set in code

I try to the gst-launch-1.0 filesrc location=/mnt/baita.jpg !decodebin ! videoscale ! video/x-raw,width=1920,height=1080 ! imagefreeze ! glimagesink render-rectangle="<0,0,1920,1080>" and it is Ok!
but i want to put it in codes and i cannot find the format about the render-rectangle parameter in the c language。so if i want to put it in c language,how i should use it.Thank you very much !
jih488
The docs for glimagesink at https://gstreamer.freedesktop.org/documentation/opengl/glimagesinkelement.html?gi-language=c#glimagesinkelement:render-rectangle reveal that it takes a GstValueArray argument.
See https://gstreamer.freedesktop.org/documentation/gstreamer/gstvalue.html?gi-language=c#GstValueArray for details and functions to create such an array.
You can use Florian's suggestion to do it with g_object_set() and a GstValueArray, but you can even do it more cleanly.
The "render-rectangle" property of glimagesink comes from the fact that it implements the interface GstVideoOverlay. So if you want to set this directly, you can use gst_video_overlay_set_render_rectangle().

GStreamer pipeline with Tee including two sink fails

Trying to implement GStreamer pipeline with Tee using following elements.
gst_bin_add_many(GST_BIN (pipeline), <rpicamsrc>, <capsfilter>, <h264parse>, tee, <queue>, <rtph264pay>, <fakesink>, <queue>, <avdec_h264>, <videoconvert>, <capsfilter>, <customplugin>, <fakesink>, nullptr);
For better understanding provided the element names. The purpose is to create Tee pipeline as follows:
rpicamsrc ! capsfilter ! h264parse ! tee name=t t. ! queue ! rtph264pay ! fakesink t. ! queue ! avdec_h264 ! videoconvert ! capsfilter ! customplugin ! fakesink
But it fails always and doesn't report any error. But no video frames are captured. After some testing identified that fails for even this simple pipeline:
gst_element_link_many ( <rpicamsrc>, <capsfilter>, <h264parse>, <rtph264pay>, <fakesink>, nullptr))
Interesting is if I remove second fakesink from that above gst_bin_add_many line of code it works. Not sure what's the problem with this. Tried to use a different sink like autovideosink but no luck. When it fails it doesn't receive GST message type GST_MESSAGE_ASYNC_DONE in gst bus, but for success case it does. Gets GST_STREAM_STATUS_TYPE_CREATE, GST_STREAM_STATUS_TYPE_ENTER and GST_MESSAGE_STREAM_START for both failure and success case. What I am doing wrong, any ideas?
gst_element_link_many() is a convenient wrapper for a non-branched pipeline, meaning that it links one from next, to next. It does not know that you want to link the tee element in the middle of the pipeline with multiple elements. For example in your case, it tries to connect the fakesink to the queue in the middle of your pipeline.
Easy way
You can use gst_parse_launch() to let GStreamer figure out what links to what.
By your hands
If you have an element like tee, you must use gst_element_link() or gst_element_link_pads() to tell GSreamer that which element connect to which.
It is possible to create two pipelines with gst_element_link_many(),
rpicamsrc → capsfilter → h264parse → tee → queue → rtph264pay → fakesink
queue → avdec_h264 → videoconvert → capsfilter → customplugin→ fakesink
and then, link the tee element in the above to the below with gst_element_link_pads().

Creating a virtual webcam from jpeg using GStreamer

I'm trying to use a jpg-File as a virtual webcam for Skype (or similar). The image file is reloading every few seconds and the Pipeline should also transmit always the newest image.
I started creating a Pipeline like this
gst-launch filesrc location=~/image.jpg ! jpegdec ! ffmpegcolorspace ! freeze ! v4l2sink device=/dev/video2
but it only streams the first image and ignores the newer versions of the image file. I read something about concat and dynamically changing the Pipeline but I couldn't get this working for me.
Could you give me any hints on how to get this working?
Dynamic refresh the input file is NOT possible (at least with filesrc).
Besides, your sample use freeze, which will prevent the image change.
One possible method is using multifilesrc and videorate instead.
multifilesrc can read many files (with a provided pattern similar to scanf/printf), and videorate can control the speed.
For example, you create 100 images with format image0000.jpg, image0001.jpg, ..., image0100.jpg. Then play them continuously, with each image in 1 second:
gst-launch multifilesrc location=~/image%04d.jpg start-index=0 stop-index=100 loop=true caps="image/jpeg,framerate=\(fraction\)1/1" ! jpegdec ! ffmpegcolorspace ! videorate ! v4l2sink device=/dev/video2
Changing the number of image at stop-index=100, and change speed at caps="image/jpeg,framerate=\(fraction\)1/1"
For more information about these elements, refer to their documents at gstreamer.freedesktop.org/documentation/plugins.html
EDIT: Look like you use GStreamer 0.10, not 1.x
In this case, please refer to old documents multifilesrc and videorate
You can use a general file name with multifilesrc if you add some parameter adjustments and pair it with an identity on a delay. It's a bit fragile but it'll do fine for a temporary one-off program as long as you keep your input images the same dimensions and format.
gst-launch-1.0 multifilesrc loop=true start-index=0 stop-index=0 location=/tmp/whatever ! decodebin ! identity sleep-time=1000000 ! videoconvert ! v4l2sink

How to stream in h265 using gstreamer?

I am trying to use latest gstreamer and x265enc together. I saw that someone have already posted some commits in http://cgit.freedesktop.org/gstreamer/gst-plugins-bad/log/ext/x265/gstx265enc.c
Can anyone please give an example pipeline where it is known to working (gst-launch-1.0 pipeline example will be very helpful)
1)
What is the current status of x265enc plugin for gstreamer ? does it work really ?
Which branch of gstreamer I need to use to build x265enc? I want to build whole gsteamer source code which will be compatible with x265enc plugin.
What are the system requirement for x265enc and how to build it ? Any wiki/basic instructions will be very helpful.
My goal is to broadcast my ip cameras (h264 streams) as h265 stream on vaughnlive.tv
Currently, I am using following pipeline to broadcast in h264 format:
GST_DEBUG=2 gst-launch-1.0 flvmux name=mux streamable=true ! rtmpsink
sync=true location="rtmp://xxxxxxxxxxxx" rtspsrc
location="rtsp://xxxxxxx" caps="application/x-rtp,
media=(string)audio, clock-rate=(int)90000, encoding-name=(string)MPA,
payload=(int)96" ! rtpmpadepay ! mpegaudioparse ! queue ! mad !
audioconvert ! queue ! voaacenc bitrate=128000 ! aacparse !
audio/mpeg,mpegversion=4,stream-format=raw ! mux. rtspsrc
location="rtsp://xxxxxxx"
caps="application/x-rtp,media=(string)video,clock-rate=(int)90000,
encoding-name=(string)H264" ! rtph264depay !
video/x-h264,stream-format=avc,alignment=au,byte-stream=false ! queue
! decodebin ! queue ! videorate ! "video/x-raw,framerate=30/1" ! queue
! x264enc threads=4 speed-preset=ultrafast bitrate=3072 ! mux.
2)
Can anyone please suggest on how should I change this pipeline to broadcast in h265 format using x265enc element?
A little late but, maybe some people will find this question when seeking info about H.265 support in gstreamer nowadays. This is with gstreamer 1.6.1 compiled from source on Ubuntu 15.10 which has packages ready for libx265..
1,
Encoder
There is x265enc which will be enabled when we have library libx265-dev.
The encoder is inside gst-plugins-bad so after doing autogen.sh you should see x265enc enabled.
You may also need h265parse, rtph265pay/depay
Decoder
I see two decoders, dont know which one is working, I guess libde265dec there is also avdec_h265.
mux
For mux for x264 I was using mpegtsmux, but this does not support video/x265, some work has to be done. The matroskamux should be working when using filesink etc..
[16:39] hi, which container is suitable for x265enc, for x264enc I was using mpegtsmux?
[16:54] otopolsky: mpegts would work if you add support for h265 there, not very difficult[16:55] slomo_: so we need to just add the caps compatibility?
[16:55] otopolsky: otherwise, matroskamux supports it. mp4mux/qtmux could get support for it relatively easily too
[16:55] otopolsky: a bit more than that. look at what tsdemux does for h265
[16:56] otopolsky: and check the gst_mpegts_descriptor_from_registration related code in tsmux
[17:00] slomo_: thanks
2,
Questioned flvmux also does not support h265 only h264..
matroskamux cannot be used for streaming, so only way is to patch mpegtsmux or flvmux etc.

gstreamer link different sink/source caps to plugin

I have plugin that works with raw video and can resize it during work.
This plugin have two (compatible) video inputs, and one video output.
Caps of input and output may be different
But when I try use different caps at sink and source it couldn't link.
Examples
This works good.
gst-launch-1.0 videotestsrc ! video/x-raw,format=BGR,width=800,height=600 ! my_plugin name=t ! video/x-raw,format=BGR,width=800,height=600 ! fakesink videotestsrc ! video/x-raw,format=BGR,width=800,height=600 ! t.
But this isn't. [WARNING: erroneous pipeline: could not link t to fakesink0]
gst-launch-1.0 videotestsrc ! video/x-raw,format=BGR,width=800,height=600 ! my_plugin make_small=1 name=t ! video/x-raw,format=BGR,width=750,height=600 ! fakesink videotestsrc ! video/x-raw,format=BGR,width=800,height=600 ! t.
I read docs/design/draft-klass.txt and looked to videoscale plugin description and changed my_plugin description like
Factory Details:
..
Klass Mixer/Effect/Converter/Video/Scaler
..
But it still doesn't work. What i miss?
Edit. My problem occurred because of using GST_PAD_SET_PROXY_CAPS() for all sink/source pads. According to the documentation this function simplify event management and guarantee that those caps is compatible.
using GST_PAD_SET_PROXY_CAPS() is an answer.