I try to the gst-launch-1.0 filesrc location=/mnt/baita.jpg !decodebin ! videoscale ! video/x-raw,width=1920,height=1080 ! imagefreeze ! glimagesink render-rectangle="<0,0,1920,1080>" and it is Ok!
but i want to put it in codes and i cannot find the format about the render-rectangle parameter in the c languagećso if i want to put it in c language,how i should use it.Thank you very much !
jih488
The docs for glimagesink at https://gstreamer.freedesktop.org/documentation/opengl/glimagesinkelement.html?gi-language=c#glimagesinkelement:render-rectangle reveal that it takes a GstValueArray argument.
See https://gstreamer.freedesktop.org/documentation/gstreamer/gstvalue.html?gi-language=c#GstValueArray for details and functions to create such an array.
You can use Florian's suggestion to do it with g_object_set() and a GstValueArray, but you can even do it more cleanly.
The "render-rectangle" property of glimagesink comes from the fact that it implements the interface GstVideoOverlay. So if you want to set this directly, you can use gst_video_overlay_set_render_rectangle().
Related
I have finished read the userguide in the github of gst-rtsp-server,
I have found the demo always use such code to construct a static pipeline,
factory = gst_rtsp_media_factory_new ();
gst_rtsp_media_factory_set_launch (factory,
"( rtspsrc location=rtsp://admin:Admin12345#192.168.1.126 ! rtph264depay ! h264parse ! rtph264pay pt=96 name=pay0 )");
but, if I want to use my own pipeline so that I could get the GstElement* pointer of the pipeline for the next work, how should I do?
I have read the examples of gst-rtsp-server in GitHub,but it is no help
To make your own pipeline you have to inherit GstRTSPMediaFactory and override create_element virtual member.
As example you could look at GstRTSPMediaFactory default implementation:
https://github.com/GStreamer/gst-rtsp-server/blob/master/gst/rtsp-server/rtsp-media-factory.c#L1636
You can use gst_parse_launch, and input your custom pipeline you need.
Also take a look at other functions provided in the link.
Is it possible to get gst-launch string equivalent for any gst-play command?
For example, playing rtsp stream with gst-play could be:
gst-play-1.0.exe rtsp://path/to/source
That command makes connect to server and opens internal (gstreamer) window for playing.
Equivalent command could be (I don't really sure):
gst-launch-1.0.exe uridecodebin uri=rtsp://path/to/source ! autovideosink
But how to get it in general case?
My main purpose is to redirect video stream to an avi-file while I know only good gst-play command. So I need to replace autovideosink with filesink in result command.
After update - I would say that you have some options:
1, Use gst-play with option --videosink but you would also need the avi mux element there and also it must be encoded in h264.. so this approach would need some hacking in source code of gst-play which you obviously do not want
1a, You can also use playbin as suggested by #thiagoss with parameter video-sink .. then you can maybe use named bin and pass it to it (not sure if this is possible this way, but you may try this):
gst-launch-1.0 playbin uri=rtsp video-sink=bin_avi \( name=bin_avi x264enc ! avimux ! filesink location=file.avi \)
2, Get the pipeline picture, analyse it and create the same thing yourself manually, in Unix-like systems do:
export GST_DEBUG_DUMP_DOT_DIR=`pwd`
gst-play-1.0 rtsp://...
#or use gst-launch and playbin.. its the same thing basically
Check the generated *.dot files.. choose the latest one (in PLAYING state) and use graphviz library to turn it into picture:
dot -T png *PLAYING.dot -o playbin.png
3, Use just the uridecodebin stuff and continue like I wrote in 1a
uridecodebin ! video/x-raw ! x264enc ! ....
HTH
Use playbin:
gst-launch-1.0.exe playbin uri=rtsp://path/to/source
I'm trying to use a jpg-File as a virtual webcam for Skype (or similar). The image file is reloading every few seconds and the Pipeline should also transmit always the newest image.
I started creating a Pipeline like this
gst-launch filesrc location=~/image.jpg ! jpegdec ! ffmpegcolorspace ! freeze ! v4l2sink device=/dev/video2
but it only streams the first image and ignores the newer versions of the image file. I read something about concat and dynamically changing the Pipeline but I couldn't get this working for me.
Could you give me any hints on how to get this working?
Dynamic refresh the input file is NOT possible (at least with filesrc).
Besides, your sample use freeze, which will prevent the image change.
One possible method is using multifilesrc and videorate instead.
multifilesrc can read many files (with a provided pattern similar to scanf/printf), and videorate can control the speed.
For example, you create 100 images with format image0000.jpg, image0001.jpg, ..., image0100.jpg. Then play them continuously, with each image in 1 second:
gst-launch multifilesrc location=~/image%04d.jpg start-index=0 stop-index=100 loop=true caps="image/jpeg,framerate=\(fraction\)1/1" ! jpegdec ! ffmpegcolorspace ! videorate ! v4l2sink device=/dev/video2
Changing the number of image at stop-index=100, and change speed at caps="image/jpeg,framerate=\(fraction\)1/1"
For more information about these elements, refer to their documents at gstreamer.freedesktop.org/documentation/plugins.html
EDIT: Look like you use GStreamer 0.10, not 1.x
In this case, please refer to old documents multifilesrc and videorate
You can use a general file name with multifilesrc if you add some parameter adjustments and pair it with an identity on a delay. It's a bit fragile but it'll do fine for a temporary one-off program as long as you keep your input images the same dimensions and format.
gst-launch-1.0 multifilesrc loop=true start-index=0 stop-index=0 location=/tmp/whatever ! decodebin ! identity sleep-time=1000000 ! videoconvert ! v4l2sink
I have plugin that works with raw video and can resize it during work.
This plugin have two (compatible) video inputs, and one video output.
Caps of input and output may be different
But when I try use different caps at sink and source it couldn't link.
Examples
This works good.
gst-launch-1.0 videotestsrc ! video/x-raw,format=BGR,width=800,height=600 ! my_plugin name=t ! video/x-raw,format=BGR,width=800,height=600 ! fakesink videotestsrc ! video/x-raw,format=BGR,width=800,height=600 ! t.
But this isn't. [WARNING: erroneous pipeline: could not link t to fakesink0]
gst-launch-1.0 videotestsrc ! video/x-raw,format=BGR,width=800,height=600 ! my_plugin make_small=1 name=t ! video/x-raw,format=BGR,width=750,height=600 ! fakesink videotestsrc ! video/x-raw,format=BGR,width=800,height=600 ! t.
I read docs/design/draft-klass.txt and looked to videoscale plugin description and changed my_plugin description like
Factory Details:
..
Klass Mixer/Effect/Converter/Video/Scaler
..
But it still doesn't work. What i miss?
Edit. My problem occurred because of using GST_PAD_SET_PROXY_CAPS() for all sink/source pads. According to the documentation this function simplify event management and guarantee that those caps is compatible.
using GST_PAD_SET_PROXY_CAPS() is an answer.
I'm have pipeline:
gst-launch-1.0 rtspsrc location=rtsp://ip/cam ! rtph264depay ! h264parse ! mp4mux fragment-duration=10000 streamable=1 ! multifilesink next-file=2 location=file-%03d.mp4
The first segment is played well, others not. When I'm try to view the structure of damaged mp4 see an interesting bug:
MOOV
Some data
MOOF
MDAT
MOOF
MDAT
The most interesting thing in "Some data". There is no header data, they simply exist. By block size I think it MDAT. I find size of the block and add before it MDAT header. File immediately becomes valid and playing. But the unknown piece can't be played because before it no MOOF header.
Problem is at mp4mux and qtmux. Tested on GStreamer 1.1.0 and 1.2.2. All results are identical.
Can use multifilesink not correct?
If you take look at documentation for multifilesink you will find the answer:
It is not possible to use this element to create independently playable mp4 files, use the splitmuxsink element for that instead. ...
So use splitmuxsink and don't forget to send EOS when you done to correct finish last file
Update
Looks like at time when question has been asked there wasn't such element like splitmuxsink
Can this be reproduced using videotestsrc instead of rtsp?
Try replacing your h264 receiving and depayloading with "videotestsrc num-buffers= ! x264enc ! mp4mux ..."
This might be a bug, please file it at https://bugzilla.gnome.org/enter_bug.cgi?product=GStreamer so it gets proper attention from maintainers.
Also, how are you trying to play it?
Thanks