I'm working on a project and i need to make this work
gst-launch-1.0 -e \
videomixer name=mix \
sink_0::xpos=0 sink_0::ypos=0 sink_0::alpha=0 \
sink_1::xpos=640 sink_1::ypos=0 sink_1::alpha=1 \
sink_2::xpos=0 sink_2::ypos=0 sink_2::alpha=1 \
! glshader location=distortion.frag ! glimagesink sync=false \
videotestsrc pattern="black" \
! video/x-raw,width=1280,height=720 \
! mix.sink_0 \
rtpbin name=rtpbinleft latency=250 ntp-sync=true do-retransmission=0 \
udpsrc caps=application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)H264 port=5000 ! rtpbinleft.recv_rtp_sink_0 \
rtpbinleft. ! rtph264depay ! h264parse ! avdec_h264 ! videoscale add-borders=false ! video/x-raw,width=640,height=720 ! mix.sink_1 \
udpsrc port=5001 ! rtpbinleft.recv_rtcp_sink_0 \
rtpbinleft.send_rtcp_src_0 ! udpsink port=5005 host=192.168.0.17 sync=false async=false \
rtpbin name=rtpbinright latency=250 ntp-sync=true do-retransmission=0 \
udpsrc caps=application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)H264 port=6000 ! rtpbinright.recv_rtp_sink_0 \
rtpbinright. ! rtph264depay ! h264parse ! avdec_h264 ! videoscale add-borders=false ! video/x-raw,width=640,height=720 ! mix.sink_2 \
udpsrc port=6001 ! rtpbinright.recv_rtcp_sink_0 \
rtpbinright.send_rtcp_src_0 ! udpsink port=6005 host=192.168.0.18 sync=false async=false
it's supposed totake to streams and apply a barrel distortion.
Here is the tuto :https://www.raspberrypi.org/forums/viewtopic.php?f=43&t=65700&start=25
I've tried almost everything but it allways fail and say
ERROR: pipeline could not be constructed: syntax error.
Any help?
Apparently you've just forgot the enclosing quotation marks in the caps values of the rtpbin's elements. In a command line you must do like:
rtpbin caps="application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)H264"
Out of the quotations all the '(string)', '(int)' are not making sense to gst-launch.
Related
I'm trying to get used to using the gstreamer compositor.
I have this basic boilerplate example working. (Compositing 2 videotestsrc next to each other):
gst-launch-1.0 compositor name=comp \
sink_0::alpha=1 sink_0::xpos=0 sink_0::ypos=0 \
sink_1::alpha=0.5 sink_1::xpos=320 sink_1::ypos=0 ! \
queue2 ! video/x-raw, width=800, height=600 ! videoconvert ! xvimagesink \
videotestsrc pattern=1 ! "video/x-raw" ! comp.sink_0 \
videotestsrc pattern=8 ! "video/x-raw" ! comp.sink_1
Then I tried changing one of the video test src to a mp4 file
I know that this command line works:
gst-launch-1.0 filesrc location=tst.mp4 ! decodebin ! videoconvert ! autovideosink
So I tried combining these two working pipelines
gst-launch-1.0 compositor name=comp \
sink_0::alpha=1 sink_0::xpos=0 sink_0::ypos=0 \
sink_1::alpha=0.5 sink_1::xpos=320 sink_1::ypos=0 ! \
queue2 ! decodebin ! video/x-raw, width=800, height=600 ! videoconvert ! xvimagesink \
videotestsrc pattern=1 ! "video/x-raw" ! comp.sink_0 \
filesrc location=tst.mp4 ! "video/x-raw" ! comp.sink_1
When I run this I get an error saying that the filter caps do not complete specify the output format.... output caps are unfixed.
I'm positive this must be a simple syntax error. Does anyone know how to fix my pipeline?
No, you need to use most of the elements that made the standalone command line work. E.g.
gst-launch-1.0 compositor name=comp \
sink_0::alpha=1 sink_0::xpos=0 sink_0::ypos=0 \
sink_1::alpha=0.5 sink_1::xpos=320 sink_1::ypos=0 ! \
queue2 ! decodebin ! video/x-raw, width=800, height=600 ! videoconvert ! xvimagesink \
videotestsrc pattern=1 ! "video/x-raw" ! comp.sink_0 \
filesrc location=tst.mp4 ! decodebin ! videoconvert ! comp.sink_1
Dears
I'm looking for a way to put two videos side by side and send them in H.264.
#!/bin/bash
gst-launch-1.0 -e \
filesrc location="/home/namako/tairyou_2.mp4" \
! decodebin \
! videoscale \
! capsfilter caps="video/x-raw,width=240" \
! videoconvert \
! videomixer.sink_0 \
\
filesrc location="/home/namako/tairyou_2.mp4" \
! decodebin \
! videoscale \
! capsfilter caps="video/x-raw,width=240" \
! videoconvert \
! videomixer.sink_1 \
\
videomixer background=1 name=videomixer sink_0::xpos=0 sink_0::ypos=0 sink_1::xpos=512 sink_1::ypos=0 \
! decodebin \
! videoscale \
! capsfilter caps="video/x-raw,width=240" \
! videoconvert \
! x264enc \
! rtph264pay \
! udpsink host=127.0.0.1 port=5000
I did the above code, but the following "error" was displayed.
パイプラインを一時停止 (PAUSED) にしています...
Pipeline is PREROLLING ...
Redistribute latency...
Redistribute latency...
Redistribute latency...
Redistribute latency...
ERROR: from element /GstPipeline:pipeline0/GstX264Enc:x264enc0: Can not initialize x264 encoder.
追加のデバッグ情報:
gstx264enc.c(1587): gst_x264_enc_init_encoder (): /GstPipeline:pipeline0/GstX264Enc:x264enc0
ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...
Freeing pipeline ...
It seems that h.264 can not be encoded.
Also, using the above code, it becomes the following pipeline connection diagram.
enter image description here
In the connection diagram, the videomixer input is mpeg2 but the output is jpeg.
What should I do to output in mpeg2, H.264?
Thanks!
The video can be played with the following code.
#!/bin/bash
gst-launch-1.0 -e \
filesrc location="filename" \
! decodebin \
! videoscale \
! capsfilter caps="video/x-raw,width=480,height=270" \
! videoconvert \
! videomixer.sink_0 \
\
filesrc location="filename" \
! decodebin \
! videoscale \
! capsfilter caps="video/x-raw,width=480,height=270" \
! videoconvert \
! videomixer.sink_1 \
\
videomixer background=1 name=videomixer sink_0::xpos=0 sink_0::ypos=0 sink_1::xpos=480 sink_1::ypos=0 \
! decodebin \
! videoscale \
! capsfilter caps="video/x-raw,width=480,height=270" \
! videoconvert \
! x264enc \
! rtph264pay \
! udpsink host=127.0.0.1 port=5000
I run the following code
gst-launch-1.0 -e \
videomixer name=mix \
sink_0::xpos=0 sink_0::ypos=0 sink_0::alpha=0\
sink_1::xpos=0 sink_1::ypos=0 \
sink_2::xpos=200 sink_2::ypos=0 \
sink_3::xpos=0 sink_3::ypos=100 \
sink_4::xpos=200 sink_4::ypos=100 \
rtmpsrc location='rtmp://streaming.example.com:1935/209147924'\
! decodebin ! videoconvert ! videoscale \
! video/x-raw,width=200,height=100 \
! mix.sink_1 \
rtmpsrc location='rtmp://streaming.example.com:1935/209147925'\
! decodebin ! videoconvert ! videoscale \
! video/x-raw,width=200,height=100 \
! mix.sink_2 \
rtmpsrc location='rtmp://streaming.example.com:1935/209147926'\
! decodebin ! videoconvert ! videoscale \
! video/x-raw,width=200,height=100 \
! mix.sink_3 \
rtmpsrc location='rtmp://streaming.example.com:1935/209147927'\
! decodebin ! videoconvert ! videoscale \
! video/x-raw,width=200,height=100 \
! mix.sink_4 \
mix. ! queue ! videoconvert ! x264enc ! flvmux streamable=true ! queue ! rtmpsink location='rtmp://streaming.example.com:1935/test'
Thank you. We solved the problem with the mosaic. This is the working version.
There are two issues.
1) Main issue is "videomixer" has only one src pad. You are connecting it to two pads.
-
gst-launch-1.0 -e \
videomixer name=mix \
sink_0::xpos=0 sink_0::ypos=0 sink_0::alpha=0\
sink_1::xpos=0 sink_1::ypos=0 \
sink_2::xpos=200 sink_2::ypos=0 \
sink_3::xpos=0 sink_3::ypos=100 \
sink_4::xpos=200 sink_4::ypos=100 \
! xvimagesink
By doing this, you are connecting videomixer src_pad to sink pad of sixvimagesink
Again at the end, you are trying to connect videomixer src_ to rtmpsink using queue and other elements.
So you have to remove one of the connections.
If you don't want to connect to xvimagesink, just remove "! xvimagesink"
If you dont want to connect to rtmpsink, remove the "mix ! queue ! videoconvert ..." part.
2) If you want to retain connection to queue, there is the following issue.
You are connecting mix.sink_4 to mix.src.
... ! mix.sink_4 \
! mix. ! queue ! videoconvert ! ...
Remove the first "!" and "." in the last line.
... ! mix.sink_4 \
mix ! queue ! videoconvert ! ...
Then it should not give syntax error.
EDIT 1
I think again you have made a mistake. You are connecting src of mix to mix.sink_0. I have corrected it.
gst-launch-1.0 -e \
videomixer name=mix \
sink_0::xpos=0 sink_0::ypos=0 sink_0::alpha=0\
sink_1::xpos=0 sink_1::ypos=0 \
sink_2::xpos=200 sink_2::ypos=0 \
sink_3::xpos=0 sink_3::ypos=100 \
sink_4::xpos=200 sink_4::ypos=100 \
\ /* You should not add "! .mix.sink_0" here. */
rtmpsrc location='rtmp://streaming.example.com:1935/209147924'\
! decodebin ! videoconvert ! videoscale \
! video/x-raw,width=200,height=100 \
! mix.sink_1 \
rtmpsrc location='rtmp://streaming.example.com:1935/209147925'\
! decodebin ! videoconvert ! videoscale \
! video/x-raw,width=200,height=100 \
! mix.sink_2 \
rtmpsrc location='rtmp://streaming.example.com:1935/209147926'\
! decodebin ! videoconvert ! videoscale \
! video/x-raw,width=200,height=100 \
! mix.sink_3 \
rtmpsrc location='rtmp://streaming.example.com:1935/209147927'\
! decodebin ! videoconvert ! videoscale \
! video/x-raw,width=200,height=100 \
! mix.sink_4 \
mix ! queue ! videoconvert ! x264enc ! flvmux streamable=true ! queue ! rtmpsink location='rtmp://streaming.example.com:1935/test'
Let me give some information about "name=" usage here.
You can name an element in gstreamer pipeline and use it to construct pipeline. It is mostly useful in complex pipelines. let me show its usage with a simple pipeline.
Assume the following is the required pipeline:
srcelem ! elem1 ! elem2 ! elem3 ! sinkelem
It can be written as below.
elem2 name=named_elem \ /* Naming elem2 */
named_elem ! elem3 ! sinkelem \ /* Connecting elem2 to downstream pipeline part. Note that there is no "!" before "named_elem" */
srcelem ! elem1 ! named_elem /* Connecting elem2 to upstream pipeline part. Note that there is no "!" after "named_elem" */
If you read it carefully, it constructs the same pipeline which is mentioned earlier.
Hi guys with the help of rtranscode i managed this pipe to work. But it only records video it does not include sound. What am I missing ?
gst-launch-1.0 souphttpsrc location="http://example/2.ts" is-live=true keep-alive=true do-timestamp=true retries=10 typefind=true blocksize=16384 \
! tsdemux parse-private-sections=false program-number=-1 name=demux demux.audio_0101 \
! queue \
! aacparse \
! avdec_aac \
! audioconvert dithering=0 \
! audio/x-raw,channels=2 \
! voaacenc bitrate=65536 \
! matroskamux name=stream streamable=true demux. \
! queue \
! h264parse \
! video/x-h264,alignment=au \
! omxh264dec \
! video/x-raw,width=910,height=512 \
! omxh264enc target-bitrate=294912 control-rate=variable \
! h264parse \
! filesink location="test.h264"
This is what I'm trying to achieve
I've been trying with the videobox plugin but all I'm getting is 4 equally sized boxes
I previously did something similar to what you described using 3 cameras.
Here's the pipe I used for it:
gst-launch-1.0 v4l2src device=/dev/video0 ! video/x-raw, width=800, height=480, framerate=25/1 ! alpha alpha=1.0 ! videobox left=-800 ! videomixer name=mix sink_00::xpos=0 sink_01::xpos=800 sink_02::xpos=0 sink_02::ypos=480 sink_00::alpha=1.0 sink_01::alpha=1.0 sink_02::alpha=1.0 ! videoconvert ! xvimagesink \
v4l2src device=/dev/video1 ! video/x-raw, width=1600, height=1200, framerate=25/1 ! alpha alpha=1.0 ! videobox border-alpha=0 top=-480 ! mix. \
v4l2src device=/dev/video2 ! video/x-raw, width=800, height=480, framerate=25/1 ! alpha alpha=1.0 ! videobox border-alpha=0 left=-00 ! mix. -e
You need to use a videobox element for each videocapture (in this case I used v4l2src however you can use other sources like filesrc ! decodebin or a network source as well) pipe and combine them in a videomixer element.
In my case I used one video on the left and two videos on the right but you can adjust left and top parameters of videobox and sink_0x::xpos' and 'sink_0x::ypos elements of the videomixer element.
Remember to add alpha to each channel or your videos will be transparent.
So this depends very much on how and what you want to do. For example, this can be be as simple GUI application that presents 4 videos on 4 different surfaces and the GUI is responsible for the layout.
If you really want to create a new single image that contains these 4 video streams then videomixer sounds the way to go. See here for some example:
https://gstreamer.freedesktop.org/data/doc/gstreamer/head/gst-plugins-good/html/gst-plugins-good-plugins-videomixer.html
I use something like this to mix running streams into a single one
/usr/local/bin/gst-launch-1.0 -vv -e videomixer name=mix background=2 \
sink_0::alpha=1.0 \
sink_0::ypos=0 \
sink_0::xpos=0 \
sink_1::alpha=1.0 \
sink_1::ypos=80 \
sink_1::xpos=40 \
sink_2::alpha=1.0 \
sink_2::ypos=80 \
sink_2::xpos=410 \
sink_3::alpha=1.0 \
sink_3::ypos=80 \
sink_3::xpos=780 \
sink_0::zorder=1 \
sink_1::zorder=3 \
sink_2::zorder=3 \
sink_3::zorder=4 \
! clockoverlay auto-resize=false draw-shadow=false draw-outline=false halignment=left valignment=top \
! timeoverlay auto-resize=false draw-shadow=false draw-outline=false halignment=left valignment=bottom \
! queue \
! nvh264enc preset=1 bitrate=1500 rc-mode=2 gop-size=10 \
! h264parse config-interval=-1 \
! mpegtsmux ! rtpmp2tpay pt=33 \
! udpsink host=239.255.42.61 port=5004 multicast-iface=10g-1 ttl=4 ttl-mc=4 \
multifilesrc location=/IMG/logo.jpg caps="image/jpeg,framerate=1/1" \
! jpegdec ! videoconvert ! videoscale \
! video/x-raw,width=1920,height=1080 \
! mix.sink_0 \
udpsrc multicast-group=239.255.42.60 address=239.255.42.60 port=5004 multicast-iface=eth0 caps="application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)MP2T,payload=(int)33" \
! rtpjitterbuffer ! rtpmp2tdepay ! tsdemux ! h264parse config-interval=-1 \
! avdec_h264 skip-frame=1 output-corrupt=false ! videoconvert ! videoscale \
! video/x-raw,width=360,height=240 \
! mix.sink_1 \
udpsrc multicast-group=239.255.42.57 address=239.255.42.57 port=5004 multicast-iface=eth0 caps="application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)MP2T,payload=(int)33" \
! rtpjitterbuffer ! rtpmp2tdepay ! tsdemux ! h264parse config-interval=-1 \
! avdec_h264 skip-frame=1 output-corrupt=false ! videoconvert ! videoscale \
! video/x-raw,width=360,height=240 \
! mix.sink_2 \
udpsrc multicast-group=239.255.42.62 address=239.255.42.62 port=5004 multicast-iface=eth0 caps="application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)MP2T,payload=(int)33" \
! rtpjitterbuffer ! rtpmp2tdepay ! tsdemux ! h264parse config-interval=-1 \
! avdec_h264 skip-frame=1 output-corrupt=false ! videoconvert ! videoscale \
! video/x-raw,width=360,height=240 \
! mix.sink_3 \