Gstreamer compositor using filesrc mp4 file - gstreamer

I'm trying to get used to using the gstreamer compositor.
I have this basic boilerplate example working. (Compositing 2 videotestsrc next to each other):
gst-launch-1.0 compositor name=comp \
sink_0::alpha=1 sink_0::xpos=0 sink_0::ypos=0 \
sink_1::alpha=0.5 sink_1::xpos=320 sink_1::ypos=0 ! \
queue2 ! video/x-raw, width=800, height=600 ! videoconvert ! xvimagesink \
videotestsrc pattern=1 ! "video/x-raw" ! comp.sink_0 \
videotestsrc pattern=8 ! "video/x-raw" ! comp.sink_1
Then I tried changing one of the video test src to a mp4 file
I know that this command line works:
gst-launch-1.0 filesrc location=tst.mp4 ! decodebin ! videoconvert ! autovideosink
So I tried combining these two working pipelines
gst-launch-1.0 compositor name=comp \
sink_0::alpha=1 sink_0::xpos=0 sink_0::ypos=0 \
sink_1::alpha=0.5 sink_1::xpos=320 sink_1::ypos=0 ! \
queue2 ! decodebin ! video/x-raw, width=800, height=600 ! videoconvert ! xvimagesink \
videotestsrc pattern=1 ! "video/x-raw" ! comp.sink_0 \
filesrc location=tst.mp4 ! "video/x-raw" ! comp.sink_1
When I run this I get an error saying that the filter caps do not complete specify the output format.... output caps are unfixed.
I'm positive this must be a simple syntax error. Does anyone know how to fix my pipeline?

No, you need to use most of the elements that made the standalone command line work. E.g.
gst-launch-1.0 compositor name=comp \
sink_0::alpha=1 sink_0::xpos=0 sink_0::ypos=0 \
sink_1::alpha=0.5 sink_1::xpos=320 sink_1::ypos=0 ! \
queue2 ! decodebin ! video/x-raw, width=800, height=600 ! videoconvert ! xvimagesink \
videotestsrc pattern=1 ! "video/x-raw" ! comp.sink_0 \
filesrc location=tst.mp4 ! decodebin ! videoconvert ! comp.sink_1

Related

GStreamer Playing 3 videos side by side

Here is the code for 2 mp4 videos playing in videoboxes.
gst-launch-1.0 filesrc location=1.mp4 ! decodebin ! queue !
videoconvert ! videobox border-alpha=0 right=-100 ! videomixer
name=mix ! videoconvert ! autovideosink filesrc location=2.mp4 !
decodebin ! queue ! videoconvert ! videobox border-alpha=0 left=-100 !
mix.
I have tried with this code to play 3 videos
gst-launch-1.0 filesrc location=Downloads/1.mp4 ! decodebin ! queue !
videoconvert ! videobox border-alpha=0 right=-100 ! videomixer
name=mix !
videoconvert ! autovideosink filesrc location=Downloads/2.mp4 !
decodebin ! queue ! videoconvert ! videobox border-alpha=0 left=-100 !
mix !
videoconvert ! autovideosink filesrc location=Downloads/3.mp4 !
decodebin ! queue ! videoconvert ! videobox border-alpha=0 left=-200 !
mix.
I get syntax error :(
Something like that with videomixer
gst-launch-1.0 -e \
videomixer name=mix background=0 \
sink_1::xpos=0 sink_1::ypos=0 \
sink_2::xpos=200 sink_2::ypos=0 \
sink_3::xpos=100 sink_3::ypos=100 \
! autovideosink \
uridecodebin uri='file:///data/big_buck_bunny_trailer-360p.mp4' \
! videoscale \
! video/x-raw,width=200,height=100 \
! mix.sink_1 \
uridecodebin uri='file:///data/sintel_trailer-480p.webm' \
! videoscale \
! video/x-raw,width=200,height=100 \
! mix.sink_2 \
uridecodebin uri='file:///data/the_daily_dweebs-720p.mp4' \
! videoscale \
! video/x-raw,width=200,height=100 \
! mix.sink_3
Once you instantiate an element with a name (eg. videomixer name=mix), you can later connect to it with . (eg. mix.). You don't need to repeat autovideosink 3 times after that.
gst-launch-1.0 filesrc location=Downloads/1.mp4 ! decodebin ! queue ! videoconvert ! videobox border-alpha=0 right=-100 ! videomixer name=mix ! videoconvert ! autovideosink
filesrc location=Downloads/2.mp4 ! decodebin ! queue ! videoconvert ! videobox border-alpha=0 left=-100 ! mix.
filesrc location=Downloads/3.mp4 ! decodebin ! queue ! videoconvert ! videobox border-alpha=0 left=-200 ! mix.
Here, we have initialized 3 pipes and merged three of them with mix element.

How can I have 4 videos as 1 using gstreamer, with 1 large to the left and 3 smaller to the right (these should have 1 above the other)?

This is what I'm trying to achieve
I've been trying with the videobox plugin but all I'm getting is 4 equally sized boxes
I previously did something similar to what you described using 3 cameras.
Here's the pipe I used for it:
gst-launch-1.0 v4l2src device=/dev/video0 ! video/x-raw, width=800, height=480, framerate=25/1 ! alpha alpha=1.0 ! videobox left=-800 ! videomixer name=mix sink_00::xpos=0 sink_01::xpos=800 sink_02::xpos=0 sink_02::ypos=480 sink_00::alpha=1.0 sink_01::alpha=1.0 sink_02::alpha=1.0 ! videoconvert ! xvimagesink \
v4l2src device=/dev/video1 ! video/x-raw, width=1600, height=1200, framerate=25/1 ! alpha alpha=1.0 ! videobox border-alpha=0 top=-480 ! mix. \
v4l2src device=/dev/video2 ! video/x-raw, width=800, height=480, framerate=25/1 ! alpha alpha=1.0 ! videobox border-alpha=0 left=-00 ! mix. -e
You need to use a videobox element for each videocapture (in this case I used v4l2src however you can use other sources like filesrc ! decodebin or a network source as well) pipe and combine them in a videomixer element.
In my case I used one video on the left and two videos on the right but you can adjust left and top parameters of videobox and sink_0x::xpos' and 'sink_0x::ypos elements of the videomixer element.
Remember to add alpha to each channel or your videos will be transparent.
So this depends very much on how and what you want to do. For example, this can be be as simple GUI application that presents 4 videos on 4 different surfaces and the GUI is responsible for the layout.
If you really want to create a new single image that contains these 4 video streams then videomixer sounds the way to go. See here for some example:
https://gstreamer.freedesktop.org/data/doc/gstreamer/head/gst-plugins-good/html/gst-plugins-good-plugins-videomixer.html
I use something like this to mix running streams into a single one
/usr/local/bin/gst-launch-1.0 -vv -e videomixer name=mix background=2 \
sink_0::alpha=1.0 \
sink_0::ypos=0 \
sink_0::xpos=0 \
sink_1::alpha=1.0 \
sink_1::ypos=80 \
sink_1::xpos=40 \
sink_2::alpha=1.0 \
sink_2::ypos=80 \
sink_2::xpos=410 \
sink_3::alpha=1.0 \
sink_3::ypos=80 \
sink_3::xpos=780 \
sink_0::zorder=1 \
sink_1::zorder=3 \
sink_2::zorder=3 \
sink_3::zorder=4 \
! clockoverlay auto-resize=false draw-shadow=false draw-outline=false halignment=left valignment=top \
! timeoverlay auto-resize=false draw-shadow=false draw-outline=false halignment=left valignment=bottom \
! queue \
! nvh264enc preset=1 bitrate=1500 rc-mode=2 gop-size=10 \
! h264parse config-interval=-1 \
! mpegtsmux ! rtpmp2tpay pt=33 \
! udpsink host=239.255.42.61 port=5004 multicast-iface=10g-1 ttl=4 ttl-mc=4 \
multifilesrc location=/IMG/logo.jpg caps="image/jpeg,framerate=1/1" \
! jpegdec ! videoconvert ! videoscale \
! video/x-raw,width=1920,height=1080 \
! mix.sink_0 \
udpsrc multicast-group=239.255.42.60 address=239.255.42.60 port=5004 multicast-iface=eth0 caps="application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)MP2T,payload=(int)33" \
! rtpjitterbuffer ! rtpmp2tdepay ! tsdemux ! h264parse config-interval=-1 \
! avdec_h264 skip-frame=1 output-corrupt=false ! videoconvert ! videoscale \
! video/x-raw,width=360,height=240 \
! mix.sink_1 \
udpsrc multicast-group=239.255.42.57 address=239.255.42.57 port=5004 multicast-iface=eth0 caps="application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)MP2T,payload=(int)33" \
! rtpjitterbuffer ! rtpmp2tdepay ! tsdemux ! h264parse config-interval=-1 \
! avdec_h264 skip-frame=1 output-corrupt=false ! videoconvert ! videoscale \
! video/x-raw,width=360,height=240 \
! mix.sink_2 \
udpsrc multicast-group=239.255.42.62 address=239.255.42.62 port=5004 multicast-iface=eth0 caps="application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)MP2T,payload=(int)33" \
! rtpjitterbuffer ! rtpmp2tdepay ! tsdemux ! h264parse config-interval=-1 \
! avdec_h264 skip-frame=1 output-corrupt=false ! videoconvert ! videoscale \
! video/x-raw,width=360,height=240 \
! mix.sink_3 \

ERROR: pipeline could not be constructed: syntax error

I'm working on a project and i need to make this work
gst-launch-1.0 -e \
videomixer name=mix \
sink_0::xpos=0 sink_0::ypos=0 sink_0::alpha=0 \
sink_1::xpos=640 sink_1::ypos=0 sink_1::alpha=1 \
sink_2::xpos=0 sink_2::ypos=0 sink_2::alpha=1 \
! glshader location=distortion.frag ! glimagesink sync=false \
videotestsrc pattern="black" \
! video/x-raw,width=1280,height=720 \
! mix.sink_0 \
rtpbin name=rtpbinleft latency=250 ntp-sync=true do-retransmission=0 \
udpsrc caps=application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)H264 port=5000 ! rtpbinleft.recv_rtp_sink_0 \
rtpbinleft. ! rtph264depay ! h264parse ! avdec_h264 ! videoscale add-borders=false ! video/x-raw,width=640,height=720 ! mix.sink_1 \
udpsrc port=5001 ! rtpbinleft.recv_rtcp_sink_0 \
rtpbinleft.send_rtcp_src_0 ! udpsink port=5005 host=192.168.0.17 sync=false async=false \
rtpbin name=rtpbinright latency=250 ntp-sync=true do-retransmission=0 \
udpsrc caps=application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)H264 port=6000 ! rtpbinright.recv_rtp_sink_0 \
rtpbinright. ! rtph264depay ! h264parse ! avdec_h264 ! videoscale add-borders=false ! video/x-raw,width=640,height=720 ! mix.sink_2 \
udpsrc port=6001 ! rtpbinright.recv_rtcp_sink_0 \
rtpbinright.send_rtcp_src_0 ! udpsink port=6005 host=192.168.0.18 sync=false async=false
it's supposed totake to streams and apply a barrel distortion.
Here is the tuto :https://www.raspberrypi.org/forums/viewtopic.php?f=43&t=65700&start=25
I've tried almost everything but it allways fail and say
ERROR: pipeline could not be constructed: syntax error.
Any help?
Apparently you've just forgot the enclosing quotation marks in the caps values of the rtpbin's elements. In a command line you must do like:
rtpbin caps="application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)H264"
Out of the quotations all the '(string)', '(int)' are not making sense to gst-launch.

Gstreamer Missing Audio in MP4

This command adds a text to the video, but the audio is missing in the output MP4 file:
gst-launch-1.0 filesrc location=input.mp4 name=src ! decodebin ! textoverlay text="My Text" ! x264enc ! h264parse ! mp4mux ! filesink location=output.mp4
How can I fix this, so that the audio is preserved?
Thanks
This works:
gst-launch-1.0 \
filesrc location=input.mp4 name=src\
! decodebin name=demuxer \
demuxer. ! queue \
! textoverlay text="My Text" \
! x264enc ! muxer. \
demuxer. ! queue \
! audioconvert ! voaacenc ! muxer. \
mp4mux name=muxer \
! filesink location=output.mp4

How combine uridecobin and videomixer with a videoscale for each sink?

I try to make a vector of image that I get to many URI. I have succeeded to display an image with videomixer and uridecodebin plus a videoscale cap.
gst-launch -e videomixer name = mixer \
sink_0::xpos = 0 sink_0::ypos = 0 \
! xvimagesink \
uridecodebin uri=http://www.logotheque.fr/6396-2/logo+RMC+INFO.jpg \
! ffmpegcolorspace ! imagefreeze ! videoscale method = 1 \
! video/x-raw-yuv,width=100,height=100 ! queue ! mixer.sink_0.
But when I add the same "uri_Image" on another position in the videomixer with the same videoscale cap :
gst-launch -e videomixer name = mixer \
sink_0::xpos = 0 sink_0::ypos = 0 \
sink_1::xpos = 100 sink_1::ypos = 0 \
! xvimagesink \
uridecodebin uri=http://www.logotheque.fr/6396-2/logo+RMC+INFO.jpg
! ffmpegcolorspace ! imagefreeze ! videoscale ! \
video/x-raw-yuv,width=100,height=100 ! queue2 ! mixer.sink_0. \
uridecodebin uri=http://www.logotheque.fr/6396-2/logo+RMC+INFO.jpg
! ffmpegcolorspace ! imagefreeze ! videoscale ! \
video/x-raw-yuv,width=100, height=100 ! queue2 ! mixer.sink_1.
I get this error : "videoscale1 : not negotiated
gstbasetransform.c(2541): gst_base_transform_handle_buffer (): /GstPipeline:pipeline0/GstVideoScale:videoscale1:
"
So I don't understand why this error appears on the second sink, because this is the same process in both cases.
Edit :
I have found a partial solution for those interested.
gst-launch -e videomixer name=mix ! ffmpegcolorspace ! xvimagesink \
uridecodebin uri=http://www.logotheque.fr/6396-2/logo+RMC+INFO.jpg ! videoscale ! video/x-raw-yuv,width=100,height=100 \
! videobox top=0 left=0 ! imagefreeze ! mix. \
uridecodebin uri=http://upload.wikimedia.org/wikipedia/fr/1/14/Logo_vibration.JPG ! videoscale ! video/x-raw- yuv,width=100,height=100 \
! videobox top=0 left=-100 ! imagefreeze ! mix.
But this solution doesn't work with png files, I don't know why because uridecodebin is an universal decoder...
If anybody have an idea...
ok try this pipeline. With pipeline you can add png file if you need:
gst-launch -e videomixer2 name=mixer sink_0::xpos=0 sink_0::ypos=0 sink_1::xpos=100 sink_1::ypos=0 ! ffmpegcolorspace ! xvimagesink uridecodebin uri=http://www.logotheque.fr/6396-2/logo+RMC+INFO.jpg ! ffmpegcolorspace ! imagefreeze ! videoscale ! "video/x-raw-yuv, format=(fourcc)AYUV, width=100, height=100" ! queue2 ! mixer.sink_0. uridecodebin uri=http://www.logotheque.fr/6396-2/logo+RMC+INFO.jpg ! ffmpegcolorspace ! imagefreeze ! videoscale ! "video/x-raw-yuv, format=(fourcc)AYUV, width=100, height=100" ! queue2 ! mixer.sink_1. -v