I am trying to forward a video between two GStreamer pipelines by using shmsink/shmsrc, and make the receiving side to encode the video.
The following is a command line for the sending side:
gst-launch-0.10 -v videotestsrc \
! 'video/x-raw-yuv, format=(fourcc)"I420", framerate=30/1, width=1280, height=720' \
! shmsink socket-path=/tmp/xxx shm-size=10000000 wait-for-connection=0 sync=false
The following is a command line for the receiving side:
gst-launch-0.10 -v shmsrc socket-path=/tmp/xxx \
! 'video/x-raw-yuv, format=(fourcc)"I420", framerate=30/1, width=1280, height=720' \
! x264enc
! filesink location=/tmp/yyy
A problem is that nothing is recorded. It seems that the pipeline is not rolling. The below shows the output message:
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)I420, framerate=(fraction)30/1, width=(int)1280, height=(int)720
/GstPipeline:pipeline0/GstX264Enc:x264enc0.GstPad:src: caps = video/x-h264, width=(int)1280, height=(int)720, framerate=(fraction)30/1, pixel-aspect-ratio=(fraction)1/1, codec_data=(buffer)014d401fffe10018674d401feca02802dd8088000003000bb9aca00078c18cb001000468ebecb2, stream-format=(string)avc, alignment=(string)au, level=(string)3.1, profile=(string)main
/GstPipeline:pipeline0/GstX264Enc:x264enc0.GstPad:sink: caps = video/x-raw-yuv, format=(fourcc)I420, framerate=(fraction)30/1, width=(int)1280, height=(int)720
When I remove x264enc as below, the pipeline is rolling and the output file, /tmp/yyy is increasing.
gst-launch-0.10 -v shmsrc socket-path=/tmp/xxx \
! 'video/x-raw-yuv, format=(fourcc)"I420", framerate=30/1, width=1280, height=720' \
! filesink location=/tmp/yyy
Interestingly the output message below shows "New clock: GstSytemclock" which was not shown previously.
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)I420, framerate=(fraction)30/1, width=(int)1280, height=(int)720
/GstPipeline:pipeline0/GstFileSink:filesink0.GstPad:sink: caps = video/x-raw-yuv, format=(fourcc)I420, framerate=(fraction)30/1, width=(int)1280, height=(int)720
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
I have no idea why the pipeline does not work with x264enc. Any help will be really appreciated.
The size of the buffers output from shmsrc are not aligned to a video frame boundary size as is required by anything taking video/x-raw caps.
With GStreamer 1.0, the rawvideoparse element has been added to allow gathering complete video frames to push downstream. I don't believe GStreamer 0.10 has that element available.
Related
I've have a gstreamer pipeline that is playing back RTP stream received from udpsrc:
udpsrc port=6000 caps=\"application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H265, payload=(int)96, seqnum-offset=(uint)5331, timestamp-offset=(uint)2500093283, ssrc=(uint)2046637718, a-framerate=(string)1\" ! rtph265depay ! avdec_h265 ! videoconvert ! autovideosink sync=false
That works fine, however if I replace udpsrc with appsrc, it stops working. The only error I'm seeing is:
... Segment with non-TIME format not supported
non-working pipeline with appsrc:
appsrc name=appsrc caps=\"application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H265, payload=(int)96, seqnum-offset=(uint)5331, timestamp-offset=(uint)2500093283, ssrc=(uint)2046637718, a-framerate=(string)1\" ! rtph265depay ! avdec_h265 ! videoconvert ! autovideosink sync=false
Turns out that appsrc needs to have its format property set to time.
Here is the fixed pipeline:
appsrc name=appsrc format=time caps=\"application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H265, payload=(int)96, seqnum-offset=(uint)5331, timestamp-offset=(uint)2500093283, ssrc=(uint)2046637718, a-framerate=(string)1\" ! rtph265depay ! avdec_h265 ! videoconvert ! autovideosink sync=false
I'm trying to save the video input (it can also be frame by frame) from a camera, whose input I can display like this:
gst-launch-1.0 udpsrc port=50004 ! application/x-rtp,encoding=JPEG,payload=26 ! rtpjpegdepay ! decodebin ! videoconvert ! autovideosink
I want to save this video to a file, either in video format or frame by frame. So I try to run
gst-launch-1.0 udpsrc port=50004 ! application/x-rtp,encoding=JPEG,payload=26 ! rtpjpegdepay ! decodebin ! videoconvert ! avimux ! filesink location=video.avi
But my video.avi file is empty. What am I doing wrong? I am a beginner in GStreamer and I can't find useful information online so I can't figure out what each part of that pipeline is doing.
EDIT
Running with verbose I get this:
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = application/x-rtp, encoding=(string)JPEG, payload=(int)26, media=(string)video, clock-rate=(int)90000
/GstPipeline:pipeline0/GstRtpJPEGDepay:rtpjpegdepay0.GstPad:sink: caps = application/x-rtp, encoding=(string)JPEG, payload=(int)26, media=(string)video, clock-rate=(int)90000
/GstPipeline:pipeline0/GstRtpJPEGDepay:rtpjpegdepay0.GstPad:src: caps = image/jpeg, framerate=(fraction)0/1, width=(int)640, height=(int)480
/GstPipeline:pipeline0/GstDecodeBin:decodebin0.GstGhostPad:sink.GstProxyPad:proxypad0: caps = image/jpeg, framerate=(fraction)0/1, width=(int)640, height=(int)480
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstTypeFindElement:typefind.GstPad:src: caps = image/jpeg, framerate=(fraction)0/1, width=(int)640, height=(int)480
[INFO] bitstreamMode 1, chromaInterleave 0, mapType 0, tiled2LinearEnable 0
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstImxVpuDecoder:imxvpudecoder0.GstPad:sink: caps = image/jpeg, framerate=(fraction)0/1, width=(int)640, height=(int)480
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstTypeFindElement:typefind.GstPad:sink: caps = image/jpeg, framerate=(fraction)0/1, width=(int)640, height=(int)480
/GstPipeline:pipeline0/GstDecodeBin:decodebin0.GstGhostPad:sink: caps = image/jpeg, framerate=(fraction)0/1, width=(int)640, height=(int)480
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstImxVpuDecoder:imxvpudecoder0.GstPad:src: caps = video/x-raw, format=(string)I420, width=(int)640, height=(int)480, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)jpeg, colorimetry=(string)bt601, framerate=(fraction)0/1
/GstPipeline:pipeline0/GstVideoConvert:videoconvert0.GstPad:src: caps = video/x-raw, format=(string)I420, width=(int)640, height=(int)480, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)jpeg, colorimetry=(string)bt601, framerate=(fraction)0/1
/GstPipeline:pipeline0/GstVideoConvert:videoconvert0.GstPad:sink: caps = video/x-raw, format=(string)I420, width=(int)640, height=(int)480, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)jpeg, colorimetry=(string)bt601, framerate=(fraction)0/1
/GstPipeline:pipeline0/GstDecodeBin:decodebin0.GstDecodePad:src_0.GstProxyPad:proxypad1: caps = video/x-raw, format=(string)I420, width=(int)640, height=(int)480, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)jpeg, colorimetry=(string)bt601, framerate=(fraction)0/1
I could eventually write the stream of images using the following command:
gst-launch-1.0 udpsrc port=50004 ! application/x-rtp,encoding=JPEG,payload=26 ! rtpjpegdepay ! matroskamux ! filesink location=video.mkv
"encoding=JPEG" just specifies jpeg at that stage of the pipeline, but that is decoded by decodebin later, resulting in uncompressed, raw video.
You can check what encoders your gstreamer install supports with
gst-inspect-1.0 | grep enc
This also lists audio encoders. Probably you have to install additional gstreamer packages to get any or more encoders, like gstreamer1.0-plugins-bad or gstreamer1.0-plugins-ugly (this package contains x264enc).
Then try the pipeline from #Alper again:
gst-launch-1.0 udpsrc port=50004 ! application/x-rtp,encoding=JPEG,payload=26 ! rtpjpegdepay \
! decodebin ! videoconvert ! x264enc ! avimux ! filesink location=video.avi
I want to launch a opus/vp8 stream using gstreamer.
I'm starting from the following snippet:
#!/bin/sh
gst-launch-1.0 \
audiotestsrc ! \
audioresample ! audio/x-raw,channels=1,rate=16000 ! \
opusenc bitrate=20000 ! \
rtpopuspay ! udpsink host=127.0.0.1 port=5002 \
videotestsrc ! \
video/x-raw,width=320,height=240,framerate=15/1 ! \
videoscale ! videorate ! videoconvert ! timeoverlay ! \
vp8enc error-resilient=1 ! \
rtpvp8pay ! udpsink host=127.0.0.1 port=5004
In the current folder now i have a mp4 video: i want to reproduce it via gstreamer encoding both track.
I've tried:
gst-launch-1.0 filesrc location=video.mp4 ! video/x-raw,width=320,height=240,framerate=15/1 ! videoscale ! videorate ! videoconvert ! timeoverlay ! vp8enc error-resilient=1 ! rtpvp8pay ! udpsink host=127.0.0.1 port=5004
but gst-launch shows me this:
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
ERROR: from element /GstPipeline:pipeline0/GstCapsFilter:capsfilter0: Filter caps do not completely specify the output format
Additional debug info:
gstcapsfilter.c(454): gst_capsfilter_prepare_buf (): /GstPipeline:pipeline0/GstCapsFilter:capsfilter0:
Output caps are unfixed: video/x-raw, width=(int)320, height=(int)240, framerate=(fraction)15/1, format=(string){ YV12, YUY2, UYVY, AYUV, RGBx, BGRx, xRGB, xBGR, RGBA, BGRA, ARGB, ABGR, RGB, BGR, Y41B, Y42B, YVYU, Y444, v210, v216, NV12, NV21, NV16, NV61, NV24, GRAY8, GRAY16_BE, GRAY16_LE, v308, RGB16, BGR16, RGB15, BGR15, UYVP, A420, RGB8P, YUV9, YVU9, IYU1, ARGB64, AYUV64, r210, I420_10LE, I420_10BE, I422_10LE, I422_10BE, Y444_10LE, Y444_10BE, GBR, GBR_10LE, GBR_10BE, NV12_64Z32, A420_10LE, A420_10BE, A422_10LE, A422_10BE, A444_10LE, A444_10BE, I420 }
ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...
Freeing pipeline ...
How can i solve this problem?
[Edit]: i've tried:
#!/bin/sh
gst-launch-1.0 \
filesrc video.mp4 ! \
video/x-raw,width=320,height=240,framerate=15/1 ! \
videoscale ! videorate ! videoconvert ! timeoverlay ! \
vp8enc error-resilient=1 ! \
rtpvp8pay ! udpsink host=127.0.0.1 port=5004
But there is the following error:
ERROR: pipeline could not be constructed: empty pipeline not allowed.
prova.sh: 8: prova.sh: filesrc: not found
Finally i've tried this:
gst-launch-1.0 filesrc location=video.mp4 ! decodebin ! videoscale ! videorate ! videoconvert ! timeoverlay ! vp8enc error-resilient=1 ! rtpvp8pay ! udpsink host=127.0.0.1 port=5004
And this is what console shows:
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
error: XDG_RUNTIME_DIR not set in the environment.
ERROR: from element /GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstVaapiDecodeBin:vaapidecodebin0/GstVaapiDecode:vaapidecode: Could not initialize supporting library.
Additional debug info:
gstvideodecoder.c(2492): gst_video_decoder_change_state (): /GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstVaapiDecodeBin:vaapidecodebin0/GstVaapiDecode:vaapidecode:
Failed to open decoder
ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...
error: XDG_RUNTIME_DIR not set in the environment.
Freeing pipeline ...
(gst-launch-1.0:514): GStreamer-CRITICAL **: gst_pad_send_event: assertion 'GST_IS_PAD (pad)' failed
(gst-launch-1.0:514): GStreamer-CRITICAL **: gst_object_unref: assertion 'object != NULL' failed
Caught SIGSEGV
exec gdb failed: No such file or directory
Spinning. Please run 'gdb gst-launch-1.0 514' to continue debugging, Ctrl-C to quit, or Ctrl-\ to dump core.
Actually, the videotestsrc produces raw video. But the mp4 should be first decoded, so this should work and negotiate caps automatically:
gst-launch-1.0 filesrc location=video.mp4 ! decodebin ! videoscale ! videorate ! videoconvert ! timeoverlay ! vp8enc error-resilient=1 ! rtpvp8pay ! udpsink host=127.0.0.1 port=5004
Following pipeline fails. How to debug this? What is going wrong?
gst-launch-1.0 -v uvch264src device=/dev/video0 name=src \
auto-start=true src.vidsrc ! queue ! video/x-h264 ! \
h264parse ! avdec_h264 ! xvimagesink sync=false
Setting pipeline to PAUSED ...
/GstPipeline:pipeline0/GstUvcH264Src:src/GstV4l2Src:v4l2src0: num-buffers = -1
/GstPipeline:pipeline0/GstUvcH264Src:src/GstV4l2Src:v4l2src0: device = /dev/video0
/GstPipeline:pipeline0/GstUvcH264Src:src/GstV4l2Src:v4l2src0: num-buffers = -1
/GstPipeline:pipeline0/GstUvcH264Src:src/GstV4l2Src:v4l2src0: device = /dev/video0
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstUvcH264Src:src/GstV4l2Src:v4l2src0.GstPad:src: caps = video/x-raw, format=(string)YUY2, width=(int)2304, height=(int)1536, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive, framerate=(fraction)2/1
/GstPipeline:pipeline0/GstUvcH264Src:src.GstGhostPad:vfsrc: caps = video/x-raw, format=(string)YUY2, width=(int)2304, height=(int)1536, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive, framerate=(fraction)2/1
/GstPipeline:pipeline0/GstUvcH264Src:src.GstGhostPad:vfsrc.GstProxyPad:proxypad0: caps = video/x-raw, format=(string)YUY2, width=(int)2304, height=(int)1536, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive, framerate=(fraction)2/1
ERROR: from element /GstPipeline:pipeline0/GstUvcH264Src:src/GstV4l2Src:v4l2src0: Internal data flow error.
Additional debug info:
gstbasesrc.c(2865): gst_base_src_loop (): /GstPipeline:pipeline0/GstUvcH264Src:src/GstV4l2Src:v4l2src0:
streaming task paused, reason not-linked (-1)
Execution ended after 0:00:02.891955232
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
But vfsrc is working fine.
gst-launch-1.0 -v -e uvch264src device=/dev/video0 name=src auto-start=true \
src.vfsrc ! queue ! video/x-raw,format=\(string\)YUY2,width=320,height=240,framerate=10/1 ! \
textoverlay text="Capture from vfsrc 79879 " font-desc="Sans 24" ! \
xvimagesink sync=false
Thanks,
Sneha
uvch264src requires the vfsrc pad to be linked. If you don't want to use it you can link it to a fakesink.
gst-launch-1.0 -v uvch264src device=/dev/video0 name=src auto-start=true src.vidsrc ! queue ! video/x-h264 ! h264parse ! avdec_h264 ! xvimagesink sync=false src.vfsrc ! fakesink
I'm just trying to get an RTP sample working, but every example I've seen doesn't execute due to missing plugins or incorrect pins.
This seems the most promising, but, although the server and client seem to launch properly and go to "PLAYING", nothing happens:
Server:
gst-launch -v videotestsrc ! \
video/x-raw-rgb, format=\(fourcc\)RGB, width=4, height=4, frame-rate=1/1 ! rtpvrawpay !
udpsink host=127.0.0.1
Server output:
/GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)RAW, sampling=(string)RGB, depth=(string)8, width=(string)4, height=(string)4, colorimetry=(string)SMPTE240M, payload=(int)96, ssrc=(uint)3779397700, clock-base=(uint)1161131286, seqnum-base=(uint)43390
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
Client:
gst-launch-0.10 -v udpsrc caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)RAW, sampling=(string)RGB, depth=(string)8, width=(string)4, height=(string)4, colorimetry=(string)SMPTE240M, payload=(int)96, ssrc=(uint)3779397700, clock-base=(uint)1161131286, seqnum-base=(uint)43390" ! rtpvrawdepay ! xvimagesink
Client output:
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstRtpVRawDepay:rtpvrawdepay0.GstPad:src: caps = video/x-raw-rgb, width=(int)4, height=(int)4, format=(fourcc)0x00000000, framerate=(fraction)0/1, endianness=(int)4321, red_mask=(int)16711680, green_mask=(int)65280, blue_mask=(int)255, bpp=(int)24, depth=(int)24
/GstPipeline:pipeline0/GstRtpVRawDepay:rtpvrawdepay0.GstPad:sink: caps = application/x-rtp, media=(string)video, payload=(int)96, clock-rate=(int)90000, encoding-name=(string)RAW, sampling=(string)RGB, depth=(string)8, width=(string)4, height=(string)4, colorimetry=(string)SMPTE240M, ssrc=(uint)3779397700, clock-base=(uint)1161131286, seqnum-base=(uint)43390
These work:
Server:
gst-launch-0.10 -v \
gstrtpbin name=rtpbin1 \
videotestsrc ! x264enc ! rtph264pay ! rtpbin1.send_rtp_sink_0 \
rtpbin1.send_rtp_src_0 ! udpsink host=127.0.0.1 port=5011 \
rtpbin1.send_rtcp_src_0 ! udpsink host=127.0.0.1 port=5012 \
udpsrc port=5013 ! rtpbin1.recv_rtcp_sink_0
Client:
gst-launch-0.10 -v \
videomixer name=mix ! ffmpegcolorspace ! autovideosink sync=false async=false \
gstrtpbin name=rtpbin1 \
udpsrc port=5011 caps = "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, sprop-parameter-sets=(string)\"Z0LAFdkBQfsBEAAAAwAXc1lAAPFi5IAA\\,aMuMsg\\=\\=\", ssrc=(uint)595281375, payload=(int)96, clock-base=(uint)3105254905, seqnum-base=(uint)59233" ! rtpbin1.recv_rtp_sink_0 rtpbin1. ! rtph264depay ! queue ! ffdec_h264 ! videobox border-alpha=0 top=0 left=0 ! mix. \
udpsrc port=5012 ! rtpbin1.recv_rtcp_sink_0 \
rtpbin1.send_rtcp_src_0 ! udpsink port=5013 host=127.0.0.1