I am trying to mux video and audio test sources into rtmpsink. This pipeline will not work:
gst-launch-1.0 \
videotestsrc ! queue ! x264enc ! \
flvmux name=mux ! \
rtmpsink location="rtmp://... live=1" \
audiotestsrc ! queue ! audioconvert ! mux.
I am getting this console result:
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
Redistribute latency...
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
If I push audio and video separately it works:
gst-launch-1.0 \
videotestsrc ! queue ! x264enc ! \
flvmux name=mux ! \
rtmpsink location="rtmp://... live=1"
-v logs
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
/GstPipeline:pipeline0/GstVideoTestSrc:videotestsrc0.GstPad:src: caps = "video/x-raw\,\ format\=\(string\)I420\,\ width\=\(int\)320\,\ height\=\(int\)240\,\ framerate\=\(fraction\)30/1\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ interlace-mode\=\(string\)progressive"
/GstPipeline:pipeline0/GstQueue:queue0.GstPad:sink: caps = "video/x-raw\,\ format\=\(string\)I420\,\ width\=\(int\)320\,\ height\=\(int\)240\,\ framerate\=\(fraction\)30/1\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ interlace-mode\=\(string\)progressive"
/GstPipeline:pipeline0/GstQueue:queue0.GstPad:sink: caps = "video/x-raw\,\ format\=\(string\)I420\,\ width\=\(int\)320\,\ height\=\(int\)240\,\ framerate\=\(fraction\)30/1\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ interlace-mode\=\(string\)progressive"
/GstPipeline:pipeline0/GstAudioTestSrc:audiotestsrc0.GstPad:src: caps = "audio/x-raw\,\ format\=\(string\)S16LE\,\ layout\=\(string\)interleaved\,\ rate\=\(int\)44100\,\ channels\=\(int\)1"
/GstPipeline:pipeline0/GstQueue:queue1.GstPad:sink: caps = "audio/x-raw\,\ format\=\(string\)S16LE\,\ layout\=\(string\)interleaved\,\ rate\=\(int\)44100\,\ channels\=\(int\)1"
/GstPipeline:pipeline0/GstQueue:queue1.GstPad:src: caps = "audio/x-raw\,\ format\=\(string\)S16LE\,\ layout\=\(string\)interleaved\,\ rate\=\(int\)44100\,\ channels\=\(int\)1"
/GstPipeline:pipeline0/GstAudioConvert:audioconvert0.GstPad:src: caps = "audio/x-raw\,\ format\=\(string\)S16LE\,\ layout\=\(string\)interleaved\,\ rate\=\(int\)44100\,\ channels\=\(int\)1"
/GstPipeline:pipeline0/GstFlvMux:mux.GstPad:audio: caps = "audio/x-raw\,\ format\=\(string\)S16LE\,\ layout\=\(string\)interleaved\,\ rate\=\(int\)44100\,\ channels\=\(int\)1"
/GstPipeline:pipeline0/GstAudioConvert:audioconvert0.GstPad:sink: caps = "audio/x-raw\,\ format\=\(string\)S16LE\,\ layout\=\(string\)interleaved\,\ rate\=\(int\)44100\,\ channels\=\(int\)1"
/GstPipeline:pipeline0/GstX264Enc:x264enc0.GstPad:sink: caps = "video/x-raw\,\ format\=\(string\)I420\,\ width\=\(int\)320\,\ height\=\(int\)240\,\ framerate\=\(fraction\)30/1\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ interlace-mode\=\(string\)progressive"
Redistribute latency...
/GstPipeline:pipeline0/GstX264Enc:x264enc0.GstPad:src: caps = "video/x-h264\,\ codec_data\=\(buffer\)01640014ffe1001967640014acd94141fb0110000003001000000303c8f142996001000568ebecb22c\,\ stream-format\=\(string\)avc\,\ alignment\=\(string\)au\,\ level\=\(string\)2\,\ profile\=\(string\)high\,\ width\=\(int\)320\,\ height\=\(int\)240\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ framerate\=\(fraction\)30/1"
/GstPipeline:pipeline0/GstFlvMux:mux.GstPad:video: caps = "video/x-h264\,\ codec_data\=\(buffer\)01640014ffe1001967640014acd94141fb0110000003001000000303c8f142996001000568ebecb22c\,\ stream-format\=\(string\)avc\,\ alignment\=\(string\)au\,\ level\=\(string\)2\,\ profile\=\(string\)high\,\ width\=\(int\)320\,\ height\=\(int\)240\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ framerate\=\(fraction\)30/1"
/GstPipeline:pipeline0/GstFlvMux:mux: streamable = true
/GstPipeline:pipeline0/GstFlvMux:mux.GstPad:src: caps = "video/x-flv\,\ streamheader\=\(buffer\)\<\ ... buffer data ... \>"
/GstPipeline:pipeline0/GstRTMPSink:rtmpsink0.GstPad:sink: caps = "video/x-flv\,\ streamheader\=\(buffer\)\<\ ... buffer data .. \>"
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Was just having a similar issue, and made it work by re-encoding the audio to MPEG.
gst-launch-1.0 videotestsrc is-live=true ! videoconvert ! queue ! \
x264enc bitrate=2000 ! \
flvmux name=mux ! \
rtmpsink location="rtmp://localhost/test/test live=1" \
audiotestsrc is-live=true ! queue ! faac ! mux.
Tested this pipeline against NGINX RTMP server, and VLC as client. I'm not entirely sure why raw audio isn't working.
You can use rtmpsrc
Example:
gst-launch-1.0 rtmpsrc location="rtmp://<location> live=1" ! decodebin name=decoder decoder. ! queue ! videoconvert ! queue ! xvimagesink
you can develop through this pip
Related
I'm trying to save the video input (it can also be frame by frame) from a camera, whose input I can display like this:
gst-launch-1.0 udpsrc port=50004 ! application/x-rtp,encoding=JPEG,payload=26 ! rtpjpegdepay ! decodebin ! videoconvert ! autovideosink
I want to save this video to a file, either in video format or frame by frame. So I try to run
gst-launch-1.0 udpsrc port=50004 ! application/x-rtp,encoding=JPEG,payload=26 ! rtpjpegdepay ! decodebin ! videoconvert ! avimux ! filesink location=video.avi
But my video.avi file is empty. What am I doing wrong? I am a beginner in GStreamer and I can't find useful information online so I can't figure out what each part of that pipeline is doing.
EDIT
Running with verbose I get this:
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = application/x-rtp, encoding=(string)JPEG, payload=(int)26, media=(string)video, clock-rate=(int)90000
/GstPipeline:pipeline0/GstRtpJPEGDepay:rtpjpegdepay0.GstPad:sink: caps = application/x-rtp, encoding=(string)JPEG, payload=(int)26, media=(string)video, clock-rate=(int)90000
/GstPipeline:pipeline0/GstRtpJPEGDepay:rtpjpegdepay0.GstPad:src: caps = image/jpeg, framerate=(fraction)0/1, width=(int)640, height=(int)480
/GstPipeline:pipeline0/GstDecodeBin:decodebin0.GstGhostPad:sink.GstProxyPad:proxypad0: caps = image/jpeg, framerate=(fraction)0/1, width=(int)640, height=(int)480
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstTypeFindElement:typefind.GstPad:src: caps = image/jpeg, framerate=(fraction)0/1, width=(int)640, height=(int)480
[INFO] bitstreamMode 1, chromaInterleave 0, mapType 0, tiled2LinearEnable 0
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstImxVpuDecoder:imxvpudecoder0.GstPad:sink: caps = image/jpeg, framerate=(fraction)0/1, width=(int)640, height=(int)480
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstTypeFindElement:typefind.GstPad:sink: caps = image/jpeg, framerate=(fraction)0/1, width=(int)640, height=(int)480
/GstPipeline:pipeline0/GstDecodeBin:decodebin0.GstGhostPad:sink: caps = image/jpeg, framerate=(fraction)0/1, width=(int)640, height=(int)480
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstImxVpuDecoder:imxvpudecoder0.GstPad:src: caps = video/x-raw, format=(string)I420, width=(int)640, height=(int)480, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)jpeg, colorimetry=(string)bt601, framerate=(fraction)0/1
/GstPipeline:pipeline0/GstVideoConvert:videoconvert0.GstPad:src: caps = video/x-raw, format=(string)I420, width=(int)640, height=(int)480, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)jpeg, colorimetry=(string)bt601, framerate=(fraction)0/1
/GstPipeline:pipeline0/GstVideoConvert:videoconvert0.GstPad:sink: caps = video/x-raw, format=(string)I420, width=(int)640, height=(int)480, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)jpeg, colorimetry=(string)bt601, framerate=(fraction)0/1
/GstPipeline:pipeline0/GstDecodeBin:decodebin0.GstDecodePad:src_0.GstProxyPad:proxypad1: caps = video/x-raw, format=(string)I420, width=(int)640, height=(int)480, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)jpeg, colorimetry=(string)bt601, framerate=(fraction)0/1
I could eventually write the stream of images using the following command:
gst-launch-1.0 udpsrc port=50004 ! application/x-rtp,encoding=JPEG,payload=26 ! rtpjpegdepay ! matroskamux ! filesink location=video.mkv
"encoding=JPEG" just specifies jpeg at that stage of the pipeline, but that is decoded by decodebin later, resulting in uncompressed, raw video.
You can check what encoders your gstreamer install supports with
gst-inspect-1.0 | grep enc
This also lists audio encoders. Probably you have to install additional gstreamer packages to get any or more encoders, like gstreamer1.0-plugins-bad or gstreamer1.0-plugins-ugly (this package contains x264enc).
Then try the pipeline from #Alper again:
gst-launch-1.0 udpsrc port=50004 ! application/x-rtp,encoding=JPEG,payload=26 ! rtpjpegdepay \
! decodebin ! videoconvert ! x264enc ! avimux ! filesink location=video.avi
I get errors when i run Gstreamer audio multicast RTP on a BeagleBone Black.
Here is the Gstreamer TX command:
# gst-launch-1.0 -v filesrc location="test.wav" ! decodebin ! audioconvert ! rtpL16pay ! queue ! udpsink host=224.0.0.10 auto-multicast=true port=5555 --gst-debug-level=3
Setting pipeline to PAUSED ... Pipeline is PREROLLING ...
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstTypeFindElement:typefind.GstPad:src:
caps = audio/x-wav
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstTypeFindElement:typefind.GstPad:src:
caps = NULL 0:00:00.300210292 23288 0x75490 FIXME
default
gstutils.c:3648:gst_pad_create_stream_id_printf_valist:
Creating random stream-id, consider implementing a deterministic way
of creating a stream-id
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstWavParse:wavparse0.GstPad:src:
caps = audio/x-raw, format=(string)S16LE, layout=(string)interleaved,
channels=(int)2, channel-mask=(bitmask)0x0000000000000003,
rate=(int)48000
/GstPipeline:pipeline0/GstAudioConvert:audioconvert0.GstPad:src: caps
= audio/x-raw, layout=(string)interleaved, rate=(int)48000, format=(string)S16BE, channels=(int)2,
channel-mask=(bitmask)0x0000000000000003
/GstPipeline:pipeline0/GstRtpL16Pay:rtpl16pay0.GstPad:src: caps =
application/x-rtp, media=(string)audio, clock-rate=(int)48000,
encoding-name=(string)L16, encoding-params=(string)2, channels=(int)2,
payload=(int)96, ssrc=(uint)3613934853,
timestamp-offset=(uint)2678744220, seqnum-offset=(uint)28780
/GstPipeline:pipeline0/GstQueue:queue0.GstPad:src: caps =
application/x-rtp, media=(string)audio, clock-rate=(int)48000,
encoding-name=(string)L16, encoding-params=(string)2, channels=(int)2,
payload=(int)96, ssrc=(uint)3613934853,
timestamp-offset=(uint)2678744220, seqnum-offset=(uint)28780
/GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps =
application/x-rtp, media=(string)audio, clock-rate=(int)48000,
encoding-name=(string)L16, encoding-params=(string)2, channels=(int)2,
payload=(int)96, ssrc=(uint)3613934853,
timestamp-offset=(uint)2678744220, seqnum-offset=(uint)28780
/GstPipeline:pipeline0/GstQueue:queue0.GstPad:sink: caps =
application/x-rtp, media=(string)audio, clock-rate=(int)48000,
encoding-name=(string)L16, encoding-params=(string)2, channels=(int)2,
payload=(int)96, ssrc=(uint)3613934853,
timestamp-offset=(uint)2678744220, seqnum-offset=(uint)28780
/GstPipeline:pipeline0/GstRtpL16Pay:rtpl16pay0.GstPad:sink: caps =
audio/x-raw, layout=(string)interleaved, rate=(int)48000,
format=(string)S16BE, channels=(int)2,
channel-mask=(bitmask)0x0000000000000003
/GstPipeline:pipeline0/GstAudioConvert:audioconvert0.GstPad:sink: caps
= audio/x-raw, format=(string)S16LE, layout=(string)interleaved, channels=(int)2, channel-mask=(bitmask)0x0000000000000003,
rate=(int)48000
/GstPipeline:pipeline0/GstDecodeBin:decodebin0.GstDecodePad:src_0.GstProxyPad:proxypad1:
caps = audio/x-raw, format=(string)S16LE, layout=(string)interleaved,
channels=(int)2, channel-mask=(bitmask)0x0000000000000003,
rate=(int)48000 /GstPipeline:pipeline0/GstRtpL16Pay:rtpl16pay0:
timestamp = 2678744220 /GstPipeline:pipeline0/GstRtpL16Pay:rtpl16pay0:
seqnum = 28780 Pipeline is PREROLLED ... Setting pipeline to PLAYING
... New clock: GstSystemClock
And the Gstreamer RX command:
# gst-launch-1.0 -v udpsrc multicast-group=224.0.0.10 auto-multicast=true port=5555 caps='application/x-rtp, media=(string)audio, clock-rate=(int)48000, encoding-name=(string)L16, encoding-params=(string)2, channels=(int)2, payload=(int)96' ! rtpL16depay ! audioconvert ! alsasink --gst-debug-level=3
Setting pipeline to PAUSED ... Pipeline is live and does not need
PREROLL ... Setting pipeline to PLAYING ... New clock: GstSystemClock
/GstPipeline:pipeline0/GstUDPSrc:udpsrc0.GstPad:src: caps =
application/x-rtp, media=(string)audio, clock-rate=(int)48000,
encoding-name=(string)L16, encoding-params=(string)2, channels=(int)2,
payload=(int)96
/GstPipeline:pipeline0/GstRtpL16Depay:rtpl16depay0.GstPad:src: caps =
audio/x-raw, format=(string)S16BE, layout=(string)interleaved,
rate=(int)48000, channels=(int)2,
channel-mask=(bitmask)0x0000000000000003 0:00:00.357371500 23249
0xefb20 WARN alsa conf.c:4694:snd_config_expand:
alsalib error: Unknown parameters {AES0 0x02 AES1 0x82 AES2 0x00 AES3
0x02} 0:00:00.358214833 23249 0xefb20 WARN alsa
pcm.c:2239:snd_pcm_open_noupdate: alsalib error: Unknown PCM
default:{AES0 0x02 AES1 0x82 AES2 0x00 AES3 0x02}
/GstPipeline:pipeline0/GstAudioConvert:audioconvert0.GstPad:src: caps
= audio/x-raw, layout=(string)interleaved, rate=(int)48000, format=(string)S16LE, channels=(int)2,
channel-mask=(bitmask)0x0000000000000003
/GstPipeline:pipeline0/GstAlsaSink:alsasink0.GstPad:sink: caps =
audio/x-raw, layout=(string)interleaved, rate=(int)48000,
format=(string)S16LE, channels=(int)2,
channel-mask=(bitmask)0x0000000000000003
/GstPipeline:pipeline0/GstAudioConvert:audioconvert0.GstPad:sink: caps
= audio/x-raw, format=(string)S16BE, layout=(string)interleaved, rate=(int)48000, channels=(int)2,
channel-mask=(bitmask)0x0000000000000003
/GstPipeline:pipeline0/GstRtpL16Depay:rtpl16depay0.GstPad:sink: caps =
application/x-rtp, media=(string)audio, clock-rate=(int)48000,
encoding-name=(string)L16, encoding-params=(string)2, channels=(int)2,
payload=(int)96 0:00:08.472125917 23249 0xefb20 WARN
audiobasesink
gstaudiobasesink.c:1593:gst_audio_base_sink_get_alignment:
Unexpected discontinuity in audio timestamps of +0:00:00.010583333,
resyncing 0:00:51.970270047 23249 0xefb20 WARN
audiobasesink
gstaudiobasesink.c:1593:gst_audio_base_sink_get_alignment:
Unexpected discontinuity in audio timestamps of +0:00:00.002729166,
resyncing 0:00:51.982288589 23249 0xefb20 WARN
audiobasesink
gstaudiobasesink.c:1593:gst_audio_base_sink_get_alignment:
Unexpected discontinuity in audio timestamps of +0:00:00.010437500,
resyncing 0:00:52.010382256 23249 0xefb20 WARN
audiobasesink
gstaudiobasesink.c:1593:gst_audio_base_sink_get_alignment:
Unexpected discontinuity in audio timestamps of +0:00:00.006937500,
resyncing 0:00:52.029231922 23249 0xefb20 WARN
audiobasesink
gstaudiobasesink.c:1593:gst_audio_base_sink_get_alignment:
Unexpected discontinuity in audio timestamps of +0:00:00.007354166,
resyncing
As you can see above, warnings are printed to the console and audio is cracking and noisy :
audiobasesink gstaudiobasesink.c:1593:gst_audio_base_sink_get_alignment: Unexpected discontinuity in audio timestamps of +x:xx:xx.xxxxxxxxx, resyncing
Any idea to solve this issue ?
Thank you for reading.
Note that both Gstreamer commands run on the same machine. it acts like a local audio loopback actually.
I think you want an rtpjitterbuffer on the receiver side. UDP does not guarantee the order of packets when they arrive and this element will take care of it.
Following pipeline fails. How to debug this? What is going wrong?
gst-launch-1.0 -v uvch264src device=/dev/video0 name=src \
auto-start=true src.vidsrc ! queue ! video/x-h264 ! \
h264parse ! avdec_h264 ! xvimagesink sync=false
Setting pipeline to PAUSED ...
/GstPipeline:pipeline0/GstUvcH264Src:src/GstV4l2Src:v4l2src0: num-buffers = -1
/GstPipeline:pipeline0/GstUvcH264Src:src/GstV4l2Src:v4l2src0: device = /dev/video0
/GstPipeline:pipeline0/GstUvcH264Src:src/GstV4l2Src:v4l2src0: num-buffers = -1
/GstPipeline:pipeline0/GstUvcH264Src:src/GstV4l2Src:v4l2src0: device = /dev/video0
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstUvcH264Src:src/GstV4l2Src:v4l2src0.GstPad:src: caps = video/x-raw, format=(string)YUY2, width=(int)2304, height=(int)1536, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive, framerate=(fraction)2/1
/GstPipeline:pipeline0/GstUvcH264Src:src.GstGhostPad:vfsrc: caps = video/x-raw, format=(string)YUY2, width=(int)2304, height=(int)1536, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive, framerate=(fraction)2/1
/GstPipeline:pipeline0/GstUvcH264Src:src.GstGhostPad:vfsrc.GstProxyPad:proxypad0: caps = video/x-raw, format=(string)YUY2, width=(int)2304, height=(int)1536, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive, framerate=(fraction)2/1
ERROR: from element /GstPipeline:pipeline0/GstUvcH264Src:src/GstV4l2Src:v4l2src0: Internal data flow error.
Additional debug info:
gstbasesrc.c(2865): gst_base_src_loop (): /GstPipeline:pipeline0/GstUvcH264Src:src/GstV4l2Src:v4l2src0:
streaming task paused, reason not-linked (-1)
Execution ended after 0:00:02.891955232
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
But vfsrc is working fine.
gst-launch-1.0 -v -e uvch264src device=/dev/video0 name=src auto-start=true \
src.vfsrc ! queue ! video/x-raw,format=\(string\)YUY2,width=320,height=240,framerate=10/1 ! \
textoverlay text="Capture from vfsrc 79879 " font-desc="Sans 24" ! \
xvimagesink sync=false
Thanks,
Sneha
uvch264src requires the vfsrc pad to be linked. If you don't want to use it you can link it to a fakesink.
gst-launch-1.0 -v uvch264src device=/dev/video0 name=src auto-start=true src.vidsrc ! queue ! video/x-h264 ! h264parse ! avdec_h264 ! xvimagesink sync=false src.vfsrc ! fakesink
I am trying to forward a video between two GStreamer pipelines by using shmsink/shmsrc, and make the receiving side to encode the video.
The following is a command line for the sending side:
gst-launch-0.10 -v videotestsrc \
! 'video/x-raw-yuv, format=(fourcc)"I420", framerate=30/1, width=1280, height=720' \
! shmsink socket-path=/tmp/xxx shm-size=10000000 wait-for-connection=0 sync=false
The following is a command line for the receiving side:
gst-launch-0.10 -v shmsrc socket-path=/tmp/xxx \
! 'video/x-raw-yuv, format=(fourcc)"I420", framerate=30/1, width=1280, height=720' \
! x264enc
! filesink location=/tmp/yyy
A problem is that nothing is recorded. It seems that the pipeline is not rolling. The below shows the output message:
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)I420, framerate=(fraction)30/1, width=(int)1280, height=(int)720
/GstPipeline:pipeline0/GstX264Enc:x264enc0.GstPad:src: caps = video/x-h264, width=(int)1280, height=(int)720, framerate=(fraction)30/1, pixel-aspect-ratio=(fraction)1/1, codec_data=(buffer)014d401fffe10018674d401feca02802dd8088000003000bb9aca00078c18cb001000468ebecb2, stream-format=(string)avc, alignment=(string)au, level=(string)3.1, profile=(string)main
/GstPipeline:pipeline0/GstX264Enc:x264enc0.GstPad:sink: caps = video/x-raw-yuv, format=(fourcc)I420, framerate=(fraction)30/1, width=(int)1280, height=(int)720
When I remove x264enc as below, the pipeline is rolling and the output file, /tmp/yyy is increasing.
gst-launch-0.10 -v shmsrc socket-path=/tmp/xxx \
! 'video/x-raw-yuv, format=(fourcc)"I420", framerate=30/1, width=1280, height=720' \
! filesink location=/tmp/yyy
Interestingly the output message below shows "New clock: GstSytemclock" which was not shown previously.
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)I420, framerate=(fraction)30/1, width=(int)1280, height=(int)720
/GstPipeline:pipeline0/GstFileSink:filesink0.GstPad:sink: caps = video/x-raw-yuv, format=(fourcc)I420, framerate=(fraction)30/1, width=(int)1280, height=(int)720
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
I have no idea why the pipeline does not work with x264enc. Any help will be really appreciated.
The size of the buffers output from shmsrc are not aligned to a video frame boundary size as is required by anything taking video/x-raw caps.
With GStreamer 1.0, the rawvideoparse element has been added to allow gathering complete video frames to push downstream. I don't believe GStreamer 0.10 has that element available.
I'm just trying to get an RTP sample working, but every example I've seen doesn't execute due to missing plugins or incorrect pins.
This seems the most promising, but, although the server and client seem to launch properly and go to "PLAYING", nothing happens:
Server:
gst-launch -v videotestsrc ! \
video/x-raw-rgb, format=\(fourcc\)RGB, width=4, height=4, frame-rate=1/1 ! rtpvrawpay !
udpsink host=127.0.0.1
Server output:
/GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)RAW, sampling=(string)RGB, depth=(string)8, width=(string)4, height=(string)4, colorimetry=(string)SMPTE240M, payload=(int)96, ssrc=(uint)3779397700, clock-base=(uint)1161131286, seqnum-base=(uint)43390
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
Client:
gst-launch-0.10 -v udpsrc caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)RAW, sampling=(string)RGB, depth=(string)8, width=(string)4, height=(string)4, colorimetry=(string)SMPTE240M, payload=(int)96, ssrc=(uint)3779397700, clock-base=(uint)1161131286, seqnum-base=(uint)43390" ! rtpvrawdepay ! xvimagesink
Client output:
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstRtpVRawDepay:rtpvrawdepay0.GstPad:src: caps = video/x-raw-rgb, width=(int)4, height=(int)4, format=(fourcc)0x00000000, framerate=(fraction)0/1, endianness=(int)4321, red_mask=(int)16711680, green_mask=(int)65280, blue_mask=(int)255, bpp=(int)24, depth=(int)24
/GstPipeline:pipeline0/GstRtpVRawDepay:rtpvrawdepay0.GstPad:sink: caps = application/x-rtp, media=(string)video, payload=(int)96, clock-rate=(int)90000, encoding-name=(string)RAW, sampling=(string)RGB, depth=(string)8, width=(string)4, height=(string)4, colorimetry=(string)SMPTE240M, ssrc=(uint)3779397700, clock-base=(uint)1161131286, seqnum-base=(uint)43390
These work:
Server:
gst-launch-0.10 -v \
gstrtpbin name=rtpbin1 \
videotestsrc ! x264enc ! rtph264pay ! rtpbin1.send_rtp_sink_0 \
rtpbin1.send_rtp_src_0 ! udpsink host=127.0.0.1 port=5011 \
rtpbin1.send_rtcp_src_0 ! udpsink host=127.0.0.1 port=5012 \
udpsrc port=5013 ! rtpbin1.recv_rtcp_sink_0
Client:
gst-launch-0.10 -v \
videomixer name=mix ! ffmpegcolorspace ! autovideosink sync=false async=false \
gstrtpbin name=rtpbin1 \
udpsrc port=5011 caps = "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, sprop-parameter-sets=(string)\"Z0LAFdkBQfsBEAAAAwAXc1lAAPFi5IAA\\,aMuMsg\\=\\=\", ssrc=(uint)595281375, payload=(int)96, clock-base=(uint)3105254905, seqnum-base=(uint)59233" ! rtpbin1.recv_rtp_sink_0 rtpbin1. ! rtph264depay ! queue ! ffdec_h264 ! videobox border-alpha=0 top=0 left=0 ! mix. \
udpsrc port=5012 ! rtpbin1.recv_rtcp_sink_0 \
rtpbin1.send_rtcp_src_0 ! udpsink port=5013 host=127.0.0.1