I am struggling with the behavior of the compositor plugin when it comes to scaling. In my pipeline a video is first downscaled, then processed, and finally upscaled back to its original size.
A minimal example which illustrates this is the following:
gst-launch-1.0 -v videotestsrc ! video/x-raw,width=1280,height=720 ! \
videoscale ! video/x-raw,width=512,height=512 ! \
compositor sink_0::width=1280 sink_0::height=720 ! fakesink
I would expect that compositor outputs 1280 x 720 video since this is the dimension specified in its sink pad, and scaling the intermediate video back to its original size "corrects" the pixel aspect ratio to 1 as well. Besides, the default scaling-policy is to ignore any aspect ratio anyway.
The result I get is this:
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
/GstPipeline:pipeline0/GstVideoTestSrc:videotestsrc0.GstPad:src: caps = video/x-raw, format=(string)ABGR64_LE, width=(int)1280, height=(int)720, framerate=(fraction)30/1, multiview-mode=(string)mono, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw, format=(string)ABGR64_LE, width=(int)1280, height=(int)720, framerate=(fraction)30/1, multiview-mode=(string)mono, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstVideoScale:videoscale0.GstPad:src: caps = video/x-raw, format=(string)ABGR64_LE, width=(int)512, height=(int)512, framerate=(fraction)30/1, multiview-mode=(string)mono, pixel-aspect-ratio=(fraction)16/9, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:src: caps = video/x-raw, format=(string)ABGR64_LE, width=(int)512, height=(int)512, framerate=(fraction)30/1, multiview-mode=(string)mono, pixel-aspect-ratio=(fraction)16/9, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstCompositor:compositor0.GstCompositorPad:sink_0: caps = video/x-raw, format=(string)ABGR64_LE, width=(int)512, height=(int)512, framerate=(fraction)30/1, multiview-mode=(string)mono, pixel-aspect-ratio=(fraction)16/9, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:sink: caps = video/x-raw, format=(string)ABGR64_LE, width=(int)512, height=(int)512, framerate=(fraction)30/1, multiview-mode=(string)mono, pixel-aspect-ratio=(fraction)16/9, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstVideoScale:videoscale0.GstPad:sink: caps = video/x-raw, format=(string)ABGR64_LE, width=(int)1280, height=(int)720, framerate=(fraction)30/1, multiview-mode=(string)mono, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw, format=(string)ABGR64_LE, width=(int)1280, height=(int)720, framerate=(fraction)30/1, multiview-mode=(string)mono, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstCompositor:compositor0.GstAggregatorPad:src: caps = video/x-raw, format=(string)AYUV, width=(int)2275, height=(int)720, framerate=(fraction)30/1, chroma-site=(string)jpeg, colorimetry=(string)bt709
/GstPipeline:pipeline0/GstFakeSink:fakesink0.GstPad:sink: caps = video/x-raw, format=(string)AYUV, width=(int)2275, height=(int)720, framerate=(fraction)30/1, chroma-site=(string)jpeg, colorimetry=(string)bt709
Redistribute latency...
/GstPipeline:pipeline0/GstCompositor:compositor0.GstAggregatorPad:src: caps = video/x-raw, format=(string)AYUV, width=(int)2275, height=(int)720, framerate=(fraction)30/1, chroma-site=(string)jpeg, colorimetry=(string)bt709
/GstPipeline:pipeline0/GstFakeSink:fakesink0.GstPad:sink: caps = video/x-raw, format=(string)AYUV, width=(int)2275, height=(int)720, framerate=(fraction)30/1, chroma-site=(string)jpeg, colorimetry=(string)bt709
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
Instead of the expected result I get a 2275 x 720 video instead. The main question is whether my assumption about compositor's behavior is wrong, or whether I need an additional videoscale step before compositor? This would make it work but it feels wrong since the compositor can scale the input by itself already.
Related
I am trying to use a basler camera on an IMX8m-plus board.
The camera is detected as:
root#ucm-imx8m-plus:~# v4l2-ctl --list-devices
():
/dev/v4l-subdev0
mxc-isi-cap (platform:32e00000.isi:cap_devic):
/dev/video0
FSL Capture Media Device (platform:mxc-md):
/dev/media0
And has the following capture modes:
root#ucm-imx8m-plus:~# v4l2-ctl -d /dev/video0 --list-formats
ioctl: VIDIOC_ENUM_FMT
Type: Video Capture Multiplanar
[0]: 'RGBP' (16-bit RGB 5-6-5)
[1]: 'RGB3' (24-bit RGB 8-8-8)
[2]: 'BGR3' (24-bit BGR 8-8-8)
[3]: 'YUYV' (YUYV 4:2:2)
[4]: 'YUV4' (32-bit A/XYUV 8-8-8-8)
[5]: 'NV12' (Y/CbCr 4:2:0)
[6]: 'YM24' (Planar YUV 4:4:4 (N-C))
[7]: 'XR24' (32-bit BGRX 8-8-8-8)
[8]: 'AR24' (32-bit BGRA 8-8-8-8)
Starting gst-launch-1.0 results in this output:
root#ucm-imx8m-plus:~# gst-launch-1.0 -v v4l2src device=/dev/video0 ! videoconvert ! videoscale ! videorate ! video/x-raw,framerate=30/1,width=320,height=240 ! autovideosink
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = video/x-raw, framerate=(fraction)30/1, width=(int)320, height=(int)240, format=(stri[ 2318.660751] bypass csc
ng)YUY2, interlace-mode=(string)p[ 2318.665343] input fmt YUV4
rogressive, colorimetry=(string)1[ 2318.670921] output fmt YUYV
:4:5:1
/GstPipeline:pipeline0/GstVideoConvert:videoconvert0.GstPad:src: caps = video/x-raw, framerate=(fraction)30/1, width=(int)320, height=(int)240, format=(string)YUY2, interlace-mode=(string)progressive, colorimetry=(string)1:4:5:1
/GstPipeline:pipeline0/GstVideoScale:videoscale0.GstPad:src: caps = video/x-raw, framerate=(fraction)30/1, width=(int)320, height=(int)240, format=(string)YUY2, interlace-mode=(string)progressive, colorimetry=(string)1:4:5:1
/GstPipeline:pipeline0/GstVideoRate:videorate0.GstPad:src: caps = video/x-raw, framerate=(fraction)30/1, width=(int)320, height=(int)240, format=(string)YUY2, interlace-mode=(string)progressive, colorimetry=(string)1:4:5:1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw, framerate=(fraction)30/1, width=(int)320, height=(int)240, format=(string)YUY2, interlace-mode=(string)progressive, colorimetry=(string)1:4:5:1
/GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0.GstGhostPad:sink.GstProxyPad:proxypad0: caps = video/x-raw, framerate=(fraction)30/1, width=(int)320, height=(int)240, format=(string)YUY2, interlace-mode=(string)progressive, colorimetry=(string)1:4:5:1
/GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstWaylandSink:autovideosink0-actual-sink-wayland.GstPad:sink: caps = video/x-raw, framerate=(fraction)30/1, width=(int)320, height=(int)240, format=(string)YUY2, interlace-mode=(string)progressive, colorimetry=(string)1:4:5:1
/GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0.GstGhostPad:sink: caps = video/x-raw, framerate=(fraction)30/1, width=(int)320, height=(int)240, format=(string)YUY2, interlace-mode=(string)progressive, colorimetry=(string)1:4:5:1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw, framerate=(fraction)30/1, width=(int)320, height=(int)240, format=(string)YUY2, interlace-mode=(string)progressive, colorimetry=(string)1:4:5:1
/GstPipeline:pipeline0/GstVideoRate:videorate0.GstPad:sink: caps = video/x-raw, framerate=(fraction)30/1, width=(int)320, height=(int)240, format=(string)YUY2, interlace-mode=(string)progressive, colorimetry=(string)1:4:5:1
/GstPipeline:pipeline0/GstVideoScale:videoscale0.GstPad:sink: caps = video/x-raw, framerate=(fraction)30/1, width=(int)320, height=(int)240, format=(string)YUY2, interlace-mode=(string)progressive, colorimetry=(string)1:4:5:1
/GstPipeline:pipeline0/GstVideoConvert:videoconvert0.GstPad:sink: caps = video/x-raw, framerate=(fraction)30/1, width=(int)320, height=(int)240, format=(string)YUY2, interlace-mode=(string)progressive, colorimetry=(string)1:4:5:1
No video is showing while this is active.
The videotestsrc is working perfectly fine. And the camera is working on different systems as well.
root#ucm-imx8m-plus:~# gst-launch-1.0 -v videotestsrc ! autovideosink
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
/GstPipeline:pipeline0/GstVideoTestSrc:videotestsrc0.GstPad:src: caps = video/x-raw, format=(string)BGRA, width=(int)320, height=(int)240, framerate=(fraction)30/1, multiview-mode=(string)mono, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0.GstGhostPad:sink.GstProxyPad:proxypad0: caps = video/x-raw, format=(string)BGRA, width=(int)320, height=(int)240, framerate=(fraction)30/1, multiview-mode=(string)mono, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstWaylandSink:autovideosink0-actual-sink-wayland.GstPad:sink: caps = video/x-raw, format=(string)BGRA, width=(int)320, height=(int)240, framerate=(fraction)30/1, multiview-mode=(string)mono, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0.GstGhostPad:sink: caps = video/x-raw, format=(string)BGRA, width=(int)320, height=(int)240, framerate=(fraction)30/1, multiview-mode=(string)mono, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
The system is an yocto-image (hardknott 5.10.72-2.2.0) from compulab.
What do I need to change and/or test to display the camera stream correctly?
Edits:
root#ucm-imx8m-plus:/opt/dart-bcon-mipi/lib# gst-launch-1.0 -v v4l2src device=/dev/video0 ! fakesink
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = video/x-raw, format=(string)YUY2, framerate=(fraction)120/1, width=(int)3840, height=(int)2160, interlace-mode=(string)progressive, colorimetry=(string)1:4:11:1
/GstPipeline[ 315.657117] bypass csc
:pipeline0/GstFakeSink:fakesink0.[ 315.660021] input fmt YUV4
GstPad:sink: caps = video/x-raw, [ 315.665603] output fmt YUYV
format=(string)YUY2, framerate=(fraction)120/1, width=(int)3840, height=(int)2160, interlace-mode=(string)progressive, colorimetry=(string)1:4:11:1
root#ucm-imx8m-plus:/opt/dart-bcon-mipi/lib# gst-launch-1.0 -v v4l2src device=/dev/video0 ! autovideosink
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = video/x-raw, format=(string)YUY2, width=(int)3840, height=(int)2160, framerate=(fraction)120/1, interlace-mode=(string)progressive, colorimetry=(string)1:4:11:1
/GstPipeline:pipeline0/GstAutoVideoSink:autovid[ 362.197247] bypass csc
eosink0.GstGhostPad:sink.GstProxy[ 362.201367] input fmt YUV4
Pad:proxypad0: caps = video/x-raw[ 362.206942] output fmt YUYV
, format=(string)YUY2, width=(int)3840, height=(int)2160, framerate=(fraction)120/1, interlace-mode=(string)progressive, colorimetry=(string)1:4:11:1
/GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstWaylandSink:autovideosink0-actual-sink-wayland.GstPad:sink: caps = video/x-raw, format=(string)YUY2, width=(int)3840, height=(int)2160, framerate=(fraction)120/1, interlace-mode=(string)progressive, colorimetry=(string)1:4:11:1
/GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0.GstGhostPad:sink: caps = video/x-raw, format=(string)YUY2, width=(int)3840, height=(int)2160, framerate=(fraction)120/1, interlace-mode=(string)progressive, colorimetry=(string)1:4:11:1
I try to capture a video from a webcam and save it after appr. 10 seconds (splitmuxsink)
I have tried the same gstreamer cmd at Windows(10) and on Linux (yocto), but it does only work on windows. Windows machine is a regular laptop, Linux machine is a Raspy 3
At Linux the file get created after start, but doesn't grow in size and does not split after certain time..
c:\gstreamer\1.0\x86\bin>gst-launch-1.0.exe -v souphttpsrc location=http://192.168.1.245:8080/video ! multipartdemux ! image/jpeg, framerate=25/1 ! jpegparse ! splitmuxsink location=file%02d.mkv max-size-time=10000000000 muxer=matroskamux
gst-launch-1.0 -v souphttpsrc location=http://192.168.1.245:8080/video ! multipartdemux ! image/jpeg, framerate=25/1 ! jpegparse ! splitmuxsink location=file%02d.mkv max-size-time=10000000000 muxer=matroskamux
I also tried to captue a singe jpg file, that worked at both systems similar.
gst-launch-1.0.exe -v souphttpsrc location=http://192.168.1.245:8080/shot.jpg ! filesink location=capture1.jpg
Versions of gstreamer:
c:\gstreamer\1.0\x86\bin>gst-launch-1.0 --gst-version
GStreamer Core Library version 1.15.90
root#raspberrypi3:~# gst-launch-1.0 --gst-version
GStreamer Core Library version 1.14.4
The output is also quite similar, but at linux I have a output like
/GstPipeline:pipeline0/GstSplitMuxSink:splitmuxsink0/GstQueue:queue0: max-size-buffers = 13
/GstPipeline:pipeline0/GstSplitMuxSink:splitmuxsink0/GstQueue:queue0: max-size-buffers = 14
all the time....
Is this output slowing everything so much down that it does not work?
Windows:
c:\gstreamer\1.0\x86\bin>gst-launch-1.0.exe -v souphttpsrc location=http://192.168.1.245:8080/video ! multipartdemux ! image/jpeg, framerate=25/1 ! jpegparse ! splitmuxsink location=file%02d.mkv max-size-time=10000000000 muxer=matroskamux
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
/GstPipeline:pipeline0/GstSplitMuxSink:splitmuxsink0/GstFileSink:sink: async = false
Got context from element 'souphttpsrc0': gst.soup.session=context, session=(SoupSession)NULL, force=(boolean)false;
/GstPipeline:pipeline0/GstCapsFilter:capsfilter1: caps = image/jpeg, framerate=(fraction)25/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:src: caps = image/jpeg, framerate=(fraction)25/1
/GstPipeline:pipeline0/GstJpegParse:jpegparse0.GstPad:sink: caps = image/jpeg, framerate=(fraction)25/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:sink: caps = image/jpeg
/GstPipeline:pipeline0/GstJpegParse:jpegparse0.GstPad:src: caps = image/jpeg, parsed=(boolean)true, format=(string)I420, width=(int)1280, height=(int)720, framerate=(fraction)25/1
/GstPipeline:pipeline0/GstSplitMuxSink:splitmuxsink0.GstGhostPad:video.GstProxyPad:proxypad0: caps = image/jpeg, parsed=(boolean)true, format=(string)I420, width=(int)1280, height=(int)720, framerate=(fraction)25/1
/GstPipeline:pipeline0/GstSplitMuxSink:splitmuxsink0/GstQueue:queue0.GstPad:sink: caps = image/jpeg, parsed=(boolean)true, format=(string)I420, width=(int)1280, height=(int)720, framerate=(fraction)25/1
/GstPipeline:pipeline0/GstSplitMuxSink:splitmuxsink0.GstGhostPad:video: caps = image/jpeg, parsed=(boolean)true, format=(string)I420, width=(int)1280, height=(int)720, framerate=(fraction)25/1
/GstPipeline:pipeline0/GstSplitMuxSink:splitmuxsink0/GstQueue:queue0.GstPad:src: caps = image/jpeg, parsed=(boolean)true, format=(string)I420, width=(int)1280, height=(int)720, framerate=(fraction)25/1
/GstPipeline:pipeline0/GstSplitMuxSink:splitmuxsink0/GstMatroskaMux:matroskamux0.GstMatroskamuxPad:video_0: caps = image/jpeg, parsed=(boolean)true, format=(string)I420, width=(int)1280, height=(int)720, framerate=(fraction)25/1
/GstPipeline:pipeline0/GstSplitMuxSink:splitmuxsink0/GstMatroskaMux:matroskamux0.GstMatroskamuxPad:video_0: caps = NULL
/GstPipeline:pipeline0/GstSplitMuxSink:splitmuxsink0/GstFileSink:sink: location = file00.mkv
Pipeline is PREROLLED ...
/GstPipeline:pipeline0/GstSplitMuxSink:splitmuxsink0/GstMatroskaMux:matroskamux0.GstMatroskamuxPad:video_0: caps = image/jpeg, parsed=(boolean)true, format=(string)I420, width=(int)1280, height=(int)720, framerate=(fraction)25/1
/GstPipeline:pipeline0/GstSplitMuxSink:splitmuxsink0/GstMatroskaMux:matroskamux0.GstPad:src: caps = video/x-matroska
/GstPipeline:pipeline0/GstSplitMuxSink:splitmuxsink0/GstFileSink:sink.GstPad:sink: caps = video/x-matroska
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Linux:
root#raspberrypi3:~# gst-launch-1.0 -v souphttpsrc location=http://192.168.1.245:8080/video ! multipartdemux ! image/jpeg, framerate=25/1 ! jpegparse ! splitmuxsink location=file%02d.mkv max-size-time=10000000000 muxer=matroskamux
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
/GstPipeline:pipeline0/GstSplitMuxSink:splitmuxsink0/GstFileSink:sink: async = false
Got context from element 'souphttpsrc0': gst.soup.session=context, session=(SoupSession)NULL, force=(boolean)false;
/GstPipeline:pipeline0/GstCapsFilter:capsfilter1: caps = image/jpeg, framerate=(fraction)25/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:src: caps = image/jpeg, framerate=(fraction)25/1
/GstPipeline:pipeline0/GstJpegParse:jpegparse0.GstPad:sink: caps = image/jpeg, framerate=(fraction)25/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:sink: caps = image/jpeg
/GstPipeline:pipeline0/GstJpegParse:jpegparse0.GstPad:src: caps = image/jpeg, parsed=(boolean)true, format=(string)I420, width=(int)1280, height=(int)720, framerate=(fraction)25/1
/GstPipeline:pipeline0/GstSplitMuxSink:splitmuxsink0.GstGhostPad:video.GstProxyPad:proxypad0: caps = image/jpeg, parsed=(boolean)true, format=(string)I420, width=(int)1280, height=(int)720, framerate=(fraction)25/1
/GstPipeline:pipeline0/GstSplitMuxSink:splitmuxsink0/GstQueue:queue0.GstPad:sink: caps = image/jpeg, parsed=(boolean)true, format=(string)I420, width=(int)1280, height=(int)720, framerate=(fraction)25/1
/GstPipeline:pipeline0/GstSplitMuxSink:splitmuxsink0/GstQueue:queue0.GstPad:src: caps = image/jpeg, parsed=(boolean)true, format=(string)I420, width=(int)1280, height=(int)720, framerate=(fraction)25/1
/GstPipeline:pipeline0/GstSplitMuxSink:splitmuxsink0/GstQueue:queue0.GstPad:src: caps = image/jpeg, parsed=(boolean)true, format=(string)I420, width=(int)1280, height=(int)720, framerate=(fraction)25/1
/GstPipeline:pipeline0/GstSplitMuxSink:splitmuxsink0/GstMatroskaMux:matroskamux0.GstMatroskamuxPad:video_0: caps = image/jpeg, parsed=(boolean)true, format=(string)I420, width=(int)1280, height=(int)720, framerate=(fraction)25/1
/GstPipeline:pipeline0/GstSplitMuxSink:splitmuxsink0/GstMatroskaMux:matroskamux0.GstMatroskamuxPad:video_0: caps = NULL
/GstPipeline:pipeline0/GstSplitMuxSink:splitmuxsink0/GstFileSink:sink: location = file00.mkv
Pipeline is PREROLLED ...
/GstPipeline:pipeline0/GstSplitMuxSink:splitmuxsink0/GstMatroskaMux:matroskamux0.GstMatroskamuxPad:video_0: caps = image/jpeg, parsed=(boolean)true, format=(string)I420, width=(int)1280, height=(int)720, framerate=(fraction)25/1
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstSplitMuxSink:splitmuxsink0/GstQueue:queue0: max-size-buffers = 6
/GstPipeline:pipeline0/GstSplitMuxSink:splitmuxsink0/GstQueue:queue0: max-size-buffers = 7
/GstPipeline:pipeline0/GstSplitMuxSink:splitmuxsink0/GstQueue:queue0: max-size-buffers = 8
/GstPipeline:pipeline0/GstSplitMuxSink:splitmuxsink0/GstQueue:queue0: max-size-buffers = 9
/GstPipeline:pipeline0/GstSplitMuxSink:splitmuxsink0/GstQueue:queue0: max-size-buffers = 10
/GstPipeline:pipeline0/GstSplitMuxSink:splitmuxsink0/GstQueue:queue0: max-size-buffers = 11
/GstPipeline:pipeline0/GstSplitMuxSink:splitmuxsink0/GstQueue:queue0: max-size-buffers = 12
/GstPipeline:pipeline0/GstSplitMuxSink:splitmuxsink0/GstQueue:queue0: max-size-buffers = 13
[...]
Looking at splitmuxsink source it seems that the max-size-buffers = <N> logs come from growing the internal queues when they are full.
It is a bit hard to guess why, without further information, so I would recommend you to set GST_DEBUG=5 (or some other level) to get more output.
Maybe check that you can write to your filesystem fast enough if you are working with high bitrates or use a intermediate location that is mounted in RAM (like /tmp often is).
I found out after trying several things:
GSteamer debug info is not realy usable because for me its too much info....
I got my example working with the following:
If I remove "jpegparse" and "framerate=25/1"
And add "do-timestamp=true" and reduce the size of the image "image/jpeg,width=640,height=480"
it does work.
Fiddling around with Gstreamer is IMO after 3 weeks of testing it not very user-friendly, but indeed its a very powerful tool.
gst-launch-1.0 -v souphttpsrc location=http://192.168.1.133:8080/video do-timestamp=true ! multipartdemux ! image/jpeg,width=640,height=480 ! splitmuxsink location=FILE%02d.mkv max-size-time=10000000000 muxer=matroskamux
I'm trying to save the video input (it can also be frame by frame) from a camera, whose input I can display like this:
gst-launch-1.0 udpsrc port=50004 ! application/x-rtp,encoding=JPEG,payload=26 ! rtpjpegdepay ! decodebin ! videoconvert ! autovideosink
I want to save this video to a file, either in video format or frame by frame. So I try to run
gst-launch-1.0 udpsrc port=50004 ! application/x-rtp,encoding=JPEG,payload=26 ! rtpjpegdepay ! decodebin ! videoconvert ! avimux ! filesink location=video.avi
But my video.avi file is empty. What am I doing wrong? I am a beginner in GStreamer and I can't find useful information online so I can't figure out what each part of that pipeline is doing.
EDIT
Running with verbose I get this:
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = application/x-rtp, encoding=(string)JPEG, payload=(int)26, media=(string)video, clock-rate=(int)90000
/GstPipeline:pipeline0/GstRtpJPEGDepay:rtpjpegdepay0.GstPad:sink: caps = application/x-rtp, encoding=(string)JPEG, payload=(int)26, media=(string)video, clock-rate=(int)90000
/GstPipeline:pipeline0/GstRtpJPEGDepay:rtpjpegdepay0.GstPad:src: caps = image/jpeg, framerate=(fraction)0/1, width=(int)640, height=(int)480
/GstPipeline:pipeline0/GstDecodeBin:decodebin0.GstGhostPad:sink.GstProxyPad:proxypad0: caps = image/jpeg, framerate=(fraction)0/1, width=(int)640, height=(int)480
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstTypeFindElement:typefind.GstPad:src: caps = image/jpeg, framerate=(fraction)0/1, width=(int)640, height=(int)480
[INFO] bitstreamMode 1, chromaInterleave 0, mapType 0, tiled2LinearEnable 0
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstImxVpuDecoder:imxvpudecoder0.GstPad:sink: caps = image/jpeg, framerate=(fraction)0/1, width=(int)640, height=(int)480
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstTypeFindElement:typefind.GstPad:sink: caps = image/jpeg, framerate=(fraction)0/1, width=(int)640, height=(int)480
/GstPipeline:pipeline0/GstDecodeBin:decodebin0.GstGhostPad:sink: caps = image/jpeg, framerate=(fraction)0/1, width=(int)640, height=(int)480
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstImxVpuDecoder:imxvpudecoder0.GstPad:src: caps = video/x-raw, format=(string)I420, width=(int)640, height=(int)480, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)jpeg, colorimetry=(string)bt601, framerate=(fraction)0/1
/GstPipeline:pipeline0/GstVideoConvert:videoconvert0.GstPad:src: caps = video/x-raw, format=(string)I420, width=(int)640, height=(int)480, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)jpeg, colorimetry=(string)bt601, framerate=(fraction)0/1
/GstPipeline:pipeline0/GstVideoConvert:videoconvert0.GstPad:sink: caps = video/x-raw, format=(string)I420, width=(int)640, height=(int)480, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)jpeg, colorimetry=(string)bt601, framerate=(fraction)0/1
/GstPipeline:pipeline0/GstDecodeBin:decodebin0.GstDecodePad:src_0.GstProxyPad:proxypad1: caps = video/x-raw, format=(string)I420, width=(int)640, height=(int)480, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)jpeg, colorimetry=(string)bt601, framerate=(fraction)0/1
I could eventually write the stream of images using the following command:
gst-launch-1.0 udpsrc port=50004 ! application/x-rtp,encoding=JPEG,payload=26 ! rtpjpegdepay ! matroskamux ! filesink location=video.mkv
"encoding=JPEG" just specifies jpeg at that stage of the pipeline, but that is decoded by decodebin later, resulting in uncompressed, raw video.
You can check what encoders your gstreamer install supports with
gst-inspect-1.0 | grep enc
This also lists audio encoders. Probably you have to install additional gstreamer packages to get any or more encoders, like gstreamer1.0-plugins-bad or gstreamer1.0-plugins-ugly (this package contains x264enc).
Then try the pipeline from #Alper again:
gst-launch-1.0 udpsrc port=50004 ! application/x-rtp,encoding=JPEG,payload=26 ! rtpjpegdepay \
! decodebin ! videoconvert ! x264enc ! avimux ! filesink location=video.avi
I get errors when i run Gstreamer audio multicast RTP on a BeagleBone Black.
Here is the Gstreamer TX command:
# gst-launch-1.0 -v filesrc location="test.wav" ! decodebin ! audioconvert ! rtpL16pay ! queue ! udpsink host=224.0.0.10 auto-multicast=true port=5555 --gst-debug-level=3
Setting pipeline to PAUSED ... Pipeline is PREROLLING ...
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstTypeFindElement:typefind.GstPad:src:
caps = audio/x-wav
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstTypeFindElement:typefind.GstPad:src:
caps = NULL 0:00:00.300210292 23288 0x75490 FIXME
default
gstutils.c:3648:gst_pad_create_stream_id_printf_valist:
Creating random stream-id, consider implementing a deterministic way
of creating a stream-id
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstWavParse:wavparse0.GstPad:src:
caps = audio/x-raw, format=(string)S16LE, layout=(string)interleaved,
channels=(int)2, channel-mask=(bitmask)0x0000000000000003,
rate=(int)48000
/GstPipeline:pipeline0/GstAudioConvert:audioconvert0.GstPad:src: caps
= audio/x-raw, layout=(string)interleaved, rate=(int)48000, format=(string)S16BE, channels=(int)2,
channel-mask=(bitmask)0x0000000000000003
/GstPipeline:pipeline0/GstRtpL16Pay:rtpl16pay0.GstPad:src: caps =
application/x-rtp, media=(string)audio, clock-rate=(int)48000,
encoding-name=(string)L16, encoding-params=(string)2, channels=(int)2,
payload=(int)96, ssrc=(uint)3613934853,
timestamp-offset=(uint)2678744220, seqnum-offset=(uint)28780
/GstPipeline:pipeline0/GstQueue:queue0.GstPad:src: caps =
application/x-rtp, media=(string)audio, clock-rate=(int)48000,
encoding-name=(string)L16, encoding-params=(string)2, channels=(int)2,
payload=(int)96, ssrc=(uint)3613934853,
timestamp-offset=(uint)2678744220, seqnum-offset=(uint)28780
/GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps =
application/x-rtp, media=(string)audio, clock-rate=(int)48000,
encoding-name=(string)L16, encoding-params=(string)2, channels=(int)2,
payload=(int)96, ssrc=(uint)3613934853,
timestamp-offset=(uint)2678744220, seqnum-offset=(uint)28780
/GstPipeline:pipeline0/GstQueue:queue0.GstPad:sink: caps =
application/x-rtp, media=(string)audio, clock-rate=(int)48000,
encoding-name=(string)L16, encoding-params=(string)2, channels=(int)2,
payload=(int)96, ssrc=(uint)3613934853,
timestamp-offset=(uint)2678744220, seqnum-offset=(uint)28780
/GstPipeline:pipeline0/GstRtpL16Pay:rtpl16pay0.GstPad:sink: caps =
audio/x-raw, layout=(string)interleaved, rate=(int)48000,
format=(string)S16BE, channels=(int)2,
channel-mask=(bitmask)0x0000000000000003
/GstPipeline:pipeline0/GstAudioConvert:audioconvert0.GstPad:sink: caps
= audio/x-raw, format=(string)S16LE, layout=(string)interleaved, channels=(int)2, channel-mask=(bitmask)0x0000000000000003,
rate=(int)48000
/GstPipeline:pipeline0/GstDecodeBin:decodebin0.GstDecodePad:src_0.GstProxyPad:proxypad1:
caps = audio/x-raw, format=(string)S16LE, layout=(string)interleaved,
channels=(int)2, channel-mask=(bitmask)0x0000000000000003,
rate=(int)48000 /GstPipeline:pipeline0/GstRtpL16Pay:rtpl16pay0:
timestamp = 2678744220 /GstPipeline:pipeline0/GstRtpL16Pay:rtpl16pay0:
seqnum = 28780 Pipeline is PREROLLED ... Setting pipeline to PLAYING
... New clock: GstSystemClock
And the Gstreamer RX command:
# gst-launch-1.0 -v udpsrc multicast-group=224.0.0.10 auto-multicast=true port=5555 caps='application/x-rtp, media=(string)audio, clock-rate=(int)48000, encoding-name=(string)L16, encoding-params=(string)2, channels=(int)2, payload=(int)96' ! rtpL16depay ! audioconvert ! alsasink --gst-debug-level=3
Setting pipeline to PAUSED ... Pipeline is live and does not need
PREROLL ... Setting pipeline to PLAYING ... New clock: GstSystemClock
/GstPipeline:pipeline0/GstUDPSrc:udpsrc0.GstPad:src: caps =
application/x-rtp, media=(string)audio, clock-rate=(int)48000,
encoding-name=(string)L16, encoding-params=(string)2, channels=(int)2,
payload=(int)96
/GstPipeline:pipeline0/GstRtpL16Depay:rtpl16depay0.GstPad:src: caps =
audio/x-raw, format=(string)S16BE, layout=(string)interleaved,
rate=(int)48000, channels=(int)2,
channel-mask=(bitmask)0x0000000000000003 0:00:00.357371500 23249
0xefb20 WARN alsa conf.c:4694:snd_config_expand:
alsalib error: Unknown parameters {AES0 0x02 AES1 0x82 AES2 0x00 AES3
0x02} 0:00:00.358214833 23249 0xefb20 WARN alsa
pcm.c:2239:snd_pcm_open_noupdate: alsalib error: Unknown PCM
default:{AES0 0x02 AES1 0x82 AES2 0x00 AES3 0x02}
/GstPipeline:pipeline0/GstAudioConvert:audioconvert0.GstPad:src: caps
= audio/x-raw, layout=(string)interleaved, rate=(int)48000, format=(string)S16LE, channels=(int)2,
channel-mask=(bitmask)0x0000000000000003
/GstPipeline:pipeline0/GstAlsaSink:alsasink0.GstPad:sink: caps =
audio/x-raw, layout=(string)interleaved, rate=(int)48000,
format=(string)S16LE, channels=(int)2,
channel-mask=(bitmask)0x0000000000000003
/GstPipeline:pipeline0/GstAudioConvert:audioconvert0.GstPad:sink: caps
= audio/x-raw, format=(string)S16BE, layout=(string)interleaved, rate=(int)48000, channels=(int)2,
channel-mask=(bitmask)0x0000000000000003
/GstPipeline:pipeline0/GstRtpL16Depay:rtpl16depay0.GstPad:sink: caps =
application/x-rtp, media=(string)audio, clock-rate=(int)48000,
encoding-name=(string)L16, encoding-params=(string)2, channels=(int)2,
payload=(int)96 0:00:08.472125917 23249 0xefb20 WARN
audiobasesink
gstaudiobasesink.c:1593:gst_audio_base_sink_get_alignment:
Unexpected discontinuity in audio timestamps of +0:00:00.010583333,
resyncing 0:00:51.970270047 23249 0xefb20 WARN
audiobasesink
gstaudiobasesink.c:1593:gst_audio_base_sink_get_alignment:
Unexpected discontinuity in audio timestamps of +0:00:00.002729166,
resyncing 0:00:51.982288589 23249 0xefb20 WARN
audiobasesink
gstaudiobasesink.c:1593:gst_audio_base_sink_get_alignment:
Unexpected discontinuity in audio timestamps of +0:00:00.010437500,
resyncing 0:00:52.010382256 23249 0xefb20 WARN
audiobasesink
gstaudiobasesink.c:1593:gst_audio_base_sink_get_alignment:
Unexpected discontinuity in audio timestamps of +0:00:00.006937500,
resyncing 0:00:52.029231922 23249 0xefb20 WARN
audiobasesink
gstaudiobasesink.c:1593:gst_audio_base_sink_get_alignment:
Unexpected discontinuity in audio timestamps of +0:00:00.007354166,
resyncing
As you can see above, warnings are printed to the console and audio is cracking and noisy :
audiobasesink gstaudiobasesink.c:1593:gst_audio_base_sink_get_alignment: Unexpected discontinuity in audio timestamps of +x:xx:xx.xxxxxxxxx, resyncing
Any idea to solve this issue ?
Thank you for reading.
Note that both Gstreamer commands run on the same machine. it acts like a local audio loopback actually.
I think you want an rtpjitterbuffer on the receiver side. UDP does not guarantee the order of packets when they arrive and this element will take care of it.
I am trying to forward a video between two GStreamer pipelines by using shmsink/shmsrc, and make the receiving side to encode the video.
The following is a command line for the sending side:
gst-launch-0.10 -v videotestsrc \
! 'video/x-raw-yuv, format=(fourcc)"I420", framerate=30/1, width=1280, height=720' \
! shmsink socket-path=/tmp/xxx shm-size=10000000 wait-for-connection=0 sync=false
The following is a command line for the receiving side:
gst-launch-0.10 -v shmsrc socket-path=/tmp/xxx \
! 'video/x-raw-yuv, format=(fourcc)"I420", framerate=30/1, width=1280, height=720' \
! x264enc
! filesink location=/tmp/yyy
A problem is that nothing is recorded. It seems that the pipeline is not rolling. The below shows the output message:
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)I420, framerate=(fraction)30/1, width=(int)1280, height=(int)720
/GstPipeline:pipeline0/GstX264Enc:x264enc0.GstPad:src: caps = video/x-h264, width=(int)1280, height=(int)720, framerate=(fraction)30/1, pixel-aspect-ratio=(fraction)1/1, codec_data=(buffer)014d401fffe10018674d401feca02802dd8088000003000bb9aca00078c18cb001000468ebecb2, stream-format=(string)avc, alignment=(string)au, level=(string)3.1, profile=(string)main
/GstPipeline:pipeline0/GstX264Enc:x264enc0.GstPad:sink: caps = video/x-raw-yuv, format=(fourcc)I420, framerate=(fraction)30/1, width=(int)1280, height=(int)720
When I remove x264enc as below, the pipeline is rolling and the output file, /tmp/yyy is increasing.
gst-launch-0.10 -v shmsrc socket-path=/tmp/xxx \
! 'video/x-raw-yuv, format=(fourcc)"I420", framerate=30/1, width=1280, height=720' \
! filesink location=/tmp/yyy
Interestingly the output message below shows "New clock: GstSytemclock" which was not shown previously.
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)I420, framerate=(fraction)30/1, width=(int)1280, height=(int)720
/GstPipeline:pipeline0/GstFileSink:filesink0.GstPad:sink: caps = video/x-raw-yuv, format=(fourcc)I420, framerate=(fraction)30/1, width=(int)1280, height=(int)720
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
I have no idea why the pipeline does not work with x264enc. Any help will be really appreciated.
The size of the buffers output from shmsrc are not aligned to a video frame boundary size as is required by anything taking video/x-raw caps.
With GStreamer 1.0, the rawvideoparse element has been added to allow gathering complete video frames to push downstream. I don't believe GStreamer 0.10 has that element available.