I want to use gstreamer (gst-launch-1.0) to stream a video signal from a camera connected to a raspberry pi (CM4) to a remote client over UDP. The gstreamer pipelines that I use always reverts to the uncompressed YUYV pixel format even after I set the format to MJPG with v4l2.
This is my pipeline:
pi#cm4:~ $ v4l2-ctl -d /dev/video0 --set-fmt-video=width=1920,height=1080,pixelformat=MJPG
pi#cm4:~ $ gst-launch-1.0 -v v4l2src device=/dev/video0 ! "video/x-raw, width=1920, height=1080, pixelformat=MJPG" ! rndbuffersize max=65000 ! udpsink host=127.0.0.1 port=1234
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = video/x-raw, width=(int)1920, height=(int)1080, pixelformat=(string)MJPG, format=(string)YUY2, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)5/1, colorimetry=(string)2:4:5:1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw, width=(int)1920, height=(int)1080, pixelformat=(string)MJPG, format=(string)YUY2, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)5/1, colorimetry=(string)2:4:5:1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstRndBufferSize:rndbuffersize0.GstPad:src: caps = video/x-raw, width=(int)1920, height=(int)1080, pixelformat=(string)MJPG, format=(string)YUY2, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)5/1, colorimetry=(string)2:4:5:1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = video/x-raw, width=(int)1920, height=(int)1080, pixelformat=(string)MJPG, format=(string)YUY2, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)5/1, colorimetry=(string)2:4:5:1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstRndBufferSize:rndbuffersize0.GstPad:sink: caps = video/x-raw, width=(int)1920, height=(int)1080, pixelformat=(string)MJPG, format=(string)YUY2, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)5/1, colorimetry=(string)2:4:5:1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw, width=(int)1920, height=(int)1080, pixelformat=(string)MJPG, format=(string)YUY2, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)5/1, colorimetry=(string)2:4:5:1, interlace-mode=(string)progressive
Even though the pipeline seems to accept the "pixelformat=(string)MJPG", the format is YUY2 and the maximum framerate is the 5fps. If I set the framerate to anything higher than 5/1, it fails with:
** (gst-launch-1.0:16205): CRITICAL **: 21:36:05.076: gst_adapter_take_buffer: assertion 'GST_IS_ADAPTER (adapter)' failed
** (gst-launch-1.0:16205): CRITICAL **: 21:36:05.076: gst_adapter_available: assertion 'GST_IS_ADAPTER (adapter)' failed
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal data stream error.
After execution of the gstreamer pipeline, v4l2-ctl confirms that the video format reverted to YUYV.
How can I force the gstreamer pipeline to use MJPG 1920x1080 and enable higher frame rates?
The camera is a Canon 5D iv that produces a clean HDMI output up to full HD at 60fps. The camera HDMI output is connected to an HDMI to USB video capture (mirabox) that supports 1920x1080 at 60fps. The video capture box is connected to the CM4 via a USB3-PCIe adapter.
This is the list of supported formats:
pi#cm4:~ $ v4l2-ctl -d 0 --list-formats-ext
ioctl: VIDIOC_ENUM_FMT
Type: Video Capture
[0]: 'MJPG' (Motion-JPEG, compressed)
Size: Discrete 1920x1080
Interval: Discrete 0.017s (60.000 fps)
Interval: Discrete 0.033s (30.000 fps)
Interval: Discrete 0.040s (25.000 fps)
Interval: Discrete 0.050s (20.000 fps)
Interval: Discrete 0.100s (10.000 fps)
[..... deleted lower resolution formats...]
[1]: 'YUYV' (YUYV 4:2:2)
Size: Discrete 1920x1080
Interval: Discrete 0.200s (5.000 fps)
Size: Discrete 1600x1200
Interval: Discrete 0.200s (5.000 fps)
[..... deleted lower resolution formats...]
Setting the pixel format is actually incorrect here. MJPEG is not a pixel fornat for "raw" video.
Try
v4l2src device=/dev/video0 ! image/jpeg, width=1920, height=1080, framerate=30/1 ! ..
Note that the camera will return you jpeg image data, so you will need a jpeg decoder if you want to display the image.
Related
I am trying to stream video from a raspberry pi 4 employing a raspberry pi camera module 2 to a laptop and I am encountering difficulties depending on the interface (ethernet or WiFi) but also the router I am using.
On the raspberry pi 4 (bullseye, gstreamer 1.18.4) I am running following command
gst-launch-1.0 libcamerasrc ! video/x-raw, width=1280, height=720 ! videoconvert ! jpegenc ! rtpjpegpay ! udpsink host=192.168.0.100 port=5200
On the laptop (bullseye, gstreamer 1.18.4) I am accessing the stream as follows
gst-launch-1.0 -v udpsrc port=5200 ! application/x-rtp, media=video, clock-rate=90000, payload=96 ! rtpjpegdepay ! jpegdec ! videoconvert ! queue ! fpsdisplaysink
When both devices are connected over WiFi to an access point, which supports WiFi 5 (TP-Link Archer MR600), I get very smooth video on the laptop with 30 FPS and almost no dropped frames. However, when connecting both devices over WiFi 6 (TP-Link Archer AX50) there are still no dropped frames but the frame rate is 1-3 FPS. Manually lowering the WiFi standard on this router doesn't solve the problem. Also, when connecting both devices over Ethernet the frame rate is again in the range 1-3 FPS, but only connecting the laptop over Ethernet and the raspberry over WiFi seems to work with 30 FPS.
When streaming I have collected the incoming packets with tcpdump
sudo tcpdump -i wlp0s20f3 -s 0 -w dump.pcap host 192.168.0.201 and udp
and then played them back with
gst-launch-1.0 filesrc location=dump.pcap ! pcapparse ! application/x-rtp, media=video, clock-rate=90000, payload=96 ! rtpjpegdepay ! jpegdec ! videoconvert ! queue ! fpsdisplaysink
showing the original stream in either of the above described cases perfectly as expected with 30 FPS. I simply noticed that for the WiFi 6 case and Ethernet case there are slightly more rendered frames for the same time period compared to the WiFi 5 case.
To me it looks like that the video stream arrives correctly, but for some reason cannot be rendered by gstreamer live but only after capturing it.
I am trying to decode an RTSP stream to a V4L2 device on a raspberry pi zero.
first i create a loopback device this way..
sudo modprobe v4l2loopback video_nr=25
Then i try to decode the RTSP stream to a virtual V4L2 device. I would prefer if i could sink it to YUYV format..
sudo gst-launch-1.0 rtspsrc location=rtsp://192.168.1.138:8554/unicast ! rtph264depay ! h264parse ! decodebin ! videoconvert ! video/x-raw,format=YUY2 ! tee ! v4l2sink device=/dev/video25
When i inspect the V4L2 device with this...
v4l2-ctl -d 25 --list-formats , i get this...
ioctl: VIDIOC_ENUM_FMT
Type: Video Capture
When i try to play it back with VLC , nothing happens.
sudo cvlc --v4l2-dev=25 --v4l2-width=640 --v4l2-height=480 --v4l2-fps=30 --vout=mmal_vout v4l2:///dev/video25
I suspect the gstreamer pipeline is not correct. I am new to gstreamer and i am poking a little bit in the dark there.
Could anybody give me some tips on what i am doing wrong here ?
I have Kodak PIXPRO SP360 4k camera connected to the Jetson Nano or TX2 via USB cable.
I want to be able to see that video over browser, either with RTSP stream, Webrtc or something else.
It doesn't matter how it works in terms of technology, as long as it works. So if you have any ideas or suggestions be free to share them.
I'm currently trying to run the basic setup.
./test-launch "nvarguscamerasrc ! video/x-raw(memory:NVMM), format=NV12, width=1920, height=1080, framerate=30/1 ! nvvidconv ! video/x-raw, width=640, height=480, format=NV12, framerate=30/1 ! omxh265enc ! rtph265pay name=pay0 pt=96 config-interval=1"
and
gst-launch-1.0 rtspsrc location=rtsp://127.0.0.1:8554/test ! queue ! decodebin ! videoconvert ! xvimagesink
and I'm getting the error saying
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Progress: (open) Opening Stream
Progress: (connect) Connecting to rtsp://127.0.0.1:8554/test
Progress: (open) Retrieving server options
Progress: (open) Retrieving media info
ERROR: from element /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0: Could not get/set settings from/on resource.
Additional debug info:
gstrtspsrc.c(6999): gst_rtspsrc_setup_streams_start (): /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0:
SDP contains no streams
ERROR: pipeline doesn't want to preroll.
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
(test-launch:22440): GLib-GObject-WARNING **: 11:36:46.018: invalid cast from 'GstRtpH265Pay' to 'GstBin'
(test-launch:22440): GStreamer-CRITICAL **: 11:36:46.018: gst_bin_get_by_name: assertion 'GST_IS_BIN (bin)' failed
(test-launch:22440): GLib-GObject-WARNING **: 11:36:46.018: invalid cast from 'GstRtpH265Pay' to 'GstBin'
(test-launch:22440): GStreamer-CRITICAL **: 11:36:46.018: gst_bin_get_by_name: assertion 'GST_IS_BIN (bin)' failed
(test-launch:22440): GLib-GObject-WARNING **: 11:36:46.018: invalid cast from 'GstRtpH265Pay' to 'GstBin'
(test-launch:22440): GStreamer-CRITICAL **: 11:36:46.018: gst_bin_get_by_name: assertion 'GST_IS_BIN (bin)' failed
I have also tried an option that worked for me on PC but I can't get it to work on Jetson. The setup goes as follows.
Download Streameye from https://github.com/ccrisan/streameye and run:
netcat -l 8700 | ./streameye -p 1337
To send the webcam stream I run:
gst-launch-1.0 v4l2src device=/dev/video0 ! decodebin ! videoconvert ! videoscale ! videorate ! jpegenc quality=30 ! tcpclientsink host=127.0.0.1 port=8700
After this I get:
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal data stream error.
Additional debug info:
gstbasesrc.c(3064): gst_base_src_loop (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
streaming stopped, reason not-negotiated (-4)
Execution ended after 0:00:03.944998186
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
Output of this command for my camera is:
v4l2-ctl -d /dev/video1 --list-formats-ext
ioctl: VIDIOC_ENUM_FMT
Index : 0
Type : Video Capture
Pixel Format: 'MJPG' (compressed)
Name : Motion-JPEG
Size: Discrete 3840x2160
Interval: Discrete 0.200s (5.000 fps)
Size: Discrete 2880x2880
Interval: Discrete 0.200s (5.000 fps)
Size: Discrete 2048x2048
Interval: Discrete 0.200s (5.000 fps)
Size: Discrete 1440x1440
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.200s (5.000 fps)
Size: Discrete 1920x1080
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.200s (5.000 fps)
Size: Discrete 1280x720
Interval: Discrete 0.033s (30.000 fps)
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.200s (5.000 fps)
Size: Discrete 640x360
Interval: Discrete 0.033s (30.000 fps)
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.200s (5.000 fps)
run your pipe with -v like this and show me result:
gst-launch-1.0 v4l2src device=/dev/video0 ! decodebin ! videoconvert ! videoscale ! videorate ! jpegenc quality=30 ! tcpclientsink host=127.0.0.1 port=8700 -v
If you want to stream it, the simplest way will be to use gst-rtsp-launch which is part of GStreamer prebuild binaries:
gst-rtsp-launch '( v4l2src device=/dev/video0 ! videoconvert ! queue ! x264enc tune="zerolatency" byte-stream=true bitrate=10000 ! rtph264pay name=pay0 pt=96 )'
Later on you can tune codec, bitrate, but for me this is enough (playable in VLC - rtsp://127.0.0.1:8554/test)
I have built GStreamer, GStreamer RTSP Server and some related plugins running streaming via RTSP. GStreamer RTSP Server examples can use some sources from webcam (dev/video0) with v4l2src, videotestsrc, or .MP4 file with filesrc.
So, how can I stream from framebuffer source (dev/fb0) via RTSP?
You can grab framebuffers with GStreamer.
Here is an example:
gst-launch-1.0 -v multifilesrc location=/dev/fb0 ! videoparse format=29 width=1680 height=1080 framerate=30/1 ! decodebin ! videoconvert ! autovideosink sync=false
You then have to adapt it to your RTSP application.
I type the Command in /gst-rtsp-server/example:
sudo ./test-launch "( multifilesrc location=/dev/fb0 ! videoparse format=29 framerate=30/1 ! decodebin ! videoconvert ! x264enc ! rtph264pay name=pay0 pt=96 )"
But, I got the error:
stream ready at rtsp://127.0.0.1:8554/test
x264 [error]: baseline profile doesn't support 4:4:4
Using VLC view
vlc vlc rtsp://127.0.0.1:8554/test
It's only black screen
Framebuffer info:
mode "1280x720"
geometry 1280 720 1280 720 32
timings 0 0 0 0 0 0 0
rgba 8/0,8/8,8/16,8/24
endmode
I have an mpegts video file encoded by a silicondust hdhomerun tuner. The pipeline I have currently:
gst-launch-0.10 filesrc location=filename.mpg ! decodebin name=decoder decoder. ! queue ! audioconvert ! audioresample ! alsasink device=front decoder. ! deinterlace ! ffmpegcolorspace ! glimagesink
Works well except that it does not capture all of the audio channels. I found this out tonight when I recorded a preseason football game and the announcers were not audible while the ref and the crowd noise was. This same file plays fine with all audio channels in xine.
Here is the output of ffmpeg, which describes the streams:
Stream #0:0[0x31]: Video: mpeg2video (Main) ([2][0][0][0] / 0x0002), yuv420p, 1280x720 [SAR 1:1 DAR 16:9], 14950 kb/s, 64.96 fps, 59.94 tbr, 90k tbn, 119.88 tbc
Stream #0:1[0x34](eng): Audio: ac3 (AC-3 / 0x332D4341), 48000 Hz, 5.1(side), s16, 448 kb/s
Stream #0:2[0x35](spa): Audio: ac3 (AC-3 / 0x332D4341), 48000 Hz, stereo, s16, 192 kb/s (visual impaired)
How can I get all audio channels to playback from a surround sound mpeg in gstreamer?
Extra info:
linux OS
alsa sound system
Update:
This problem is actually quite strange. Randomly, it plays back all channels required, and I'll think I have found the solution, but then the new found solution stops working and some of the audio channels are missing again.
Even playbin2 is randomly including and excluding these channels:
gst-launch-0.10 -v playbin2 uri=file:filename.mpg
I just submitted a bug report on bugzilla.gnome.org after determining that the intermittent behavior was also present using playbin2.