Stream webcam with GStreamer (RTSP) - gstreamer
I have Kodak PIXPRO SP360 4k camera connected to the Jetson Nano or TX2 via USB cable.
I want to be able to see that video over browser, either with RTSP stream, Webrtc or something else.
It doesn't matter how it works in terms of technology, as long as it works. So if you have any ideas or suggestions be free to share them.
I'm currently trying to run the basic setup.
./test-launch "nvarguscamerasrc ! video/x-raw(memory:NVMM), format=NV12, width=1920, height=1080, framerate=30/1 ! nvvidconv ! video/x-raw, width=640, height=480, format=NV12, framerate=30/1 ! omxh265enc ! rtph265pay name=pay0 pt=96 config-interval=1"
and
gst-launch-1.0 rtspsrc location=rtsp://127.0.0.1:8554/test ! queue ! decodebin ! videoconvert ! xvimagesink
and I'm getting the error saying
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Progress: (open) Opening Stream
Progress: (connect) Connecting to rtsp://127.0.0.1:8554/test
Progress: (open) Retrieving server options
Progress: (open) Retrieving media info
ERROR: from element /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0: Could not get/set settings from/on resource.
Additional debug info:
gstrtspsrc.c(6999): gst_rtspsrc_setup_streams_start (): /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0:
SDP contains no streams
ERROR: pipeline doesn't want to preroll.
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
(test-launch:22440): GLib-GObject-WARNING **: 11:36:46.018: invalid cast from 'GstRtpH265Pay' to 'GstBin'
(test-launch:22440): GStreamer-CRITICAL **: 11:36:46.018: gst_bin_get_by_name: assertion 'GST_IS_BIN (bin)' failed
(test-launch:22440): GLib-GObject-WARNING **: 11:36:46.018: invalid cast from 'GstRtpH265Pay' to 'GstBin'
(test-launch:22440): GStreamer-CRITICAL **: 11:36:46.018: gst_bin_get_by_name: assertion 'GST_IS_BIN (bin)' failed
(test-launch:22440): GLib-GObject-WARNING **: 11:36:46.018: invalid cast from 'GstRtpH265Pay' to 'GstBin'
(test-launch:22440): GStreamer-CRITICAL **: 11:36:46.018: gst_bin_get_by_name: assertion 'GST_IS_BIN (bin)' failed
I have also tried an option that worked for me on PC but I can't get it to work on Jetson. The setup goes as follows.
Download Streameye from https://github.com/ccrisan/streameye and run:
netcat -l 8700 | ./streameye -p 1337
To send the webcam stream I run:
gst-launch-1.0 v4l2src device=/dev/video0 ! decodebin ! videoconvert ! videoscale ! videorate ! jpegenc quality=30 ! tcpclientsink host=127.0.0.1 port=8700
After this I get:
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal data stream error.
Additional debug info:
gstbasesrc.c(3064): gst_base_src_loop (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
streaming stopped, reason not-negotiated (-4)
Execution ended after 0:00:03.944998186
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
Output of this command for my camera is:
v4l2-ctl -d /dev/video1 --list-formats-ext
ioctl: VIDIOC_ENUM_FMT
Index : 0
Type : Video Capture
Pixel Format: 'MJPG' (compressed)
Name : Motion-JPEG
Size: Discrete 3840x2160
Interval: Discrete 0.200s (5.000 fps)
Size: Discrete 2880x2880
Interval: Discrete 0.200s (5.000 fps)
Size: Discrete 2048x2048
Interval: Discrete 0.200s (5.000 fps)
Size: Discrete 1440x1440
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.200s (5.000 fps)
Size: Discrete 1920x1080
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.200s (5.000 fps)
Size: Discrete 1280x720
Interval: Discrete 0.033s (30.000 fps)
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.200s (5.000 fps)
Size: Discrete 640x360
Interval: Discrete 0.033s (30.000 fps)
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.200s (5.000 fps)
run your pipe with -v like this and show me result:
gst-launch-1.0 v4l2src device=/dev/video0 ! decodebin ! videoconvert ! videoscale ! videorate ! jpegenc quality=30 ! tcpclientsink host=127.0.0.1 port=8700 -v
If you want to stream it, the simplest way will be to use gst-rtsp-launch which is part of GStreamer prebuild binaries:
gst-rtsp-launch '( v4l2src device=/dev/video0 ! videoconvert ! queue ! x264enc tune="zerolatency" byte-stream=true bitrate=10000 ! rtph264pay name=pay0 pt=96 )'
Later on you can tune codec, bitrate, but for me this is enough (playable in VLC - rtsp://127.0.0.1:8554/test)
Related
GStreamer pipeline stops playing after fast and shaky camera movement
I'm working on a video streaming wearable device. During the tests, it came up that the pipeline clock and stream stop while fast walking or running. It's bizarre behaviour because in debug messages there are no errors about the broken pipeline, besides lost frames. It's frizzed and only restarting help. May you guys guess what causes the problem? The pipelines I use: streaming device: gst-launch-1.0 -vem --gst-debug=3 v4l2src device=/dev/video0 ! video/x-raw,width=640,height=480,framerate=\(fraction\)30/1 ! v4l2h264enc extra-controls=s,video_bitrate=250000 capture-io-mode=4 output-io-mode=4 ! "video/x-h264,level=(string)4" ! rtph264pay config-interval=1 ! multiudpsink clients="127.0.0.1:5008,10.123.0.2:5008" client: udpsrc port=5008 do-timestamp=true ! application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96 ! rtpjitterbuffer latency=100 drop-on-latency=true drop-messages-interval=100000000 ! queue max-size-buffers=20000 ! rtph264depay ! h264parse ! avdec_h264 ! videoconvert ! glupload ! qmlglsink name=qmlglsink sync=false The hardware I use is a PS3 Eye cam, and LTE modem to transmit video with a pretty low uplink of 1-2 Mbit/s, and everything running on RaspberryPi 3b+ 1GB. For more debug info there are also pictures of the log file after last registered dropped frame and every next "cycle" sends a new query, loops over GST Element from sink to the source which is my camera and ends with max query duration(highlighted query to v4l2src) Do you know how to overcome this problem?
The problem has been resolved. The issue was not variable encoder bitrate. A more detailed inspection and pipeline that works for me is in this GStreamer issue page
setting the video format in v4l2 for streaming with gstreamer
I want to use gstreamer (gst-launch-1.0) to stream a video signal from a camera connected to a raspberry pi (CM4) to a remote client over UDP. The gstreamer pipelines that I use always reverts to the uncompressed YUYV pixel format even after I set the format to MJPG with v4l2. This is my pipeline: pi#cm4:~ $ v4l2-ctl -d /dev/video0 --set-fmt-video=width=1920,height=1080,pixelformat=MJPG pi#cm4:~ $ gst-launch-1.0 -v v4l2src device=/dev/video0 ! "video/x-raw, width=1920, height=1080, pixelformat=MJPG" ! rndbuffersize max=65000 ! udpsink host=127.0.0.1 port=1234 Setting pipeline to PAUSED ... Pipeline is live and does not need PREROLL ... Setting pipeline to PLAYING ... New clock: GstSystemClock /GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = video/x-raw, width=(int)1920, height=(int)1080, pixelformat=(string)MJPG, format=(string)YUY2, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)5/1, colorimetry=(string)2:4:5:1, interlace-mode=(string)progressive /GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw, width=(int)1920, height=(int)1080, pixelformat=(string)MJPG, format=(string)YUY2, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)5/1, colorimetry=(string)2:4:5:1, interlace-mode=(string)progressive /GstPipeline:pipeline0/GstRndBufferSize:rndbuffersize0.GstPad:src: caps = video/x-raw, width=(int)1920, height=(int)1080, pixelformat=(string)MJPG, format=(string)YUY2, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)5/1, colorimetry=(string)2:4:5:1, interlace-mode=(string)progressive /GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = video/x-raw, width=(int)1920, height=(int)1080, pixelformat=(string)MJPG, format=(string)YUY2, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)5/1, colorimetry=(string)2:4:5:1, interlace-mode=(string)progressive /GstPipeline:pipeline0/GstRndBufferSize:rndbuffersize0.GstPad:sink: caps = video/x-raw, width=(int)1920, height=(int)1080, pixelformat=(string)MJPG, format=(string)YUY2, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)5/1, colorimetry=(string)2:4:5:1, interlace-mode=(string)progressive /GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw, width=(int)1920, height=(int)1080, pixelformat=(string)MJPG, format=(string)YUY2, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)5/1, colorimetry=(string)2:4:5:1, interlace-mode=(string)progressive Even though the pipeline seems to accept the "pixelformat=(string)MJPG", the format is YUY2 and the maximum framerate is the 5fps. If I set the framerate to anything higher than 5/1, it fails with: ** (gst-launch-1.0:16205): CRITICAL **: 21:36:05.076: gst_adapter_take_buffer: assertion 'GST_IS_ADAPTER (adapter)' failed ** (gst-launch-1.0:16205): CRITICAL **: 21:36:05.076: gst_adapter_available: assertion 'GST_IS_ADAPTER (adapter)' failed ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal data stream error. After execution of the gstreamer pipeline, v4l2-ctl confirms that the video format reverted to YUYV. How can I force the gstreamer pipeline to use MJPG 1920x1080 and enable higher frame rates? The camera is a Canon 5D iv that produces a clean HDMI output up to full HD at 60fps. The camera HDMI output is connected to an HDMI to USB video capture (mirabox) that supports 1920x1080 at 60fps. The video capture box is connected to the CM4 via a USB3-PCIe adapter. This is the list of supported formats: pi#cm4:~ $ v4l2-ctl -d 0 --list-formats-ext ioctl: VIDIOC_ENUM_FMT Type: Video Capture [0]: 'MJPG' (Motion-JPEG, compressed) Size: Discrete 1920x1080 Interval: Discrete 0.017s (60.000 fps) Interval: Discrete 0.033s (30.000 fps) Interval: Discrete 0.040s (25.000 fps) Interval: Discrete 0.050s (20.000 fps) Interval: Discrete 0.100s (10.000 fps) [..... deleted lower resolution formats...] [1]: 'YUYV' (YUYV 4:2:2) Size: Discrete 1920x1080 Interval: Discrete 0.200s (5.000 fps) Size: Discrete 1600x1200 Interval: Discrete 0.200s (5.000 fps) [..... deleted lower resolution formats...]
Setting the pixel format is actually incorrect here. MJPEG is not a pixel fornat for "raw" video. Try v4l2src device=/dev/video0 ! image/jpeg, width=1920, height=1080, framerate=30/1 ! .. Note that the camera will return you jpeg image data, so you will need a jpeg decoder if you want to display the image.
Need to decode a RTSP stream to V4l2loopback with gstreamer
I am trying to decode an RTSP stream to a V4L2 device on a raspberry pi zero. first i create a loopback device this way.. sudo modprobe v4l2loopback video_nr=25 Then i try to decode the RTSP stream to a virtual V4L2 device. I would prefer if i could sink it to YUYV format.. sudo gst-launch-1.0 rtspsrc location=rtsp://192.168.1.138:8554/unicast ! rtph264depay ! h264parse ! decodebin ! videoconvert ! video/x-raw,format=YUY2 ! tee ! v4l2sink device=/dev/video25 When i inspect the V4L2 device with this... v4l2-ctl -d 25 --list-formats , i get this... ioctl: VIDIOC_ENUM_FMT Type: Video Capture When i try to play it back with VLC , nothing happens. sudo cvlc --v4l2-dev=25 --v4l2-width=640 --v4l2-height=480 --v4l2-fps=30 --vout=mmal_vout v4l2:///dev/video25 I suspect the gstreamer pipeline is not correct. I am new to gstreamer and i am poking a little bit in the dark there. Could anybody give me some tips on what i am doing wrong here ?
Using gstreamer to stream from Framebufferr via RTSP
I have built GStreamer, GStreamer RTSP Server and some related plugins running streaming via RTSP. GStreamer RTSP Server examples can use some sources from webcam (dev/video0) with v4l2src, videotestsrc, or .MP4 file with filesrc. So, how can I stream from framebuffer source (dev/fb0) via RTSP?
You can grab framebuffers with GStreamer. Here is an example: gst-launch-1.0 -v multifilesrc location=/dev/fb0 ! videoparse format=29 width=1680 height=1080 framerate=30/1 ! decodebin ! videoconvert ! autovideosink sync=false You then have to adapt it to your RTSP application.
I type the Command in /gst-rtsp-server/example: sudo ./test-launch "( multifilesrc location=/dev/fb0 ! videoparse format=29 framerate=30/1 ! decodebin ! videoconvert ! x264enc ! rtph264pay name=pay0 pt=96 )" But, I got the error: stream ready at rtsp://127.0.0.1:8554/test x264 [error]: baseline profile doesn't support 4:4:4 Using VLC view vlc vlc rtsp://127.0.0.1:8554/test It's only black screen Framebuffer info: mode "1280x720" geometry 1280 720 1280 720 32 timings 0 0 0 0 0 0 0 rgba 8/0,8/8,8/16,8/24 endmode
Mimimize gstreamer CPU usage streaming filesrc as RTSP to udpsink
I stream h264 contents as rtsp from a filesrc to a udpsink gst-launch-0.10 filesrc location=./test.movie ! h264parse ! rtph264pay ! udpsink host=127.0.0.1 port=9004 This pipeline works well but gives a quite high CPU load and my question is if there is any way of minimizing the CPU load, limiting the processing rate? Feels like since there is a filesrc and udpsink there is really nothing stopping gstreamer from just working as quick as it can. A 35sec movie takes <0,5 seconds for gstreamer to stream with the given pipeline.