i am trying to stream live video feed from a camera connected to a Jetson NX, to a computer connected to the same network, the network works as wireless ethernet, meaning the jetson sees it as wired connection but in reality its wireless and is limited by bitrate.
On the jetson side, I am using OpenCV VideoWrite to send frames over the network using this pipeline:
cv::VideoWriter video_write("appsrc ! video/x-raw,format=BGR ! queue ! videoconvert ! video/x-raw,format=BGRx ! nvvidconv\
! video/x-raw(memory:NVMM),format=NV12,width=640,height=360,framerate=30/1 ! nvv4l2h264enc insert-sps-pps=1 insert-vui=1 idrinterval=30 \
bitrate=1800000 EnableTwopassCBR=1 ! h264parse ! rtph264pay ! udpsink host=169.254.84.2 port=5004 auto-multicast=0",\
cv::CAP_GSTREAMER,30,cv::Size(640,360));
on the receiving computer my video capture is :
cv::VideoCapture video("udpsrc port=5004 auto_multicast=0 !
application/x-rtp,media=video,encoding-name=H264 ! rtpjitterbuffer latency=0 !
rtph264depay ! decodebin ! videoconvert ! video/x-raw,format=BGR ! appsink drop=1 "
, cv::CAP_GSTREAMER);
Problem is, if the camera jitters a lot or moves a lot to any direction, the video stream either freezes or completely pixelates. I was hoping I could get suggestions for either a better encoder(im not limited to nvv4l2h264enc or h264 in general) or a solution for the pixaltion and freeze, or maybe even a better way to stream the video other than VideoWrite.
I am trying to stream 360p video at 30fps, my bitrate is limited to either 6mbps or 2.5mbps depending on the distance i want to limit myself at. it does not seem like a network problem simply because If i change parameters like Codec(MJPG instead of gstreamer for example) it changes the behaviour of the video feed, in my case it lowers the amount of freezing but makes the pixelating worse.
Related
I am currently streaming my entire Surface Pro's desktop to another device on my local network.
I would like to optimize and reduce lag as much as possible between the two devices.
I currently have arround 500ms latency between what I have on the screen of the streaming device, and what I see on the device receiving and displaying the stream.
I have used UDP instead of TCP and encoded with H264.
gst-launch-1.0.exe dxgiscreencapsrc width=2880 height=1920 monitor=0 ! video/x-raw,framerate=30/1 ! queue !
videoconvert ! videoscale ! video/x-raw,format=NV12,width=1440,height=960 ! videoconvert !
videocrop top=50 left=20 bottom=280 right=20 ! queue ! mfh264enc rc-mode=0 low-latency=true
bitrate=1500 gop-size=10 ! queue ! rtph264pay ! udpsink host=127.0.0.1 port=49800
I'm working on a video streaming wearable device. During the tests, it came up that the pipeline clock and stream stop while fast walking or running. It's bizarre behaviour because in debug messages there are no errors about the broken pipeline, besides lost frames. It's frizzed and only restarting help. May you guys guess what causes the problem?
The pipelines I use:
streaming device:
gst-launch-1.0 -vem --gst-debug=3 v4l2src device=/dev/video0 ! video/x-raw,width=640,height=480,framerate=\(fraction\)30/1 ! v4l2h264enc extra-controls=s,video_bitrate=250000 capture-io-mode=4 output-io-mode=4 ! "video/x-h264,level=(string)4" ! rtph264pay config-interval=1 ! multiudpsink clients="127.0.0.1:5008,10.123.0.2:5008"
client:
udpsrc port=5008 do-timestamp=true ! application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96 ! rtpjitterbuffer latency=100 drop-on-latency=true drop-messages-interval=100000000 ! queue max-size-buffers=20000 ! rtph264depay ! h264parse ! avdec_h264 ! videoconvert ! glupload ! qmlglsink name=qmlglsink sync=false
The hardware I use is a PS3 Eye cam, and LTE modem to transmit video with a pretty low uplink of 1-2 Mbit/s, and everything running on RaspberryPi 3b+ 1GB.
For more debug info there are also pictures of the log file after last registered dropped frame and every next "cycle" sends a new query, loops over GST Element from sink to the source which is my camera and ends with max query duration(highlighted query to v4l2src)
Do you know how to overcome this problem?
The problem has been resolved. The issue was not variable encoder bitrate.
A more detailed inspection and pipeline that works for me is in this GStreamer issue page
I am trying to display the camera data, store the video and stream the video over wifi at the same time.
test-launch "(v4l2src device=/dev/video0 ! video/x-raw,framerate=30/1,width=720,height=480 ! tee name="splitter" ! queue ! imxvpuenc_h264 bitrate=2000 ! rtph264pay name=pay0 pt=96 splitter. ! queue ! videoconvert ! video/x-raw,width=720,height=480 ! fbdevsink)" &
This command is working fine with one problem. The video on display is not working until I connect the Wi-Fi and start the steaming. I would like to start the video playback to display. I don't want to be dependent on Wi-Fi streaming.
0
The objective I am trying to achieve is streaming 1080p video from Raspberry pi camera and record the video simultaneously.
I tried recording the http streaming as source but didn't work on 30fps. A lot of frames were missing and almost got 8fps only.
As a second approach, I am trying to record the file directly from camera and then streaming the "recording in progress/buffer" file. For the same I am trying to use GStreamer. Please suggest if this is good option or should I try any other?
For Recording using GStreamer I used
gst-launch-1.0 -v v4l2src device=/dev/video0 ! capsfilter caps="video/x-raw,width=1920,height=1080,framerate=30/1" !
videoflip method=clockwise ! videoflip method=clockwise ! videoconvert ! videorate ! x264enc! avimux ! filesink location=test_video.h264
Result : recorded video shows 1080p and 30fps but frames are dropping heavily.
For Streaming the video buffer I have used UDP in Gstreamer as,
gst-launch-1.0 -v v4l2src device=/dev/video0 ! capsfilter caps="video/x-raw,width=640,height=480,framerate=30/1" ! x264enc ! queue ! rtph264pay ! udpsink host=192.168.5.1 port=8080
Result : No specific errors on terminal but can't get stream on vlc.
Please suggest the best method here.
I am currently working on a project that utilizes a Nvidia Jetson. We need to stream 3 cameras over UDP RTP to a single source (unicast), while saving the contents of all three cameras.
I am having issues with my pipeline, It is probably a simple mistake somewhere that I simply am not seeing.
gst-launch-1.0 -e v4l2src device=/dev/video0 ! 'video/x-raw, width=(int)640, height=(int)480' ! tee name=c c. ! queue ! omxvp8enc bitrate=1500000 ! rtpvp8pay ! udpsink bind-port=8574 host=129.21.57.204 port=8574 loop=false c. ! queue ! omxh264enc bitrate=1500000 ! mp4mux ! queue ! filesink location=test-RightFacingCamera.mp4 v4l2src device=/dev/video1 ! 'video/x-raw, width=(int)640, height=(int)480' ! tee name=b b. ! queue ! omxvp8enc bitrate=1500000 ! rtpvp8pay ! udpsink bind-port=8564 host=129.21.57.204 port=8564 loop=false b. ! queue ! omxh264enc bitrate=1500000 ! mp4mux ! queue ! filesink location=test-LeftFacingCamera.mp4 v4l2src device=/dev/video2 ! 'video/x-raw, width=(int)640, height=(int)480' ! tee name=a a. ! queue ! omxvp8enc bitrate=1500000 ! rtpvp8pay ! udpsink bind-port=8554 host=129.21.57.204 port=8554 loop=false a. ! queue ! omxh264enc bitrate=1500000 ! mp4mux ! queue ! filesink location=test-FrontFacingCamera.mp4
Now the issue here is that 2 of the 3 streams will simply stop without cause, there is no debug information at all, they will simply cease to stream and write to the file after about 2 minutes of up time.
Additionally, I have considered converting this into C/C++ w/Gstreamer, I would not know where to begin if someone would like to point me in a direction. Currently I have a javascript code written up that detects each camera by serial number and assigns a port to the given camera. Then runs this command.
Thanks for any help.
This issue was caused by the cameras themselves. Turns out that ECON brand cameras have an issue where 3 of the identical camera will not work in v4l2. My team and I have bought new cameras, all identical model to test, and it works fine.
We were using ECONS because of supposed scientific quality and USB-3 speeds. Unfortunately we do not have USB3 speeds or bandwidth, so we are stuck on a lower resolution.
Hope that helps anyone that runs into a simaler problem, the current cameras that seem to all work asynchronously over USB2.0 are Logitech c922s
This is usb bandwidth limitation of Jetson. We can support 3 camera at a time with compromising the frame-rate. The Logitech camera is compared and that camera is H.264 camera (It gives compressed frames) so it afford to give 60fps bandwidth.