I'm working on a video streaming wearable device. During the tests, it came up that the pipeline clock and stream stop while fast walking or running. It's bizarre behaviour because in debug messages there are no errors about the broken pipeline, besides lost frames. It's frizzed and only restarting help. May you guys guess what causes the problem?
The pipelines I use:
streaming device:
gst-launch-1.0 -vem --gst-debug=3 v4l2src device=/dev/video0 ! video/x-raw,width=640,height=480,framerate=\(fraction\)30/1 ! v4l2h264enc extra-controls=s,video_bitrate=250000 capture-io-mode=4 output-io-mode=4 ! "video/x-h264,level=(string)4" ! rtph264pay config-interval=1 ! multiudpsink clients="127.0.0.1:5008,10.123.0.2:5008"
client:
udpsrc port=5008 do-timestamp=true ! application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96 ! rtpjitterbuffer latency=100 drop-on-latency=true drop-messages-interval=100000000 ! queue max-size-buffers=20000 ! rtph264depay ! h264parse ! avdec_h264 ! videoconvert ! glupload ! qmlglsink name=qmlglsink sync=false
The hardware I use is a PS3 Eye cam, and LTE modem to transmit video with a pretty low uplink of 1-2 Mbit/s, and everything running on RaspberryPi 3b+ 1GB.
For more debug info there are also pictures of the log file after last registered dropped frame and every next "cycle" sends a new query, loops over GST Element from sink to the source which is my camera and ends with max query duration(highlighted query to v4l2src)
Do you know how to overcome this problem?
The problem has been resolved. The issue was not variable encoder bitrate.
A more detailed inspection and pipeline that works for me is in this GStreamer issue page
Related
i am trying to stream live video feed from a camera connected to a Jetson NX, to a computer connected to the same network, the network works as wireless ethernet, meaning the jetson sees it as wired connection but in reality its wireless and is limited by bitrate.
On the jetson side, I am using OpenCV VideoWrite to send frames over the network using this pipeline:
cv::VideoWriter video_write("appsrc ! video/x-raw,format=BGR ! queue ! videoconvert ! video/x-raw,format=BGRx ! nvvidconv\
! video/x-raw(memory:NVMM),format=NV12,width=640,height=360,framerate=30/1 ! nvv4l2h264enc insert-sps-pps=1 insert-vui=1 idrinterval=30 \
bitrate=1800000 EnableTwopassCBR=1 ! h264parse ! rtph264pay ! udpsink host=169.254.84.2 port=5004 auto-multicast=0",\
cv::CAP_GSTREAMER,30,cv::Size(640,360));
on the receiving computer my video capture is :
cv::VideoCapture video("udpsrc port=5004 auto_multicast=0 !
application/x-rtp,media=video,encoding-name=H264 ! rtpjitterbuffer latency=0 !
rtph264depay ! decodebin ! videoconvert ! video/x-raw,format=BGR ! appsink drop=1 "
, cv::CAP_GSTREAMER);
Problem is, if the camera jitters a lot or moves a lot to any direction, the video stream either freezes or completely pixelates. I was hoping I could get suggestions for either a better encoder(im not limited to nvv4l2h264enc or h264 in general) or a solution for the pixaltion and freeze, or maybe even a better way to stream the video other than VideoWrite.
I am trying to stream 360p video at 30fps, my bitrate is limited to either 6mbps or 2.5mbps depending on the distance i want to limit myself at. it does not seem like a network problem simply because If i change parameters like Codec(MJPG instead of gstreamer for example) it changes the behaviour of the video feed, in my case it lowers the amount of freezing but makes the pixelating worse.
I am currently streaming my entire Surface Pro's desktop to another device on my local network.
I would like to optimize and reduce lag as much as possible between the two devices.
I currently have arround 500ms latency between what I have on the screen of the streaming device, and what I see on the device receiving and displaying the stream.
I have used UDP instead of TCP and encoded with H264.
gst-launch-1.0.exe dxgiscreencapsrc width=2880 height=1920 monitor=0 ! video/x-raw,framerate=30/1 ! queue !
videoconvert ! videoscale ! video/x-raw,format=NV12,width=1440,height=960 ! videoconvert !
videocrop top=50 left=20 bottom=280 right=20 ! queue ! mfh264enc rc-mode=0 low-latency=true
bitrate=1500 gop-size=10 ! queue ! rtph264pay ! udpsink host=127.0.0.1 port=49800
I am developing an application where I am using a wave file from a location at one end of a pipeline and udpsink at the other end of it.
gst-launch-1.0 filesrc location=/path/to/wave/file/Tornado.wav ! wavparse ! audioconvert ! audio/x-raw,channels=1,depth=16,width=16,rate=44100 ! rtpL16pay ! udpsink host=xxx.xxx.xxx.xxx port=5000
The Above wave file is having sampling rate = 44100 Hz and single-channel(mono)
On the same PC I am using a c++ program application to catch these packets and depayload to a headerless audio file (say Tornado.raw)
The pipeline I am creating for this is basically
gst-launch-1.0 udpsrc port=5000 ! "application/x-rtp,media=(string)audio, clock-rate=(int)44100, width=16, height=16, encoding-name=(string)L16, encoding-params=(string)1, channels=(int)1, channel-positions=(int)1, payload=(int)96" ! rtpL16depay ! filesink location=Tornado.raw
Now This works fine. I get the headerless data and when I play it using the Audacity It plays great!
I am trying to resample this audio file while it is in pipeline from 44100 Hz to 8000 Hz
Simply changing the clock-rate=(int)44100 to clock-rate=(int)8000 is not helping (also absurd logically)
I am looking for how to get the headerless file at the pipeline output with 8000 Hz sampling.
Also the data that I am getting now is Big-endian, but I want Little-endian as output. how do I set that in the pipeline?
You might relate this to one of my earlier question.
First, you have some weird caps in your pipeline - width and height are for video here. They probably will be just ignored.. but still.. not sure on others there as well but meh..
For the actual question. Just use audioresample and audioconvert elements of Gstreamer to transfer in your desired format.
E.g.
[..] ! rtpL16depay ! audioresample ! audioconvert ! \
audio/x-raw, rate=8000, format=S16LE ! filesink location=Tornado.raw
I am currently working on a project that utilizes a Nvidia Jetson. We need to stream 3 cameras over UDP RTP to a single source (unicast), while saving the contents of all three cameras.
I am having issues with my pipeline, It is probably a simple mistake somewhere that I simply am not seeing.
gst-launch-1.0 -e v4l2src device=/dev/video0 ! 'video/x-raw, width=(int)640, height=(int)480' ! tee name=c c. ! queue ! omxvp8enc bitrate=1500000 ! rtpvp8pay ! udpsink bind-port=8574 host=129.21.57.204 port=8574 loop=false c. ! queue ! omxh264enc bitrate=1500000 ! mp4mux ! queue ! filesink location=test-RightFacingCamera.mp4 v4l2src device=/dev/video1 ! 'video/x-raw, width=(int)640, height=(int)480' ! tee name=b b. ! queue ! omxvp8enc bitrate=1500000 ! rtpvp8pay ! udpsink bind-port=8564 host=129.21.57.204 port=8564 loop=false b. ! queue ! omxh264enc bitrate=1500000 ! mp4mux ! queue ! filesink location=test-LeftFacingCamera.mp4 v4l2src device=/dev/video2 ! 'video/x-raw, width=(int)640, height=(int)480' ! tee name=a a. ! queue ! omxvp8enc bitrate=1500000 ! rtpvp8pay ! udpsink bind-port=8554 host=129.21.57.204 port=8554 loop=false a. ! queue ! omxh264enc bitrate=1500000 ! mp4mux ! queue ! filesink location=test-FrontFacingCamera.mp4
Now the issue here is that 2 of the 3 streams will simply stop without cause, there is no debug information at all, they will simply cease to stream and write to the file after about 2 minutes of up time.
Additionally, I have considered converting this into C/C++ w/Gstreamer, I would not know where to begin if someone would like to point me in a direction. Currently I have a javascript code written up that detects each camera by serial number and assigns a port to the given camera. Then runs this command.
Thanks for any help.
This issue was caused by the cameras themselves. Turns out that ECON brand cameras have an issue where 3 of the identical camera will not work in v4l2. My team and I have bought new cameras, all identical model to test, and it works fine.
We were using ECONS because of supposed scientific quality and USB-3 speeds. Unfortunately we do not have USB3 speeds or bandwidth, so we are stuck on a lower resolution.
Hope that helps anyone that runs into a simaler problem, the current cameras that seem to all work asynchronously over USB2.0 are Logitech c922s
This is usb bandwidth limitation of Jetson. We can support 3 camera at a time with compromising the frame-rate. The Logitech camera is compared and that camera is H.264 camera (It gives compressed frames) so it afford to give 60fps bandwidth.
I am on gstreamer 1.2.4.
I have developed a video sink based on xvimagesink.
When I set the sync property of the sink to 0 the pipeline hangs after displaying a few frames.
I used the following pipeline
gst-launch-1.0 filesrc location=test.mp4 ! qtdemux ! h264parse ! avdec_h264 ! myvideosink sync=0
I noticed that even the decoder stops receiveing frames.
This issue occurs even with videotestsrc
gst-launch-1.0 videotestsrc ! myvideosink sync=0
This does not occur with xvimagesink.
What could be the problem?