gstreamer - Reducing Latency - gstreamer

I am exeriencing latency with gstreamer. Can anyone recomend an optimised command for the two web cameras I am trying to use?
Any recomendations on what the command would look like adding audio as well?
Logitech Brio - Send
gst-launch-1.0 -v -e v4l2src device=$CAMERA ! image/jpeg,format=MJPG,width=1920,height=1080,framerate=60/1 ! jpegparse ! rtpjpegpay ! udpsink host=$HOSTIP port=$PORT sync=false
Logitech Brio - Receive (OBS)
udpsrc port=8555 caps = "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)JPEG, payload=(int)26" ! rtpjpegdepay ! jpegparse ! jpegdec ! video.
Logitech c920 - Send
gst-launch-1.0 -v -e v4l2src device=$CAMERA ! video/x-h264,width=320,height=240,framerate=30/1 ! x264enc bitrate=6000 pass=pass1 speed-preset=ultrafast tune=zerolatency sliced-threads=true threads=6 ! h264parse ! rtph264pay ! udpsink host=$HOSTIP port=$PORT sync=false
Logitech c920 - Receive (OBS)
udpsrc port=8554 caps = "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96" ! rtph264depay ! h264parse ! avdec_h264 ! video.

Related

Gstreamer x-raw to h264 mp4

The documentation for some software I'm using says to use this gstreamer pipeline to stream video from a camera:
gst-launch-1.0 v4l2src device=/dev/video5 ! video/x-raw ! videoconvert ! v4l2h264enc ! h264parse config-interval=3 ! rtph264pay mtu=1024 ! udpsink host=127.0.0.1 port=5600
If I wanted to adapted this to pipe to a .mp4, I thought something like this would work:
gst-launch-1.0 v4l2src device=/dev/video5 ! video/x-raw ! videoconvert ! v4l2h264enc ! h264parse config-interval=3 ! filesink location=test.mp4
but the resulting file is not playable in vlc.
What am I missing?
Thanks in advance.
You would use a container (such as qtmux here) :
# For recording 100 frames:
gst-launch-1.0 v4l2src device=/dev/video5 num-buffers=100 ! video/x-raw ! videoconvert ! v4l2h264enc ! h264parse config-interval=3 ! qtmux ! filesink location=test.mp4
# If you want to stop mnaually with Ctrl-C, add EOS:
gst-launch-1.0 -e v4l2src device=/dev/video5 ! video/x-raw ! videoconvert ! v4l2h264enc ! h264parse config-interval=3 ! qtmux ! filesink location=test.mp4

Blurring when sending GRAY8 video over UDP with GStreamer

These commands produce a video image that has blurring around the edge of the changing static in the bottom right corner. If I remove the GRAY8 format, the blurring disappears.
gst-launch-1.0 videotestsrc ! video/x-raw,format=GRAY8 ! videoconvert ! x264enc ! rtph264pay ! udpsink host=127.0.0.1 port=5000
gst-launch-1.0 udpsrc port=5000 ! application/x-rtp ! rtph264depay ! avdec_h264 ! videoconvert ! ximagesink

Use gstreamer to stream video and audio of Logitech C920

I'm quite a newbie on using gstreamer. I want to stream video and audio from my C920 webcam to another PC but I keep getting wrong in combining things..
I can now stream h264 video from my C920 to another PC using:
gst-launch-1.0 v4l2src device=/dev/video1 ! video/x-h264,width=1280,height=720,framerate=30/1 ! h264parse ! rtph264pay pt=127 config-interval=4 ! udpsink host=172.19.3.103
And view it with:
gst-launch-1.0 udpsrc port=1234 ! application/x-rtp, payload=127 ! rtph264depay ! avdec_h264 ! xvimagesink sync=false
I can also get the audio from the C920 and record it to a file together with a test-image:
gst-launch videotestsrc ! videorate ! video/x-raw-yuv,framerate=5/1 ! queue ! theoraenc ! queue ! mux. pulsesrc device="alsa_input.usb-046d_HD_Pro_Webcam_C920_F1894590-02-C920.analog-stereo" ! audio/x-raw-int,rate=48000,channels=2,depth=16 ! queue ! audioconvert ! queue ! vorbisenc ! queue ! mux. oggmux name=mux ! filesink location=stream.ogv
But I' trying to get something like this (below) to work.. This one is not working, presumably it's even a very bad combi I made!
gst-launch v4l2src device=/dev/video1 ! video/x-h264,width=1280,height=720,framerate=30/1 ! queue ! mux. pulsesrc device="alsa_input.usb-046d_HD_Pro_Webcam_C920_F1894590-02-C920.analog-stereo" ! audio/x-raw-int,rate=48000,channels=2,depth=16 ! queue ! audioconvert ! queue ! x264enc ! queue ! udpsink host=127.0.0.1 port=1234
You should encode your video before linking it against the mux. Also, I do not see you declaring the type of muxer you are using and you do not put the audio in the mux.
I am not sure it is even possible to send audio AND video over the same rtp stream in this manner in gstreamer. I know that the rtsp server implementation in gstreamer allows audio and video together but even in it I am not sure if it is still two streams just being abstracted away from implementation.
You may want to just use to separate streams and pass them through to a gstrtpbin element.

GStreamer - How to compose three streams (two video files with audio)

I have two avi files (one contains only video and the second file consists of video and audio stream). I would like to composite/mix two avi files to one video file (side by side) and preserve audio.
I can compose video streams (using videomixer) but I don't know how to add audio stream:
gst-launch filesrc location=test01.avi ! decodebin ! videoscale ! ffmpegcolorspace ! video/x-raw-yuv, width=640, height=480 ! videobox border-alpha=0 left=-640 ! videomixer name=mix ! ffmpegcolorspace ! queue ! avimux ! filesink location=videoTestMix666.avi filesrc location=test02.avi ! decodebin ! videoscale ! ffmpegcolorspace ! video/x-raw-yuv, width=640, height=480 ! videobox right=-640 ! mix.
Could you advice me hot to do this - how to add audio? Many thanks

Can't mix videotestsrc and uridecodebin with GStreamer/videomixer

How to mix a live source and a non-live source with GStreamer videomixer plug-in?
Gst-launch shows nothing when mixing uridecodebin ( some mpeg video ) & videotestsrc
gst-launch \
videomixer name=mix sink_0::zorder=0 sink_1::zorder=1 ! ffmpegcolorspace ! autovideosink \
uridecodebin uri=file:///test.mpg ! timeoverlay ! videoscale ! video/x-raw-yuv,width=704 ,height=576 ! queue ! mix.sink_0 \
videotestsrc ! video/x-raw-yuv, width=176,height=144 ! queue ! mix.sink_1
But it works if I change both of the source to the mpeg video,
gst-launch
videomixer name=mix sink_0::zorder=0 sink_1::zorder=1 ! ffmpegcolorspace ! autovideosink
uridecodebin uri=file:///test.mpg ! timeoverlay ! videoscale ! video/x-raw-yuv,width=704 ,height=576 ! queue ! mix.sink_0
uridecodebin uri=file:///test.mpg ! timeoverlay ! videoscale ! video/x-raw-yuv,width=176,height=144 ! queue ! mix.sink_1
It seems that I made a silly mistake. Here is my answer to my own question here:
With the above command and the tested video clip, they output different video format.
After forcing the video format of videotestsrc to I420, it works fine.
Here is a command that mixes:
gst-launch -v
videomixer name=mix sink_0::zorder=0 sink_1::zorder=1 ! ffmpegcolorspace ! autovideosink
uridecodebin uri=file:///media/sf_share/test.mpg ! timeoverlay ! videoscale ! video/x-raw-yuv,width=704 ,height=576 ! videorate force-fps=-1 ! queue ! mix.sink_0
videotestsrc ! video/x-raw-yuv,width=352,height=288,format=(fourcc)I420 ! timeoverlay ! queue ! mix.sink_1