These commands produce a video image that has blurring around the edge of the changing static in the bottom right corner. If I remove the GRAY8 format, the blurring disappears.
gst-launch-1.0 videotestsrc ! video/x-raw,format=GRAY8 ! videoconvert ! x264enc ! rtph264pay ! udpsink host=127.0.0.1 port=5000
gst-launch-1.0 udpsrc port=5000 ! application/x-rtp ! rtph264depay ! avdec_h264 ! videoconvert ! ximagesink
Related
I am exeriencing latency with gstreamer. Can anyone recomend an optimised command for the two web cameras I am trying to use?
Any recomendations on what the command would look like adding audio as well?
Logitech Brio - Send
gst-launch-1.0 -v -e v4l2src device=$CAMERA ! image/jpeg,format=MJPG,width=1920,height=1080,framerate=60/1 ! jpegparse ! rtpjpegpay ! udpsink host=$HOSTIP port=$PORT sync=false
Logitech Brio - Receive (OBS)
udpsrc port=8555 caps = "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)JPEG, payload=(int)26" ! rtpjpegdepay ! jpegparse ! jpegdec ! video.
Logitech c920 - Send
gst-launch-1.0 -v -e v4l2src device=$CAMERA ! video/x-h264,width=320,height=240,framerate=30/1 ! x264enc bitrate=6000 pass=pass1 speed-preset=ultrafast tune=zerolatency sliced-threads=true threads=6 ! h264parse ! rtph264pay ! udpsink host=$HOSTIP port=$PORT sync=false
Logitech c920 - Receive (OBS)
udpsrc port=8554 caps = "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96" ! rtph264depay ! h264parse ! avdec_h264 ! video.
The documentation for some software I'm using says to use this gstreamer pipeline to stream video from a camera:
gst-launch-1.0 v4l2src device=/dev/video5 ! video/x-raw ! videoconvert ! v4l2h264enc ! h264parse config-interval=3 ! rtph264pay mtu=1024 ! udpsink host=127.0.0.1 port=5600
If I wanted to adapted this to pipe to a .mp4, I thought something like this would work:
gst-launch-1.0 v4l2src device=/dev/video5 ! video/x-raw ! videoconvert ! v4l2h264enc ! h264parse config-interval=3 ! filesink location=test.mp4
but the resulting file is not playable in vlc.
What am I missing?
Thanks in advance.
You would use a container (such as qtmux here) :
# For recording 100 frames:
gst-launch-1.0 v4l2src device=/dev/video5 num-buffers=100 ! video/x-raw ! videoconvert ! v4l2h264enc ! h264parse config-interval=3 ! qtmux ! filesink location=test.mp4
# If you want to stop mnaually with Ctrl-C, add EOS:
gst-launch-1.0 -e v4l2src device=/dev/video5 ! video/x-raw ! videoconvert ! v4l2h264enc ! h264parse config-interval=3 ! qtmux ! filesink location=test.mp4
ok, this works
This gstreamer pipeline works well to save my camera video stream to a file on my raspberry pi.
gst-launch-1.0 v4l2src device=/dev/video0 ! 'video/x-raw,framerate=30/1,format=UYVY' ! v4l2h264enc ! 'video/x-h264,level=(string)4' ! filesink location = test_video6.h264
but what is the correct pipeline to display a live video stream from my camera in order to watch it in real time on my monitor, instead of just saving it to a file to view it later with VLC.
For example, I have tried adding
! videoconvert ! autovideosink
to the above pipeline, but it does not work.
Try this:
gst-launch-1.0 v4l2src device=/dev/video0 ! 'video/x-raw,framerate=30/1,format=UYVY' ! v4l2h264enc ! 'video/x-h264,level=(string)4' ! decodebin ! videoconvert ! autovideosink
If this doesn't work you can use the general example of a video pipeline from here and use:
gst-launch-1.0 v4l2src ! decodebin ! videoconvert ! autovideosink
from there you can add the settings you want.
EDIT: Another implementation is creating a tee of your file and send it to play through a queue in this case you do:
gst-launch-1.0 v4l2src device=/dev/video0 ! 'video/x-raw,framerate=30/1,format=UYVY' ! v4l2h264enc ! 'video/x-h264,level=(string)4'! tee name="source"! queue ! filesink location = test_video6.h264 source. ! queue ! decodebin ! videoconvert ! autovideosink
I would like to receive a rtmp-stream and create a pipe with v4l2sink as output
gst-launch rtmpsrc location="rtmp://localhost/live/test" ! "video/x-raw-yuv,width=640,height=480,framerate=30/1,format=(fourcc)YUY2" ! videorate ! v4l2sink device=/dev/video1
But I get only a green screen: https://www.dropbox.com/s/yq9oqi9m62c5afo/screencast1422465570.webm?dl=0
Your pipeline is telling GStreamer to treat encoded, muxed RTMP data as YUV video buffers.
Instead you need to parse, demux, and decode the video part of the RTMP data. I don't have a sample stream to test on, but you may be able to just use decodebin (which in GStreamer 0.10 was called decodebin2 for whatever reason). You'll also want to reorder the videorate to be before the framerate caps, so it knows what to convert to.
Wild stab in the dark:
gst-launch rtmpsrc location="rtmp://localhost/live/test" ! decodebin2 ! videoscale ! ffmpegcolorspace ! videorate ! "video/x-raw-yuv,width=640,height=480,framerate=30/1,format=(fourcc)YUY2" ! v4l2sink device=/dev/video1
Now It works:
gst-launch rtmpsrc location="rtmp://localhost/live/test" ! decodebin2 ! videoscale ! ffmpegcolorspace ! videorate ! "video/x-raw-yuv,width=1920,height=1080,framerate=30/1,format=(fourcc)YUY2" ! v4l2sink device=/dev/video1
How to mix a live source and a non-live source with GStreamer videomixer plug-in?
Gst-launch shows nothing when mixing uridecodebin ( some mpeg video ) & videotestsrc
gst-launch \
videomixer name=mix sink_0::zorder=0 sink_1::zorder=1 ! ffmpegcolorspace ! autovideosink \
uridecodebin uri=file:///test.mpg ! timeoverlay ! videoscale ! video/x-raw-yuv,width=704 ,height=576 ! queue ! mix.sink_0 \
videotestsrc ! video/x-raw-yuv, width=176,height=144 ! queue ! mix.sink_1
But it works if I change both of the source to the mpeg video,
gst-launch
videomixer name=mix sink_0::zorder=0 sink_1::zorder=1 ! ffmpegcolorspace ! autovideosink
uridecodebin uri=file:///test.mpg ! timeoverlay ! videoscale ! video/x-raw-yuv,width=704 ,height=576 ! queue ! mix.sink_0
uridecodebin uri=file:///test.mpg ! timeoverlay ! videoscale ! video/x-raw-yuv,width=176,height=144 ! queue ! mix.sink_1
It seems that I made a silly mistake. Here is my answer to my own question here:
With the above command and the tested video clip, they output different video format.
After forcing the video format of videotestsrc to I420, it works fine.
Here is a command that mixes:
gst-launch -v
videomixer name=mix sink_0::zorder=0 sink_1::zorder=1 ! ffmpegcolorspace ! autovideosink
uridecodebin uri=file:///media/sf_share/test.mpg ! timeoverlay ! videoscale ! video/x-raw-yuv,width=704 ,height=576 ! videorate force-fps=-1 ! queue ! mix.sink_0
videotestsrc ! video/x-raw-yuv,width=352,height=288,format=(fourcc)I420 ! timeoverlay ! queue ! mix.sink_1