Need to decode a RTSP stream to V4l2loopback with gstreamer - gstreamer

I am trying to decode an RTSP stream to a V4L2 device on a raspberry pi zero.
first i create a loopback device this way..
sudo modprobe v4l2loopback video_nr=25
Then i try to decode the RTSP stream to a virtual V4L2 device. I would prefer if i could sink it to YUYV format..
sudo gst-launch-1.0 rtspsrc location=rtsp://192.168.1.138:8554/unicast ! rtph264depay ! h264parse ! decodebin ! videoconvert ! video/x-raw,format=YUY2 ! tee ! v4l2sink device=/dev/video25
When i inspect the V4L2 device with this...
v4l2-ctl -d 25 --list-formats , i get this...
ioctl: VIDIOC_ENUM_FMT
Type: Video Capture
When i try to play it back with VLC , nothing happens.
sudo cvlc --v4l2-dev=25 --v4l2-width=640 --v4l2-height=480 --v4l2-fps=30 --vout=mmal_vout v4l2:///dev/video25
I suspect the gstreamer pipeline is not correct. I am new to gstreamer and i am poking a little bit in the dark there.
Could anybody give me some tips on what i am doing wrong here ?

Related

How to display fps of streaming video in gsteramer?

I'm using ov5647 MIPI sensor with Raspberry Pi 4 model B, to stream video I'm using gstreamer with v4l2 to stream
command:
gst-launch-1.0 v4l2src device=/dev/video0 ! videoconvert ! xvimagesink sync=false
what modification do I need to do with above command, to display fps?
Use the fpsdisplaysink element as follows::
gst-launch-1.0 v4l2src device=/dev/video0 ! videoconvert ! fpsdisplaysink video-sink=xvimagesink text-overlay=false sync=false -v 2>&1
-v 2>&1 - redirects output to stdout
text-overlay=true - renders the FPS information into the video stream.

Trying to stream virtualy created camera video0 device to make rtsp stream for VLC using gstreamer in 24bit RGB format

I have created Video0 device using V4l2loopback and used the following sample Git code V4l2Loopback_cpp as a application to stream jpg images from a folder sequential by altering some conditions in the code.But the code read images as 24Bit RGB image and send it Video0 device which is fine ,because the image run like a proper video on VLC video device capture. As i mentioned earlier thet if i checked the VLC properties of the video its Shows the following content
i need this video0 device to stream rtsp h264 video in vlc using the gstreamer lib .
i have used the following command to check in commandline for testing but its show some internal process error
gst-launch-1.0 -v v4l2src device=/dev/video0 ! video/x-raw,width=590,height=332,framerate=30/1 ! x264enc ! rtph264pay name=pay0 pt=96
i dont know whats the problem here .is it 24bit jpeg format or the gstreamer command i use. I need a proper gstreamer command line to process the video0 devide to stream h264 rtsp video any help is appreciated thank u.
image Format - jpg (sequence image passed)
Video0 recives - 24-bit RGB image
output need - h264 rtsp stream from video0
Not sure this is a solution, but the following may help:
You may try adjusting resolution according to what V4L reports (width=584) :
v4l2src device=/dev/video0 ! video/x-raw,format=RGB,width=584,height=332,framerate=30/1 ! videoconvert ! x264enc insert-vui=1 ! h264parse ! rtph264pay name=pay0 pt=96
Note that this error may happen on v4l2loopback receiver side while it may be a sender error. If you're feeding v4l2loopback with a gstreamer pipeline to v4l2sink, you would try adding identity such as:
... ! video/x-raw,format=RGB,width=584,height=332,framerate=30/1 ! identity drop-allocation=1 ! v4l2sink

How to record video (1080p 30fps) from raspberry pi camera and stream the 'recording in progress' file simultaneously?

0
The objective I am trying to achieve is streaming 1080p video from Raspberry pi camera and record the video simultaneously.
I tried recording the http streaming as source but didn't work on 30fps. A lot of frames were missing and almost got 8fps only.
As a second approach, I am trying to record the file directly from camera and then streaming the "recording in progress/buffer" file. For the same I am trying to use GStreamer. Please suggest if this is good option or should I try any other?
For Recording using GStreamer I used
gst-launch-1.0 -v v4l2src device=/dev/video0 ! capsfilter caps="video/x-raw,width=1920,height=1080,framerate=30/1" !
videoflip method=clockwise ! videoflip method=clockwise ! videoconvert ! videorate ! x264enc! avimux ! filesink location=test_video.h264
Result : recorded video shows 1080p and 30fps but frames are dropping heavily.
For Streaming the video buffer I have used UDP in Gstreamer as,
gst-launch-1.0 -v v4l2src device=/dev/video0 ! capsfilter caps="video/x-raw,width=640,height=480,framerate=30/1" ! x264enc ! queue ! rtph264pay ! udpsink host=192.168.5.1 port=8080
Result : No specific errors on terminal but can't get stream on vlc.
Please suggest the best method here.

Using gstreamer to stream from Framebufferr via RTSP

I have built GStreamer, GStreamer RTSP Server and some related plugins running streaming via RTSP. GStreamer RTSP Server examples can use some sources from webcam (dev/video0) with v4l2src, videotestsrc, or .MP4 file with filesrc.
So, how can I stream from framebuffer source (dev/fb0) via RTSP?
You can grab framebuffers with GStreamer.
Here is an example:
gst-launch-1.0 -v multifilesrc location=/dev/fb0 ! videoparse format=29 width=1680 height=1080 framerate=30/1 ! decodebin ! videoconvert ! autovideosink sync=false
You then have to adapt it to your RTSP application.
I type the Command in /gst-rtsp-server/example:
sudo ./test-launch "( multifilesrc location=/dev/fb0 ! videoparse format=29 framerate=30/1 ! decodebin ! videoconvert ! x264enc ! rtph264pay name=pay0 pt=96 )"
But, I got the error:
stream ready at rtsp://127.0.0.1:8554/test
x264 [error]: baseline profile doesn't support 4:4:4
Using VLC view
vlc vlc rtsp://127.0.0.1:8554/test
It's only black screen
Framebuffer info:
mode "1280x720"
geometry 1280 720 1280 720 32
timings 0 0 0 0 0 0 0
rgba 8/0,8/8,8/16,8/24
endmode

Streaming H264 video for Logitech C920 using GStreamer produces a relief

I'm trying to stream the native H264 video from a Logitech C920 camera using GStreamer 1.2.4:
gst-launch-1.0 v4l2src ! video/x-h264,width=640,height=480,framerate=10/1 ! \
h264parse ! decodebin ! videoconvert ! ximagesink sync=false
This produces an image output which is a kind of a relief:
As soon as add more movement to the scene, the image quality improves but is still far away from being sufficent. I seems that anyhow the stable parts of the video stream are not decoded.
Any ideas?
I'm using gstreamer 1.2.4 on a Banana PI (Debian Wheezy)