How to display fps of streaming video in gsteramer? - gstreamer

I'm using ov5647 MIPI sensor with Raspberry Pi 4 model B, to stream video I'm using gstreamer with v4l2 to stream
command:
gst-launch-1.0 v4l2src device=/dev/video0 ! videoconvert ! xvimagesink sync=false
what modification do I need to do with above command, to display fps?

Use the fpsdisplaysink element as follows::
gst-launch-1.0 v4l2src device=/dev/video0 ! videoconvert ! fpsdisplaysink video-sink=xvimagesink text-overlay=false sync=false -v 2>&1
-v 2>&1 - redirects output to stdout
text-overlay=true - renders the FPS information into the video stream.

Related

How to send/receive videotestsrc images encoded as JPEG using RTP with GStreamer?

I want to send/receive videotestsrc images encoded as JPEG using RTP with GStreamer.
I tried the following commands. It can be sent and received test image.
But, received image is not correct. It is broken. It seems image is dark and misaligned.
How to send and receive images normally?
send command
gst-launch-1.0 videotestsrc ! videoconvert ! video/x-raw, format=YUY2 ! jpegenc ! rtpjpegpay ! udpsink host=127.0.0.1 port=5000
receive command
gst-launch-1.0 udpsrc port=5000 ! application/x-rtp, media=video, encoding-name=JPEG, framerate=30/1, payload=26, clock-rate=90000 ! rtpjpegdepay ! jpegdec ! videoconvert ! autovideosink
result
environment
windows 10
gstreamer 1.18.3
#ken
This works great under linux (ubuntu 20.04). I suppose there's something about Windows and/or direct 3d. It looks like this
so seems good.

Using v4l2loopback and GStreamer with MJPEG cameras

I have one 4k camera which has MJPEG and YUY2 formats. Currently, I can run
$ gst-launch-1.0 v4l2src device=/dev/video1 ! "video/x-raw,format=YUY2,width=640,height=480,framerate=30/1" ! tee name=t ! queue ! v4l2sink device=/dev/video20 t. ! queue ! v4l2sink device=/dev/video21
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
And stream video1 image to two different devices.
Q: How to pass MJPEG image from video1 to both video20 and video21, which are in YUY2 format.
In the MJPEG case you need to add image/jpeg caps to v4l2src. After v4l2src you need to convert it to raw video.
Gstreamer has jpegdec and avdec_mjpeg plugins. In my current version jpegdec does not support YUY2 output, so I would use avdec_mjpeg. Alernatively you can use jpegdec with videoconvert (i.e.... ! jpegdec ! videoconvert ! ...).
The following line should do it:
gst-launch-1.0 v4l2src device=/dev/video1 ! "image/jpeg,width=3840,height=2160,framerate=30/1" ! avdec_mjpeg ! "video/x-raw,format=YUY2,width=3840,height=2160,framerate=30/1" ! tee name=t ! queue ! v4l2sink device=/dev/video20 t. ! queue ! v4l2sink device=/dev/video21

How to record video (1080p 30fps) from raspberry pi camera and stream the 'recording in progress' file simultaneously?

0
The objective I am trying to achieve is streaming 1080p video from Raspberry pi camera and record the video simultaneously.
I tried recording the http streaming as source but didn't work on 30fps. A lot of frames were missing and almost got 8fps only.
As a second approach, I am trying to record the file directly from camera and then streaming the "recording in progress/buffer" file. For the same I am trying to use GStreamer. Please suggest if this is good option or should I try any other?
For Recording using GStreamer I used
gst-launch-1.0 -v v4l2src device=/dev/video0 ! capsfilter caps="video/x-raw,width=1920,height=1080,framerate=30/1" !
videoflip method=clockwise ! videoflip method=clockwise ! videoconvert ! videorate ! x264enc! avimux ! filesink location=test_video.h264
Result : recorded video shows 1080p and 30fps but frames are dropping heavily.
For Streaming the video buffer I have used UDP in Gstreamer as,
gst-launch-1.0 -v v4l2src device=/dev/video0 ! capsfilter caps="video/x-raw,width=640,height=480,framerate=30/1" ! x264enc ! queue ! rtph264pay ! udpsink host=192.168.5.1 port=8080
Result : No specific errors on terminal but can't get stream on vlc.
Please suggest the best method here.

Using gstreamer to stream from Framebufferr via RTSP

I have built GStreamer, GStreamer RTSP Server and some related plugins running streaming via RTSP. GStreamer RTSP Server examples can use some sources from webcam (dev/video0) with v4l2src, videotestsrc, or .MP4 file with filesrc.
So, how can I stream from framebuffer source (dev/fb0) via RTSP?
You can grab framebuffers with GStreamer.
Here is an example:
gst-launch-1.0 -v multifilesrc location=/dev/fb0 ! videoparse format=29 width=1680 height=1080 framerate=30/1 ! decodebin ! videoconvert ! autovideosink sync=false
You then have to adapt it to your RTSP application.
I type the Command in /gst-rtsp-server/example:
sudo ./test-launch "( multifilesrc location=/dev/fb0 ! videoparse format=29 framerate=30/1 ! decodebin ! videoconvert ! x264enc ! rtph264pay name=pay0 pt=96 )"
But, I got the error:
stream ready at rtsp://127.0.0.1:8554/test
x264 [error]: baseline profile doesn't support 4:4:4
Using VLC view
vlc vlc rtsp://127.0.0.1:8554/test
It's only black screen
Framebuffer info:
mode "1280x720"
geometry 1280 720 1280 720 32
timings 0 0 0 0 0 0 0
rgba 8/0,8/8,8/16,8/24
endmode

How do I use GStreamer gst-launch to re-mux an interlaced h264 stream from an RTSP source?

Is there any way to take an interlaced h264 stream and re-mux it into Matroska or QT container form in a way that players will correctly identify it as interlaced? My basic pipeline is:
gst-launch-1.0 -e souphttpsrc location="http://hostname/stream1.sdp" ! application/sdp ! sdpdemux ! rtpjitterbuffer ! rtph264depay ! h264parse ! matroskamux ! filesink location=test.mkv
This saves the file just fine, but in all the players I've tried, the interlaced form of the video was not detected.