Campture still images using Gstreamer in RPi4 - gstreamer

I am trying to get a camera that uses Gstreamer to capture images in a specific interval of time, while searching in the web I found the following line:
gst-launch-1.0 -v videotestsrc is-live=true ! clockoverlay font-desc="Sans, 48" ! videoconvert ! videorate ! video/x-raw,framerate=1/3 ! jpegenc ! multifilesink location=file-%02d.jpg
I believe it would work great, but unfortunately, I don't know how to get it to work with my specific camera, meaning I don't know how to identify my video source in RPi4 and if that is the only thing I have to change to get it to work. I would appreciate either help with the video source, or any other method to get those images.

Related

Gstreamer screenshot from RTSP stream is always gray

I'm trying to create screenshot (i.e. grab one frame) from RTSP camera stream using gstreamer pipeline.
The pipeline used looks like this:
gst-launch-1.0 rtspsrc location=$CAM_URL is_live=true ! decodebin ! videoconvert ! jpegenc snapshot=true ! filesink location=/tmp/frame.jpg
Problem is that the result image is always gray, with random artifacts. It looks like it's grabbing the very first frame, and it doesn't wait for the key frame.
Is there any way how can I modify the pipeline to actually grab first valid frame of video? Or just wait long enough to be sure that there was at least one key frame already?
I'm unsure why, but after some trial and error it is now working with decodebin3 instead of decodebin. Documentation is still bit discouraging though, stating decodebin3 is still experimental API and a technology preview. Its behaviour and exposed API is subject to change.
Full pipeline looks like this:
gst-launch-1.0 rtspsrc location=$CAM_URL is_live=true ! decodebin3 ! videoconvert ! jpegenc snapshot=true ! filesink location=/tmp/frame.jpg

Gstreamer: Save image/jpeg using multifilesink every 5 seconds

I am trying to figure out how to save an image using multifilesink every N seconds (lets say 5). My get-launch-1.0 pipeline is below: gst-launch-1.0 videotestsrc ! 'video/x-raw, format=I420, width=400, height=400, framerate=1/5' ! jpegenc ! multifilesink location=/some/location/img_%06d.jpg
I was thinking the framerate option could control the capture speed but it seems to not be affecting anything. How can I delay this pipeline to only save a jpeg every N seconds?
Edit: So I figured how that this will work with videotestsrc if you set "is-live=true" but I would like to do this with an nvcamerasrc or nvarguscamerasrc.
When the videotestsrc is not running as a live source, it will pump out frames as fast as it can, updating timestamps based on the output framerate configured on the source pad.
Setting it to live-mode will ensure that it actually matches the expected framerate.
This shouldn't be an issue with a true live source like a camera source.
However something like this can force synchronization with the videotestsrc:
gst-launch-1.0.exe videotestsrc ! video/x-raw, format=I420, width=400, height=400, framerate=1/5 ! identity sync=true ! timeoverlay ! jpegenc ! multifilesink location="/some/location/img_%06.jpg"

How to record video (1080p 30fps) from raspberry pi camera and stream the 'recording in progress' file simultaneously?

0
The objective I am trying to achieve is streaming 1080p video from Raspberry pi camera and record the video simultaneously.
I tried recording the http streaming as source but didn't work on 30fps. A lot of frames were missing and almost got 8fps only.
As a second approach, I am trying to record the file directly from camera and then streaming the "recording in progress/buffer" file. For the same I am trying to use GStreamer. Please suggest if this is good option or should I try any other?
For Recording using GStreamer I used
gst-launch-1.0 -v v4l2src device=/dev/video0 ! capsfilter caps="video/x-raw,width=1920,height=1080,framerate=30/1" !
videoflip method=clockwise ! videoflip method=clockwise ! videoconvert ! videorate ! x264enc! avimux ! filesink location=test_video.h264
Result : recorded video shows 1080p and 30fps but frames are dropping heavily.
For Streaming the video buffer I have used UDP in Gstreamer as,
gst-launch-1.0 -v v4l2src device=/dev/video0 ! capsfilter caps="video/x-raw,width=640,height=480,framerate=30/1" ! x264enc ! queue ! rtph264pay ! udpsink host=192.168.5.1 port=8080
Result : No specific errors on terminal but can't get stream on vlc.
Please suggest the best method here.

GStreamer Playbin video speed is too fast

I am trying to create virtual web camera using GStreamer and v4l2loopback. The problem is that I want to use Playbin but the video speed is too fast when I use it. For instance, it happens when I execute the following command:
gst-launch-1.0 -v playbin uri=file:/vagrant/test.avi
video-sink="videoconvert
! videoscale
! video/x-raw,format=YUY2,width=320,height=320
! v4l2sink device=/dev/video0"
Adding "framerate=20/1" to the caps throws "Not negotiated error" while setting it to "30/1" works but doesn't help to fix the issue with the speed.
On the other hand, I am getting normal speed when executing the following command:
gst-launch-1.0 -v filesrc location=/vagrant/test.avi
! avidemux
! decodebin
! videoconvert
! videoscale
! "video/x-raw,format=YUY2,width=320,height=320"
! v4l2sink device=/dev/video0
I tried a lot of combinations with filters from the last example with the Playbin but none of them helped.
Any help would be highly appreciated!
The problem was with the virtual machine running on top of the VirtualBox. To be more precise - I had 3d acceleration turned on, which resulted in all the videos to play at speed 2x.
Turning off 3d acceleration by setting --accelerate3d=off helped to solve the issue.

Displaying gstreamer in Oculus Rift

So I have a fisheye camera piped through gstreamer, over the internet to another pc where I want to display it on an Oculus Rift. The Oculus expects a 1280×800 resolution input just like a normal monitor, but the left 640×800 of the screen displays in the left eye, other 640×800 for right eye.
I need to modify this:
gst-launch-1.0 -e -v udpsrc port=5001 ! application/x-rtp, payload=96 ! rtpjitterbuffer ! rtph264depay ! avdec_h264 ! fpsdisplaysink sync=false text-overlay=false
to show the stream twice, side-by-side. If I run this command and I winKey+leftArrow, it displays really well in one eye. The oculus even crops out edges (read: windows decorations). But gstreamer won't let me run gst-launch twice at the same time. Any way to make it work? Admittedly, it's quite a hack, but it seemed to work quite well in the one eye.
Alternatively, can someone help me use videomixer?
windows 8, btw'
Thanks!
You should be able to duplicate the video with a
... ! tee name=t ! queue ! videomixer name=m sink_0::xpos=0 sink_1::xpos=640 ! ... t. ! queue ! m.
the key is to use the videomixers pad properties to position the copies.